CN114660588A - Distributed photoelectric target tracking system for anti-unmanned aerial vehicle - Google Patents

Distributed photoelectric target tracking system for anti-unmanned aerial vehicle Download PDF

Info

Publication number
CN114660588A
CN114660588A CN202210176675.6A CN202210176675A CN114660588A CN 114660588 A CN114660588 A CN 114660588A CN 202210176675 A CN202210176675 A CN 202210176675A CN 114660588 A CN114660588 A CN 114660588A
Authority
CN
China
Prior art keywords
target
unmanned aerial
aerial vehicle
photoelectric
photoelectric tracking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210176675.6A
Other languages
Chinese (zh)
Other versions
CN114660588B (en
Inventor
梁中岩
王坤
宫世杰
郭乔进
王森
沈琪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CETC 28 Research Institute
Original Assignee
CETC 28 Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CETC 28 Research Institute filed Critical CETC 28 Research Institute
Priority to CN202210176675.6A priority Critical patent/CN114660588B/en
Priority claimed from CN202210176675.6A external-priority patent/CN114660588B/en
Publication of CN114660588A publication Critical patent/CN114660588A/en
Application granted granted Critical
Publication of CN114660588B publication Critical patent/CN114660588B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • G01S13/726Multiple target tracking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a distributed photoelectric target tracking system for an anti-unmanned aerial vehicle, which comprises: the system comprises a multi-target tracking radar, a data processing server, a photoelectric tracking observation device, a video processing server, a video storage server, a situation display client and a video tracking client; the unmanned aerial vehicle target of realization specific protection zone carries out real-time supervision, form the three-dimensional observation net that covers specific protection zone, realize the real-time image monitoring and the tracking measurement to aerial unmanned aerial vehicle target, strike for unmanned aerial vehicle and provide more accurate target position information, satisfy from long distance to closely target real-time tracking, and to the analysis reply of overall process afterwards, realize showing the global situation of tracking process, form and cover specific protection zone can reconstruct in a flexible way, three-dimensional unmanned aerial vehicle target observation net.

Description

Distributed photoelectric target tracking system for anti-unmanned aerial vehicle
Technical Field
The invention relates to a target tracking system, in particular to a distributed photoelectric target tracking system for an anti-unmanned aerial vehicle.
Background
In recent years, the unmanned aerial vehicle is rapidly developed in the civil and commercial fields by virtue of the advantages of low cost, small volume, light weight, easy operation, good flexibility, strong adaptability, high stability and the like, and the popularization rate is improved again and again. But at the same time, unmanned aerial vehicle abuse, black flies and malicious use pose many threats to national security. Particularly, in some public areas, such as airports and the like, frequent accidents occur, and real-time monitoring of the unmanned aerial vehicle is necessary.
Unmanned aerial vehicle reflection area is little, and flying height is low, speed is slow, and in places such as city, airport especially, ground clutter is more, adopts single radar means to be difficult to effectively survey unmanned aerial vehicle. The photoelectric detection equipment has the advantages of high measurement precision, high data reliability, stable tracking and the like, and the accuracy of target judgment can be improved by the fusion of radar and photoelectric data.
Disclosure of Invention
The purpose of the invention is as follows: the invention aims to solve the technical problem of the prior art and provides a distributed photoelectric target tracking system for an anti-unmanned aerial vehicle.
In order to solve the technical problem, the invention discloses a distributed photoelectric target tracking system for an anti-unmanned aerial vehicle, which comprises: the system comprises a multi-target tracking radar, a data processing server, a photoelectric tracking observation device, a video processing server, a video storage server, a situation display client and a video tracking client;
the multi-target tracking radar is used for acquiring the real-time three-dimensional position of the target unmanned aerial vehicle and transmitting the real-time three-dimensional position to the data processing server;
the data processing server is used for fusing and processing data of the multi-target tracking radar and the photoelectric tracking observation equipment and controlling the photoelectric tracking observation equipment to track the target unmanned aerial vehicle in real time;
the photoelectric tracking observation equipment is deployed around or in a protection area in a distributed manner by adopting a plurality of pieces of equipment, receives a control instruction sent by a data processing server and adjusts the azimuth angle and the pitch angle of the equipment in real time;
the video processing server is used for carrying out real-time target identification on the unmanned aerial vehicle target in the video picture of the photoelectric tracking observation device and sending the target position in the video to the data processing server;
the video Storage server provides functions of storing, inquiring, replaying and downloading historical videos based on a centralized Storage mode of an IP-SAN (referred to as SAN, Storage Area Network is a technology of integrating Storage equipment, connecting equipment and an interface in a high-speed Network, and the IP-SAN is an SAN technology based on an IP Network);
the situation display client is used for displaying flight track data of the target unmanned aerial vehicle, a target sign and two-dimensional and three-dimensional map information;
the video tracking client is used for displaying real-time video pictures of the photoelectric tracking observation equipment.
In the present invention, the data processing server and the video processing server include:
the data processing server performs fusion processing on data of the multi-target tracking radar and the photoelectric tracking observation device, receives position information of the target unmanned aerial vehicle sent by the multi-target tracking radar in real time, calculates an azimuth angle and a pitch angle of the photoelectric tracking observation device closest to the target unmanned aerial vehicle according to the longitude, the latitude and the altitude of the target unmanned aerial vehicle and the longitude, the latitude and the altitude of the photoelectric tracking observation device, and controls the photoelectric tracking observation device to adjust the orientation of a holder to perform target tracking;
the video processing server identifies the target of the unmanned aerial vehicle in a real-time video picture returned by the photoelectric tracking observation equipment, video verification is carried out on the target indicated by the radar, if the unmanned aerial vehicle is identified, the target position in the video is sent to the data processing server, and the data processing server carries out unmanned aerial vehicle position correction to enable the target unmanned aerial vehicle to be positioned in the center of the video picture and continuously track the target unmanned aerial vehicle; if no drone is identified, the tracking is stopped.
In the invention, the data processing server provides multi-target cooperative guidance:
after the data processing server receives the track data of the target unmanned aerial vehicle, according to the position of the guided photoelectric tracking observation device and the holder information, processing and generating multi-channel guide data aiming at the guided photoelectric tracking observation device at different positions, according to the nearest optimal guide principle, issuing different guide data to the corresponding guided photoelectric tracking observation device, and controlling the holder to rotate so as to track the target, so that the target is always positioned in the central range of the observation image.
In the present invention, the data processing server provides single-target multi-device guidance:
after the target unmanned aerial vehicle enters the optimal detection area of the guided photoelectric tracking observation device, the data processing server issues the guiding data to the guided photoelectric tracking observation device for target guidance, after the target unmanned aerial vehicle moves into the optimal detection area of another guided photoelectric tracking observation device after a period of time, the data processing server automatically issues another path of guiding data to another guided photoelectric tracking observation device, and the two guided photoelectric tracking observation devices realize the whole-process tracking of the target through linkage relay; when the aerial target is less or a certain target unmanned aerial vehicle needs to be focused, the system simultaneously moves the multi-path photoelectric tracking observation equipment to observe from multiple angles.
In the invention, the method for tracking the anti-unmanned aerial vehicle by adopting the system comprises the following steps:
step 1, detecting a real-time position of a target Unmanned Aerial Vehicle (UAV) by a multi-target tracking radar, wherein the real-time position comprises longitude x, latitude y and height z, and transmitting the position to a data processing server;
step 2, the data processing server receives the position information of the target unmanned aerial vehicle sent by the multi-target tracking radar in real time, and the position information is obtained according to the longitude x, the latitude y and the altitude z of the target unmanned aerial vehicle and the longitude x of the photoelectric tracking observation equipment calibrated manually0Latitude y0And a height z0Resolving the azimuth angle of the photoelectric tracking observation equipment nearest to the target unmanned aerial vehicle
Figure BDA0003520549190000033
A pitch angle theta is controlled, and the direction of a tripod head is adjusted by the photoelectric tracking observation equipment to carry out target followingTracking; the target tracking comprises multi-target cooperative guidance and single-target multi-device guidance;
step 3, the video processing server identifies the target of the unmanned aerial vehicle in a real-time video picture returned by the photoelectric tracking observation equipment, video verification is carried out on the target indicated by the radar, if the unmanned aerial vehicle is identified, the target position in the video is sent to the data processing server, and the data processing server carries out unmanned aerial vehicle position correction, so that the target unmanned aerial vehicle is positioned in the center of the video picture and continuously tracks the target unmanned aerial vehicle; if no drone is identified, the tracking is stopped.
In the invention, the step 2 comprises the following steps:
step 2-1, calculating a pitch angle of the photoelectric tracking observation equipment;
step 2-2, calculating the azimuth angle of the photoelectric tracking observation equipment;
step 2-3, multi-target cooperative guidance;
and 2-4, single-target multi-device booting.
In the invention, the pitch angle theta of the photoelectric tracking observation device in the step 2-1 is calculated by the following steps:
step 2-1-1, the ground distance D from the space position where the target unmanned aerial vehicle is mapped to the ground point to the photoelectric tracking observation device can be calculated by using a Haversine formula (reference: r.w. sinntott, "virtus of the Haversine", Sky and Telescope, vol 68, no 2,1984):
Figure BDA0003520549190000031
D=Δσ×RE
wherein, the earth is assumed to be an ideal sphere, RE6371km for the radius of the earth, Δ y ═ y-y0| is the absolute value of the 2-point latitude radian difference, and Δ x | -x0I is the absolute value of the 2-point longitude radian difference, and Delta sigma is 2 points (x, y, z) and (x) on the earth's sphere0,y0,z0) Central angles formed by connecting lines with the ball centers respectively;
step 2-1-2, relative to the earth, the ground distance D can be approximated as a straight line, which is obtained by a trigonometric function relationship:
Figure BDA0003520549190000032
step 2-1-3, calculating the space distance L between the photoelectric tracking observation device and the target unmanned aerial vehicle:
Figure BDA0003520549190000041
2-1-4, collecting proper focal length data for each distance section to carry out rapid focusing to form a mapping relation table of the distance L and the focal length f; inquiring a distance-focal length mapping relation table according to the spatial distance L to adjust the focal length of the photoelectric tracking observation device;
step 2-1-5, correcting the pitch angle of the photoelectric tracking observation equipment, wherein the method comprises the following steps:
the method for calculating the correction angle delta theta of the pitch angle of the photoelectric tracking observation equipment comprises the following steps of:
according to the trigonometric function relationship, it can be deduced that:
Figure BDA0003520549190000042
wherein beta is the vertical field angle of the photoelectric tracking observation equipment,
Figure BDA0003520549190000043
Chthe CCD target surface height of a charge coupled device image sensor in the photoelectric tracking observation equipment is adopted, f is the focal length, H is the picture pixel height, and H is the pixel coordinate from the picture central point.
In the invention, the photoelectric tracking observation equipment in the step 2-2 observes the azimuth angle required by the target unmanned aerial vehicle
Figure BDA0003520549190000046
The calculation method comprises the following steps:
step 2-2-1, establishing an auxiliary line, calculating the due north direction distance d between the photoelectric target tracking equipment and the target unmanned aerial vehicle, and obtaining the distance by a Haversine formula according to the actual observation range of the system:
Figure BDA0003520549190000044
wherein pi is a circumference ratio;
step 2-2-2, with respect to the earth, the ground distances D and D are approximated as straight lines and are obtained by a trigonometric function relationship:
Figure BDA0003520549190000045
step 2-2-3, correcting the azimuth angle of the photoelectric tracking observation equipment, wherein the method comprises the following steps:
azimuth angle correction angle of photoelectric tracking observation equipment
Figure BDA0003520549190000051
When the offset is negative, the offset is adjusted to the left, when the offset is positive, the offset is adjusted to the right,
Figure BDA0003520549190000052
the calculation method is as follows:
Figure BDA0003520549190000053
wherein gamma is the horizontal field angle of the photoelectric tracking observation equipment,
Figure BDA0003520549190000054
Cwis the width of CCD target surface, f is focal length, W is the width of picture pixel, W is the pixel coordinate from the center point of picture,
Figure BDA0003520549190000055
negative for left offset adjustment and positive for right offset adjustment.
In the invention, the multi-target cooperative guiding method in the step 2-3 comprises the following steps: after receiving the track data of the target unmanned aerial vehicle, the data processing server processes and generates multi-channel guide data for the guided photoelectric tracking observation equipment at different positions according to the position and the holder information of the guided photoelectric tracking observation equipment, issues different guide data to the corresponding guided photoelectric tracking observation equipment according to the nearest optimal guide principle, and controls the holder to rotate so as to track the target, so that the target is always positioned in the central range of the observation image; the near optimal guidance method comprises the following steps:
step 2-3-1, calculating the space distance L between the plurality of target unmanned aerial vehicles and the photoelectric tracking observation equipment in the idle state1,L2,…,Ln(ii) a Wherein n is the number of the unallocated target unmanned aerial vehicles multiplied by the number of the photoelectric tracking observation devices in the idle state;
step 2-3-2, allocating the target unmanned aerial vehicle to the photoelectric tracking observation equipment L with the shortest space distancea=min{L1,L2,…,Ln};
Step 2-3-3, if the space distance between the target unmanned aerial vehicle to be distributed and the photoelectric tracking observation equipment exceeds the maximum tracking distance, the distribution fails; marking the photoelectric tracking observation equipment in an idle state and participating in allocation again;
and 2-3-4, repeating the steps 2-3-1 to 2-3-3 until no distributable photoelectric tracking observation equipment exists.
In the invention, the single-target multi-device guiding method in the step 2-4 comprises the following steps: after the target unmanned aerial vehicle enters the optimal detection area of the photoelectric tracking observation device No. 1, the data processing server issues the guide data to the photoelectric tracking observation device No. 1 for target guidance, after the target moves into the optimal detection area of the photoelectric tracking observation device No. 2 for a period of time, the data processing server automatically issues the other path of guide data to the photoelectric tracking observation device No. 2, and the two guided photoelectric tracking observation devices realize the whole-course tracking of the target through linkage relay; when the aerial targets are few or a certain target unmanned aerial vehicle needs to be focused, the system simultaneously moves a plurality of paths of photoelectric tracking observation equipment to observe the targets from a plurality of angles;
the single-target multi-device guiding method is suitable for the situation that only one target unmanned aerial vehicle exists or a certain target unmanned aerial vehicle needs to be focused on, and comprises the following steps:
step 2-4-1, calculating the space distance L between the target unmanned aerial vehicle and all photoelectric tracking observation equipment1,L2,…,Lm(ii) a Wherein m is the number of photoelectric tracking observation devices;
step 2-4-2, allocating the target unmanned aerial vehicle to the photoelectric tracking observation equipment L with the shortest space distanceb=min{L1,L2,…,LmRecording as No. 1 photoelectric tracking observation equipment;
step 2-4-3, when the target unmanned aerial vehicle is about to fly out of the optimal detection area of the photoelectric tracking observation equipment No. 1, namely Lb>L, L is the maximum detection distance of the photoelectric tracking observation device), the distance L between the target unmanned aerial vehicle and other photoelectric tracking observation devices except the photoelectric tracking observation device No. 1 is recalculated2,…,Lm(for convenience of description, assume Lb=L1);
Step 2-4-4, allocating the target unmanned aerial vehicle to the photoelectric tracking observation equipment L with the shortest space distancec=min{L2,…,LmAnd marking as No. 2 photoelectric tracking observation equipment, and repeating the steps 2-4-3 until the target unmanned aerial vehicle flies out of the observation areas of all the photoelectric tracking observation equipment.
Has the advantages that:
according to the distributed photoelectric target tracking system for the anti-unmanned aerial vehicle, the target unmanned aerial vehicle is secondarily positioned by guiding the photoelectric tracking observation device through the multi-target tracking radar, so that the position of the target unmanned aerial vehicle is further accurate, the target unmanned aerial vehicle is tracked in real time, multi-target cooperative guidance and single-target multi-device guidance are supported, the system can simultaneously move multiple paths of photoelectric tracking observation devices, the target is observed from multiple angles, more accurate target position information is provided for the unmanned aerial vehicle to strike, and a three-dimensional unmanned aerial vehicle observation target network which covers a specific protection area and can be flexibly reconstructed is formed.
Drawings
The foregoing and/or other advantages of the invention will become further apparent from the following detailed description of the invention when taken in conjunction with the accompanying drawings.
FIG. 1 is a schematic diagram of a system architecture according to the present invention.
FIG. 2 is a flow chart of the present invention.
Fig. 3 is a diagram illustrating a deployment example of the photoelectric tracking observation device.
Fig. 4 is a schematic view of the pitch angle of the photoelectric tracking observation device.
FIG. 5 is a schematic view of the pitch angle correction of the photoelectric tracking observation device of the present invention.
Fig. 6 is a schematic view of the azimuth angle of the photoelectric tracking observation device of the present invention.
FIG. 7 is a schematic view of the azimuth angle calibration of the photoelectric tracking observation device of the present invention.
FIG. 8 is a diagram illustrating multi-target cooperative guiding according to the present invention.
FIG. 9 is a diagram of single target multi-device boot according to the present invention.
Detailed Description
As shown in fig. 1, the present invention mainly includes the following parts: the system comprises a multi-target tracking radar, a data processing server, a photoelectric tracking observation device, a video processing server, a video storage server, a situation display client and a video tracking client;
the multi-target tracking radar is used for acquiring the real-time three-dimensional position of the target unmanned aerial vehicle and transmitting the real-time three-dimensional position to the data processing server;
the data processing server is used for fusing and processing data of the multi-target tracking radar and the photoelectric tracking observation equipment and controlling the photoelectric tracking observation equipment to track the target unmanned aerial vehicle in real time;
the photoelectric tracking observation equipment is deployed around or in a protection area in a distributed mode by adopting a plurality of pieces of equipment, receives a control instruction sent by a data processing server, and adjusts an azimuth angle and a pitch angle of the equipment in real time;
the video processing server is used for carrying out real-time target identification on the unmanned aerial vehicle target in the video picture of the photoelectric tracking observation device and sending the target position in the video to the data processing server;
the video storage server provides functions of storing, inquiring, replaying and downloading historical videos based on an IP-SAN centralized storage mode;
the situation display client is used for displaying flight track data, a target sign and two-dimensional and three-dimensional map information of the target unmanned aerial vehicle;
the video tracking client is used for displaying real-time video pictures of the photoelectric tracking observation equipment.
As shown in fig. 2, a distributed photoelectric target tracking system for an anti-drone includes the following processing steps:
step a1, the multi-target tracking radar detects the real-time position (including longitude x, latitude y and height z) of the target unmanned aerial vehicle and transmits the position to the data processing server;
step a2, the data processing server receives the position information of the target unmanned aerial vehicle sent by the multi-target tracking radar in real time, and the longitude x of the photoelectric tracking observation equipment is calibrated according to the longitude x, the latitude y and the altitude z of the target unmanned aerial vehicle0Latitude y0And a height z0Resolving the azimuth angle of the photoelectric tracking observation equipment nearest to the target unmanned aerial vehicle
Figure BDA0003520549190000071
The pitch angle theta controls the photoelectric tracking observation equipment to adjust the orientation of the holder to track the target;
step a3, the video processing server identifies the target of the unmanned aerial vehicle in the real-time video image returned by the photoelectric tracking observation equipment, video verification is carried out on the target indicated by the radar, if the unmanned aerial vehicle is identified, the target position in the video is sent to the data processing server, and the data processing server carries out unmanned aerial vehicle position correction, so that the target unmanned aerial vehicle is positioned in the center of the video image and continuously tracks the target unmanned aerial vehicle; if no drone is identified, the tracking is stopped.
As shown in fig. 3, typical deployment of the photoelectric tracking observation device of the present invention includes that a solid rectangular region in the figure is a protection region, a dotted line is a trackable detection region of the photoelectric tracking observation device, and the photoelectric tracking observation device can be flexibly deployed according to a range of a region to be protected.
As shown in fig. 4, the method for calculating the pitch angle of the photoelectric tracking observation device of the present invention includes that (x, y, z) in the diagram is the longitude, latitude and altitude of the target drone detected by the radar, (x0,y0,z0) Longitude, latitude and height data of the photoelectric tracking observation equipment are calibrated manually, theta is a pitch angle required by the photoelectric tracking observation equipment for observing the target unmanned aerial vehicle, and D is a ground distance from a ground point to the photoelectric tracking observation equipment, wherein the space position of the target unmanned aerial vehicle is mapped to the ground point. Calculating the pitch angle of the photoelectric tracking observation equipment:
at step b1, distance D can be calculated by Haversane equation:
Figure BDA0003520549190000081
D=Δσ×RE
wherein the earth is assumed to be a perfect sphere, RE6371km for the radius of the earth, Δ y ═ y-y0| is the absolute value of the 2-point latitude radian difference, and Δ x | -x0| is the absolute value of the 2-point longitude radian difference, and Δ σ is 2 points (x, y, z), (x) on the earth's sphere0,y0,z0) Central angles formed by connecting lines with the ball centers respectively;
step b2, regarding the earth, the ground distance D can be approximately regarded as a straight line, which can be obtained from the trigonometric function relationship:
Figure BDA0003520549190000082
step b3, calculating the space distance L between the photoelectric tracking observation device and the target unmanned aerial vehicle:
Figure BDA0003520549190000083
and b4, collecting proper focal length data for each distance segment to form a mapping relation table of the distance L and the focal length f for realizing quick focusing. And inquiring a distance-focal length mapping relation table according to the spatial distance L to adjust the focal length of the photoelectric tracking observation device.
As shown in fig. 5, the method for correcting the pitch angle of the photoelectric tracking and observing device of the present invention includes that β is the vertical field angle of the photoelectric tracking and observing device, H is the picture pixel height, H is the pixel coordinate from the picture center point, 2B is the actual scene height corresponding to the picture, B is the actual height distance from the plane to the picture center point, the picture center point is 0 point, and is positive upward and negative downward.
From the trigonometric relationship it can be deduced:
Figure BDA0003520549190000091
wherein,
Figure BDA0003520549190000092
Chthe height of the CCD target surface is shown, f is the focal length, and delta theta is adjusted by downward deviation when the focal length is negative and is adjusted by upward deviation when the focal length is positive.
As shown in fig. 6, the method for calculating the azimuth angle of the photoelectric tracking observation device of the present invention includes that the azimuth angle is calculated without considering the altitude, and assuming that the target drone and the photoelectric tracking observation device are on the same plane, (x, y) in the figure is the longitude and latitude, (x, y) of the target drone detected by the radar0,y0) For manually calibrated longitude and latitude data of photoelectric tracking observation equipment,
Figure BDA0003520549190000093
the azimuth angle required by the target unmanned aerial vehicle is observed by the photoelectric tracking observation device, D is the ground distance from the ground point to the photoelectric tracking observation device mapped by the space position of the target unmanned aerial vehicle, and N is the true north direction and is taken as the azimuth angle of 0 degree. The method comprises the following steps of calculating the azimuth angle of photoelectric tracking observation equipment:
step c1, establishing an auxiliary line, calculating d in fig. 6, and deducing from a Haversine formula according to the actual observation range of the system:
Figure BDA0003520549190000094
step c2, the ground distances D and D can be approximately regarded as straight lines relative to the earth, and can be obtained from a trigonometric function relationship:
Figure BDA0003520549190000095
as shown in fig. 7, the method for correcting the azimuth angle of the photoelectric tracking and observing device according to the embodiment of the present invention includes that γ is the horizontal field angle of the photoelectric tracking and observing device, W is the width of the picture pixel, W is the pixel coordinate from the center point of the picture, 2A is the actual scene width corresponding to the picture, a is the actual width distance from the airplane to the center point of the picture, the center point of the picture is 0 point, the right direction is positive, and the left direction is negative.
According to the trigonometric function relationship, it can be deduced that:
Figure BDA0003520549190000101
wherein,
Figure BDA0003520549190000102
Cwis the width of the CCD target surface, f is the focal length,
Figure BDA0003520549190000103
negative for left offset adjustment and positive for right offset adjustment.
As shown in fig. 8, the multi-target cooperative guidance method of the present invention includes that after receiving track data of a target unmanned aerial vehicle, a data processing server processes and generates multi-path guidance data for guided photoelectric tracking and observation devices at different positions according to the position of the guided photoelectric tracking and observation device and cradle head information, and sends different guidance data to corresponding guided photoelectric tracking and observation devices according to a "best-near guidance" principle, and controls the cradle head to rotate to track the target, so that the target is always located within a central range of an observation image.
The principle steps of 'nearest optimal guidance':
step d1, calculating the space distance L between the plurality of target unmanned aerial vehicles and the photoelectric tracking observation equipment in the idle state1,L2,…,Ln
Step d2, allocating the target unmanned aerial vehicle to the photoelectric tracking observation device L with the shortest space distancea=min{L1,L2,…,Ln};
Step d3, if the space distance between the target unmanned aerial vehicle to be distributed and the photoelectric tracking observation equipment exceeds the maximum tracking distance, the distribution fails; if the space distance between the distributed target unmanned aerial vehicle and the photoelectric tracking observation equipment exceeds the maximum tracking distance, marking the photoelectric tracking observation equipment in an idle state, and participating in distribution again;
and step d4, repeating the steps d1 to d3 until no distributable photoelectric tracking observation devices exist.
As shown in fig. 9, the single-target multi-device guiding method of the present invention includes that after a target unmanned aerial vehicle enters the optimal detection area of the photoelectric tracking observation device No. 1, a data processing server issues guiding data to the photoelectric tracking observation device No. 1 for target guidance, after a target moves into the optimal detection area of the photoelectric tracking observation device No. 2 for a period of time, the data processing server automatically issues another path of guiding data to the photoelectric tracking observation device No. 2, and the two guided photoelectric tracking observation devices realize the whole-process tracking of the target through linkage relay; when the aerial target is less or a certain target unmanned aerial vehicle needs to be focused, the system can simultaneously move a plurality of paths of photoelectric tracking observation equipment to observe the target from a plurality of angles.
The single-target multi-device guiding method is suitable for the situation that only one target unmanned aerial vehicle exists or a certain target unmanned aerial vehicle needs to be focused, and the single-target multi-device guiding method comprises the following steps:
step e1, calculating a plurality of target unmanned aerial vehicles and all photoelectric tracking observation devicesSpatial distance L between the devices1,L2,…,Lm
Step e2, allocating the target unmanned aerial vehicle to the photoelectric tracking observation device L with the shortest space distanceb=min{L1,L2,…,LmRecording as No. 1 photoelectric tracking observation equipment;
step e3, the target unmanned aerial vehicle is about to fly out of the optimal detection area (L) of the photoelectric tracking observation device No. 1b>L, L is the maximum detection distance of the photoelectric tracking observation device), the distance L between the target unmanned aerial vehicle and other photoelectric tracking observation devices except the photoelectric tracking observation device No. 1 is recalculated2,…,Lm(for convenience of description, assume Lb=L1);
Step e4, allocating the target unmanned aerial vehicle to the photoelectric tracking observation device L with the shortest space distancec=min{L2,…,LmAnd recording the unmanned aerial vehicle as No. 2 photoelectric tracking observation equipment, and repeating the steps until the target unmanned aerial vehicle flies out of observation areas of all the photoelectric tracking observation equipment.
The invention provides a thought and a method for a distributed photoelectric target tracking system for an anti-unmanned aerial vehicle, and a method and a way for implementing the technical scheme are many, the above description is only a preferred embodiment of the invention, and it should be noted that, for a person skilled in the art, a plurality of improvements and embellishments can be made without departing from the principle of the invention, and the improvements and embellishments should also be regarded as the protection scope of the invention. All the components not specified in the present embodiment can be realized by the prior art.

Claims (10)

1. A distributed photoelectricity target tracking system for anti-unmanned aerial vehicle, comprising: the system comprises a multi-target tracking radar, a data processing server, a photoelectric tracking observation device, a video processing server, a video storage server, a situation display client and a video tracking client;
the multi-target tracking radar is used for acquiring the real-time three-dimensional position of the target unmanned aerial vehicle and transmitting the real-time three-dimensional position to the data processing server;
the data processing server is used for fusing and processing data of the multi-target tracking radar and the photoelectric tracking observation equipment and controlling the photoelectric tracking observation equipment to track the target unmanned aerial vehicle in real time;
the photoelectric tracking observation equipment is deployed around or in a protection area in a distributed mode by adopting a plurality of pieces of equipment, receives a control instruction sent by a data processing server, and adjusts an azimuth angle and a pitch angle of the equipment in real time;
the video processing server is used for carrying out real-time target identification on the unmanned aerial vehicle target in the video picture of the photoelectric tracking observation device and sending the target position in the video to the data processing server;
the video storage server provides functions of storing, inquiring, replaying and downloading historical videos based on an IP-SAN centralized storage mode;
the situation display client is used for displaying flight track data of the target unmanned aerial vehicle, a target sign and two-dimensional and three-dimensional map information;
the video tracking client is used for displaying real-time video pictures of the photoelectric tracking observation equipment.
2. The distributed photoelectric target tracking system for anti-drones according to claim 1, characterized in that the data processing server and the video processing server comprise:
the data processing server performs fusion processing on data of the multi-target tracking radar and the photoelectric tracking observation device, receives position information of the target unmanned aerial vehicle sent by the multi-target tracking radar in real time, calculates an azimuth angle and a pitch angle of the photoelectric tracking observation device closest to the target unmanned aerial vehicle according to the longitude, the latitude and the altitude of the target unmanned aerial vehicle and the longitude, the latitude and the altitude of the photoelectric tracking observation device, and controls the photoelectric tracking observation device to adjust the orientation of a holder to perform target tracking;
the video processing server identifies the target of the unmanned aerial vehicle in a real-time video picture returned by the photoelectric tracking observation equipment, video verification is carried out on the target indicated by the radar, if the unmanned aerial vehicle is identified, the target position in the video is sent to the data processing server, and the data processing server carries out unmanned aerial vehicle position correction to enable the target unmanned aerial vehicle to be positioned in the center of the video picture and continuously track the target unmanned aerial vehicle; if no drone is identified, the tracking is stopped.
3. The distributed photoelectric target tracking system for anti-drones according to claim 2, characterized in that the data processing server provides multi-target cooperative guidance:
after the data processing server receives the track data of the target unmanned aerial vehicle, according to the position of the guided photoelectric tracking observation device and the holder information, processing and generating multi-channel guide data aiming at the guided photoelectric tracking observation device at different positions, according to the nearest optimal guide principle, issuing different guide data to the corresponding guided photoelectric tracking observation device, and controlling the holder to rotate so as to track the target, so that the target is always positioned in the central range of the observation image.
4. The distributed electro-optical target tracking system for anti-drones according to claim 3, characterized in that said data processing server provides single-target multi-device booting:
after the target unmanned aerial vehicle enters the optimal detection area of the guided photoelectric tracking observation device, the data processing server issues the guiding data to the guided photoelectric tracking observation device for target guidance, after the target unmanned aerial vehicle moves into the optimal detection area of another guided photoelectric tracking observation device after a period of time, the data processing server automatically issues another path of guiding data to another guided photoelectric tracking observation device, and the two guided photoelectric tracking observation devices realize the whole-process tracking of the target through linkage relay; when the aerial target is less or a certain target unmanned aerial vehicle needs to be focused, the system simultaneously moves the multi-path photoelectric tracking observation equipment to observe from multiple angles.
5. The distributed photoelectric target tracking system for the anti-drone according to claim 4, characterized in that the method for anti-drone tracking with the system comprises the following steps:
step 1, detecting a real-time position of a target Unmanned Aerial Vehicle (UAV) by a multi-target tracking radar, wherein the real-time position comprises longitude x, latitude y and height z, and transmitting the position to a data processing server;
step 2, the data processing server receives the position information of the target unmanned aerial vehicle sent by the multi-target tracking radar in real time, and the position information is obtained according to the longitude x, the latitude y and the altitude z of the target unmanned aerial vehicle and the longitude x of the photoelectric tracking observation equipment calibrated manually0Latitude y0And a height z0Resolving the azimuth angle of the photoelectric tracking observation equipment nearest to the target unmanned aerial vehicle
Figure FDA0003520549180000021
The pitch angle theta controls the photoelectric tracking observation equipment to adjust the orientation of the holder to track the target; the target tracking comprises multi-target cooperative guidance and single-target multi-device guidance;
step 3, the video processing server identifies the target of the unmanned aerial vehicle in a real-time video picture returned by the photoelectric tracking observation equipment, video verification is carried out on the target indicated by the radar, if the unmanned aerial vehicle is identified, the target position in the video is sent to the data processing server, and the data processing server carries out unmanned aerial vehicle position correction, so that the target unmanned aerial vehicle is positioned in the center of the video picture and continuously tracks the target unmanned aerial vehicle; if no drone is identified, the tracking is stopped.
6. The distributed photoelectric target tracking system for anti-drones according to claim 5, characterized in that step 2 comprises:
step 2-1, calculating a pitch angle of the photoelectric tracking observation equipment;
step 2-2, calculating the azimuth angle of the photoelectric tracking observation equipment;
step 2-3, multi-target cooperative guidance;
and 2-4, single-target multi-device booting.
7. The distributed photoelectric target tracking system for the anti-unmanned aerial vehicle as claimed in claim 6, wherein the photoelectric tracking observation device pitch angle θ in step 2-1 is calculated by:
step 2-1-1, the ground distance D from the space position where the target unmanned aerial vehicle is mapped to the ground upper point to the photoelectric tracking observation device can be calculated by a Haversene formula:
Figure FDA0003520549180000031
D=Δσ×RE
wherein, the earth is assumed to be an ideal sphere, RE6371km for the radius of the earth, Δ y ═ y-y0| is the absolute value of the 2-point latitude radian difference, and Δ x | -x0| is the absolute value of the 2-point longitude radian difference, and Δ σ is 2 points (x, y, z) and (x) on the earth's sphere0,y0,z0) Central angles formed by connecting lines with the ball centers respectively;
step 2-1-2, relative to the earth, the ground distance D can be approximated as a straight line, which is obtained by a trigonometric function relationship:
Figure FDA0003520549180000032
step 2-1-3, calculating the space distance L between the photoelectric tracking observation device and the target unmanned aerial vehicle:
Figure FDA0003520549180000033
2-1-4, collecting proper focal length data for each distance section to carry out rapid focusing to form a mapping relation table of the distance L and the focal length f; inquiring a distance-focal length mapping relation table according to the spatial distance L to adjust the focal length of the photoelectric tracking observation device;
step 2-1-5, correcting the pitch angle of the photoelectric tracking observation equipment, wherein the method comprises the following steps:
the method for calculating the correction angle delta theta of the pitch angle of the photoelectric tracking observation equipment comprises the following steps of:
according to the trigonometric function relationship, it can be deduced that:
Figure FDA0003520549180000034
wherein beta is the vertical field angle of the photoelectric tracking observation equipment,
Figure FDA0003520549180000035
Chthe CCD target surface height of a charge coupled device image sensor in the photoelectric tracking observation equipment is adopted, f is the focal length, H is the picture pixel height, and H is the pixel coordinate from the picture central point.
8. The distributed photoelectric target tracking system for anti-drones according to claim 7, wherein the photoelectric tracking observation device in step 2-2 observes the required azimuth angle of the target drone
Figure FDA0003520549180000048
The calculation method comprises the following steps:
step 2-2-1, establishing an auxiliary line, calculating the due north direction distance d between the photoelectric target tracking equipment and the target unmanned aerial vehicle, and obtaining the distance by a Haversine formula according to the actual observation range of the system:
Figure FDA0003520549180000041
wherein pi is a circumference ratio;
step 2-2-2, with respect to the earth, the ground distances D and D are approximated as straight lines and are obtained by a trigonometric function relationship:
Figure FDA0003520549180000042
step 2-2-3, correcting the azimuth angle of the photoelectric tracking observation equipment, wherein the method comprises the following steps:
azimuth angle correction angle of photoelectric tracking observation equipment
Figure FDA0003520549180000043
When the offset is negative, the offset is adjusted to the left, when the offset is positive, the offset is adjusted to the right,
Figure FDA0003520549180000044
the calculation method is as follows:
Figure FDA0003520549180000045
wherein gamma is the horizontal field angle of the photoelectric tracking observation equipment,
Figure FDA0003520549180000046
Cwis the width of CCD target surface, f is focal length, W is the width of picture pixel, W is the pixel coordinate from the center point of picture,
Figure FDA0003520549180000047
the offset adjustment is performed to the left when the polarity is negative, and the offset adjustment is performed to the right when the polarity is positive.
9. The distributed photoelectric target tracking system for the anti-unmanned aerial vehicle as claimed in claim 8, wherein the multi-target cooperative guidance method in the step 2-3 comprises: after receiving the track data of the target unmanned aerial vehicle, the data processing server processes and generates multi-channel guide data for the guided photoelectric tracking observation equipment at different positions according to the position and the holder information of the guided photoelectric tracking observation equipment, issues different guide data to the corresponding guided photoelectric tracking observation equipment according to the nearest optimal guide principle, and controls the holder to rotate so as to track the target, so that the target is always positioned in the central range of the observation image; the near optimal guidance method comprises the following steps:
step 2-3-1, calculating the space distance L between the plurality of target unmanned aerial vehicles and the photoelectric tracking observation equipment in the idle state1,L2,…,Ln(ii) a Wherein n is the number of the unallocated target unmanned aerial vehicles multiplied by the number of the photoelectric tracking observation devices in the idle state;
step 2-3-2, allocating the target unmanned aerial vehicle to the photoelectric tracking observation equipment L with the shortest space distancea=min{L1,L2,…,Ln};
Step 2-3-3, if the space distance between the target unmanned aerial vehicle to be distributed and the photoelectric tracking observation equipment exceeds the maximum tracking distance, the distribution fails; marking the photoelectric tracking observation equipment in an idle state and participating in allocation again;
and 2-3-4, repeating the steps 2-3-1 to 2-3-3 until no distributable photoelectric tracking observation equipment exists.
10. The distributed photoelectric target tracking system for anti-drones according to claim 9, wherein the single-target multi-device guiding method in steps 2-4 comprises: after the target unmanned aerial vehicle enters the optimal detection area of the photoelectric tracking observation device No. 1, the data processing server issues the guide data to the photoelectric tracking observation device No. 1 for target guidance, after the target moves into the optimal detection area of the photoelectric tracking observation device No. 2 for a period of time, the data processing server automatically issues the other path of guide data to the photoelectric tracking observation device No. 2, and the two guided photoelectric tracking observation devices realize the whole-course tracking of the target through linkage relay; when the aerial targets are few or a certain target unmanned aerial vehicle needs to be focused, the system simultaneously moves a plurality of paths of photoelectric tracking observation equipment to observe the targets from a plurality of angles;
the single-target multi-device guiding method is suitable for the situation that only one target unmanned aerial vehicle exists or a certain target unmanned aerial vehicle needs to be focused on, and comprises the following steps:
step 2-4-1, calculating the space distance L between the target unmanned aerial vehicle and all the photoelectric tracking observation devices1,L2,…,Lm(ii) a Wherein m isThe number of the observation equipment is photoelectric tracking;
step 2-4-2, allocating the target unmanned aerial vehicle to the photoelectric tracking observation equipment L with the shortest space distanceb=min{L1,L2,…,LmRecording as No. 1 photoelectric tracking observation equipment;
step 2-4-3, when the target unmanned aerial vehicle is about to fly out of the optimal detection area of the photoelectric tracking observation equipment No. 1, namely Lb>When L is the maximum detection distance of the photoelectric tracking observation equipment, the distance L between the target unmanned aerial vehicle and other photoelectric tracking observation equipment except the photoelectric tracking observation equipment No. 1 is recalculated2,…,Lm
Step 2-4-4, allocating the target unmanned aerial vehicle to the photoelectric tracking observation equipment L with the shortest space distancec=min{L2,…,LmAnd marking as No. 2 photoelectric tracking observation equipment, and repeating the steps 2-4-3 until the target unmanned aerial vehicle flies out of observation areas of all the photoelectric tracking observation equipment.
CN202210176675.6A 2022-02-25 Distributed photoelectric target tracking system for anti-unmanned aerial vehicle Active CN114660588B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210176675.6A CN114660588B (en) 2022-02-25 Distributed photoelectric target tracking system for anti-unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210176675.6A CN114660588B (en) 2022-02-25 Distributed photoelectric target tracking system for anti-unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN114660588A true CN114660588A (en) 2022-06-24
CN114660588B CN114660588B (en) 2024-10-22

Family

ID=

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116165674A (en) * 2023-04-25 2023-05-26 北京融合汇控科技有限公司 Accurate positioning method for black flying unmanned aerial vehicle

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110398720A (en) * 2019-08-21 2019-11-01 深圳耐杰电子技术有限公司 A kind of anti-unmanned plane detection tracking interference system and photoelectric follow-up working method
KR102210902B1 (en) * 2020-11-16 2021-02-02 주식회사 영국전자 Anti-drone integrated system that dynamically detects drones
CN114047505A (en) * 2021-10-22 2022-02-15 四川九洲空管科技有限责任公司 Target detection display control method and device supporting photoelectric cross positioning in radar failure scene

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110398720A (en) * 2019-08-21 2019-11-01 深圳耐杰电子技术有限公司 A kind of anti-unmanned plane detection tracking interference system and photoelectric follow-up working method
KR102210902B1 (en) * 2020-11-16 2021-02-02 주식회사 영국전자 Anti-drone integrated system that dynamically detects drones
CN114047505A (en) * 2021-10-22 2022-02-15 四川九洲空管科技有限责任公司 Target detection display control method and device supporting photoelectric cross positioning in radar failure scene

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王伟东: "基于核相关滤波的改进目标跟踪算法", 数字技术与应用, 25 July 2021 (2021-07-25) *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116165674A (en) * 2023-04-25 2023-05-26 北京融合汇控科技有限公司 Accurate positioning method for black flying unmanned aerial vehicle

Similar Documents

Publication Publication Date Title
KR102018892B1 (en) Method and apparatus for controlling take-off and landing of unmanned aerial vehicle
CN104215239B (en) Guidance method using vision-based autonomous unmanned plane landing guidance device
CN108614273B (en) Airborne dual-waveband photoelectric wide-area reconnaissance and tracking device and method
CN106526551B (en) A kind of radar antenna dynamic performance testing system and method
CN103822635B (en) The unmanned plane during flying spatial location real-time computing technique of view-based access control model information
USRE40800E1 (en) GPS Airborne target geolocating method
EP2724204B1 (en) Method for acquiring images from arbitrary perspectives with uavs equipped with fixed imagers
CN110033480B (en) Aerial photography measurement-based airborne photoelectric system target motion vector estimation method
CN109753076A (en) A kind of unmanned plane vision tracing implementing method
CN107885223A (en) Unmanned plane recovery guiding system based on laser
CN102902282B (en) Based on the geographic tracking method that optical axis overlaps with the axes of inertia
JP6138326B1 (en) MOBILE BODY, MOBILE BODY CONTROL METHOD, PROGRAM FOR CONTROLLING MOBILE BODY, CONTROL SYSTEM, AND INFORMATION PROCESSING DEVICE
CN106468552A (en) A kind of two-shipper crossing location method based on airborne photoelectric platform
CN106291535A (en) A kind of obstacle detector, robot and obstacle avoidance system
US20120232717A1 (en) Remote coordinate identifier system and method for aircraft
CN109597432B (en) Unmanned aerial vehicle take-off and landing monitoring method and system based on vehicle-mounted camera unit
CN110879617A (en) Infrared-guided unmanned aerial vehicle landing method and device
CN106143932A (en) A kind of unmanned plane recovery system based on laser-beam riding
CN106094876A (en) A kind of unmanned plane target locking system and method thereof
KR20140030610A (en) Surveillance method for using unmanned aerial vehicles and ground observation equipments
CN111176323A (en) Radar and infrared integrated unmanned aerial vehicle landing control method and device
Kong et al. A ground-based multi-sensor system for autonomous landing of a fixed wing UAV
Miller et al. Optical Flow as a navigation means for UAV
CN114660588A (en) Distributed photoelectric target tracking system for anti-unmanned aerial vehicle
CN114660588B (en) Distributed photoelectric target tracking system for anti-unmanned aerial vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant