CN113721665A - Pan-tilt control method based on machine vision and applied to anti-low-slow small target - Google Patents

Pan-tilt control method based on machine vision and applied to anti-low-slow small target Download PDF

Info

Publication number
CN113721665A
CN113721665A CN202011280436.2A CN202011280436A CN113721665A CN 113721665 A CN113721665 A CN 113721665A CN 202011280436 A CN202011280436 A CN 202011280436A CN 113721665 A CN113721665 A CN 113721665A
Authority
CN
China
Prior art keywords
target
pan
tracking mode
low
tilt
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011280436.2A
Other languages
Chinese (zh)
Inventor
林德福
李帆
王辉
宋韬
吴则良
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN202011280436.2A priority Critical patent/CN113721665A/en
Publication of CN113721665A publication Critical patent/CN113721665A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a cloud deck control method based on machine vision and applied to anti-low-slow small targets.

Description

Pan-tilt control method based on machine vision and applied to anti-low-slow small target
Technical Field
The invention relates to a control method of a pan-tilt on an unmanned aerial vehicle, in particular to a pan-tilt control method based on machine vision and applied to a low-speed small target.
Background
"Low-slow small" target (LSST) refers to small aircraft and airborne objects that have all or part of the characteristics of low altitude, ultra-low altitude flight (flight height below 1000 m), flight speed less than 200km/h, and are not easily found by radar. The low-slow small aircraft has the flying height of below 1000 meters, the flying speed per hour of less than 200 kilometers and the radar reflection area of less than 2 square meters, and the low-slow small target is widely used and rapidly developed due to the wide range of the low-slow small target (including small and medium sized airplanes, helicopters, gliders, hot air balloons, unmanned aerial vehicles and other general aviation equipment and aviation sports equipment) and the development of science and technology.
The development of the 'slow and slow small' target improves the national economic development level, but the 'slow and slow small' event obviously rises in recent years by trying a double-edged sword, the significant threat of the development of the 'slow and slow small' target to important targets, key areas and important activities is increasingly highlighted, and once the double-edged sword is utilized by some people with bad minds, an unimaginable later result is generated. With the opening of the low-altitude airspace in China, the supervision and the prevention of the low-slow small target become problems to be solved urgently, and the accurate detection, interception, tracking and striking of the low-slow small target are very important and urgent.
The low-slow small target has the characteristics of difficult detection and difficult defense, and the existing interception modes of the low-slow small target mainly include soft killing and hard killing. The soft killing realizes the fighting capacity of weakening the 'low-slow small' target through an interference communication link, an interference navigation positioning system and interference detection equipment. Hard kills are through interventions in the form of sending helicopter blows, drone blows and destroying ground stations. Because of a series of advantages of strong battlefield sensing capability, high flexibility, low cost and the like of the unmanned aerial vehicle, striking a 'low-slow-small' target by the unmanned aerial vehicle becomes a very considerable measure with respect to cost-effectiveness ratio.
The precondition of all processing on low and slow small targets is that the target is identified and tracked in real time, and the tracked target is also identified by carrying a camera through a pan-tilt in the existing scheme.
For the above reasons, the present inventors have conducted intensive studies on the existing pan/tilt control method, and have a desire to design a pan/tilt control method based on machine vision, which can solve the above problems and is applied to a reverse-low-slow small target.
Disclosure of Invention
In order to overcome the problems, the inventor of the invention makes a keen study and designs a pan-tilt control method based on machine vision, which is applied to the anti-low-slow small targets, in the method, through setting an unmanned aerial vehicle and an automatic control process of the pan-tilt, the unmanned aerial vehicle can carry the pan-tilt to cruise within a preset range, so that the low-slow small targets which invade are found in time, the pan-tilt rotates according to a set special angular speed after finding the targets, the possibility that the targets are separated from a visual field domain is reduced, the targets are located at the central position of the visual field domain as much as possible, and the tracking effect of the targets is enhanced, thereby completing the invention.
Specifically, the invention aims to provide a machine vision-based pan-tilt control method applied to a reverse slow small target, which comprises the following steps:
step 1, an unmanned aerial vehicle carrying a cloud deck hovers when reaching a preset position, controls a cloud deck and a camera on the cloud deck to search for a target, and enters a search mode;
step 2, in the search mode, a camera reads a shot image in real time, judges whether the image contains a target, and judges whether the image enters a tracking mode when the image contains the target;
step 3, generating a control command to adjust the rotation angular velocity of the pan-tilt according to the pixel deviation of the target in the image in real time after entering the tracking mode,
in the tracking mode, if the target is lost in a short time, the tracking mode is continuously kept, and if the target is lost for a long time, the tracking mode is stopped, the cradle head is controlled to recover to an initial angle state, and the unmanned aerial vehicle is controlled to recover to a preset position.
In step 2, when the target depth in the image is less than the set value and the continuous multi-frame images all contain the target, the tracking mode is entered.
Wherein the set value is 30 meters, and the multi-frame image is 5 frames or more than 5 frames.
In step 3, after entering the tracking mode, the rotational angular velocity of the pan/tilt head is first adjusted at a small speed, and then adjusted at a large speed after a certain time.
In step 3, after entering the tracking mode, calculating a desired rotational angular velocity of the pan/tilt head according to the following formula (one), and controlling the pan/tilt head to rotate according to the desired rotational angular velocity;
Figure BDA0002780582240000031
wherein the content of the first and second substances,
Figure BDA0002780582240000032
indicating the desired rotational angular velocity, k, of the headpmaxRepresenting the maximum value of the pixel deviation term weight, kpminRepresenting the minimum value of the pixel deviation term weight, t representing time, err representing the pixel deviation, kdmaxRepresenting the maximum value of the pixel deviation change rate weight, kdminRepresenting the minimum value of the pixel deviation rate of change weight,
Figure BDA0002780582240000033
indicating the rate of change of pixel deviation.
Wherein, in the step 3, after the target is lost in the tracking mode, the control command corresponding to the previous frame of image is continuously lost through the target to control the pan-tilt,
and if the target cannot be captured after the target is lost for 200ms, terminating the tracking mode and controlling the holder to recover to the initial angle state.
And controlling the unmanned aerial vehicle to return to the preset position after the target is lost for 1 s.
When the unmanned aerial vehicle enters the tracking mode, the state estimation of the target is obtained through real-time calculation, and the unmanned aerial vehicle is controlled to track or chase the target according to the state estimation.
The invention has the advantages that:
(1) according to the cloud deck control method based on machine vision applied to the anti-low-slow small target, the cloud deck can be stably switched from a static state to a fast tracking state, and the cloud deck cannot lose the target due to motion blur;
(2) according to the pan-tilt control method based on machine vision, which is applied to the anti-low-slow small target, the required navigation information can be conveniently obtained when a task is executed;
(3) according to the cloud deck control method based on machine vision applied to the inverse low-speed small target, which is provided by the invention, the cloud deck can be always in a controllable state when some unexpected conditions occur by the control method with high robustness.
Drawings
FIG. 1 is a logic diagram of a control method of a pan-tilt-zoom based on machine vision applied to a small target with low speed and low speed according to a preferred embodiment of the invention;
FIG. 2 shows pixel normalized pixel deviations obtained in the example;
FIG. 3 shows a partial enlarged view of FIG. 2;
fig. 4 shows a comparison of an actual trajectory with an observed trajectory.
Detailed Description
The invention is explained in more detail below with reference to the figures and examples. The features and advantages of the present invention will become more apparent from the description.
The word "exemplary" is used exclusively herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments. While the various aspects of the embodiments are presented in drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
According to the pan-tilt control method based on machine vision applied to the anti-low-slow small target provided by the invention, as shown in fig. 1, the method comprises the following steps:
step 1, an unmanned aerial vehicle carrying a cloud deck hovers when reaching a preset position, controls the cloud deck and a camera on the cloud deck to search for a target, and enters a search mode; the preset position can be a coordinate point filled into the unmanned aerial vehicle before the unmanned aerial vehicle takes off, and also can be a coordinate point transmitted to the unmanned aerial vehicle by the ground control station in real time.
Step 2, in the search mode, a camera reads a shot image in real time, judges whether the image contains a target, and judges whether the image enters a tracking mode when the image contains the target; in the search mode, the camera takes pictures in real time at a preset frequency, for example 25Hz, and searches for objects in each frame of image by means of image recognition, and the possible shapes of the objects are filled into the image recognition system by means of machine learning.
Step 3, generating a control command to adjust the rotation angular velocity of the pan-tilt according to the pixel deviation of the target in the image in real time after entering the tracking mode,
in the tracking mode, if the target is lost in a short time, the tracking mode is continuously kept, and if the target is lost for a long time, the tracking mode is stopped, the cradle head is controlled to recover to an initial angle state, and the unmanned aerial vehicle is controlled to recover to a preset position. When the target is just lost, the target is considered to be possible to be re-identified in a short time, so that for the cloud platform, the control instruction of the target just before the target is lost can be kept to ensure that the movement direction of the cloud platform is consistent with the movement direction of the target to the maximum extent, and conditions are provided for re-identifying the target; when the target is lost for a long time and cannot be identified again, the target is treated as a complete loss in case the unmanned aerial vehicle is in an unknown danger and is ready for re-mission.
In a preferred embodiment, in step 2, when the depth of the target in the image is less than the set value and the target is included in the continuous multiframe images, the tracking mode is entered.
Preferably, the set value is 30 meters, and the plurality of frames of images are 5 frames or more than 5 frames of images. The condition for entering the tracking mode is set, so that the target can not be blurred due to too far distance after entering the tracking mode, and can not be frequently lost due to the fact that the target deviates from the center of the field of view.
In a preferred embodiment, in step 3, after entering the tracking mode, the rotational angular velocity of the pan/tilt head is first adjusted at a small speed, and after a certain time, the rotational angular velocity of the pan/tilt head is adjusted at a large speed. When the pan/tilt head starts moving, because the target is likely to be located at the edge position in the field of view, a large control instruction is given to the pan/tilt head at this time, which causes the pan/tilt head to move too fast, so that the image generates motion blur, and the target cannot be identified even though being located in the field of view, therefore, the control instruction given in the first stage is not suitable to be too large; after the pan-tilt tracking exceeds a certain time, if a small control instruction is still kept, when a target moves in front of the pan-tilt in a large movement mode, the pan-tilt can be lost due to insufficient maneuvering capability, and therefore the maneuvering capability of the pan-tilt needs to be increased at this stage, and the control instruction is increased.
Preferably, the expected rotational angular velocity of the pan/tilt head is calculated by the following formula (one) after entering the tracking mode, and the pan/tilt head is controlled to rotate according to the calculated rotational angular velocity;
Figure BDA0002780582240000061
wherein the content of the first and second substances,
Figure BDA0002780582240000062
indicating the desired rotational angular velocity, k, of the headpmaxRepresenting the maximum value of the pixel deviation term weight, kpminRepresents the minimum value of the pixel deviation term weight, t represents the time, specifically the time after entering the tracking mode, i.e. the time is started after entering the tracking mode, err represents the pixel deviation, k represents the pixel deviationdmaxRepresenting the maximum value of the pixel deviation change rate weight, kdminRepresenting the minimum value of the pixel deviation rate of change weight,
Figure BDA0002780582240000063
indicating the rate of change of element deviation, i.e. eThe derivative of rr.
In a preferred embodiment, in step 3, in the tracking mode, after the target is lost, the pan-tilt is continuously controlled through the control instruction corresponding to the previous frame of image where the target is lost, that is, the target can still be found within 200ms of the target loss;
and if the target cannot be captured after the target is lost for 200ms, terminating the tracking mode, and controlling the holder to recover to the initial angle state, namely that the target cannot be found back at the moment.
Through setting up this time condition, can improve unmanned aerial vehicle and the efficiency of patrolling of last camera to the at utmost, reduce the possibility that the region of patrolling has low little target infiltration slowly.
Preferably, after the target is lost for 1s, the unmanned aerial vehicle is controlled to return to the preset position, and the next cruising operation can be started after the unmanned aerial vehicle returns to the preset position.
Preferably, when the unmanned aerial vehicle enters the tracking mode, the state estimation of the target is obtained through real-time calculation, and the unmanned aerial vehicle is controlled to track or chase the target according to the state estimation. The drone may carry equipment to attack or capture objects, so the objects may also disappear from the field of view after being knocked down or captured.
In a preferred embodiment, when entering the tracking mode, the drone tracks or chases a found low-slow small target with the pan-tilt and the camera, and specific operations such as tracking, shooting or capturing can be selected in advance according to a set instruction. The unmanned aerial vehicle image recognition system extracts at least 4 characteristic points from the target in each frame of image, and calculates the state estimation of the target according to the characteristic points, wherein the state estimation comprises the position, the posture and the speed of the target. The characteristic point is the peculiar point on the target, is convenient for discern, can select to set for according to the kind and the appearance of target, like four motor positions of four rotor unmanned aerial vehicle. The method specifically comprises the following steps:
step A, obtaining a rotation matrix through the pixel coordinates of the target characteristic points,
step B, obtaining the attitude of the target through the rotation matrix,
step C, obtaining the acceleration of the target through the attitude of the target,
and D, acquiring the actual position and speed of the target through the acceleration of the target.
Preferably, in the step a, the rotation parameter of the target is obtained by the following formula (two):
Figure BDA0002780582240000081
wherein R represents a rotation matrix, i.e. for deriving from an orthogonal coordinate system OaXaYaZaTo the camera coordinate system OcXcYcZcA 3 × 3 rotation matrix for conversion, wherein 9 parameters in the rotation matrix are also called rotation parameters;
r' represents an arbitrary rotation matrix, the third column [ R ] thereof7 r8 r9]TEqual to the rotation axis Za and R' satisfies the orthogonal constraint of the rotation matrix;
rotating shaft
Figure BDA0002780582240000082
Figure BDA0002780582240000083
Representing point Pi0Point of orientation Pj0Vector of (c), Pi0Pj0I represents a point Pi0Point of orientation Pj0Modulo of the vector of (a);
two points P can be solved by extracting the pixel coordinates of 4 characteristic points from the target in each frame imagei0And point Pj0To determine the rotation axis Za in equation (two), i.e. [ r ]7 r8 r9]T
rot (Z, alpha) represents that the rotation angle of the target around the Z axis is alpha;
c=cosα,s=sinα;
r1to r9Each representing each element of an arbitrary 3 x 3 rotation matrix R', a third column R7 r8 r9]TEqual to the rotation axis Za.
In the step B, the posture of the target is obtained by the following formula (three):
Figure BDA0002780582240000091
wherein, theta1Represents the pitch angle of the target in the range of
Figure BDA0002780582240000092
θ2Represents the pitch angle of the target when theta1When the angle of pitch of the unmanned aerial vehicle is larger than 90 degrees or smaller than-90 degrees2It is shown that,
ψ1representing pitch angle theta1The yaw angle of the target is obtained by corresponding solution,
ψ2representing pitch angle theta2The yaw angle of the target is obtained by corresponding solution,
φ1representing pitch angle theta1The roll angle of the target obtained by corresponding solution is obtained,
φ2representing pitch angle theta2The roll angle, R, of the target obtained by solving the time correspondence31、R32、R33The three elements in the third row of the rotation matrix R solved in expression (two),
R21the first element of the second row in the rotation matrix R solved in expression (two),
R11the first element of the first row in the rotation matrix R obtained by solving in the expression (II);
a sin denotes an arcsine function, and a tan2 denotes an arctan function.
The target attitude comprises three included angles of a target body coordinate system and an inertial coordinate, namely a roll angle, a pitch angle and a yaw angle, and can be obtained through the formula (III).
In the step C, the acceleration of the target drone is obtained by the following formula (four):
a=[ax,ay,az]T(IV)
Where, a represents the acceleration of the target,
axrepresents an acceleration component in the X-axis direction in the inertial coordinate system,
Figure BDA0002780582240000101
ayrepresents the acceleration component in the Y-axis direction in the inertial coordinate system,
Figure BDA0002780582240000102
azrepresents the acceleration component in the vertical direction, az=0
Wherein g represents the acceleration of gravity;
theta represents the pitch angle of the target unmanned aerial vehicle obtained by solving in the formula (III),
Figure BDA0002780582240000104
solving the roll angle of the obtained target unmanned aerial vehicle in the expression (III),
solving the yaw angle of the obtained target unmanned aerial vehicle in the expression psi (III).
Preferably, when tracking the target, the unmanned aerial vehicle controls itself to be in a horizontal plane parallel to the target, and sets the target to stably fly in the horizontal plane.
In said step D, the actual position and velocity of the target are obtained by the following formula (five),
Figure BDA0002780582240000103
wherein, KkRepresenting Kalman gain, γkRepresenting a binary random variable for simulating intermittent measurements, if an object is detected in the k-th frame imageγ k1, if the target drone is not detected in the k-th frame image, γk=0;
wkRepresenting the process noise, w, corresponding to the k-th frame imagek-1Representing the corresponding process noise of the k-1 frame image,
Figure BDA0002780582240000111
indicating the state quantity corresponding to the k frame image estimated based on the k-1 frame image,
Figure BDA0002780582240000112
the state quantity corresponding to the estimated optimal k-1 frame image, namely X,
Figure BDA0002780582240000113
representing the state quantity corresponding to the optimal k frame image obtained by estimation, namely X;
Zkrepresenting the measurement quantity corresponding to the k frame image, namely Z;
a represents a process matrix and H represents an observation matrix;
Figure BDA0002780582240000114
p denotes the position of the object, v denotes the velocity of the object, a denotes the acceleration of the object, h denotes the sampling period of the image, preferably 25Hz, I3Representing a three-dimensional identity matrix.
By the method, the target state estimation corresponding to each image can be obtained when each frame of image is obtained, so that the speed of the unmanned aerial vehicle can be controlled accordingly, the distance between the target and the unmanned aerial vehicle is ensured to be kept within a certain range, such as within 30 meters, the target is gradually close to the target or the distance between the target and the target is kept constant, and the cloud deck and the camera can be helpful for capturing the target more clearly.
Examples
Selecting a low-slow small target to move on a plane at the speed of 12 m/s, wherein the motion track is shown by a solid line in fig. 4, tracking the low-slow small target by an unmanned aerial vehicle carrying cloud deck and a camera, finding the target after the unmanned aerial vehicle enters a search mode, judging and knowing that the target depth is 23 m by continuous 5 frames of images with the target, and then entering a tracking mode, namely, the unmanned aerial vehicle enters the tracking mode at the moment of 0, wherein the tracking mode lasts for more than 6 seconds, the cloud deck is controlled to rotate by the following formula (I) in the tracking mode, and the target is lost within 200ms during the period, and the cloud deck is controlled by a control instruction corresponding to the image of the previous frame of the target during the loss period;
Figure BDA0002780582240000121
wherein the content of the first and second substances,
Figure BDA0002780582240000122
indicating the desired rotational angular velocity, k, of the headpmaxRepresenting the maximum value of the pixel deviation term weight, kpminRepresenting the minimum value of the weight of the pixel deviation term, t representing the time after entering the tracking mode, err representing the pixel deviation, kdmaxRepresenting the maximum value of the pixel deviation variation rate weight, kdminRepresenting the minimum value of the pixel deviation rate of change weight,
Figure BDA0002780582240000123
indicating the rate of change of prime bias.
In the tracking mode, the drone controls itself to follow the target and maintain a fixed distance, which is the distance between the drone and the target when entering the tracking mode, namely 23 meters. Specifically, the unmanned aerial vehicle obtains the position and speed information of the target in real time through the following steps:
step A, extracting 4 characteristic points from the target in each frame of image, and obtaining a rotation matrix according to the pixel coordinates of the characteristic points,
step B, obtaining the attitude of the target through the rotation matrix,
step C, obtaining the acceleration of the target through the attitude of the target,
and D, acquiring the actual position and speed of the target through the acceleration of the target.
Wherein the rotation parameter of the target is obtained by the following formula (two):
Figure BDA0002780582240000124
wherein R represents a rotation matrix, i.e. for deriving from an orthogonal coordinate system OaXaYaZaTo the camera coordinate system OcXcYcZcA 3 × 3 rotation matrix for conversion, wherein 9 parameters in the rotation matrix are also called rotation parameters;
r' represents an arbitrary rotation matrix, the third column [ R ] thereof7 r8 r9]TEqual to the rotation axis Za and R' satisfies the orthogonal constraint of the rotation matrix;
rotating shaft
Figure BDA0002780582240000131
Figure BDA0002780582240000132
Representing point Pi0Point of orientation Pj0Vector of (c), Pi0Pj0I represents a point Pi0Point of orientation Pj0Modulo of the vector of (a);
two points P can be solved by extracting the pixel coordinates of 4 characteristic points from the target in each frame imagei0And point Pj0To determine the rotation axis Za in equation (two), i.e. [ r ]7 r8 r9]T
rot (Z, alpha) represents that the rotation angle of the target around the Z axis is alpha;
c=cosα,s=sinα;
r1to r9Each representing each element of an arbitrary 3 x 3 rotation matrix R', a third column R7 r8 r9]TEqual to the rotation axis Za.
In the step B, the posture of the target is obtained by the following formula (three):
Figure BDA0002780582240000133
wherein, theta1Represents the pitch angle of the target in the range of
Figure BDA0002780582240000134
θ2Represents the pitch angle of the target when theta1When the angle of pitch of the unmanned aerial vehicle is larger than 90 degrees or smaller than-90 degrees2It is shown that,
ψ1representing pitch angle theta1The yaw angle of the target is obtained by corresponding solution,
ψ2representing pitch angle theta2The yaw angle of the target is obtained by corresponding solution,
φ1representing pitch angle theta1The roll angle of the target obtained by corresponding solution is obtained,
φ2representing pitch angle theta2The roll angle of the target obtained by corresponding solution is obtained,
R31、R32、R33the three elements in the third row of the rotation matrix R solved in expression (two),
R21the first element of the second row in the rotation matrix R solved in expression (two),
R11the first element of the first row in the rotation matrix R obtained by solving in the expression (II);
asin denotes the arcsine function and atan2 denotes the arctan function.
The target attitude comprises three included angles of a target body coordinate system and an inertial coordinate, namely a roll angle, a pitch angle and a yaw angle, and can be obtained through the formula (III).
In the step C, the acceleration of the target drone is obtained by the following formula (four):
a=[ax,ay,az]T(IV)
Where, a represents the acceleration of the target,
axrepresents an acceleration component in the X-axis direction in the inertial coordinate system,
Figure BDA0002780582240000141
ayrepresents the acceleration component in the Y-axis direction in the inertial coordinate system,
Figure BDA0002780582240000142
azrepresents the acceleration component in the vertical direction, az=0
Wherein g represents the acceleration of gravity;
theta represents the pitch angle of the target unmanned aerial vehicle obtained by solving in the formula (III),
Figure BDA0002780582240000156
solving the roll angle of the obtained target unmanned aerial vehicle in the expression (III),
solving the yaw angle of the obtained target unmanned aerial vehicle in the expression psi (III).
Preferably, when tracking the target, the unmanned aerial vehicle controls itself to be in a horizontal plane parallel to the target, and sets the target to stably fly in the horizontal plane.
In said step D, the actual position and velocity of the target are obtained by the following formula (five),
Figure BDA0002780582240000151
wherein, KkRepresenting Kalman gain, γkRepresenting a binary random variable used to simulate intermittent measurements, gamma if an object is detected in the kth frame image k1, if the target drone is not detected in the k-th frame image, γk=0;
wkRepresenting the process noise, w, corresponding to the k-th frame imagek-1Representing the corresponding process noise of the k-1 frame image,
Figure BDA0002780582240000152
indicating the state quantity corresponding to the k frame image estimated based on the k-1 frame image,
Figure BDA0002780582240000153
the state quantity corresponding to the estimated optimal k-1 frame image, namely X,
Figure BDA0002780582240000154
representing the state quantity corresponding to the optimal k frame image obtained by estimation, namely X;
Zkrepresenting the measurement quantity corresponding to the k frame image, namely Z;
a represents a process matrix and H represents an observation matrix;
Figure BDA0002780582240000155
p denotes the position of the object, v denotes the velocity of the object, a denotes the acceleration of the object, h denotes the sampling period of the image, preferably 25Hz, I3Representing a three-dimensional identity matrix.
The target track obtained by the method in the first 5 seconds in the tracking mode is selected, the track is compared with the target real track, the obtained track deviation condition is shown in fig. 2 and fig. 3, and the obtained observation target position track is shown in a dotted line in fig. 4.
As can be seen from fig. 2 and 3, when the pan/tilt and the camera enter the tracking mode for about 1s, the target can be basically within the error range of 0.1 (the total upper and lower errors are-1 to 1), and the situation that the target cannot be identified due to excessive motion blur (i.e. the pixel deviation is 0) does not occur; in addition, in the tracking mode, when the target is lost in the middle, the control command corresponding to the previous frame image of the target loss is continuously used for controlling the holder, so that the target can be captured again within 200ms, and the tracking can be continuously carried out.
As can be seen from fig. 4, the observed target trajectory substantially coincides with the real target trajectory, the deviation between the two trajectories is small, and the observed trajectory can be used to characterize the real trajectory.
The present invention has been described above in connection with preferred embodiments, which are merely exemplary and illustrative. On the basis of the above, the invention can be subjected to various substitutions and modifications, and the substitutions and the modifications are all within the protection scope of the invention.

Claims (8)

1. A cloud deck control method based on machine vision and applied to anti-low-slow small targets is characterized by comprising the following steps:
step 1, an unmanned aerial vehicle carrying a cloud deck hovers when reaching a preset position, controls the cloud deck and a camera on the cloud deck to search for a target, and enters a search mode;
step 2, in the search mode, a camera reads a shot image in real time, judges whether the image contains a target, and judges whether the image enters a tracking mode when the image contains the target;
step 3, generating a control command to adjust the rotation angular velocity of the pan-tilt according to the pixel deviation of the target in the image in real time after entering the tracking mode,
in the tracking mode, if the target is lost in a short time, the tracking mode is continuously kept, if the target is lost for a long time, the tracking mode is stopped, the cradle head is controlled to recover to the initial angle state, and the unmanned aerial vehicle is controlled to recover to the preset position.
2. The machine vision-based pan-tilt control method applied to anti-low-slow small targets according to claim 1,
in step 2, when the target depth in the image is less than the set value and the continuous multiframe images all contain the target, the tracking mode is entered.
3. The machine vision-based pan-tilt control method applied to anti-low-slow small targets according to claim 2,
the set value is 30 meters, and the multi-frame image is 5 frames or more than 5 frames.
4. The machine vision-based pan-tilt control method applied to anti-low-slow small targets according to claim 1,
in step 3, after entering the tracking mode, the rotational angular velocity of the pan/tilt head is first adjusted at a small speed, and after a certain time, the rotational angular velocity of the pan/tilt head is adjusted at a large speed.
5. The machine vision-based pan-tilt control method applied to anti-low-slow small targets according to claim 4,
in step 3, obtaining a desired rotational angular velocity of the pan/tilt head after entering the tracking mode according to the following formula (one), and controlling the pan/tilt head to rotate according to the desired rotational angular velocity;
Figure FDA0002780582230000021
wherein the content of the first and second substances,
Figure FDA0002780582230000022
indicating the desired rotational angular velocity, k, of the headpmaxRepresenting the maximum value of the pixel deviation term weight, kpminRepresenting the minimum value of the pixel deviation term weight, t representing time, err representing the pixel deviation, kdmaxRepresenting the maximum value of the pixel deviation change rate weight, kdminRepresenting the minimum value of the pixel deviation rate of change weight,
Figure FDA0002780582230000023
expression elementRate of change of deviation.
6. The machine vision-based pan-tilt control method applied to anti-low-slow small targets according to claim 1,
in step 3, in the tracking mode, after the target is lost, the pan-tilt is continuously controlled by the control instruction corresponding to the previous frame of image lost by the target,
and if the target cannot be captured after the target is lost for 200ms, terminating the tracking mode and controlling the holder to recover to the initial angle state.
7. The machine vision-based pan-tilt control method applied to anti-low-slow small targets according to claim 1,
and after the target is lost for 1s, controlling the unmanned aerial vehicle to return to the preset position.
8. The machine vision-based pan-tilt control method applied to the anti-low-slow small target according to claim 1, characterized in that:
when the unmanned aerial vehicle enters a tracking mode, the state estimation of the target is obtained through real-time calculation, and the unmanned aerial vehicle is controlled to track or chase the target according to the state estimation.
CN202011280436.2A 2020-11-16 2020-11-16 Pan-tilt control method based on machine vision and applied to anti-low-slow small target Pending CN113721665A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011280436.2A CN113721665A (en) 2020-11-16 2020-11-16 Pan-tilt control method based on machine vision and applied to anti-low-slow small target

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011280436.2A CN113721665A (en) 2020-11-16 2020-11-16 Pan-tilt control method based on machine vision and applied to anti-low-slow small target

Publications (1)

Publication Number Publication Date
CN113721665A true CN113721665A (en) 2021-11-30

Family

ID=78672358

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011280436.2A Pending CN113721665A (en) 2020-11-16 2020-11-16 Pan-tilt control method based on machine vision and applied to anti-low-slow small target

Country Status (1)

Country Link
CN (1) CN113721665A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090154768A1 (en) * 2007-12-18 2009-06-18 Robert Bosch Corporation Method of motion detection and autonomous motion tracking using dynamic sensitivity masks in a pan-tilt camera
US20130070105A1 (en) * 2011-09-15 2013-03-21 Kabushiki Kaisha Toshiba Tracking device, tracking method, and computer program product
CN109753076A (en) * 2017-11-03 2019-05-14 南京奇蛙智能科技有限公司 A kind of unmanned plane vision tracing implementing method
CN110322474A (en) * 2019-07-11 2019-10-11 史彩成 A kind of image motive target real-time detection method based on unmanned aerial vehicle platform
CN111656403A (en) * 2019-06-27 2020-09-11 深圳市大疆创新科技有限公司 Method and device for tracking target and computer storage medium
CN111932588A (en) * 2020-08-07 2020-11-13 浙江大学 Tracking method of airborne unmanned aerial vehicle multi-target tracking system based on deep learning

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090154768A1 (en) * 2007-12-18 2009-06-18 Robert Bosch Corporation Method of motion detection and autonomous motion tracking using dynamic sensitivity masks in a pan-tilt camera
US20130070105A1 (en) * 2011-09-15 2013-03-21 Kabushiki Kaisha Toshiba Tracking device, tracking method, and computer program product
CN109753076A (en) * 2017-11-03 2019-05-14 南京奇蛙智能科技有限公司 A kind of unmanned plane vision tracing implementing method
CN111656403A (en) * 2019-06-27 2020-09-11 深圳市大疆创新科技有限公司 Method and device for tracking target and computer storage medium
CN110322474A (en) * 2019-07-11 2019-10-11 史彩成 A kind of image motive target real-time detection method based on unmanned aerial vehicle platform
CN111932588A (en) * 2020-08-07 2020-11-13 浙江大学 Tracking method of airborne unmanned aerial vehicle multi-target tracking system based on deep learning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
辛哲奎等: "小型无人机地面目标跟踪系统机载云台自适应跟踪控制", 控制理论与应用, vol. 27, no. 8, 15 August 2010 (2010-08-15), pages 1001 - 1006 *

Similar Documents

Publication Publication Date Title
CN108227751B (en) Landing method and system of unmanned aerial vehicle
US10187580B1 (en) Action camera system for unmanned aerial vehicle
CN111596693B (en) Ground target tracking control method and system for unmanned aerial vehicle based on pan-tilt camera
Barber et al. Vision-based target geo-location using a fixed-wing miniature air vehicle
Wang et al. Quadrotor autonomous approaching and landing on a vessel deck
CN108459618A (en) A kind of flight control system and method that unmanned plane automatically launches mobile platform
CN109753076A (en) A kind of unmanned plane vision tracing implementing method
CN107741229A (en) A kind of carrier landing guidance method of photoelectricity/radar/inertia combination
CN110222581A (en) A kind of quadrotor drone visual target tracking method based on binocular camera
CN111932588A (en) Tracking method of airborne unmanned aerial vehicle multi-target tracking system based on deep learning
EP3128386B1 (en) Method and device for tracking a moving target from an air vehicle
WO2017040254A1 (en) Mitigation of small unmanned aircraft systems threats
CN107656545A (en) A kind of automatic obstacle avoiding searched and rescued towards unmanned plane field and air navigation aid
US9221557B1 (en) UAV retrieval system and method
CN107463181A (en) A kind of quadrotor self-adoptive trace system based on AprilTag
US20180095469A1 (en) Autonomous system for shooting moving images from a drone, with target tracking and holding of the target shooting angle
CN105717933A (en) Unmanned aerial vehicle and unmanned aerial vehicle anti-collision method
CN112363528B (en) Unmanned aerial vehicle anti-interference cluster formation control method based on airborne vision
CN110132060A (en) A kind of method of the interception unmanned plane of view-based access control model navigation
CN111665870A (en) Trajectory tracking method and unmanned aerial vehicle
CN114689030A (en) Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision
Lee et al. Autonomous target following with monocular camera on uas using recursive-ransac tracker
Morais et al. Trajectory and Guidance Mode for autonomously landing an UAV on a naval platform using a vision approach
Lin et al. Real-time 6DoF deck pose estimation and target tracking for landing an UAV in a cluttered shipboard environment using on-board vision
CN113721665A (en) Pan-tilt control method based on machine vision and applied to anti-low-slow small target

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination