CN112639652A - Target tracking method and device, movable platform and imaging platform - Google Patents

Target tracking method and device, movable platform and imaging platform Download PDF

Info

Publication number
CN112639652A
CN112639652A CN202080004853.6A CN202080004853A CN112639652A CN 112639652 A CN112639652 A CN 112639652A CN 202080004853 A CN202080004853 A CN 202080004853A CN 112639652 A CN112639652 A CN 112639652A
Authority
CN
China
Prior art keywords
angle
attitude
target
imaging device
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080004853.6A
Other languages
Chinese (zh)
Inventor
陆泽早
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN112639652A publication Critical patent/CN112639652A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Abstract

A target tracking method, a target tracking apparatus, a computer-readable storage medium, a movable platform, and an imaging platform, the target tracking method comprising: acquiring the current position and the current size of a target in a picture of the imaging equipment; acquiring an expected position of the target in the picture, and determining the attitude angle deviation of the imaging equipment according to the expected position and the current position; determining an attitude angular velocity corresponding to the attitude angular deviation according to the current position and the current size; controlling the imaging device to rotate the attitude angular deviation at the attitude angular velocity to bring the target to the desired position in the screen.

Description

Target tracking method and device, movable platform and imaging platform
Technical Field
The present disclosure relates to the field of target tracking, and in particular, to a target tracking method and apparatus, a movable platform, and an imaging platform.
Background
A movable platform such as a drone may be used to perform tracking of the target. Such a movable platform typically comprises a movable carrier and an imaging device. The imaging device is typically mounted to a movable carrier by a carrier that can rotate the imaging device relative to the movable carrier.
In a general target tracking process, a user may select a target in a frame of an imaging apparatus through a remote controller and designate a desired position of the target in the frame. The movable platform responds to the selection and designation operation of the user, and controls the movable carrier and the carrier to rotate so as to adjust the posture of the imaging device, so that the target moves to the expected position in the picture of the imaging device.
Disclosure of Invention
The embodiment of the disclosure provides a target tracking method, which includes:
acquiring the current position and the current size of a target in a picture of the imaging equipment;
acquiring an expected position of the target in the picture, and determining the attitude angle deviation of the imaging equipment according to the expected position and the current position;
determining an attitude angular velocity corresponding to the attitude angular deviation according to the current position and the current size;
controlling the imaging device to rotate the attitude angular deviation at the attitude angular velocity to bring the target to the desired position in the screen.
The embodiment of the present disclosure further provides a target tracking apparatus, including:
a memory for storing executable instructions;
a processor to execute the executable instructions stored in the memory to perform the following:
acquiring the current position and the current size of a target in a picture of the imaging equipment;
acquiring an expected position of the target in the picture, and determining the attitude angle deviation of the imaging equipment according to the expected position and the current position;
determining an attitude angular velocity corresponding to the attitude angular deviation according to the current position and the current size;
controlling the imaging device to rotate the attitude angular deviation at the attitude angular velocity to bring the target to the desired position in the screen.
Embodiments of the present disclosure also provide a computer-readable storage medium storing executable instructions that, when executed by one or more processors, may cause the one or more processors to perform the above-described target tracking method.
The disclosed embodiment also provides a movable platform, including: a movable carrier, an imaging apparatus, and a carrier;
the imaging device is mounted to the movable carrier or, alternatively, to the movable carrier by the carrier;
the movable carrier includes: the above-mentioned target tracking means;
the target tracking device may control the vehicle and/or the movable carrier to rotate to adjust a pose angle of the imaging apparatus.
The embodiment of the present disclosure further provides an imaging platform, including: a carrier and an imaging device; the carrier includes: the above-mentioned target tracking means;
the target tracking device may control the vehicle to rotate to adjust a pose angle of the imaging apparatus.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present disclosure, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without inventive labor.
Fig. 1 is a flowchart of a target tracking method according to an embodiment of the present disclosure.
Fig. 2 is a schematic structural diagram of a movable platform according to an embodiment of the disclosure.
Fig. 3 shows a screen of the imaging apparatus of the embodiment of the present disclosure.
Fig. 4 shows another screen of the imaging apparatus of the embodiment of the present disclosure.
Fig. 5 is a flowchart for determining an attitude angular velocity corresponding to the attitude angular deviation.
FIG. 6 is a flow chart of determining at least one control parameter.
Fig. 7 is a flow chart for determining the field angle of an object.
Fig. 8 shows still another screen of the imaging apparatus of the embodiment of the present disclosure.
FIG. 9 is a flow chart for determining an attitude angle of a line connecting a target and an imaging device.
Fig. 10 is an input/output curve of the attitude angle controller.
Fig. 11 is a flowchart for determining an attitude angular velocity corresponding to an attitude angular deviation.
FIG. 12 is a flow chart for determining a desired pose angle of an imaging device.
Fig. 13a, 13b and 13c show three scenarios for calculating the desired attitude angle in an analytic manner, respectively.
Fig. 14 shows a top view of the imaging apparatus.
Fig. 15 is a flowchart of a target tracking method according to another embodiment of the present disclosure.
FIG. 16 is a flow chart for determining a desired pose angle of an imaging device.
Fig. 17a and 17b show two scenarios for calculating the desired attitude angle in an iterative manner, respectively.
Fig. 18 is a flowchart of a target tracking method according to another embodiment of the present disclosure.
Fig. 19 is another flowchart of a target tracking method according to another embodiment of the present disclosure.
Fig. 20 is another input/output curve of the attitude angle controller.
Fig. 21 is a flowchart of a target tracking method according to yet another embodiment of the present disclosure.
Fig. 22 is a schematic diagram of a target tracking device according to an embodiment of the disclosure.
Detailed Description
In the target tracking process, the user may specify a desired position in the screen of the imaging device, and the target in the screen is located at the desired position by adjusting the posture of the imaging device. In a general target tracking method, a controller with fixed parameters is generally adopted to control the attitude angular velocity of an imaging device, and the control method often has the following defects:
first, control of attitude angle and attitude angular velocity lacks smoothness and stability.
Since the attitude angular velocity is positively correlated with the focal length of the imaging device, the attitude angular velocity is too slow when the focal length of the imaging device is small, and the attitude angular velocity is too fast when the focal length of the imaging device is large. The attitude angular velocity is also positively correlated with the pitch angle of the target relative to the imaging device, and therefore, when the pitch angle of the target relative to the imaging device is small, the heading angular velocity of the imaging device is too slow, and when the pitch angle of the target relative to the imaging device is large, the heading angular velocity of the imaging device is too fast.
When the size of the target in the image forming apparatus screen is large, the recognition of the target by the image forming apparatus is unstable, resulting in large fluctuation in the attitude angular velocity.
When the pitch angle of the target relative to the imaging device is large (e.g., close to 90 degrees), large variations in the heading angle of the imaging device are easily caused, causing the imaging device to oscillate back and forth between different poses.
Secondly, when the brightness of the environment where the target is located is low, the attitude angle is singular, the target is shielded, and the position of the target relative to the imaging equipment is changed, the target is easily lost.
When the brightness of the environment where the target is located is low, the exposure time of the imaging device is long, and image blurring is easily caused, so that the attitude angular speed is too high, and even the target is lost.
When the pitch angle of the target with respect to the imaging device is large (e.g., close to 90 degrees), the desired attitude angle of the imaging device is prone to singularity, resulting in failure of target tracking.
When a target in a picture of the imaging device is occluded by other objects and then reappears in the picture, the expected attitude angle of the imaging device is easy to change greatly, so that the attitude angular velocity is not smooth, and even the target is lost.
When the position of the target relative to the imaging device changes, for example, the target passes directly under or directly over the imaging device, a serious lag in the control of the attitude angle occurs due to a large change in the heading angle of the imaging device, resulting in even the loss of the target.
The present disclosure provides a target tracking method, a target tracking apparatus, a computer-readable storage medium, a movable platform, and an imaging platform. The method and the device can enable the control of the attitude angle and the attitude angular velocity to be smoother and more stable, can avoid target tracking failure caused by the singular condition of the attitude angle, and can keep tracking the target when the target is shielded, the brightness of the environment where the target is located is lower, and the position of the target relative to the imaging equipment is changed.
The technical solution of the present disclosure will be clearly and completely described below with reference to the embodiments and the drawings in the embodiments. It is to be understood that the described embodiments are merely illustrative of some, and not restrictive, of the embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure.
An embodiment of the present disclosure provides a target tracking method, as shown in fig. 1, the target tracking method includes:
s101: acquiring the current position and the current size of a target in a picture of the imaging equipment;
s102: acquiring an expected position of the target in the picture, and determining the attitude angle deviation of the imaging equipment according to the expected position and the current position;
s103: determining an attitude angular velocity corresponding to the attitude angular deviation according to the current position and the current size;
s104: controlling the imaging device to rotate the attitude angular deviation at the attitude angular velocity to bring the target to the desired position in the screen.
The target tracking method of the embodiment can be applied to various platforms capable of tracking targets, such as a movable platform, an imaging platform and the like. Fig. 2 shows a movable platform 100. The movable platform 100 includes: a movable carrier 110, a carrier 140, and an imaging apparatus 130. The imaging device 130 is supported by the movable carrier 110. Imaging device 130 may be supported directly by movable carrier 110 or may be supported by movable carrier 110 via carrier 140. The imaging device 130 may be used to capture the target 160 within the viewing angle 170 of the imaging device 130. One or more targets 160 may be within a viewing angle 170 of the imaging device 130. In fig. 2, although the movable carrier 110 is depicted as a drone, the present disclosure is not so limited and any suitable type of movable carrier may be used, such as, but not limited to, a drone vehicle, a drone, a robot, and the like.
The movable carrier 110 may include a fuselage 105, and one or more propulsion units 150. The propulsion unit 150 may be configured such that the movable carrier 110 generates lift. Propulsion unit 150 may include a rotor. The movable carrier 110 is capable of flying in a three-dimensional space and is rotatable along at least one of a pitch axis, a yaw axis, and a roll axis. The body 105 of the movable carrier 110 may include within it: a flight controller, one or more processors, one or more memories, one or more sensors, one or more communication units.
The movable platform 100 may include one or more imaging devices 130. In some examples, the imaging device 130 may be a camera, a video camera. The imaging device 130 may be a visible light imaging device, an infrared imaging device, an ultraviolet imaging device, or a thermal imaging device. The imaging device 130 may achieve zooming by adjusting at least one of an optical zoom level, a digital zoom level, to adjust a size of a target in a frame of the imaging device 130.
Imaging device 130 may be mounted to carrier 140. Vehicle 140 may allow imaging device 130 to rotate about at least one of a pitch axis, a yaw axis, and a roll axis. Carrier 140 may include a single axis pan-tilt, a dual axis pan-tilt, or a triple axis pan-tilt.
The movable platform 100 may be controlled by a remote control 120. Remote control 120 may be in communication with at least one of movable carrier 110, vehicle 140, and imaging device 130. The remote control 120 includes a display. The display is used to display a picture of the imaging device 130. The remote control 120 also includes an input device. The input device may be used to receive input information from a user. The user's input information may include the location of the target 160 in the frame of the imaging device 130. The location of the target 160 may include a current location, a desired location of the target 160.
The following describes the target tracking method of this embodiment by taking the movable carrier as the unmanned aerial vehicle as an example.
The current position and the current size of the target in the screen of the imaging apparatus are acquired through S101.
Unmanned aerial vehicle is at the flight in-process, and imaging device can form images to unmanned aerial vehicle's surrounding environment to transmit the image in its picture to unmanned aerial vehicle's remote controller, the image display in imaging device picture is given the user with the display of remote controller. In one embodiment, when the user wishes to track an object of interest on the screen, the object in the screen may be selected via an input device of the remote control. For example, a user may click or box a target in the display to select the target. And responding to the target selection operation of the user, and sending an instruction corresponding to the target selection operation to the unmanned aerial vehicle by the remote controller. After the unmanned aerial vehicle receives the instruction, the imaging device identifies the target selected by the user. As shown in fig. 3, when the target is successfully identified, an identification frame for identifying the contour of the target is displayed in the image of the imaging device, and the identification frame is used for identifying the current position and the current size of the target.
In another embodiment, the imaging device may also automatically select and identify the target that is desired to be tracked. For example, a user may set a set of target selection rules at the drone through a remote controller, and during the flight of the drone, the imaging device automatically selects a tracked target according to the target selection rules. In one example, the target selection rule may be set according to the following parameters: object class, object size, object color, object distance, etc. For example, the object categories may include people, vehicles, animals, and the like. When people, vehicles and animals appear in the picture of the imaging equipment, the tracking objects can be automatically selected as the tracking objects and recognized. And when the target recognition is successful, the target is identified through the identification frame.
The size of the target in the imaging device frame may be determined by the following factors: focal length of imaging device, target distance. As will be appreciated by those skilled in the art, when the focal length of the imaging device is small or the target is far from the imaging device, the size of the target in the frame is small; conversely, the size of the object in the screen is large.
The identification frame may include a frame body of any shape, for example, a rectangular frame, a square frame, a circular frame, etc., and the embodiment is not limited to the specific shape.
The expected position of the target in the screen is acquired through S102, and the attitude angle deviation of the imaging device is determined according to the expected position and the current position.
In some cases, the target is not in the user's desired position in the screen. In this case, the user can adjust the position of the object in the screen, and specify a desired position of the object in the screen through the remote controller to move the object to the desired position. As shown in fig. 4, the user may click on the display of the remote control, and the location of the click may be the desired location of the target. And responding to the expected position clicking operation of the user, and sending an instruction corresponding to the expected position clicking operation to the unmanned aerial vehicle by the remote controller. And after receiving the instruction, the unmanned aerial vehicle determines the attitude angle deviation of the imaging equipment according to the expected position and the current position of the target. The attitude angle deviation of the imaging device refers to the attitude angle deviation of the imaging device in a world coordinate system. The world coordinate system may be: one or more of an inertial coordinate system, a terrestrial coordinate system and a geographic coordinate system.
The attitude angle deviation of the imaging device may be determined in a variety of ways depending on the desired position and the current position, the specific way of determining the attitude angle deviation of the imaging device being described in detail later.
After the current position and the current size of the target are obtained, the attitude angular velocity corresponding to the attitude angular deviation is determined from the current position and the current size through S103.
In the present embodiment, the attitude angular velocity is related to the current position and the current size of the target. Different target current positions or current sizes correspond to different attitude angular velocities. During the target tracking process, the target may move momentarily or intermittently, which causes the current position or the current size of the target in the frame of the imaging device to change, and at this time, the attitude angular velocity corresponding to the attitude angular deviation also changes.
In this embodiment, determining the attitude angular velocity corresponding to the attitude angular deviation according to the current position and the current size, as shown in fig. 5, includes:
s501: determining at least one control parameter according to the current position and the current size;
s502: and obtaining the attitude angular velocity according to the attitude angular deviation and at least one control parameter.
The present embodiment determines the control parameter based on the current position and the current size. In some examples, the control parameter may be a control parameter of a controller for controlling the imaging device. The controller may be an attitude angle controller. In some examples, the attitude angles include: one or more of a heading angle, a pitch angle, and a roll angle. In some examples, the attitude angles may include: heading angle and pitch angle. The attitude angle controller of the imaging apparatus may include various types of controllers of a heading angle and a pitch angle, such as a feedback controller, a predictive controller, and the like. In one example, the attitude angle controller of the imaging device may include: a proportional controller (P) of a heading angle, a proportional integral controller (PI) and a proportional integral derivative controller (PID), a proportional controller (P) of a pitch angle, a proportional integral controller (PI), a proportional integral derivative controller (PID) and the like. In the description of the present embodiment, the attitude angle refers to both the heading angle and the pitch angle unless otherwise specified.
In this embodiment, determining at least one control parameter according to the current position and the current size, as shown in fig. 6, includes:
s601: determining the field angle of the target according to the current position and the current size;
s602: determining an attitude angle of a connecting line between the target and the imaging equipment according to the current position;
s603: and obtaining at least one control parameter according to at least one of the field angle and the attitude angle.
In the present embodiment, the field angle of the object refers to the range of the attitude angle of the object with respect to the imaging device. Determining the field angle of the target according to the current position and the current size, as shown in fig. 7, includes:
s701: acquiring a posture angle of an identification frame relative to the imaging equipment;
s702: and obtaining the angle of field according to the attitude angle of the identification frame relative to the imaging equipment.
The current field angle of the imaging device picture is first acquired. The current field angle may be determined from the imaging device focal length. As shown in the following formula:
Figure BDA0002957643410000081
Figure BDA0002957643410000082
wherein, fovzxAngle of view indicating the direction of the lines of the screen, fovzyA field angle indicating a column direction of the screen, focal _ length indicating a current focal length of the imaging device, and W, H indicating a width and a height of an image sensor of the imaging device, respectively.
The pose angle of the edge of the identification box with respect to the imaging device is then determined. In some examples, the identification box may be a rectangular box. As shown in fig. 8, the attitude angle of the rectangular frame with respect to the imaging device includes: the attitude angles of the sides in the picture row direction and the sides in the picture column direction with respect to the imaging apparatus. The attitude angle of the side of the picture line direction with respect to the imaging apparatus includes: a heading angle of the left side and the right side relative to the imaging device; the attitude angle of the side in the column direction of the screen with respect to the imaging apparatus includes: the pitch angle of the upper and lower sides with respect to the imaging device.
Then subtracting the course angles of the left side and the right side relative to the imaging equipment to obtain a course angle range of the target relative to the imaging equipment; and subtracting the pitch angles of the upper side and the lower side relative to the imaging equipment to obtain the range of the pitch angle of the target relative to the imaging equipment. And taking the course angle range and the pitch angle range of the target relative to the imaging device as the field angle of the target.
In this embodiment, determining an attitude angle of a connection line between the target and the imaging device according to the current position, as shown in fig. 9, includes:
s901: and acquiring the pitch angle of the identification frame relative to the imaging equipment.
S902: determining the pitch angle of the target relative to the imaging equipment according to the pitch angle of the identification frame relative to the imaging equipment;
s903: acquiring a current pitch angle of the imaging equipment;
s904: and obtaining the pitch angle of a connecting line between the target and the imaging equipment according to the pitch angle of the target relative to the imaging equipment and the current pitch angle.
The process of acquiring the pitch angle of the frame with respect to the imaging device can be referred to the process of acquiring the attitude angle of the frame with respect to the imaging device in S701. And after the pitch angles of the upper side and the lower side of the identification frame relative to the imaging device are obtained, taking the median value of the pitch angles of the upper side and the lower side relative to the imaging device as the pitch angle of the target relative to the imaging device.
The current pitch angle of the imaging device refers to the pitch angle of the imaging device in the world coordinate system. Drones are generally provided with attitude sensors, such as inertial sensors; the vehicle is usually provided with an angle sensor. And the pitch angle of the imaging equipment in a world coordinate system can be obtained through the attitude sensor and the angle sensor.
For example, a connection line between the target and the imaging device may be a connection line between an optical center of the imaging device and a center of the target, that is, a connection line between the optical center of the imaging device and the center of the identification frame. The pitch angle of the line between the target and the imaging device refers to the pitch angle in the world coordinate system. The pitch angle of the target relative to the imaging device is the pitch angle of the target in the imaging device coordinate system, and the current pitch angle of the imaging device refers to the current pitch angle of the imaging device in the world coordinate system. Therefore, the pitch angle of the target relative to the imaging device is superposed to the current pitch angle of the imaging device, and the pitch angle of the connecting line between the target and the imaging device can be obtained.
As described previously, the attitude angle controller of the imaging apparatus may include various types of controllers, and when a proportional controller is employed, its input-output curve is as shown in fig. 10, in which the horizontal axis represents the attitude angle deviation and the vertical axis represents the attitude angular velocity.
In some examples, the control parameters in S502 include: the gain attenuation threshold in fig. 10, and the gain attenuation threshold is positively correlated with the field angle and/or the attitude angle.
When the gain attenuation threshold is a gain attenuation threshold for course angle control and is positively correlated with the field angle and the attitude angle, the field angle includes: a range of heading angles of the target relative to the imaging device, the attitude angles including: and (6) a pitch angle. That is, the gain-attenuation threshold is a gain-attenuation threshold for the course angle control, which is determined according to the field angle and the attitude angle.
When the gain attenuation threshold is a gain attenuation threshold for pitch angle control and is positively correlated with a field angle, the field angle includes: a range of pitch angles of the target relative to the imaging device. That is, the gain attenuation threshold value is a gain attenuation threshold value with respect to the pitch angle control, which is determined in accordance with the angle of view.
For the gain attenuation threshold of the heading angle controller, the heading angle range of the target relative to the imaging device is first multiplied by a first scaling factor. And then adjusting the result of the product of the course angle range of the target relative to the imaging equipment and the first scale coefficient according to the pitch angle of a connecting line between the target and the imaging equipment to obtain the gain attenuation threshold of the course angle controller. Wherein, the first scale factor can be determined according to the identification condition of the target. When the identification frame in the picture is stable, the value of the first scale factor can be properly increased; when the flag frame in the picture is not stable, the value of the first scale factor can be appropriately reduced. In some examples, the first scaling factor may be 1 or greater than 1. In some examples, adjusting the product result according to a pitch angle of a line between the target and the imaging device may include: and dividing the product result by a trigonometric function value of the pitch angle of the connecting line between the target and the imaging device.
And multiplying the pitch angle range of the target relative to the imaging device by a second proportionality coefficient to obtain the gain attenuation threshold of the pitch angle controller. Similarly to the first scaling factor, the second scaling factor can also be determined based on the identification of the target. In some examples, the second scaling factor may be 1 or greater than 1.
In some examples, the control parameters in S502 include: the dead band threshold in fig. 10. Determining at least one control parameter from at least one of the field angle and the pose angle, including: and determining the dead zone threshold according to the attitude angle of a connecting line between the target and the imaging equipment.
In some examples, the attitude angle of the line between the target and the imaging device refers to a pitch angle of the line between the target and the imaging device. For the dead zone threshold of the course angle controller, when the absolute value of the pitch angle is greater than the threshold, controlling the dead zone threshold to be positively correlated with the pitch angle; and when the absolute value of the pitch angle is smaller than or equal to the threshold value, setting the dead zone threshold value as a preset value.
In some examples, controlling the dead band threshold to positively correlate with pitch angle includes:
and subtracting the threshold from the absolute value of the pitch angle of the connecting line between the target and the imaging equipment, multiplying the difference value between the absolute value of the pitch angle and the threshold by a third proportionality coefficient, and adjusting the product result of the difference value and the third proportionality coefficient according to the pitch angle of the connecting line between the target and the imaging equipment to obtain the dead zone threshold of the course angle controller.
The threshold and the third proportionality coefficient can be determined according to actual requirements. For example, the threshold may be 70 to 85 degrees; or the threshold may be 80 degrees. In some examples, the third scaling factor may be 0.1-0.9, for example, the third scaling factor may be 0.6. In some examples, adjusting the product of the difference value and the third scaling factor may include: and dividing the product result by a trigonometric function value of the pitch angle of the connecting line between the target and the imaging device.
And when the absolute value of the pitch angle of a connecting line between the target and the imaging equipment is smaller than or equal to the threshold, directly setting the dead zone threshold of the course angle controller as a first preset value. The first preset value can also be determined according to actual requirements. In some examples, the first preset value may include 0.
For the dead zone threshold of the pitch angle controller, the dead zone threshold can be directly set to the second preset value. The second preset value can also be determined according to actual requirements, and the second preset value can be the same as or different from the first preset value. In some examples, the second preset value may include 0.
After determining at least one control parameter through the above process, S502 obtains an attitude angular velocity according to the attitude angular deviation and the at least one control parameter. Specifically, the attitude angle deviation may be input to an attitude angle controller having at least one control parameter, the attitude angle controller outputting an attitude angular velocity corresponding to the attitude angle deviation. For example, for the proportional controller shown in fig. 10. After the gain attenuation threshold and the dead zone threshold of the course angle controller and the pitch angle controller are determined, when the attitude angle deviation comprises the course angle deviation, the course angle deviation can be input into the course angle controller with the gain attenuation threshold and the dead zone threshold, and the course angular speed is obtained. When the attitude angle deviation comprises a pitch angle deviation, the pitch angle deviation can be input to a pitch angle controller having a gain attenuation threshold and a dead zone threshold to obtain a pitch angle velocity. When the attitude angle deviation comprises course angle deviation and pitch angle deviation, the course angle deviation and the pitch angle deviation can be respectively input into the course angle controller and the pitch angle controller to obtain course angular velocity and pitch angle velocity.
After the attitude angular velocity is obtained, the imaging device is controlled to rotate the attitude angular deviation at the attitude angular velocity through S104, so that the target is at a desired position in the screen, and the composition of the target is realized.
When imaging device passes through the carrier and installs in unmanned aerial vehicle, the accessible control carrier is rotatory, or, the accessible control unmanned aerial vehicle is rotatory, or, control both rotations of carrier and unmanned aerial vehicle to make imaging device rotate course angle deviation, pitch angle speed rotation pitch angle deviation with course angular velocity.
When imaging device direct mount was in unmanned aerial vehicle, the accessible controlled unmanned aerial vehicle was rotatory to make imaging device rotate course angle deviation, the pitch angle speed rotation pitch angle deviation with course angular velocity.
Therefore, the target tracking method of the embodiment can dynamically adjust the control parameters including the gain attenuation threshold and the dead zone threshold according to the current position and the current size of the target, can adapt to targets with different distances, different sizes and different directions, and compared with a general target tracking method in which the control parameters are set to fixed values, the control on the attitude angle and the attitude angular velocity is smoother and more stable, and the defects that the control on the attitude angle and the attitude angular velocity lacks smoothness and stability due to the focal length of the imaging device, the larger pitch angle of the target relative to the imaging device and the change of the target size in the prior art are overcome.
The following describes a process of determining an attitude angular velocity corresponding to the attitude angular deviation based on the current position and the current size of the target.
As shown in fig. 11, determining the attitude angular velocity corresponding to the attitude angular deviation includes:
s1101: determining a desired pose angle of the imaging device from the desired position and the current position;
s1102: acquiring a current attitude angle of the imaging device;
s1103: and determining the attitude angle deviation according to the expected attitude angle and the current attitude angle.
The current attitude angle of the imaging apparatus can be obtained with reference to the manner of S903. The current attitude angle includes a current heading angle and/or a current pitch angle, which respectively refer to a heading angle and a pitch angle of the imaging device in the world coordinate system. The attitude angle of the imaging equipment in the world coordinate system can be obtained through the attitude sensor and the angle sensor of the unmanned aerial vehicle.
After the expected attitude angle of the imaging device and the current attitude angle of the imaging device are obtained, the expected attitude angle and the current attitude angle are subtracted to obtain the attitude angle deviation.
The process of determining the desired pose angle of the imaging device is described below.
When the target is located in the frame of the imaging device and the imaging device is out of the target, determining a desired pose angle of the imaging device, as shown in fig. 12, includes:
s1201: determining an attitude angle of a connecting line between the target and the imaging equipment according to the current position;
s1202: determining a pose angular deviation of the target relative to the imaging device from the desired position;
s1203: and obtaining the expected attitude angle according to the attitude angle and the attitude angle deviation.
The attitude angle of the line between the target and the imaging device may include: a course angle and/or a pitch angle of a connecting line between the target and the imaging device; the desired attitude angles may include: a desired heading angle and/or a desired pitch angle; the attitude angle deviation may include: heading angle deviation and pitch angle deviation.
The course angle and the pitch angle of the line between the target and the imaging device may be determined with reference to S602 described above. When the target is located in the screen of the imaging apparatus and the imaging apparatus is out of the target, the desired position in S1202 refers to a desired position designated by the user through the remote controller. The desired heading angle and the desired pitch angle may be calculated analytically or iteratively.
Illustratively, the analytic approach is to calculate the desired attitude angle by solving an equation. When the pitch angle of the target relative to the imaging device is large, e.g. close to 90 degrees, the solution of the equation may present a singular situation. The singular cases include: no valid solution, multiple valid solutions, unique valid solutions, infinite solutions, etc. The present embodiment determines the desired attitude angle by analyzing the singular case.
In some examples, as shown in fig. 13a, the pitch angle of the target with respect to the imaging device is close to 90 degrees, the dashed arrow (r) indicates the pitch angle of the line between the target and the imaging device, and the desired position specified by the user is that the target is located at the lower part in the column direction of the screen of the imaging device. Calculating the singular condition of the expected attitude angle deviation in an analytic mode to obtain two solutions: a first pitch angle deviation of the target with respect to the imaging apparatus, indicated by a solid arrow between the dotted arrow (r) and the dotted arrow (c), and a second pitch angle deviation of the target with respect to the imaging apparatus, indicated by a solid arrow between the dotted arrow (r) and the dotted arrow (c). The course angle of the imaging device corresponding to the first pitch angle deviation is 180 degrees different from the expected course angle of the imaging device, and the course angle of the imaging device corresponding to the second pitch angle deviation is equal to the expected course angle of the imaging device, so that the first pitch angle deviation is an invalid solution, the second pitch angle deviation is a unique valid solution, and the second pitch angle deviation is used as the pitch angle deviation obtained in the step S1202.
As shown in fig. 13b again, the pitch angle of the target with respect to the imaging device is close to 90 degrees, a dashed arrow (r) indicates the pitch angle of a line connecting the target and the imaging device, and the desired position specified by the user is that the target is located at the upper portion in the column direction of the screen of the imaging device. Calculating the singular condition of the expected attitude angle deviation in an analytic mode to obtain two solutions: a first pitch angle deviation of the target with respect to the imaging apparatus, indicated by a solid arrow between the dotted arrow (r) and the dotted arrow (c), and a second pitch angle deviation of the target with respect to the imaging apparatus, indicated by a solid arrow between the dotted arrow (r) and the dotted arrow (c). And the course angle of the imaging device corresponding to the first pitch angle deviation is 180 degrees different from the expected course angle of the imaging device, and the roll angle of the imaging device corresponding to the second pitch angle deviation is rotated by 180 degrees, so that both solutions are invalid solutions. In this case, an effective solution, which is a third pitch angle deviation of the target with respect to the imaging apparatus, indicated by a solid line arrow between the dotted line arrow (r) and the dotted line arrow (r), as the pitch angle deviation obtained in S1202, may be newly determined by gradually reducing the pitch angle deviation.
As shown in fig. 13c, the pitch angle of the target relative to the imaging device is close to 90 degrees, the desired position specified by the user is that the target is located at the left part of the image row direction of the imaging device, and the desired heading angle deviation obtained by calculating the desired attitude angle in an analytic manner is the first heading angle deviation between the dashed line box (r) and the center of the image. But since the sum of the first heading angle deviation and the pitch angle of the target relative to the imaging device is greater than 90 degrees, the first heading angle deviation is an invalid solution. In this case, the effective solution, which is the second course angle deviation between the dashed box and the center of the screen, may be determined again by gradually reducing the course angle deviation, and the second course angle deviation is taken as the course angle deviation obtained in S1202.
Therefore, according to the target tracking method of the embodiment, when the target is located in any direction of the imaging device through singular condition analysis, the correct attitude angle of the imaging device can be obtained, the target is located at or as close to the expected position specified by the user as possible, especially when the pitch angle of the target relative to the imaging device is large, the above effects can still be achieved, and target tracking failure caused by the singular condition of the expected attitude angle of the imaging device is avoided.
In the target tracking method of the prior art, the difference between the current position and the expected position of the target is generally directly converted into the attitude angle deviation of the imaging device. There is a large error between the attitude angle deviation obtained in this way and the actual attitude angle deviation. As the top view screen of the imaging apparatus shown in fig. 14, it is assumed that the identification frame is located at the current position of the target, and the center of the screen is the desired position of the target. Under the condition, the course angle deviation obtained by the prior art is smaller than the actual course angle deviation, and the pitch angle deviation is larger than the actual pitch angle deviation. Therefore, in the target tracking method in the prior art, the final position of the target inevitably deviates from the expected position. In the embodiment, the expected attitude angle of the imaging device is determined according to the expected position and the current position of the target, and then the attitude angle deviation is determined according to the expected attitude angle and the current attitude angle of the imaging device. For the overlooking picture shown in fig. 14, the course angle deviation and the pitch angle deviation obtained by the embodiment are equal to the actual course angle deviation and pitch angle deviation, and no error exists. Compared with the target tracking method in the prior art, the obtained attitude angle deviation is higher in precision, the target can be moved to the expected position more accurately, and the target tracking precision and the composition effect are improved.
The above description of the target tracking method in this embodiment is provided with reference to the proportional controller in fig. 10, and when the controller in this embodiment adopts other controllers, such as a proportional-integral controller, a proportional-integral-derivative controller, and the like, the above technical effects can be achieved as well.
It should be noted that all step numbers (e.g., S101, S102, S103, S104, etc.) described in this embodiment are only for convenience of description, and are only used to refer to a certain step, and not to limit the order of the step. In fact, the various steps of S101-S104 through S1201-S1203 of the description of the present embodiment may be performed in any order. For example, two steps that have no dependency on each other may be executed sequentially or in parallel, and the order of sequential execution is not limited.
Another embodiment of the present disclosure provides a target tracking method. For brevity, the same or similar features of this embodiment as those of the previous embodiment are not repeated, and only the differences from the previous embodiment will be described below.
As shown in fig. 15, the target tracking method of this embodiment further includes:
s1501: determining another attitude angle deviation when the target in the picture of the imaging device moves outside the picture of the imaging device;
s1502: determining another attitude angular velocity corresponding to the another attitude angular deviation according to the current position and the current size;
s1503: controlling the imaging device to rotate the other attitude angular deviation at the other attitude angular velocity to cause the target to reappear in a picture of the imaging device.
When the target is tracked by the imaging device, the target may be occluded by other objects. For example, when the target is a moving vehicle on a road, the vehicle in the screen may be blocked by objects such as signboards and street lamps on both sides of the road. Since such targets are moving relative to the imaging target, the target may already be out of the picture when an object obstructing the vehicle moves out of the imaging device picture. The target tracking method of the embodiment can retrieve the target positioned outside the picture, so that the target reappears in the picture, and the target is continuously tracked.
Similar to the previous embodiment, the present embodiment may determine a desired attitude angle of the imaging device and a current attitude angle of the imaging device, and then determine another attitude angle deviation according to the desired attitude angle and the current attitude angle.
Determining the desired pose angle of the imaging device, as shown in fig. 16, includes:
s1601: acquiring the position and the movement speed of the target relative to the imaging device at the last moment in the picture and the duration of the target outside the picture;
s1602: determining a predicted position of the target based on the position, the speed of movement, and the duration;
s1603: and obtaining the expected attitude angle according to the predicted position. Wherein the predicted location comprises: the position of the target under uniform motion and/or deceleration motion.
The position and the movement speed of the target relative to the imaging device at the last moment in the picture can be acquired in various ways. As previously described, the drone may include one or more sensors. In some examples, the sensors may include inertial sensors, satellite positioning modules. The position and the speed of the unmanned aerial vehicle in a world coordinate system, namely the position and the speed of the imaging device, can be obtained by the unmanned aerial vehicle through the inertial sensor and/or the satellite positioning module. The unmanned aerial vehicle can also obtain the position and the speed of the target in the world coordinate system in various ways. And obtaining the position and the movement speed of the target relative to the imaging device according to the position and the speed of the imaging device and the target in the world coordinate system.
The duration of the target outside the screen can then be determined from the current time and the last time, and the predicted position of the target can be determined from the position, the speed of movement and the duration. Predicting the location may include: a first predicted position and a second predicted position. The first predicted position is a position where the target is assumed to arrive at a constant speed by the movement speed, and the second predicted position is a position where the target is assumed to arrive at a deceleration speed by the movement speed as an initial speed.
The desired attitude angle may be determined analytically or iteratively. Illustratively, in an iterative manner, the first predicted position overlaps the desired position when the imaging device is at the first desired pose angle, assuming the imaging device is at the first desired pose angle.
It is determined whether the second predicted position at this time is in the picture of the imaging apparatus. As shown in fig. 17a, when the second predicted position is located in the imaging device screen, the first desired attitude angle is taken as the desired attitude angle. And when the second predicted position is not in the screen but outside the screen of the imaging device, as shown by the solid line box in fig. 17b, the imaging device is controlled to change the first desired attitude angle so that the imaging device is moved in the direction opposite to the target movement direction until the second predicted position is in the screen of the imaging device, as shown by the broken line box in fig. 17b, the first desired attitude angle at that time is taken as the desired attitude angle.
After the desired attitude angle is obtained, another attitude angle deviation and another corresponding attitude angular velocity thereof may be determined, and the imaging device is controlled to rotate the other attitude angle deviation at the other attitude angular velocity so that the target reappears in the screen of the imaging device.
Therefore, the target outside the image of the imaging device can be retrieved by the embodiment, so that the target reappears in the image, the target retrieving capability is stronger, the attitude angular speed control of the imaging device is smoother and more stable, and the defects that the attitude angular speed of the imaging device is not smooth and even the target is lost due to larger change of the attitude angle when the target is retrieved in the prior art are avoided.
Another embodiment of the present disclosure provides a target tracking method. For brevity, the same or similar features of this embodiment as those of the above embodiments are not repeated, and only the different contents from the above embodiments are described below.
As shown in fig. 18, the target tracking method of this embodiment further includes:
s1801: the imaging device identifies the target in the picture;
s1802: and determining the attitude angular velocity corresponding to the attitude angular deviation according to the recognition result of the target.
The process of the imaging device recognizing the target in the screen can be referred to the aforementioned S101. Determining the attitude angular velocity corresponding to the attitude angular deviation according to the recognition result of the target, including:
determining at least one control parameter according to the recognition result of the target;
and obtaining the attitude angular velocity according to the attitude angular deviation and at least one control parameter.
At least one of the parameters in this embodiment includes: and the gain coefficient is used for controlling the attitude angle and has a value range.
In the target tracking process, when the imaging device starts to identify the target, the gain coefficient is set to be the minimum value of the value range. When the target is located in the frame of the imaging device and the imaging device continues to recognize the target, the gain coefficient may be gradually increased from the minimum value until the maximum value of the value range is reached. If the target is located in the picture but the imaging device does not recognize the target during the increase of the gain coefficient or after the gain coefficient reaches the maximum value, or the target moves from the picture to the outside of the picture, the gain coefficient is decreased until the minimum value is reached. Wherein the process of increasing and decreasing the gain factor is for one or both of the heading angle controller, the pitch angle controller.
The value range of the gain coefficient can be determined according to the target tracking effect. In some examples, the range of values may be 0.1-1. The speed of change of the gain factor during the increase and decrease of the gain factor can also be determined according to the effect of object tracking. In some examples, the rate of change of the gain factor may be 0.1-0.5/second, for example, the gain factor may be 0.15/second.
The gain factor corresponds to a scaling factor of the attitude angle controller. After the gain coefficient is applied to the attitude angle controller, the input and output curves of the attitude angle controller are adjusted as a whole. For example, the input/output curve of the proportional controller shown in fig. 10 may be enlarged or reduced as a whole after the proportional controller is multiplied by a gain factor.
As shown in fig. 19, the target tracking method of this embodiment further includes:
s1901: determining an upper limit value of the attitude angular velocity;
s1902: and when the attitude angular velocity exceeds the upper limit value, controlling the imaging device to rotate the attitude angular deviation by the upper limit value of the angular velocity.
In some examples, the upper limit value is determined according to at least one of the following speed thresholds:
a first velocity threshold of the imaging device at the pose axis;
a second velocity threshold determined by a range of pose angles of the target relative to the imaging device and an exposure time of the imaging device;
a third speed threshold determined by the range of pose angles of the target with respect to the imaging device and the recognition period of the target by the imaging device.
The attitude axis may include: a heading axis and a pitch axis. The first speed threshold refers to the maximum angular velocity of the imaging device in the heading and pitch axes. When the imaging device is mounted to the drone by a vehicle, the maximum angular velocity is generally limited to the range of attitude angular velocities of the vehicle and the drone. When the imaging device is directly mounted to the drone, the maximum angular velocity is generally limited to the range of attitude angular velocities of the drone.
The second speed threshold may be obtained by dividing the range of the attitude angle of the target with respect to the imaging device by the exposure time and multiplying by a fourth scaling factor. The imaging device may automatically set the exposure time according to the ambient brightness, or the exposure time may be set by the user. User's accessible remote controller sets up exposure time, and the remote controller sends exposure time to unmanned aerial vehicle, and unmanned aerial vehicle sets up imaging device's exposure time into the exposure time that the user set up. The fourth scaling factor may be determined according to the attitude angle control effect of the imaging device. In some examples, the fourth scaling factor may be 0.01-1, for example, the fourth scaling factor may be 0.1. And determining the upper limit value of the attitude angular speed according to the second speed threshold value, so that the problem that the image of the imaging equipment is blurred due to the over-high attitude angular speed of the imaging equipment, and the target identification fails can be avoided.
And dividing the attitude angle range of the target relative to the imaging equipment by the recognition period, and multiplying the recognition period by a fifth proportionality coefficient to obtain a third speed threshold. The recognition period of the object refers to the time required for the imaging apparatus to recognize one object. The fifth scale factor may be determined according to the attitude angle control effect of the imaging device. In some examples, the fifth scaling factor may be 0.01-1, for example, the fifth scaling factor may be 0.1. And determining the upper limit value of the attitude angular speed according to the third speed threshold value, so that the problem that the target position in the picture is changed too fast to cause target recognition failure due to too fast attitude angular speed of the imaging equipment can be avoided.
In some examples, the minimum of the first speed threshold, the second speed threshold, and the third speed threshold may be used as the upper limit value of the attitude angular velocity. The present embodiment is not limited to this, and any one of the speed thresholds or the average value of the three speed thresholds may be used as the upper limit value of the attitude angular velocity.
S1901 and S1902 of the present embodiment are directed to one or both of the heading angle controller, the pitch angle controller. That is, the attitude axis may include: a course axis, the attitude angle range including: a course angle range, the attitude angular velocity comprising: a course angular velocity; and/or; the attitude axis includes: pitch axis, attitude angle range including: pitch angle range, attitude angular velocity including: pitch angle rate.
By dynamically adjusting the gain coefficient according to the target recognition result, the attitude angle of the imaging device can be prevented from being greatly changed. Especially, under the condition that the imaging equipment starts to recognize the target and the target appears on the picture again after being shielded, the condition that the attitude angular speed of the imaging equipment is too high can be effectively avoided, and the attitude control process of the imaging equipment is smoother and more stable. By determining the upper limit value of the attitude angular velocity and controlling the imaging device to rotate the attitude angular deviation within the upper limit value, as shown in fig. 20, it is equivalent to that the output maximum value is limited in the proportional controller shown in fig. 10, so that the attitude angular velocity corresponding to the attitude angular deviation does not exceed the upper limit value, and the too high attitude angular velocity of the imaging device can also be avoided, and the attitude control process of the imaging device can be made smoother and more stable. The defect that the target is lost due to the reasons of low ambient brightness, the shielding of the target and the like in the prior art is avoided.
Yet another embodiment of the present disclosure provides a target tracking method. For brevity, the same or similar features of this embodiment as those of the above embodiments are not repeated, and only the different contents from the above embodiments are described below.
As shown in fig. 21, the target tracking method of this embodiment further includes:
s2101: acquiring the distance and the movement speed of the target relative to the imaging device;
s2102: determining an attitude angle of a connecting line between the target and the imaging equipment according to the current position;
s2103: and determining a feedforward value of the attitude angular velocity according to the distance, the motion velocity and the attitude angle.
The distance and the movement speed of the object relative to the imaging device can be obtained in various ways. As previously described, the drone may include one or more sensors. In some examples, the sensors may include inertial sensors, satellite positioning modules. The position and the speed of the unmanned aerial vehicle in a world coordinate system, namely the position and the speed of the imaging device, can be obtained by the unmanned aerial vehicle through the inertial sensor and/or the satellite positioning module. The unmanned aerial vehicle can also obtain the position and the speed of the target in the world coordinate system in various ways. And obtaining the distance and the movement speed of the target relative to the imaging device according to the positions and the speeds of the imaging device and the target in the world coordinate system.
A process of determining an attitude angle of a connection line between the target and the imaging device according to the current position may be referred to as S602 in the above embodiment. The pitch angle of the line between the subject and the imaging apparatus can be obtained through S2102.
After the distance and the movement speed of the target relative to the imaging equipment and the pitch angle of a connecting line between the target and the imaging equipment are obtained, a feedforward value of the attitude angular speed can be determined according to the distance, the movement speed and the pitch angle. The velocity of the target along the attitude axis of the imaging device can be obtained according to the motion velocity, the feed-forward value can be obtained according to the velocity, the distance and the pitch angle along the attitude axis of the imaging device, and the feed-forward value is superposed to the attitude angular velocity determined in step S103.
Specifically, the speed of motion is first decomposed into a first speed along the heading axis of the imaging device, a second speed along the pitch axis of the imaging device, and a third speed along the roll axis of the imaging device.
And for the course angle controller, converting the first speed into a course angular speed of the target relative to the imaging equipment according to the distance between the target and the imaging equipment, and obtaining a course angular speed feedforward according to a pitch angle of a connecting line between the target and the imaging equipment.
And for the pitch controller, after the second speed is obtained, converting the second speed into the pitch angle speed of the target relative to the imaging equipment according to the distance between the target and the imaging equipment, and taking the second pitch angle speed as the pitch angle speed feedforward.
In a general target tracking method, when the position of a target relative to an imaging device changes, tracking lag is likely to occur. For example, when a target moves from one side of the imaging device to the other side through right below or right above the imaging device, the attitude angle control may be severely delayed or even the target may be lost due to a large change in the heading angle of the imaging device. The attitude angular velocity feedforward control method and the attitude angular velocity feedforward control device can apply the attitude angular velocity feedforward to the attitude angular velocity controller according to the distance and the motion speed of the target relative to the imaging equipment, ensure the real-time performance of attitude control through the attitude angular velocity feedforward, improve the target tracking effect, and avoid the defect that the target is lost due to the lagging of the attitude tracking in the prior art.
Still another embodiment of the present disclosure provides a target tracking apparatus, as shown in fig. 22, including:
a memory for storing executable instructions;
a processor to execute the executable instructions stored in the memory to perform the following:
acquiring the current position and the current size of a target in a picture of the imaging equipment;
acquiring an expected position of the target in the picture, and determining the attitude angle deviation of the imaging equipment according to the expected position and the current position;
determining an attitude angular velocity corresponding to the attitude angular deviation according to the current position and the current size;
controlling the imaging device to rotate the attitude angular deviation at the attitude angular velocity to bring the target to the desired position in the screen.
The attitude angle deviation includes: a heading angular deviation, the attitude angular velocity comprising: a course angular velocity; and/or; the attitude angle deviation includes: pitch angle deviation, the attitude angular velocity comprising: pitch angle rate.
The target tracking apparatus of this embodiment may perform the operations, steps, and processes described in any of the above embodiments.
In some examples, the imaging device is mounted to a movable carrier by a carrier; the processor is further configured to perform the following operations: controlling the carrier and/or the movable carrier to rotate so that the imaging equipment rotates by the course angle deviation at the course angle speed; and/or; controlling the carrier and/or the movable carrier to rotate such that the imaging device rotates the pitch angle offset at the pitch angle rate.
In some examples, the processor is further configured to: determining at least one control parameter according to the current position and the current size; and obtaining the attitude angular velocity according to the attitude angular deviation and at least one control parameter.
In some examples, the processor is further configured to: determining the field angle of the target according to the current position and the current size; determining an attitude angle of a connecting line between the target and the imaging equipment according to the current position; and obtaining at least one control parameter according to at least one of the field angle and the attitude angle.
In some examples, the processor is further configured to: acquiring an attitude angle of an identification frame relative to the imaging device, wherein the identification frame is used for identifying the current position and the current size; and obtaining the angle of field according to the attitude angle of the identification frame relative to the imaging equipment.
In some examples, the identifying a pose angle of the frame relative to the imaging device includes: the heading angle of the identification frame relative to the imaging device comprises: a course angle range of the target relative to the imaging device; and/or; the pose angle of the identification frame relative to the imaging device comprises: the identification frame is in a pitch angle relative to the imaging device, and the field angle comprises: a range of pitch angles of the target relative to the imaging device.
In some examples, the attitude angle includes: a pitch angle; the processor is further configured to perform the following operations: acquiring a pitch angle of an identification frame relative to the imaging device, wherein the identification frame is used for identifying the current position and the current size; determining the pitch angle of the target relative to the imaging equipment according to the pitch angle of the identification frame relative to the imaging equipment; acquiring a current pitch angle of the imaging equipment; and obtaining the pitch angle of a connecting line between the target and the imaging equipment according to the pitch angle of the target relative to the imaging equipment and the current pitch angle.
In some examples, at least one of the parameters includes: a gain decay threshold and/or a dead band threshold.
In some examples, the gain attenuation threshold is positively correlated with the field angle and/or the pose angle.
In some examples, when the gain attenuation threshold is a gain attenuation threshold for heading angle control and is positively correlated with the field angle and the attitude angle, the field angle includes: a range of heading angles of the target relative to the imaging device, the attitude angles including: a pitch angle; and/or; when the gain attenuation threshold is a gain attenuation threshold for pitch angle control and is positively correlated with the field angle, the field angle includes: a range of pitch angles of the target relative to the imaging device.
In some examples, the processor is further configured to: and determining the dead zone threshold according to the attitude angle.
In some examples, the attitude angle includes: a pitch angle; the processor is further configured to perform the following operations: when the absolute value of the pitch angle is larger than a threshold value, controlling the dead zone threshold value to be positively correlated with the pitch angle; when the absolute value of the pitch angle is smaller than or equal to the threshold value, setting the dead zone threshold value as a preset value; wherein the dead zone threshold is a dead zone threshold for heading angle control.
In some examples, the processor is further configured to: and setting the dead zone threshold value as a preset value, wherein the dead zone threshold value is related to pitch angle control.
In some examples, the processor is further configured to: the imaging device identifies the target in the picture; and determining the attitude angular velocity corresponding to the attitude angular deviation according to the recognition result of the target.
In some examples, the processor is further configured to: determining at least one control parameter according to the recognition result of the target; and obtaining the attitude angular velocity according to the attitude angular deviation and at least one control parameter.
In some examples, at least one of the parameters includes: gain coefficients for course angle control and/or pitch angle control; the gain factor has a value range; the processor is further configured to perform the following operations: when the target is located in the picture and the target is identified, increasing the gain coefficient within the value range; and when the target is positioned in the picture but the target is not recognized or the target is positioned outside the picture, reducing the gain coefficient in the value range.
In some examples, the processor is further configured to: acquiring the distance and the movement speed of the target relative to the imaging device; determining an attitude angle of a connecting line between the target and the imaging equipment according to the current position; and determining a feedforward value of the attitude angular velocity according to the distance, the motion velocity and the attitude angle.
In some examples, the processor is further configured to: obtaining the speed along the attitude axis of the imaging equipment according to the movement speed; and obtaining the feedforward value according to the speed, the distance and the attitude angle.
In some examples, the attitude angle includes: a pitch angle, the attitude angular velocity comprising: a pitch angle rate, the attitude axis comprising: a pitch axis; and/or; the attitude angle includes: a heading angle, the attitude angular velocity comprising: a heading angular velocity, the attitude axis comprising: a heading axis.
In some examples, the processor is further configured to: determining an upper limit value of the attitude angular velocity; and when the attitude angular velocity exceeds the upper limit value, controlling the imaging device to rotate the attitude angular deviation by the upper limit value of the angular velocity.
In some examples, the processor is further configured to: determining the upper limit value according to the following speed thresholds: a first velocity threshold of the imaging device at a pose axis, a second velocity threshold determined by a pose angular range of the target relative to the imaging device and an exposure time of the imaging device, a third velocity threshold determined by a pose angular range of the target relative to the imaging device and a recognition period of the target by the imaging device.
In some examples, the angular velocity upper limit value is a minimum of the first velocity threshold, the second velocity threshold, and the third velocity threshold.
In some examples, the pose axis comprises: a heading axis, the attitude angle range including: a course angular range, the attitude angular velocity comprising: a course angular velocity; and/or; the attitude axis includes: a pitch axis, the range of attitude angles comprising: a pitch angle range, the attitude angular velocity comprising: pitch angle rate.
In some examples, the processor is further configured to: determining a desired pose angle of the imaging device from the desired position and the current position; acquiring a current attitude angle of the imaging device; and determining the attitude angle deviation according to the expected attitude angle and the current attitude angle.
In some examples, the processor is further configured to: determining an attitude angle of a connecting line between the target and the imaging equipment according to the current position; determining a pose angular deviation of the target relative to the imaging device from the desired position; and obtaining the expected attitude angle according to the attitude angle and the attitude angle deviation.
In some examples, the current pose angle includes: a current heading angle, the desired attitude angle comprising: a desired heading angle, the attitude angle comprising: a heading angle, the attitude angle deviation comprising: course angle deviation; and/or; the current attitude angle includes: a current pitch angle, the desired attitude angle comprising: a desired pitch angle, the attitude angle comprising: a pitch angle, the attitude angle deviation comprising: and (5) pitch angle deviation.
In some examples, the processor is further configured to: determining another attitude angle deviation when the target in the picture of the imaging device moves outside the picture of the imaging device; determining another attitude angular velocity corresponding to the another attitude angular deviation according to the current position and the current size; controlling the imaging device to rotate the other attitude angular deviation at the other attitude angular velocity to cause the target to reappear in a picture of the imaging device.
In some examples, the processor is further configured to: determining a desired pose angle of the imaging device; acquiring a current attitude angle of the imaging device; determining the other attitude angle deviation from the desired attitude angle and the current attitude angle.
In some examples, the processor is further configured to: acquiring the position and the movement speed of the target relative to the imaging device at the last moment in the picture and the duration of the target outside the picture; determining a predicted position of the target based on the position, the speed of movement, and the duration; and obtaining the expected attitude angle according to the predicted position.
In some examples, the predicted location includes: the position of the target under uniform motion and/or deceleration motion.
In some examples, the current pose angle includes: a current heading angle, the desired attitude angle comprising: a desired heading angle, the another attitude angle deviation comprising: a heading angular deviation, the another attitude angular velocity comprising: a course angular velocity; and/or; the current attitude angle includes: a current pitch angle, the desired attitude angle comprising: a desired pitch angle, the another attitude angle deviation comprising: a pitch angle deviation, the another attitude angular velocity comprising: pitch angle rate.
Yet another embodiment of the present disclosure also provides a computer-readable storage medium storing executable instructions that, when executed by one or more processors, may cause the one or more processors to perform the target tracking method of the above-described embodiment.
A computer-readable storage medium may be, for example, any medium that can contain, store, communicate, propagate, or transport the instructions. For example, a readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Specific examples of the readable storage medium include: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and/or wired/wireless communication links.
In addition, the computer program may be configured with computer program code, for example, comprising computer program modules. It should be noted that the division manner and the number of the modules are not fixed, and those skilled in the art may use suitable program modules or program module combinations according to actual situations, when these program modules are executed by a computer (or a processor), the computer may execute the flow of the simulation method of the unmanned aerial vehicle described in the present disclosure and the modifications thereof.
Yet another embodiment of the present disclosure provides a movable platform, including: a movable carrier, an imaging apparatus, and a carrier. The imaging apparatus is mounted to the movable carrier or, alternatively, to the movable carrier by the carrier. The movable carrier includes: unmanned aerial vehicles, unmanned ships, or robots. The carrier includes: a head having at least one rotational degree of freedom.
The movable carrier comprises the object tracking means of the above embodiments. The target tracking device may control the vehicle and/or the movable carrier to rotate to adjust a pose angle of the imaging apparatus.
Yet another embodiment of the present disclosure further provides an imaging platform, including: a carrier and an imaging device; the carrier includes: the object tracking apparatus of the above embodiment. The target tracking device may control the carrier to rotate to adjust the attitude angle of the imaging apparatus. The carrier can be an imaging platform including a handheld cloud platform and a cloud platform camera.
It is obvious to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to perform all or part of the above described functions. For the specific working process of the device described above, reference may be made to the corresponding process in the foregoing method embodiment, which is not described herein again.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present disclosure, and not for limiting the same; while the present disclosure has been described in detail with reference to the foregoing embodiments, those of ordinary skill in the art will understand that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; features in embodiments of the disclosure may be combined arbitrarily, without conflict; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present disclosure.

Claims (70)

1. A target tracking method, comprising:
acquiring the current position and the current size of a target in a picture of the imaging equipment;
acquiring an expected position of the target in the picture, and determining the attitude angle deviation of the imaging equipment according to the expected position and the current position;
determining an attitude angular velocity corresponding to the attitude angular deviation according to the current position and the current size;
controlling the imaging device to rotate the attitude angular deviation at the attitude angular velocity to bring the target to the desired position in the screen.
2. The target tracking method of claim 1, wherein the attitude angle deviation comprises: a heading angular deviation, the attitude angular velocity comprising: a course angular velocity;
and/or;
the attitude angle deviation includes: pitch angle deviation, the attitude angular velocity comprising: pitch angle rate.
3. The object tracking method of claim 2, wherein the imaging device is mounted to a movable carrier by a vehicle;
the controlling the imaging device to rotate the attitude angular deviation at the attitude angular velocity includes:
controlling the carrier and/or the movable carrier to rotate so that the imaging equipment rotates by the course angle deviation at the course angle speed;
and/or;
controlling the carrier and/or the movable carrier to rotate such that the imaging device rotates the pitch angle offset at the pitch angle rate.
4. The method for tracking an object according to claim 1, wherein said determining an attitude angular velocity corresponding to the attitude angular deviation based on the current position and the current size comprises:
determining at least one control parameter according to the current position and the current size;
and obtaining the attitude angular velocity according to the attitude angular deviation and at least one control parameter.
5. The target tracking method of claim 4, wherein said determining at least one control parameter based on said current position and said current size comprises:
determining the field angle of the target according to the current position and the current size;
determining an attitude angle of a connecting line between the target and the imaging equipment according to the current position;
and obtaining at least one control parameter according to at least one of the field angle and the attitude angle.
6. The target tracking method of claim 5, wherein said determining a field of view of the target from the current position and the current size comprises:
acquiring an attitude angle of an identification frame relative to the imaging device, wherein the identification frame is used for identifying the current position and the current size;
and obtaining the angle of field according to the attitude angle of the identification frame relative to the imaging equipment.
7. The object tracking method of claim 6,
the pose angle of the identification frame relative to the imaging device comprises: the heading angle of the identification frame relative to the imaging device comprises: a course angle range of the target relative to the imaging device;
and/or;
the pose angle of the identification frame relative to the imaging device comprises: the identification frame is in a pitch angle relative to the imaging device, and the field angle comprises: a range of pitch angles of the target relative to the imaging device.
8. The target tracking method of claim 5, wherein the attitude angle comprises: a pitch angle;
the determining an attitude angle of a connecting line between the target and the imaging device according to the current position includes:
acquiring a pitch angle of an identification frame relative to the imaging device, wherein the identification frame is used for identifying the current position and the current size;
determining the pitch angle of the target relative to the imaging equipment according to the pitch angle of the identification frame relative to the imaging equipment;
acquiring a current pitch angle of the imaging equipment;
and obtaining the pitch angle of a connecting line between the target and the imaging equipment according to the pitch angle of the target relative to the imaging equipment and the current pitch angle.
9. The target tracking method of claim 5, wherein at least one of the parameters comprises: a gain decay threshold and/or a dead band threshold.
10. The object tracking method of claim 9, wherein the gain-attenuation threshold is positively correlated with the field angle and/or the pose angle.
11. The object tracking method of claim 10, wherein when the gain attenuation threshold is a gain attenuation threshold for course angle control and is positively correlated with the field angle and the attitude angle,
the field angle includes: a range of heading angles of the target relative to the imaging device, the attitude angles including: and (6) a pitch angle.
And/or;
when the gain attenuation threshold is a gain attenuation threshold with respect to pitch angle control, and is positively correlated with the field angle,
the field angle includes: a range of pitch angles of the target relative to the imaging device.
12. The target tracking method of claim 9, wherein said determining at least one control parameter from at least one of said field angle and said pose angle comprises:
and determining the dead zone threshold according to the attitude angle.
13. The target tracking method of claim 12, wherein the attitude angle comprises: a pitch angle;
the determining the dead zone threshold according to the attitude angle includes:
when the absolute value of the pitch angle is larger than a threshold value, controlling the dead zone threshold value to be positively correlated with the pitch angle;
when the absolute value of the pitch angle is smaller than or equal to the threshold value, setting the dead zone threshold value as a preset value;
wherein the dead zone threshold is a dead zone threshold for heading angle control.
14. The target tracking method of claim 9, wherein said determining at least one control parameter from at least one of said field angle and said pose angle comprises:
and setting the dead zone threshold value as a preset value, wherein the dead zone threshold value is related to pitch angle control.
15. The target tracking method of claim 1, further comprising:
the imaging device identifies the target in the picture;
and determining the attitude angular velocity corresponding to the attitude angular deviation according to the recognition result of the target.
16. The target tracking method according to claim 15, wherein the determining the attitude angular velocity corresponding to the attitude angular deviation according to the recognition result of the target comprises:
determining at least one control parameter according to the recognition result of the target;
and obtaining the attitude angular velocity according to the attitude angular deviation and at least one control parameter.
17. The target tracking method of claim 16, wherein at least one of the parameters comprises: gain coefficients for course angle control and/or pitch angle control; the gain factor has a value range;
the determining at least one control parameter according to the recognition result of the target includes:
when the target is located in the picture and the target is identified, increasing the gain coefficient within the value range;
and when the target is positioned in the picture but the target is not recognized or the target is positioned outside the picture, reducing the gain coefficient in the value range.
18. The target tracking method of claim 1, further comprising:
acquiring the distance and the movement speed of the target relative to the imaging device;
determining an attitude angle of a connecting line between the target and the imaging equipment according to the current position;
and determining a feedforward value of the attitude angular velocity according to the distance, the motion velocity and the attitude angle.
19. The method of target tracking of claim 18 wherein said determining a feed forward value for said pose angular velocity from said distance, said motion velocity, and said pose angle comprises:
obtaining the speed along the attitude axis of the imaging equipment according to the movement speed;
and obtaining the feedforward value according to the speed, the distance and the attitude angle.
20. The object tracking method of claim 19,
the attitude angle includes: a pitch angle, the attitude angular velocity comprising: a pitch angle rate, the attitude axis comprising: a pitch axis;
and/or;
the attitude angle includes: a heading angle, the attitude angular velocity comprising: a heading angular velocity, the attitude axis comprising: a heading axis.
21. The target tracking method of claim 1, further comprising:
determining an upper limit value of the attitude angular velocity;
and when the attitude angular velocity exceeds the upper limit value, controlling the imaging device to rotate the attitude angular deviation by the upper limit value of the angular velocity.
22. The target tracking method of claim 21, wherein said determining an upper limit value for the attitude angular velocity comprises:
determining the upper limit value according to the following speed thresholds:
a first velocity threshold of the imaging device at a pose axis, a second velocity threshold determined by a pose angular range of the target relative to the imaging device and an exposure time of the imaging device, a third velocity threshold determined by a pose angular range of the target relative to the imaging device and a recognition period of the target by the imaging device.
23. The object tracking method of claim 22, wherein the angular velocity upper limit value is the smallest of the first velocity threshold value, the second velocity threshold value, and the third velocity threshold value.
24. The object tracking method of claim 22,
the attitude axis includes: a heading axis, the attitude angle range including: a course angular range, the attitude angular velocity comprising: a course angular velocity;
and/or;
the attitude axis includes: a pitch axis, the range of attitude angles comprising: a pitch angle range, the attitude angular velocity comprising: pitch angle rate.
25. The target tracking method of claim 1, wherein said determining a pose angular deviation of the imaging device from the desired position and the current position comprises:
determining a desired pose angle of the imaging device from the desired position and the current position;
acquiring a current attitude angle of the imaging device;
and determining the attitude angle deviation according to the expected attitude angle and the current attitude angle.
26. The target tracking method of claim 25 wherein said determining a desired pose angle of said imaging device from said desired position and said current position comprises:
determining an attitude angle of a connecting line between the target and the imaging equipment according to the current position;
determining a pose angular deviation of the target relative to the imaging device from the desired position;
and obtaining the expected attitude angle according to the attitude angle and the attitude angle deviation.
27. The object tracking method of claim 26,
the current attitude angle includes: a current heading angle, the desired attitude angle comprising: a desired heading angle, the attitude angle comprising: a heading angle, the attitude angle deviation comprising: course angle deviation;
and/or;
the current attitude angle includes: a current pitch angle, the desired attitude angle comprising: a desired pitch angle, the attitude angle comprising: a pitch angle, the attitude angle deviation comprising: and (5) pitch angle deviation.
28. The target tracking method of claim 1, further comprising:
determining another attitude angle deviation when the target in the picture of the imaging device moves outside the picture of the imaging device;
determining another attitude angular velocity corresponding to the another attitude angular deviation according to the current position and the current size;
controlling the imaging device to rotate the other attitude angular deviation at the other attitude angular velocity to cause the target to reappear in a picture of the imaging device.
29. The target tracking method of claim 28 wherein said determining another attitude angle deviation comprises:
determining a desired pose angle of the imaging device;
acquiring a current attitude angle of the imaging device;
determining the other attitude angle deviation from the desired attitude angle and the current attitude angle.
30. The target tracking method of claim 29, wherein said determining a desired pose angle of the imaging device comprises:
acquiring the position and the movement speed of the target relative to the imaging device at the last moment in the picture and the duration of the target outside the picture;
determining a predicted position of the target based on the position, the speed of movement, and the duration;
and obtaining the expected attitude angle according to the predicted position.
31. The target tracking method of claim 30, wherein predicting the location comprises: the position of the target under uniform motion and/or deceleration motion.
32. The object tracking method of claim 30,
the current attitude angle includes: a current heading angle, the desired attitude angle comprising: a desired heading angle, the another attitude angle deviation comprising: a heading angular deviation, the another attitude angular velocity comprising: a course angular velocity;
and/or;
the current attitude angle includes: a current pitch angle, the desired attitude angle comprising: a desired pitch angle, the another attitude angle deviation comprising: a pitch angle deviation, the another attitude angular velocity comprising: pitch angle rate.
33. An object tracking device, comprising:
a memory for storing executable instructions;
a processor to execute the executable instructions stored in the memory to perform the following:
acquiring the current position and the current size of a target in a picture of the imaging equipment;
acquiring an expected position of the target in the picture, and determining the attitude angle deviation of the imaging equipment according to the expected position and the current position;
determining an attitude angular velocity corresponding to the attitude angular deviation according to the current position and the current size;
controlling the imaging device to rotate the attitude angular deviation at the attitude angular velocity to bring the target to the desired position in the screen.
34. The object tracking device of claim 33,
the attitude angle deviation includes: a heading angular deviation, the attitude angular velocity comprising: a course angular velocity;
and/or;
the attitude angle deviation includes: pitch angle deviation, the attitude angular velocity comprising: pitch angle rate.
35. The object tracking device of claim 34, wherein the imaging apparatus is mounted to a movable carrier by a carrier;
the processor is further configured to perform the following operations:
controlling the carrier and/or the movable carrier to rotate so that the imaging equipment rotates by the course angle deviation at the course angle speed;
and/or;
controlling the carrier and/or the movable carrier to rotate such that the imaging device rotates the pitch angle offset at the pitch angle rate.
36. The target tracking device of claim 33 wherein the processor is further configured to:
determining at least one control parameter according to the current position and the current size;
and obtaining the attitude angular velocity according to the attitude angular deviation and at least one control parameter.
37. The target tracking device of claim 36, wherein the processor is further configured to:
determining the field angle of the target according to the current position and the current size;
determining an attitude angle of a connecting line between the target and the imaging equipment according to the current position;
and obtaining at least one control parameter according to at least one of the field angle and the attitude angle.
38. The target tracking device of claim 37, wherein the processor is further configured to:
acquiring an attitude angle of an identification frame relative to the imaging device, wherein the identification frame is used for identifying the current position and the current size;
and obtaining the angle of field according to the attitude angle of the identification frame relative to the imaging equipment.
39. The object tracking device of claim 38,
the pose angle of the identification frame relative to the imaging device comprises: a heading angle of the identification frame relative to the imaging device,
the field angle includes: a course angle range of the target relative to the imaging device;
and/or;
the pose angle of the identification frame relative to the imaging device comprises: a pitch angle of the identification frame with respect to the imaging device,
the field angle includes: a range of pitch angles of the target relative to the imaging device.
40. The target tracking device of claim 37, wherein the attitude angle comprises: a pitch angle;
the processor is further configured to perform the following operations:
acquiring a pitch angle of an identification frame relative to the imaging device, wherein the identification frame is used for identifying the current position and the current size;
determining the pitch angle of the target relative to the imaging equipment according to the pitch angle of the identification frame relative to the imaging equipment;
acquiring a current pitch angle of the imaging equipment;
and obtaining the pitch angle of a connecting line between the target and the imaging equipment according to the pitch angle of the target relative to the imaging equipment and the current pitch angle.
41. The target tracking device of claim 37, wherein at least one of said parameters comprises: a gain decay threshold and/or a dead band threshold.
42. The object tracking device of claim 41 wherein the gain-attenuation threshold is positively correlated with the field angle and/or the pose angle.
43. The object tracking device of claim 42,
when the gain attenuation threshold is a gain attenuation threshold related to course angle control and is positively correlated with the field angle and the attitude angle,
the field angle includes: a range of heading angles of the target relative to the imaging device, the attitude angles including: a pitch angle;
and/or;
when the gain attenuation threshold is a gain attenuation threshold with respect to pitch angle control, and is positively correlated with the field angle,
the field angle includes: a range of pitch angles of the target relative to the imaging device.
44. The target tracking device of claim 41 wherein the processor is further configured to:
and determining the dead zone threshold according to the attitude angle.
45. The target tracking device of claim 44 wherein the attitude angle comprises: a pitch angle;
the processor is further configured to perform the following operations:
when the absolute value of the pitch angle is larger than a threshold value, controlling the dead zone threshold value to be positively correlated with the pitch angle;
when the absolute value of the pitch angle is smaller than or equal to the threshold value, setting the dead zone threshold value as a preset value;
wherein the dead zone threshold is a dead zone threshold for heading angle control.
46. The target tracking device of claim 41 wherein the processor is further configured to:
and setting the dead zone threshold value as a preset value, wherein the dead zone threshold value is related to pitch angle control.
47. The target tracking device of claim 33 wherein the processor is further configured to:
the imaging device identifies the target in the picture;
and determining the attitude angular velocity corresponding to the attitude angular deviation according to the recognition result of the target.
48. The target tracking device of claim 47, wherein the processor is further configured to:
determining at least one control parameter according to the recognition result of the target;
and obtaining the attitude angular velocity according to the attitude angular deviation and at least one control parameter.
49. The target tracking device of claim 48 wherein at least one of said parameters comprises: gain coefficients for course angle control and/or pitch angle control; the gain factor has a value range;
the processor is further configured to perform the following operations:
when the target is located in the picture and the target is identified, increasing the gain coefficient within the value range;
and when the target is positioned in the picture but the target is not recognized or the target is positioned outside the picture, reducing the gain coefficient in the value range.
50. The target tracking device of claim 33 wherein the processor is further configured to:
acquiring the distance and the movement speed of the target relative to the imaging device;
determining an attitude angle of a connecting line between the target and the imaging equipment according to the current position;
and determining a feedforward value of the attitude angular velocity according to the distance, the motion velocity and the attitude angle.
51. The target tracking device of claim 50 wherein the processor is further configured to:
obtaining the speed along the attitude axis of the imaging equipment according to the movement speed;
and obtaining the feedforward value according to the speed, the distance and the attitude angle.
52. The object tracking device of claim 51,
the attitude angle includes: a pitch angle, the attitude angular velocity comprising: a pitch angle rate, the attitude axis comprising: a pitch axis;
and/or;
the attitude angle includes: a heading angle, the attitude angular velocity comprising: a heading angular velocity, the attitude axis comprising: a heading axis.
53. The target tracking device of claim 33 wherein the processor is further configured to:
determining an upper limit value of the attitude angular velocity;
and when the attitude angular velocity exceeds the upper limit value, controlling the imaging device to rotate the attitude angular deviation by the upper limit value of the angular velocity.
54. The target tracking device of claim 53, wherein the processor is further configured to:
determining the upper limit value according to the following speed thresholds:
a first velocity threshold of the imaging device at a pose axis, a second velocity threshold determined by a pose angular range of the target relative to the imaging device and an exposure time of the imaging device, a third velocity threshold determined by a pose angular range of the target relative to the imaging device and a recognition period of the target by the imaging device.
55. The target tracking device of claim 54 wherein the upper angular velocity limit is the minimum of the first velocity threshold, the second velocity threshold, and the third velocity threshold.
56. The object tracking device of claim 54,
the attitude axis includes: a heading axis, the attitude angle range including: a course angular range, the attitude angular velocity comprising: a course angular velocity;
and/or;
the attitude axis includes: a pitch axis, the range of attitude angles comprising: a pitch angle range, the attitude angular velocity comprising: pitch angle rate.
57. The target tracking device of claim 33 wherein the processor is further configured to:
determining a desired pose angle of the imaging device from the desired position and the current position;
acquiring a current attitude angle of the imaging device;
and determining the attitude angle deviation according to the expected attitude angle and the current attitude angle.
58. The target tracking device of claim 57 wherein the processor is further configured to:
determining an attitude angle of a connecting line between the target and the imaging equipment according to the current position;
determining a pose angular deviation of the target relative to the imaging device from the desired position;
and obtaining the expected attitude angle according to the attitude angle and the attitude angle deviation.
59. The object tracking device of claim 58,
the current attitude angle includes: a current heading angle, the desired attitude angle comprising: a desired heading angle, the attitude angle comprising: a heading angle, the attitude angle deviation comprising: course angle deviation;
and/or;
the current attitude angle includes: a current pitch angle, the desired attitude angle comprising: a desired pitch angle, the attitude angle comprising: a pitch angle, the attitude angle deviation comprising: and (5) pitch angle deviation.
60. The target tracking device of claim 33 wherein the processor is further configured to:
determining another attitude angle deviation when the target in the picture of the imaging device moves outside the picture of the imaging device;
determining another attitude angular velocity corresponding to the another attitude angular deviation according to the current position and the current size;
controlling the imaging device to rotate the other attitude angular deviation at the other attitude angular velocity to cause the target to reappear in a picture of the imaging device.
61. The target tracking device of claim 60 wherein the processor is further configured to:
determining a desired pose angle of the imaging device;
acquiring a current attitude angle of the imaging device;
determining the other attitude angle deviation from the desired attitude angle and the current attitude angle.
62. The target tracking device of claim 61, wherein the processor is further configured to:
acquiring the position and the movement speed of the target relative to the imaging device at the last moment in the picture and the duration of the target outside the picture;
determining a predicted position of the target based on the position, the speed of movement, and the duration;
and obtaining the expected attitude angle according to the predicted position.
63. The target tracking device of claim 62, wherein the predicted location comprises: the position of the target under uniform motion and/or deceleration motion.
64. The object tracking device of claim 62,
the current attitude angle includes: a current heading angle, the desired attitude angle comprising: a desired heading angle, the another attitude angle deviation comprising: a heading angular deviation, the another attitude angular velocity comprising: a course angular velocity;
and/or;
the current attitude angle includes: a current pitch angle, the desired attitude angle comprising: a desired pitch angle, the another attitude angle deviation comprising: a pitch angle deviation, the another attitude angular velocity comprising: pitch angle rate.
65. A computer-readable storage medium having stored thereon executable instructions that, when executed by one or more processors, may cause the one or more processors to perform the object tracking method of any one of claims 1 to 32.
66. A movable platform, comprising: a movable carrier, an imaging apparatus, and a carrier;
the imaging device is mounted to the movable carrier or, alternatively, to the movable carrier by the carrier;
the movable carrier includes: the target tracking device of any one of claims 33-64;
the target tracking device may control the vehicle and/or the movable carrier to rotate to adjust a pose angle of the imaging apparatus.
67. The movable platform of claim 66, wherein the movable carrier comprises: unmanned aerial vehicles, unmanned ships, or robots.
68. The movable platform of claim 66, wherein the carrier comprises: a head having at least one rotational degree of freedom.
69. An imaging platform, comprising: a carrier and an imaging device; the carrier includes: the target tracking device of any one of claims 33-64;
the target tracking device may control the vehicle to rotate to adjust a pose angle of the imaging apparatus.
70. The imaging platform of claim 69, wherein the carrier comprises: hand-held cloud platform.
CN202080004853.6A 2020-05-07 2020-05-07 Target tracking method and device, movable platform and imaging platform Pending CN112639652A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/089009 WO2021223171A1 (en) 2020-05-07 2020-05-07 Target tracking method and apparatus, movable platform, and imaging platform

Publications (1)

Publication Number Publication Date
CN112639652A true CN112639652A (en) 2021-04-09

Family

ID=75291189

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080004853.6A Pending CN112639652A (en) 2020-05-07 2020-05-07 Target tracking method and device, movable platform and imaging platform

Country Status (2)

Country Link
CN (1) CN112639652A (en)
WO (1) WO2021223171A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113301248A (en) * 2021-04-13 2021-08-24 中科创达软件股份有限公司 Shooting method, shooting device, electronic equipment and computer storage medium
WO2023123769A1 (en) * 2021-12-29 2023-07-06 国家电投集团贵州金元威宁能源股份有限公司 Control method and control apparatus for implementing target tracking for unmanned aerial vehicle
WO2024051330A1 (en) * 2022-09-07 2024-03-14 华为技术有限公司 Camera control method and related apparatus

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116974208B (en) * 2023-09-22 2024-01-19 西北工业大学 Rotor unmanned aerial vehicle target hitting control method and system based on strapdown seeker
CN117649426B (en) * 2024-01-29 2024-04-09 中国科学院长春光学精密机械与物理研究所 Moving target tracking method for preventing shielding of landing gear of unmanned aerial vehicle

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE202014011010U1 (en) * 2014-07-30 2017-05-31 SZ DJI Technology Co., Ltd. Target tracking systems
CN107087427A (en) * 2016-11-30 2017-08-22 深圳市大疆创新科技有限公司 Control method, device and the equipment and aircraft of aircraft
CN107148639A (en) * 2015-09-15 2017-09-08 深圳市大疆创新科技有限公司 It is determined that method and device, tracks of device and the system of the positional information of tracking target
CN107749951A (en) * 2017-11-09 2018-03-02 睿魔智能科技(东莞)有限公司 A kind of visually-perceptible method and system for being used for unmanned photography
JP2018129063A (en) * 2018-03-14 2018-08-16 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Method for controlling unmanned aircraft, unmanned aircraft, and system for controlling unmanned aircraft
CN110109482A (en) * 2019-06-14 2019-08-09 上海应用技术大学 Target Tracking System based on SSD neural network

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE202014011010U1 (en) * 2014-07-30 2017-05-31 SZ DJI Technology Co., Ltd. Target tracking systems
CN107148639A (en) * 2015-09-15 2017-09-08 深圳市大疆创新科技有限公司 It is determined that method and device, tracks of device and the system of the positional information of tracking target
CN107209854A (en) * 2015-09-15 2017-09-26 深圳市大疆创新科技有限公司 For the support system and method that smoothly target is followed
CN107087427A (en) * 2016-11-30 2017-08-22 深圳市大疆创新科技有限公司 Control method, device and the equipment and aircraft of aircraft
CN107749951A (en) * 2017-11-09 2018-03-02 睿魔智能科技(东莞)有限公司 A kind of visually-perceptible method and system for being used for unmanned photography
JP2018129063A (en) * 2018-03-14 2018-08-16 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Method for controlling unmanned aircraft, unmanned aircraft, and system for controlling unmanned aircraft
CN110109482A (en) * 2019-06-14 2019-08-09 上海应用技术大学 Target Tracking System based on SSD neural network

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113301248A (en) * 2021-04-13 2021-08-24 中科创达软件股份有限公司 Shooting method, shooting device, electronic equipment and computer storage medium
CN113301248B (en) * 2021-04-13 2022-09-06 中科创达软件股份有限公司 Shooting method and device, electronic equipment and computer storage medium
WO2023123769A1 (en) * 2021-12-29 2023-07-06 国家电投集团贵州金元威宁能源股份有限公司 Control method and control apparatus for implementing target tracking for unmanned aerial vehicle
WO2024051330A1 (en) * 2022-09-07 2024-03-14 华为技术有限公司 Camera control method and related apparatus

Also Published As

Publication number Publication date
WO2021223171A1 (en) 2021-11-11

Similar Documents

Publication Publication Date Title
CN112639652A (en) Target tracking method and device, movable platform and imaging platform
US10928838B2 (en) Method and device of determining position of target, tracking device and tracking system
US11073389B2 (en) Hover control
US20210141378A1 (en) Imaging method and device, and unmanned aerial vehicle
US9924104B2 (en) Background-differential extraction device and background-differential extraction method
EP2791868B1 (en) System and method for processing multi-camera array images
WO2019113966A1 (en) Obstacle avoidance method and device, and unmanned aerial vehicle
WO2018214090A1 (en) Control method and apparatus, and pan-tilt
US20220033076A1 (en) System and method for tracking targets
CN113794840B (en) Video processing method, video processing equipment, unmanned aerial vehicle and video processing system
US20170242432A1 (en) Image processing for gesture-based control of an unmanned aerial vehicle
WO2019104641A1 (en) Unmanned aerial vehicle, control method therefor and recording medium
US20060017816A1 (en) General line of sight stabilization system
EP3742248A1 (en) Controlling a group of drones for image capture
Lauterbach et al. The Eins3D project—Instantaneous UAV-based 3D mapping for Search and Rescue applications
EP3380892B1 (en) Aerial photography camera system
EP2673591A1 (en) Image capturing
Karakostas et al. UAV cinematography constraints imposed by visual target tracking
CN110337668B (en) Image stability augmentation method and device
JP2023505987A (en) Calibration of camera on unmanned aerial vehicle using human joint
CN113950610A (en) Device control method, device and computer readable storage medium
Fragoso et al. Dynamically feasible motion planning for micro air vehicles using an egocylinder
GB2481027A (en) Image stabilising apparatus and method
WO2020237478A1 (en) Flight planning method and related device
WO2019205103A1 (en) Pan-tilt orientation correction method, pan-tilt orientation correction apparatus, pan-tilt, pan-tilt system, and unmanned aerial vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination