WO2021223171A1 - 目标跟踪方法和装置、可移动平台以及成像平台 - Google Patents

目标跟踪方法和装置、可移动平台以及成像平台 Download PDF

Info

Publication number
WO2021223171A1
WO2021223171A1 PCT/CN2020/089009 CN2020089009W WO2021223171A1 WO 2021223171 A1 WO2021223171 A1 WO 2021223171A1 CN 2020089009 W CN2020089009 W CN 2020089009W WO 2021223171 A1 WO2021223171 A1 WO 2021223171A1
Authority
WO
WIPO (PCT)
Prior art keywords
angle
attitude
imaging device
target
angular velocity
Prior art date
Application number
PCT/CN2020/089009
Other languages
English (en)
French (fr)
Inventor
陆泽早
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/089009 priority Critical patent/WO2021223171A1/zh
Priority to CN202080004853.6A priority patent/CN112639652A/zh
Publication of WO2021223171A1 publication Critical patent/WO2021223171A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Definitions

  • the present disclosure relates to the field of target tracking, and in particular to a target tracking method and device, a movable platform and an imaging platform.
  • Movable platforms such as drones can be used to perform tracking of targets.
  • a movable platform usually includes a movable carrier and an imaging device.
  • the imaging device is usually mounted on a movable carrier through a carrier, and the carrier can rotate the imaging device relative to the movable carrier.
  • the user can select the target in the image of the imaging device through the remote control, and specify the desired position of the target in the screen.
  • the movable platform controls the movable carrier and the carrier to rotate to adjust the posture of the imaging device, so that the target moves to a desired position in the image of the imaging device.
  • the embodiments of the present disclosure provide a target tracking method, including:
  • the imaging device is controlled to rotate the attitude angular deviation at the attitude angular velocity, so that the target is at the desired position in the screen.
  • the embodiment of the present disclosure also provides a target tracking device, including:
  • Memory used to store executable instructions
  • the processor is configured to execute the executable instructions stored in the memory to perform the following operations:
  • the imaging device is controlled to rotate the attitude angular deviation at the attitude angular velocity, so that the target is at the desired position in the screen.
  • the embodiments of the present disclosure also provide a computer-readable storage medium, which stores executable instructions.
  • the executable instructions When executed by one or more processors, the one or more processors can execute the foregoing Target tracking method.
  • the embodiment of the present disclosure also provides a movable platform, including: a movable carrier, an imaging device, and a carrier;
  • the imaging device is installed on the movable carrier, or installed on the movable carrier through the carrier;
  • the movable carrier includes: the above-mentioned target tracking device;
  • the target tracking device can control the rotation of the carrier and/or the movable carrier to adjust the attitude angle of the imaging device.
  • the embodiment of the present disclosure also provides an imaging platform, including: a carrier and an imaging device; the carrier includes: the above-mentioned target tracking device;
  • the target tracking device can control the rotation of the carrier to adjust the attitude angle of the imaging device.
  • Fig. 1 is a flowchart of a target tracking method according to an embodiment of the disclosure.
  • Fig. 2 is a schematic structural diagram of a movable platform according to an embodiment of the disclosure.
  • Fig. 3 shows a screen of the imaging device of the embodiment of the present disclosure.
  • Fig. 4 shows another screen of the imaging device of the embodiment of the present disclosure.
  • Figure 5 is a flowchart for determining the attitude angular velocity corresponding to the attitude angular deviation.
  • Figure 6 is a flow chart for determining at least one control parameter.
  • Fig. 7 is a flowchart for determining the angle of view of the target.
  • FIG. 8 shows another screen of the imaging device of the embodiment of the present disclosure.
  • Fig. 9 is a flowchart for determining the attitude angle of the connection line between the target and the imaging device.
  • Figure 10 shows the input and output curves of the attitude angle controller.
  • Fig. 11 is a flowchart for determining the attitude angular velocity corresponding to the attitude angular deviation.
  • Fig. 12 is a flowchart for determining the desired attitude angle of the imaging device.
  • Fig. 13a, Fig. 13b and Fig. 13c respectively show three scenarios for calculating the desired attitude angle in an analytical manner.
  • Figure 14 shows a top view of the imaging device.
  • FIG. 15 is a flowchart of a target tracking method according to another embodiment of the present disclosure.
  • Fig. 16 is a flowchart for determining the desired attitude angle of the imaging device.
  • Figure 17a and Figure 17b respectively show two scenarios for calculating the desired attitude angle in an iterative manner.
  • FIG. 18 is a flowchart of a target tracking method according to another embodiment of the present disclosure.
  • FIG. 19 is another flowchart of the target tracking method according to another embodiment of the present disclosure.
  • Figure 20 is another input and output curve of the attitude angle controller.
  • FIG. 21 is a flowchart of a target tracking method according to another embodiment of the present disclosure.
  • FIG. 22 is a schematic diagram of a target tracking device according to an embodiment of the disclosure.
  • the user can specify a desired position in the image of the imaging device, and adjust the posture of the imaging device to make the target in the image be located at the desired position.
  • the general target tracking method usually uses a controller with fixed parameters to control the attitude angular velocity of the imaging device. This control method often has the following defects:
  • attitude angular velocity is positively correlated with the focal length of the imaging device, when the focal length of the imaging device is small, the attitude angular velocity is too slow, and when the focal length of the imaging device is large, the attitude angular velocity is too fast.
  • the attitude angular velocity is also positively related to the pitch angle of the target relative to the imaging device. Therefore, when the pitch angle of the target relative to the imaging device is small, the heading angular velocity of the imaging device is too slow. When the pitch angle of the target relative to the imaging device is large, The heading angular velocity of the imaging device is too fast.
  • the pitch angle of the target relative to the imaging device is large (for example, close to 90 degrees), it is easy to cause a large change in the heading angle of the imaging device, causing the imaging device to oscillate back and forth between different attitudes.
  • the exposure time of the imaging device is longer, which easily causes blurring of the picture, resulting in excessively fast attitude angular velocity, and even loss of the target.
  • the pitch angle of the target relative to the imaging device is relatively large (for example, close to 90 degrees), the desired attitude angle of the imaging device is prone to singularities, resulting in failure of target tracking.
  • the present disclosure provides a target tracking method, a target tracking device, a computer-readable storage medium, a movable platform, and an imaging platform.
  • the present disclosure can make the control of the attitude angle and the attitude angular velocity smoother and more stable, and can avoid the target tracking failure due to the singular situation of the attitude angle, when the target is blocked, the brightness of the environment where the target is located is low, and the position of the target relative to the imaging device When there is a change, it can also keep track of the target.
  • An embodiment of the present disclosure provides a target tracking method. As shown in FIG. 1, the target tracking method includes:
  • S101 Acquire the current position and current size of the target in the image of the imaging device
  • S102 Acquire a desired position of the target in the screen, and determine the attitude angle deviation of the imaging device according to the desired position and the current position;
  • S104 Control the imaging device to rotate the attitude angular deviation at the attitude angular velocity, so that the target is at the desired position in the screen.
  • FIG. 2 shows a movable platform 100.
  • the movable platform 100 includes a movable carrier 110, a carrier 140 and an imaging device 130.
  • the imaging device 130 is supported by a movable carrier 110.
  • the imaging device 130 may be directly supported by the movable carrier 110 or may be supported by the movable carrier 110 via the carrier 140.
  • the imaging device 130 may be used to capture the target 160 within the angle of view 170 of the imaging device 130.
  • One or more targets 160 may be within the viewing angle 170 of the imaging device 130.
  • the movable carrier 110 is depicted as a drone, the present disclosure is not limited to this, and any suitable type of movable carrier can be used, such as but not limited to drones, unmanned vehicles, and unmanned ships. , Robots, etc.
  • the movable carrier 110 may include a body 105 and one or more propulsion units 150.
  • the propulsion unit 150 may be configured as the movable carrier 110 to generate lift.
  • the propulsion unit 150 may include a rotor.
  • the movable carrier 110 can fly in a three-dimensional space and can rotate along at least one of a pitch axis, a yaw axis, and a roll axis.
  • the body 105 of the movable carrier 110 may include: a flight controller, one or more processors, one or more memories, one or more sensors, and one or more communication units.
  • the movable platform 100 may include one or more imaging devices 130.
  • the imaging device 130 may be a camera or a video camera.
  • the imaging device 130 may be a visible light imaging device, an infrared imaging device, an ultraviolet imaging device, or a thermal imaging device.
  • the imaging device 130 may achieve zooming by adjusting at least one of an optical zoom level and a digital zoom level to adjust the target size in the screen of the imaging device 130.
  • the imaging device 130 may be installed on the carrier 140.
  • the carrier 140 may allow the imaging device 130 to rotate about at least one of a pitch axis, a yaw axis, and a roll axis.
  • the carrier 140 may include a single-axis pan/tilt, a dual-axis pan/tilt, or a three-axis pan/tilt.
  • the movable platform 100 can be controlled by a remote controller 120.
  • the remote controller 120 may communicate with at least one of the movable carrier 110, the carrier 140, and the imaging device 130.
  • the remote controller 120 includes a display.
  • the display is used to display the image of the imaging device 130.
  • the remote controller 120 also includes an input device.
  • the input device can be used to receive user input information.
  • the user's input information may include the position of the target 160 in the screen of the imaging device 130.
  • the position of the target 160 may include the current position and the desired position of the target 160.
  • the movable carrier is an unmanned aerial vehicle as an example to describe the target tracking method of this embodiment.
  • the imaging device can image the surrounding environment of the drone, and transmit the image in the screen to the remote control of the drone.
  • the display of the remote control displays the image in the image of the imaging device to user.
  • the target in the screen can be selected through the input device of the remote control. For example, the user can click or box select a target on the display to select the target.
  • the remote controller sends an instruction corresponding to the target selection operation to the drone.
  • the imaging device recognizes the target selected by the user. As shown in FIG. 3, when the target is successfully recognized, an identification frame identifying the outline of the target is displayed on the screen of the imaging device, and the identification frame is used to identify the current position and current size of the target.
  • the imaging device can also automatically select and recognize the target that is desired to be tracked.
  • the user can set a set of target selection rules on the drone through the remote control.
  • the imaging device automatically selects the target to be tracked according to the target selection regulations.
  • the target selection rule can be set according to the following parameters: target category, target size, target color, target distance, etc.
  • the target category may include people, vehicles, animals, and so on. When people, vehicles, or animals appear in the image of the imaging device, they can be automatically selected as tracking targets and recognized. When the target is successfully identified, the target is identified by the identification box.
  • the size of the target in the image of the imaging device can be determined by the following factors: the focal length of the imaging device and the target distance. Those skilled in the art can understand that when the focal length of the imaging device is small or the target is far away from the imaging device, the size of the target in the picture is smaller; conversely, the size of the target in the picture is larger.
  • the identification frame may include any shape of the frame, for example, a rectangular frame, a square frame, a circular frame, etc., and the specific shape thereof is not limited in this embodiment.
  • the target is not in the position desired by the user on the screen.
  • the user can adjust the position of the target on the screen, and specify the desired position of the target on the screen through the remote control to move the target to the desired position.
  • the user can click on the display of the remote control, and the clicked position can be used as the desired position of the target.
  • the remote controller sends an instruction corresponding to the desired location tapping operation to the drone.
  • the drone determines the attitude angle deviation of the imaging device according to the expected position and the current position of the target.
  • the attitude angle deviation of the imaging device refers to the attitude angle deviation of the imaging device in the world coordinate system.
  • the world coordinate system can be one or more of the inertial coordinate system, the earth coordinate system, and the geographic coordinate system.
  • the attitude angular velocity corresponding to the attitude angle deviation is determined from the current position and the current size through S103.
  • the attitude angular velocity is related to the current position and current size of the target. Different target current positions or current sizes correspond to different attitude angular velocities.
  • the target may move momentarily or intermittently, causing the current position or current size of the target in the image of the imaging device to change. At this time, the attitude angular velocity corresponding to the attitude angle deviation also changes.
  • determining the attitude angular velocity corresponding to the attitude angular deviation according to the current position and the current size includes:
  • S501 Determine at least one control parameter according to the current position and the current size
  • S502 Obtain the attitude angular velocity according to the attitude angular deviation and at least one of the control parameters.
  • the control parameter is determined according to the current position and the current size.
  • the control parameter may be a control parameter of a controller for controlling the imaging device.
  • the controller may be an attitude angle controller.
  • the attitude angle includes one or more of a heading angle, a pitch angle, and a roll angle.
  • the attitude angle may include a heading angle and a pitch angle.
  • the attitude angle controller of the imaging device may include various types of heading angle and pitch angle controllers, such as feedback controllers, predictive controllers, and so on.
  • the attitude angle controller of the imaging device may include: a proportional controller (P), a proportional integral controller (PI), and a proportional integral derivative controller (PID) for the heading angle, and a proportional controller (P ), proportional integral controller (PI), proportional integral derivative controller (PID), etc.
  • P proportional controller
  • PI proportional integral controller
  • PD proportional integral derivative controller
  • the attitude angle refers to both the heading angle and the pitch angle.
  • determining at least one control parameter according to the current position and the current size, as shown in FIG. 6, includes:
  • S601 Determine the field of view angle of the target according to the current position and the current size
  • S602 Determine the attitude angle of the connection line between the target and the imaging device according to the current position
  • S603 Obtain at least one control parameter according to at least one of the field of view angle and the attitude angle.
  • the field of view angle of the target refers to the range of the posture angle of the target relative to the imaging device. Determining the field of view angle of the target according to the current position and the current size, as shown in FIG. 7, includes:
  • S702 Obtain the field of view angle according to the attitude angle of the identification frame relative to the imaging device.
  • the current angle of view can be determined according to the focal length of the imaging device. As shown in the following formula:
  • fov zx represents the field angle in the row direction of the screen
  • fov zy represents the field angle in the column direction of the screen
  • focal_length represents the current focal length of the imaging device
  • W and H represent the width and height of the image sensor of the imaging device, respectively.
  • the identification frame may be a rectangular frame.
  • the posture angle of the rectangular frame relative to the imaging device includes the posture angle of the side edge in the row direction of the screen and the posture angle of the side edge in the column direction of the screen relative to the imaging device.
  • the attitude angle of the side in the row direction of the picture relative to the imaging device includes: the heading angle of the left side and the right side relative to the imaging device; the attitude angle of the side in the picture column direction relative to the imaging device includes: the upper side and the bottom The pitch angle of the side relative to the imaging device.
  • determining the attitude angle of the connection line between the target and the imaging device according to the current position includes:
  • S901 Acquire the pitch angle of the marking frame relative to the imaging device.
  • S902 Determine the pitch angle of the target relative to the imaging device according to the pitch angle of the marking frame relative to the imaging device;
  • S904 Obtain the pitch angle of the connection line between the target and the imaging device according to the pitch angle of the target relative to the imaging device and the current pitch angle.
  • the process of obtaining the pitch angle of the marking frame relative to the imaging device refer to the process of obtaining the attitude angle of the marking frame relative to the imaging device in S701.
  • the median of the pitch angles of the upper and lower sides relative to the imaging device is taken as the pitch angle of the target relative to the imaging device.
  • the current pitch angle of the imaging device refers to the pitch angle of the imaging device in the world coordinate system.
  • UAVs are usually equipped with attitude sensors, such as inertial sensors; vehicles are usually equipped with angle sensors.
  • the pitch angle of the imaging device in the world coordinate system can be obtained through the attitude sensor and the angle sensor.
  • connection line between the target and the imaging device may be a connection line between the optical center of the imaging device and the target center, that is, the connection line between the optical center of the imaging device and the center of the marking frame.
  • the pitch angle of the line between the target and the imaging device refers to the pitch angle in the world coordinate system.
  • the pitch angle of the target relative to the imaging device is the pitch angle of the target in the imaging device coordinate system, and the current pitch angle of the imaging device refers to the current pitch angle of the imaging device in the world coordinate system. Therefore, by adding the pitch angle of the target relative to the imaging device to the current pitch angle of the imaging device, the pitch angle of the connection line between the target and the imaging device can be obtained.
  • the attitude angle controller of the imaging device can include various types of controllers.
  • a proportional controller When a proportional controller is used, its input and output curves are shown in Figure 10, where the horizontal axis represents the attitude angle deviation, and the vertical axis represents Attitude angular velocity.
  • control parameters in S502 include: the gain attenuation threshold in FIG. 10, and the gain attenuation threshold is positively correlated with the field of view and/or the attitude angle.
  • the gain attenuation threshold is the gain attenuation threshold for heading angle control and is positively correlated with the field of view angle and the attitude angle
  • the field of view angle includes: the heading angle range of the target relative to the imaging device
  • the attitude angle includes: pitch Horn. That is, when the gain attenuation threshold is the gain attenuation threshold for heading angle control, the gain attenuation threshold is determined according to the field of view and the attitude angle.
  • the field of view angle includes: the range of the target's elevation angle relative to the imaging device. That is, when the gain attenuation threshold is a gain attenuation threshold for pitch angle control, the gain attenuation threshold is determined according to the angle of view.
  • the gain attenuation threshold of the heading angle controller For the gain attenuation threshold of the heading angle controller, first multiply the heading angle range of the target relative to the imaging device by the first proportional coefficient. Then, according to the pitch angle of the connection between the target and the imaging device, the product of the heading angle range of the target relative to the imaging device and the first proportional coefficient is adjusted to obtain the gain attenuation threshold of the heading angle controller.
  • the first proportional coefficient can be determined according to the recognition situation of the target. When the logo frame in the screen is stable, the value of the first scale factor can be appropriately increased; when the logo frame in the screen is unstable, the value of the first scale factor can be appropriately decreased. In some examples, the first scale factor may be 1 or greater.
  • adjusting the product result according to the pitch angle of the line between the target and the imaging device may include: dividing the product result by the trigonometric function value of the pitch angle of the line between the target and the imaging device.
  • the pitch angle range of the target relative to the imaging device is multiplied by the second proportional coefficient to obtain the gain attenuation threshold of the pitch angle controller.
  • the second scale factor can also be determined according to the recognition situation of the target. In some examples, the second scale factor may be 1 or greater.
  • control parameters in S502 include: the dead zone threshold in FIG. 10. Determining at least one control parameter according to at least one of the field of view angle and the attitude angle includes: determining the dead zone threshold according to the attitude angle of the connection line between the target and the imaging device.
  • the attitude angle of the line between the target and the imaging device refers to the pitch angle of the line between the target and the imaging device.
  • the control dead zone threshold when the absolute value of the pitch angle is greater than the threshold, the control dead zone threshold is positively correlated with the pitch angle; when the absolute value of the pitch angle is less than or equal to the threshold, the dead zone threshold is set Set to the default value.
  • controlling the dead zone threshold to be positively correlated with the pitch angle includes:
  • the threshold is subtracted from the absolute value of the pitch angle of the line between the target and the imaging device, and the difference between the absolute value of the pitch angle and the threshold is multiplied by the third scale factor, and then the pitch angle of the line between the target and the imaging device Adjusting the product result of the difference and the third proportional coefficient to obtain the dead zone threshold of the heading angle controller.
  • the above threshold and the third proportional coefficient can be determined according to actual needs.
  • the threshold may be 70 degrees to 85 degrees; or the threshold may be 80 degrees.
  • the third scale factor may be 0.1-0.9, for example, the third scale factor may be 0.6.
  • adjusting the product result of the difference value and the third scale factor may include: dividing the product result by the trigonometric function value of the pitch angle of the connection line between the target and the imaging device.
  • the dead zone threshold of the heading angle controller is directly set to the first preset value.
  • the first preset value can also be determined according to actual needs. In some examples, the first preset value may include zero.
  • the dead zone threshold can be directly set to the second preset value.
  • the second preset value may also be determined according to actual requirements, and the second preset value may be the same as or different from the first preset value. In some examples, the second preset value may include zero.
  • S502 obtains the attitude angular velocity according to the attitude angle deviation and the at least one control parameter.
  • the attitude angle deviation may be input to an attitude angle controller having at least one control parameter, and the attitude angle controller outputs the attitude angular velocity corresponding to the attitude angle deviation.
  • the attitude angle deviation can be input into the heading angle controller with the gain attenuation threshold and the dead zone threshold. , Get the heading angular velocity.
  • the pitch angle deviation can be input to the pitch angle controller with the gain attenuation threshold and the dead zone threshold to obtain the pitch angle velocity.
  • the attitude angle deviation includes both the heading angle deviation and the pitch angle deviation
  • the heading angle deviation and the pitch angle deviation can be input into the above-mentioned heading angle controller and the pitch angle controller respectively to obtain the heading angle velocity and the pitch angle velocity.
  • the imaging device is controlled to rotate the attitude angular deviation at the attitude angular velocity through S104, so that the target is at a desired position in the screen, and the composition of the target is realized.
  • the imaging device When the imaging device is installed on the UAV via a vehicle, it can be controlled to rotate the vehicle, or the UAV can be controlled to rotate, or both the vehicle and the UAV can be controlled to rotate, so that the imaging device can rotate at a heading angular velocity. Rotation heading angle deviation, pitch angle speed rotation pitch angle deviation.
  • the rotation of the UAV can be controlled to make the imaging device rotate at the yaw rate and the pitch angle deviation.
  • the target tracking method of this embodiment can dynamically adjust control parameters including gain attenuation threshold and dead zone threshold according to the current position and current size of the target, and can adapt to targets of different distances, sizes, and directions.
  • control parameter is set to a fixed value
  • the control of the attitude angle and the attitude angular velocity is smoother and more stable, avoiding the focal length of the imaging device and the pitch of the target relative to the imaging device in the prior art.
  • the control of the attitude angle and attitude angular velocity caused by the large angle and the change of the target size lacks smoothness and stability.
  • the following describes the process of determining the attitude angular velocity corresponding to the attitude angular deviation according to the current position and current size of the target.
  • determining the attitude angular velocity corresponding to the attitude angular deviation includes:
  • S1101 Determine a desired attitude angle of the imaging device according to the desired position and the current position;
  • S1103 Determine the attitude angle deviation according to the desired attitude angle and the current attitude angle.
  • the current attitude angle of the imaging device can be obtained by referring to the method of S903.
  • the current attitude angle includes the current heading angle and/or the current pitch angle, which respectively refer to the heading angle and the pitch angle of the imaging device in the world coordinate system.
  • the attitude angle of the imaging device in the world coordinate system can be obtained through the attitude sensor and angle sensor of the UAV.
  • the attitude angle deviation can be obtained by subtracting the two.
  • the following describes the process of determining the desired attitude angle of the imaging device.
  • determining the expected posture angle of the imaging device includes:
  • S1201 Determine the attitude angle of the connection line between the target and the imaging device according to the current position
  • S1202 Determine the attitude angle deviation of the target relative to the imaging device according to the desired position
  • the attitude angle of the line between the target and the imaging device may include: the heading angle and/or the pitch angle of the line between the target and the imaging device;
  • the desired attitude angle may include: the desired heading angle and/or the desired pitch angle;
  • the attitude angle deviation Can include: heading angle deviation and pitch angle deviation.
  • the heading angle and the pitch angle of the connection line between the target and the imaging device can be determined with reference to the above S602.
  • the desired position in S1202 refers to the desired position designated by the user through the remote control.
  • the desired heading angle and desired pitch angle can be calculated analytically or iteratively.
  • the analytical method is to calculate the desired attitude angle by solving the equation.
  • the pitch angle of the target relative to the imaging device is large, for example, close to 90 degrees
  • the solution of the equation will be singular. Singular situations include: no effective solution, multiple effective solutions, unique effective solution, infinite solutions, etc.
  • the desired attitude angle is determined by analyzing the singular situation.
  • the pitch angle of the target relative to the imaging device is close to 90 degrees
  • the dashed arrow 1 indicates the pitch angle of the connection line between the target and the imaging device
  • the desired position specified by the user is that the target is located on the imaging device.
  • the analytic method is used to calculate the expected attitude angle deviation, and two solutions are obtained: the solid arrow between the dashed arrow 1 and the dashed arrow 2 indicates the first pitch angle deviation of the target relative to the imaging device, and the dashed arrow 1 and the dashed line
  • the solid arrow between arrows 3 indicates the deviation of the second pitch angle of the target relative to the imaging device.
  • the heading angle of the imaging device corresponding to the first pitch angle deviation differs from the expected heading angle of the imaging device by 180 degrees, and the heading angle of the imaging device corresponding to the second pitch angle deviation is equal to the expected heading angle of the imaging device, therefore,
  • the first pitch angle deviation is an invalid solution
  • the second pitch angle deviation is the only valid solution
  • the second pitch angle deviation is taken as the pitch angle deviation obtained in S1202.
  • the pitch angle of the target relative to the imaging device is close to 90 degrees.
  • the dashed arrow 1 indicates the pitch angle of the connection between the target and the imaging device.
  • the desired position specified by the user is the target in the direction of the image column of the imaging device.
  • the analytic method is used to calculate the expected attitude angle deviation, and two solutions are obtained: the solid arrow between the dashed arrow 1 and the dashed arrow 2 indicates the first pitch angle deviation of the target relative to the imaging device, and the dashed arrow 1 and the dashed line
  • the solid arrow between arrows 3 indicates the deviation of the second pitch angle of the target relative to the imaging device.
  • the two solutions are both It is an invalid solution.
  • the effective solution can be re-determined by gradually reducing the pitch angle deviation.
  • the effective solution is the third pitch of the target relative to the imaging device indicated by the solid arrow between the dashed arrow 1 and the dashed arrow 4. Angle deviation, the third pitch angle deviation is taken as the pitch angle deviation obtained in S1202.
  • the pitch angle of the target relative to the imaging device is close to 90 degrees
  • the desired position specified by the user is that the target is located on the left of the image line direction of the imaging device.
  • the desired heading angle deviation is obtained by calculating the desired attitude angle in an analytical manner. It is the first heading angle deviation between the dashed frame 1 and the center of the screen. However, since the sum of the first heading angle deviation and the pitch angle of the target relative to the imaging device is greater than 90 degrees, the first heading angle deviation is an invalid solution. In this case, the effective solution can be re-determined by gradually reducing the heading angle deviation.
  • the effective solution is the second heading angle deviation between the dashed frame 2 and the center of the screen.
  • the second heading angle deviation is obtained as S1202 The deviation of the heading angle.
  • the target tracking method of this embodiment guarantees that when the target is located in any direction of the imaging device, the correct attitude angle of the imaging device can be obtained through singularity analysis, and the target is located at or as close as possible to the desired position specified by the user. Especially when the pitch angle of the target relative to the imaging device is large, the above effect can still be achieved, and the target tracking failure due to the strange situation of the expected attitude angle of the imaging device is avoided.
  • the prior art target tracking method usually directly converts the difference between the current position of the target and the desired position into the attitude angle deviation of the imaging device. There is a large error between the attitude angle deviation obtained in this way and the actual attitude angle deviation.
  • the identification frame is the current position of the target, and the center of the screen is the desired position of the target.
  • the heading angle deviation obtained by the prior art is smaller than the actual heading angle deviation, and the pitch angle deviation is greater than the actual pitch angle deviation. Therefore, in the prior art target tracking method, the final position of the target will inevitably deviate from the desired position.
  • the desired attitude angle of the imaging device is first determined according to the desired position and current position of the target, and then the attitude angle deviation is determined according to the desired attitude angle and the current attitude angle of the imaging device.
  • the heading angle deviation and the pitch angle deviation obtained in this embodiment are equal to the actual heading angle deviation and the pitch angle deviation, and there is no error.
  • the obtained attitude angle deviation accuracy is higher, the target can be moved to the desired position more accurately, and the accuracy of target tracking and the composition effect are improved.
  • the target tracking method of this embodiment is described above in conjunction with the proportional controller in FIG. 10.
  • the controller of this embodiment adopts other controllers, such as a proportional integral controller, a proportional integral derivative controller, etc., the above technology can also be achieved. Effect.
  • sequence numbers of all the steps described in this embodiment are only for the convenience of description, they are only used to refer to a certain step, and are not a limitation on the order of the steps. .
  • the steps S101-S104 to S1201-S1203 described in this embodiment can be executed in any order. For example, for two steps that are not dependent on each other, they can be executed sequentially or in parallel, and the order of successive executions is not limited.
  • Another embodiment of the present disclosure provides a target tracking method.
  • the same or similar features of this embodiment and the previous embodiment will not be repeated, and only the content that is different from the previous embodiment will be described below.
  • the target tracking method of this embodiment further includes:
  • the target When the imaging device tracks the target, the target may be blocked by other objects.
  • the target when the target is a moving vehicle on the road, the vehicle in the screen may be blocked by road signs, street lights and other objects on both sides of the road. Since this type of target moves relative to the imaging target, when the object obstructing the vehicle moves out of the image of the imaging device, the target may already be outside the screen.
  • the target tracking method of this embodiment can retrieve the target located outside the screen, make the target reappear in the screen, and continue to track the target.
  • this embodiment may first determine the desired attitude angle of the imaging device and the current attitude angle of the imaging device, and then determine another attitude angle deviation according to the desired attitude angle and the current attitude angle.
  • Determining the desired attitude angle of the imaging device includes:
  • S1601 Acquire the position and movement speed of the target relative to the imaging device at the last moment in the picture, and the duration of time that the target is outside the picture;
  • S1602 Determine the predicted position of the target according to the position, the movement speed, and the duration;
  • the predicted position includes: the position of the target under uniform motion and/or decelerated motion.
  • drones can include one or more sensors.
  • the sensors may include inertial sensors, satellite positioning modules.
  • the UAV can obtain the position and speed of the UAV in the world coordinate system through the inertial sensor and/satellite positioning module, that is, the position and speed of the imaging device.
  • the UAV can also obtain the position and speed of the target in the world coordinate system in a variety of ways. According to the position and speed of the imaging device and the target in the world coordinate system, the position and movement speed of the target relative to the imaging device can be obtained.
  • the predicted position may include: a first predicted position and a second predicted position.
  • the first predicted position is the position where the target is assumed to be reached by a uniform motion at the motion speed
  • the second predicted position is the position where the target is assumed to be reached by decelerating motion at the motion speed as the initial speed.
  • the desired attitude angle can be determined analytically or iteratively. Exemplarily, in the iterative manner, it is assumed that the imaging device is at the first desired attitude angle, and when the imaging device is at the first desired attitude angle, the first predicted position overlaps with the desired position.
  • the imaging device is controlled to change the first desired attitude angle, so that the imaging device moves in the target direction Move in the opposite direction until the second predicted position is located in the image of the imaging device, as shown by the dashed box in FIG. 17b, and the first desired attitude angle at this time is taken as the desired attitude angle.
  • another attitude angular deviation and its corresponding another attitude angular velocity can be determined, and the imaging device can be controlled to rotate the other attitude angular deviation at the other attitude angular velocity, so that the target reappears in the image of the imaging device.
  • this embodiment can retrieve the target moved out of the image of the imaging device, so that the target reappears in the screen, the ability to retrieve the target is stronger, and the attitude angular velocity control of the imaging device is smoother and more stable, avoiding the existing
  • the attitude angle of the imaging device changes greatly, the attitude angular velocity is not smooth, and even the target is lost.
  • Another embodiment of the present disclosure provides a target tracking method.
  • the same or similar features of this embodiment and the above-mentioned embodiment will not be repeated here, and only the content that is different from the above-mentioned embodiment will be described below.
  • the target tracking method of this embodiment further includes:
  • S1801 The imaging device recognizes the target in the picture
  • Determining the attitude angular velocity corresponding to the attitude angular deviation according to the recognition result of the target includes:
  • At least one of the parameters in this embodiment includes: a gain coefficient for attitude angle control, and the gain coefficient has a value range.
  • the gain coefficient is set to the minimum value of the value range.
  • the gain coefficient can be gradually increased from the minimum value until the maximum value of the value range is reached. If during the increase of the gain coefficient, or after the gain coefficient reaches the maximum value, the target is located in the picture but the imaging device does not recognize the target, or the target moves from the picture to the outside of the picture, then the gain coefficient is reduced until Reached the minimum.
  • the above process of increasing and decreasing the gain coefficient is directed to one or both of the heading angle controller and the pitch angle controller.
  • the value range of the gain coefficient can be determined according to the effect of target tracking. In some examples, the value range may be 0.1-1. During the increase and decrease of the gain coefficient, the change speed of the gain coefficient can also be determined according to the effect of target tracking. In some examples, the rate of change of the gain coefficient may be 0.1-0.5/sec, for example, the gain coefficient may be 0.15/sec.
  • the gain coefficient is equivalent to the proportional coefficient of the attitude angle controller. After the gain coefficient is applied to the attitude angle controller, it is equivalent to adjusting the input and output curve of the attitude angle controller as a whole. For example, after the proportional controller shown in Figure 10 is multiplied by the gain coefficient, its input and output curves can be enlarged or reduced as a whole.
  • the target tracking method of this embodiment further includes:
  • the upper limit value is determined according to at least one of the following speed thresholds:
  • the first speed threshold of the imaging device on the attitude axis
  • the second speed threshold determined by the attitude angle range of the target relative to the imaging device and the exposure time of the imaging device
  • the third velocity threshold is determined by the attitude angle range of the target relative to the imaging device and the recognition period of the imaging device to the target.
  • the attitude axis may include: a yaw axis and a pitch axis.
  • the first speed threshold refers to the maximum angular velocity of the imaging device on the yaw axis and the pitch axis.
  • the maximum angular velocity is usually limited by the attitude angular velocity range of the vehicle and the UAV.
  • the maximum angular velocity is usually limited by the attitude angular velocity range of the UAV.
  • the imaging device can automatically set the exposure time according to the ambient brightness, or the exposure time can also be set by the user.
  • the user can set the exposure time through the remote control, and the remote control sends the exposure time to the drone, and the drone sets the exposure time of the imaging device to the exposure time set by the user.
  • the fourth scale factor can be determined according to the attitude angle control effect of the imaging device. In some examples, the fourth scale factor may be 0.01-1, for example, the fourth scale factor may be 0.1. Determining the upper limit value of the attitude angular velocity according to the second velocity threshold can avoid the excessively fast attitude angular velocity of the imaging device, causing motion blur in the image of the imaging device and causing target recognition failure.
  • the third velocity threshold Dividing the target's attitude angle range relative to the imaging device by the recognition period, and multiplying it by the fifth scale factor, the third velocity threshold can be obtained.
  • the target recognition cycle refers to the time required for the imaging device to recognize a target.
  • the fifth scale factor can be determined according to the attitude angle control effect of the imaging device. In some examples, the fifth scale factor may be 0.01-1, for example, the fifth scale factor may be 0.1. Determining the upper limit value of the attitude angular velocity according to the third velocity threshold can prevent the excessively fast attitude angular velocity of the imaging device from causing the target position in the picture to change too quickly, causing the target recognition to fail.
  • the minimum value of the first speed threshold, the second speed threshold, and the third speed threshold may be used as the upper limit value of the attitude angular velocity.
  • This embodiment is not limited to this, and any one of the speed thresholds or the average of the three speed thresholds may be used as the upper limit of the attitude angular velocity.
  • the attitude axis may include: the heading axis, the attitude angle range includes: the heading angle range, the attitude angular velocity includes: the heading angular velocity; and/or; the attitude axis includes: the pitch axis, the attitude angle range includes: the pitch angle range, the attitude angular velocity includes: pitch Angular velocity.
  • Another embodiment of the present disclosure provides a target tracking method.
  • the same or similar features of this embodiment and the above-mentioned embodiment will not be repeated here, and only the content that is different from the above-mentioned embodiment will be described below.
  • the target tracking method of this embodiment further includes:
  • S2101 Acquire the distance and movement speed of the target relative to the imaging device
  • S2102 Determine the attitude angle of the connection line between the target and the imaging device according to the current position
  • S2103 Determine the feedforward value of the attitude angular velocity according to the distance, the movement speed, and the attitude angle.
  • drones can include one or more sensors.
  • the sensors may include inertial sensors, satellite positioning modules.
  • the UAV can obtain the position and speed of the UAV in the world coordinate system through the inertial sensor and/satellite positioning module, that is, the position and speed of the imaging device.
  • the UAV can also obtain the position and speed of the target in the world coordinate system in a variety of ways. According to the position and speed of the imaging device and the target in the world coordinate system, the distance and movement speed of the target relative to the imaging device can be obtained.
  • the feedforward value of the attitude angular velocity can be determined according to the distance, movement speed and pitch angle.
  • the speed of the target along the attitude axis of the imaging device can be obtained according to the movement speed, and then the feedforward value can be obtained according to the speed, distance and pitch angle along the attitude axis of the imaging device, and the feedforward value is superimposed on the attitude angular velocity determined in S103.
  • the movement speed is first decomposed into a first speed along the yaw axis of the imaging device, a second speed along the pitch axis of the imaging device, and a third speed along the roll axis of the imaging device.
  • the first speed is converted into the heading angular velocity of the target relative to the imaging device according to the distance of the target relative to the imaging device, and the heading angular velocity is obtained according to the pitch angle of the connection line between the target and the imaging device. Feed.
  • the pitch angle controller After the second speed is obtained, the second speed is converted into the pitch angular velocity of the target relative to the imaging device according to the distance of the target relative to the imaging device, and the second pitch angular velocity is used as the pitch angular velocity feedforward.
  • attitude angular velocity feedforward to the attitude angle controller according to the distance and movement speed of the target relative to the imaging device.
  • attitude angular velocity feedforward the real-time attitude control is ensured, the target tracking effect is improved, and the prior art can be avoided.
  • the posture tracking lag causes the defect that the target is lost.
  • Yet another embodiment of the present disclosure also provides a target tracking device, as shown in FIG. 22, including:
  • Memory used to store executable instructions
  • the processor is configured to execute the executable instructions stored in the memory to perform the following operations:
  • the imaging device is controlled to rotate the attitude angular deviation at the attitude angular velocity, so that the target is at the desired position in the screen.
  • the attitude angle deviation includes a heading angle deviation, and the attitude angular velocity includes: a heading angular velocity; and/or; the attitude angle deviation includes a pitch angle deviation, and the attitude angular velocity includes a pitch angle velocity.
  • the target tracking device of this embodiment can execute the operations, steps, and processes described in any of the foregoing embodiments.
  • the imaging device is mounted on a movable carrier through a carrier; the processor is further configured to perform the following operations: controlling the rotation of the carrier and/or the movable carrier so that the imaging device is The heading angular velocity rotates the heading angle deviation; and/or; the rotation of the vehicle and/or the movable carrier is controlled so that the imaging device rotates the pitch angle deviation at the pitch angle speed.
  • the processor is further configured to perform the following operations: determine at least one control parameter according to the current position and the current size; obtain the attitude according to the attitude angle deviation and at least one of the control parameters Angular velocity.
  • the processor is further configured to perform the following operations: determine the field of view angle of the target according to the current position and the current size; determine the difference between the target and the imaging device according to the current position At least one of the control parameters is obtained according to at least one of the field of view angle and the attitude angle.
  • the processor is further configured to perform the following operations: obtain an attitude angle of an identification frame relative to the imaging device, where the identification frame is used to identify the current position and the current size;
  • the attitude angle of the frame relative to the imaging device obtains the angle of view.
  • the attitude angle of the identification frame relative to the imaging device includes: the heading angle of the identification frame relative to the imaging device, and the field of view angle includes: the target relative to the imaging device And/or; the attitude angle of the marking frame relative to the imaging device includes: the pitch angle of the marking frame relative to the imaging device, and the field of view angle includes: the target relative to The pitch angle range of the imaging device.
  • the attitude angle includes: a pitch angle; the processor is further configured to perform the following operations: obtain a pitch angle of an identification frame relative to the imaging device, and the identification frame is used to identify the current position and The current size; determine the pitch angle of the target relative to the imaging device according to the pitch angle of the identification frame relative to the imaging device; obtain the current pitch angle of the imaging device; according to the target relative to the The pitch angle of the imaging device and the current pitch angle are used to obtain the pitch angle of the connection line between the target and the imaging device.
  • At least one of the parameters includes: a gain reduction threshold and/or a dead zone threshold.
  • the gain attenuation threshold is positively correlated with the field of view angle and/or the attitude angle.
  • the field of view includes: the target relative to the The heading angle range of the imaging device, and the attitude angle includes: a pitch angle;
  • the field of view includes: the range of the pitch angle of the target relative to the imaging device.
  • the processor is further configured to perform the following operations: determine the dead zone threshold according to the attitude angle.
  • the attitude angle includes: a pitch angle; the processor is further configured to perform the following operations: when the absolute value of the pitch angle is greater than a threshold, control the dead zone threshold to be positively correlated with the pitch angle When the absolute value of the pitch angle is less than or equal to the threshold, the dead zone threshold is set to a preset value; wherein the dead zone threshold is a dead zone threshold for heading angle control.
  • the processor is further configured to perform the following operations: set the dead zone threshold to a preset value, and the dead zone threshold is a dead zone threshold for pitch angle control.
  • the processor is further configured to perform the following operations: the imaging device recognizes the target in the screen; and determines the attitude angular velocity corresponding to the attitude angular deviation according to the recognition result of the target.
  • the processor is further configured to perform the following operations: determine at least one control parameter according to the recognition result of the target; and obtain the attitude angular velocity according to the attitude angular deviation and at least one of the control parameters.
  • At least one of the parameters includes: a gain coefficient for heading angle control and/or pitch angle control; the gain coefficient has a value range; the processor is further configured to perform the following operations: When the target is located in the screen and the target is recognized, the gain coefficient is increased within the value range; when the target is located in the screen but the target is not recognized, or the target is located When outside the screen, reduce the gain coefficient within the value range.
  • the processor is further configured to perform the following operations: obtain the distance and movement speed of the target relative to the imaging device; determine the connection line between the target and the imaging device according to the current position The attitude angle; the feedforward value of the attitude angular velocity is determined according to the distance, the movement speed and the attitude angle.
  • the processor is further configured to perform the following operations: obtain the speed along the attitude axis of the imaging device according to the motion speed; obtain the front view according to the speed, the distance, and the attitude angle. Feed value.
  • the attitude angle includes a pitch angle
  • the attitude angular velocity includes a pitch angular velocity
  • the attitude axis includes a pitch axis
  • the attitude angle includes a heading angle
  • the attitude angular velocity includes : Heading angular velocity
  • the attitude axis includes: heading axis.
  • the processor is further configured to perform the following operations: determine the upper limit of the attitude angular velocity; when the attitude angular velocity exceeds the upper limit, control the imaging device to set the upper limit of the angular velocity The value rotates the attitude angle deviation.
  • the processor is further configured to perform the following operations: determine the upper limit value according to the following speed threshold: the first speed threshold of the imaging device on the attitude axis, and the target relative to the imaging device The attitude angle range and the second speed threshold determined by the exposure time of the imaging device, the second velocity threshold determined by the attitude angle range of the target relative to the imaging device and the recognition period of the imaging device for the target Three speed thresholds.
  • the upper limit value of the angular velocity is the smallest one of the first velocity threshold, the second velocity threshold, and the third velocity threshold.
  • the attitude axis includes: a heading axis, the attitude angle range includes: a heading angle range, the attitude angular velocity includes: a heading angular velocity; and/or; the attitude axis includes: a pitch axis, the attitude The angular range includes the pitch angle range, and the attitude angular velocity includes the pitch angular velocity.
  • the processor is further configured to perform the following operations: determine a desired attitude angle of the imaging device according to the desired position and the current position; obtain the current attitude angle of the imaging device; The attitude angle and the current attitude angle determine the attitude angle deviation.
  • the processor is further configured to perform the following operations: determine the attitude angle of the line between the target and the imaging device according to the current position; determine the target relative to the target according to the desired position The attitude angle deviation of the imaging device; and the desired attitude angle is obtained according to the attitude angle and the attitude angle deviation.
  • the current attitude angle includes: a current heading angle, the desired attitude angle includes a desired heading angle, the attitude angle includes a heading angle, and the attitude angle deviation includes a heading angle deviation; and/or ;
  • the current attitude angle includes: a current pitch angle, the desired attitude angle includes: a desired pitch angle, the attitude angle includes: a pitch angle, and the attitude angle deviation includes: a pitch angle deviation.
  • the processor is further configured to perform the following operations: when the target in the image of the imaging device moves out of the image of the imaging device, determine another attitude angle deviation; The position and the current size determine another attitude angular velocity corresponding to the other attitude angular deviation; control the imaging device to rotate the other attitude angular deviation at the other attitude angular velocity, so that the target reappears at In the picture of the imaging device.
  • the processor is further configured to perform the following operations: determine the desired attitude angle of the imaging device; obtain the current attitude angle of the imaging device; determine the desired attitude angle and the current attitude angle Describe another attitude angle deviation.
  • the processor is further configured to perform the following operations: obtain the position and movement speed of the target relative to the imaging device at the last moment in the picture, and the position and movement speed of the target located outside the picture. Duration; determine the predicted position of the target according to the position, the movement speed and the duration; obtain the desired attitude angle according to the predicted position.
  • the predicted position includes: the position of the target under a uniform motion and/or under a decelerating motion.
  • the current attitude angle includes a current heading angle
  • the desired attitude angle includes a desired heading angle
  • the other attitude angle deviation includes a heading angle deviation
  • the another attitude angular velocity includes: a heading angular velocity And/or
  • the current attitude angle includes: a current pitch angle
  • the desired attitude angle includes: a desired pitch angle
  • the other attitude angle deviation includes a pitch angle deviation
  • the another attitude angular velocity includes: a pitch angle velocity .
  • Another embodiment of the present disclosure further provides a computer-readable storage medium, which stores executable instructions, and when the executable instructions are executed by one or more processors, the one or more processors The target tracking method of the above-mentioned embodiment is executed.
  • the computer-readable storage medium may be any medium that can contain, store, transmit, propagate, or transmit instructions.
  • a readable storage medium may include, but is not limited to, an electric, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, device, or propagation medium.
  • Specific examples of readable storage media include: magnetic storage devices, such as magnetic tape or hard disk (HDD); optical storage devices, such as optical disks (CD-ROM); memory, such as random access memory (RAM) or flash memory; and/or wired /Wireless communication link.
  • the computer program may be configured to have, for example, computer program code including computer program modules. It should be noted that the division method and number of modules are not fixed. Those skilled in the art can use appropriate program modules or program module combinations according to actual conditions. When these program module combinations are executed by a computer (or processor), the computer The flow of the simulation method of the drone described in the present disclosure and its variants can be executed.
  • Yet another embodiment of the present disclosure also provides a movable platform, including: a movable carrier, an imaging device, and a carrier.
  • the imaging device is installed on the movable carrier, or installed on the movable carrier through the carrier.
  • the movable carrier includes: unmanned aerial vehicle, unmanned vehicle, unmanned ship or robot.
  • the carrier includes: a pan-tilt with at least one rotational degree of freedom.
  • the movable carrier includes the target tracking device of the above-mentioned embodiment.
  • the target tracking device can control the rotation of the carrier and/or the movable carrier to adjust the attitude angle of the imaging device.
  • Yet another embodiment of the present disclosure further provides an imaging platform, including: a carrier and an imaging device; the carrier includes: the target tracking device of the above-mentioned embodiment.
  • the target tracking device can control the rotation of the carrier to adjust the attitude angle of the imaging device.
  • the carrier can be an imaging platform including a handheld PTZ and PTZ camera.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Studio Devices (AREA)

Abstract

一种目标跟踪方法、目标跟踪装置、计算机可读存储介质、可移动平台和成像平台,目标跟踪方法包括:获取成像设备的画面中的目标的当前位置和当前尺寸;获取所述目标在所述画面中的期望位置,并根据所述期望位置和所述当前位置确定所述成像设备的姿态角偏差;根据所述当前位置和所述当前尺寸确定所述姿态角偏差对应的姿态角速度;控制所述成像设备以所述姿态角速度旋转所述姿态角偏差,以使所述目标在所述画面中处于所述期望位置。

Description

目标跟踪方法和装置、可移动平台以及成像平台 技术领域
本公开涉及目标跟踪领域,尤其涉及一种目标跟踪方法和装置、可移动平台以及成像平台。
背景技术
诸如无人机的可移动平台可以用于执行对目标的跟踪。这种可移动平台通常包括可移动载体以及成像设备。成像设备通常通过载具安装于可移动载体,载具可以使成像设备相对于可移动载体旋转。
一般的目标跟踪过程,用户可通过遥控器选取成像设备画面中的目标,并指定目标的在画面中的期望位置。可移动平台响应于用户的选取和指定操作,控制可移动载体、载具进行旋转以调节成像设备的姿态,以使目标移动至成像设备画面中的期望位置。
发明内容
本公开实施例提供了一种目标跟踪方法,包括:
获取成像设备的画面中的目标的当前位置和当前尺寸;
获取所述目标在所述画面中的期望位置,并根据所述期望位置和所述当前位置确定所述成像设备的姿态角偏差;
根据所述当前位置和所述当前尺寸确定所述姿态角偏差对应的姿态角速度;
控制所述成像设备以所述姿态角速度旋转所述姿态角偏差,以使所述目标在所述画面中处于所述期望位置。
本公开实施例还提供了一种目标跟踪装置,包括:
存储器,用于存储可执行指令;
处理器,用于执行所述存储器中存储的所述可执行指令,以执行如下操作:
获取成像设备的画面中的目标的当前位置和当前尺寸;
获取所述目标在所述画面中的期望位置,并根据所述期望位置和所述当前位置确定所述成像设备的姿态角偏差;
根据所述当前位置和所述当前尺寸确定所述姿态角偏差对应的姿态 角速度;
控制所述成像设备以所述姿态角速度旋转所述姿态角偏差,以使所述目标在所述画面中处于所述期望位置。
本公开实施例还提供了一种计算机可读存储介质,其存储有可执行指令,所述可执行指令在由一个或多个处理器执行时,可以使所述一个或多个处理器执行上述目标跟踪方法。
本公开实施例还提供了一种可移动平台,包括:可移动载体、成像设备和载具;
所述成像设备安装于所述可移动载体,或者,通过所述载具安装于所述可移动载体;
所述可移动载体包括:上述目标跟踪装置;
所述目标跟踪装置可控制所述载具和/或所述可移动载体旋转,以调整所述成像设备的姿态角。
本公开实施例还提供了一种成像平台,包括:载具和成像设备;所述载具包括:上述目标跟踪装置;
所述目标跟踪装置可控制所述载具旋转,以调整所述成像设备的姿态角。
附图说明
为了更清楚地说明本公开实施例中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本公开实施例目标跟踪方法的流程图。
图2为本公开实施例的可移动平台的结构示意图。
图3显示了本公开实施例的成像设备的画面。
图4显示了本公开实施例的成像设备的另一画面。
图5为确定姿态角偏差对应的姿态角速度的流程图。
图6为确定至少一个控制参数的流程图。
图7为确定目标的视场角的流程图。
图8显示了本公开实施例的成像设备的又一画面。
图9为确定目标与成像设备之间连线的姿态角的流程图。
图10为姿态角控制器的输入输出曲线。
图11为确定姿态角偏差对应的姿态角速度的流程图。
图12为确定成像设备的期望姿态角的流程图。
图13a、图13b和图13c分别显示了解析方式计算期望姿态角的三种场景。
图14显示了成像设备的俯视画面。
图15为本公开另一实施例目标跟踪方法的流程图。
图16为确定成像设备的期望姿态角的流程图。
图17a和图17b分别显示了迭代方式计算期望姿态角的两种场景。
图18为本公开另一实施例目标跟踪方法的流程图。
图19为本公开另一实施例目标跟踪方法的又一流程图。
图20为姿态角控制器的另一输入输出曲线。
图21为本公开又一实施例目标跟踪方法的流程图。
图22为本公开实施例目标跟踪装置的示意图。
具体实施方式
在目标跟踪过程中,用户可在成像设备的画面中指定一期望位置,通过调节成像设备的姿态以使画面中的目标位于该期望位置。一般的目标跟踪方法,通常采用参数固定的控制器对成像设备的姿态角速度进行控制,这种控制方式常常存在如下缺陷:
首先,姿态角和姿态角速度的控制缺乏平滑性与稳定性。
由于姿态角速度与成像设备的焦距正相关,因此,当成像设备的焦距较小时,姿态角速度过慢,当成像设备的焦距较大时,姿态角速度过快。姿态角速度与目标相对于成像设备的俯仰角也正相关,因此,当目标相对于成像设备的俯仰角较小时,成像设备的航向角速度过慢,当目标相对于成像设备的俯仰角较大时,成像设备的航向角速度过快。
当目标在成像设备画面中的尺寸较大时,成像设备对目标的识别不稳定,导致姿态角速度产生较大波动。
当目标相对于成像设备的俯仰角较大时(例如,接近90度),容易引起成像设备航向角的大幅变化,导致成像设备在不同姿态之间来回振荡。
其次,目标所在环境的亮度较低、姿态角出现奇异情况、目标被遮挡、目标相对于成像设备的位置发生变化时,均容易导致目标丢失。
当目标所在环境的亮度较低时,成像设备的曝光时间较长,容易引起画面模糊,导致姿态角速度过快,甚至导致目标丢失。
当目标相对于成像设备的俯仰角较大时(例如,接近90度),成像设备的期望姿态角容易出现奇异情况,导致目标跟踪失败。
当成像设备画面中的目标被其他物体遮挡,之后又重新出现在画面中,成像设备的期望姿态角容易发生较大变化,引起姿态角速度不平滑,甚至导致目标丢失。
当目标相对于成像设备的位置发生变化,例如目标经过成像设备的正下方或正上方时,由于成像设备的航向角发生大幅变化,出现姿态角的控制严重滞后,甚至导致目标丢失。
本公开提供了一种目标跟踪方法、目标跟踪装置、计算机可读存储介质、可移动平台和成像平台。本公开能够使姿态角和姿态角速度的控制更加平滑和稳定,可避免因姿态角的奇异情况导致目标跟踪失败,在目标被遮挡、目标所在环境的亮度较低、以及目标相对于成像设备的位置发生变化时,也能够保持对目标的跟踪。
下面将结合实施例和实施例中的附图,对本公开技术方案进行清楚、完整的描述。显然,所描述的实施例仅仅是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
本公开一实施例提供了一种目标跟踪方法,如图1所示,该目标跟踪方法包括:
S101:获取成像设备的画面中的目标的当前位置和当前尺寸;
S102:获取所述目标在所述画面中的期望位置,并根据所述期望位置和所述当前位置确定所述成像设备的姿态角偏差;
S103:根据所述当前位置和所述当前尺寸确定所述姿态角偏差对应的姿态角速度;
S104:控制所述成像设备以所述姿态角速度旋转所述姿态角偏差,以使所述目标在所述画面中处于所述期望位置。
本实施例的目标跟踪方法可应用于可移动平台、成像平台等各种可对目标进行跟踪的平台。图2示出了一种可移动平台100。可移动平台100包括:可移动载体110、载具140和成像设备130。成像设备130由可移动载体110支撑。成像设备130可以由可移动载体110直接支撑,或者可以经由载具140而被可移动载体110支撑。成像设备130可以用于在成像设备130的视角170内捕捉目标160。一个或多个目标160可以在成像设备130的视角170内。在图2中,虽然可移动载体110被描绘为无人机,但是本公开并不限于此,可以使用任何合适类型的可移动载体,例如但不限于无人机、无人车、无人船、机器人等。
可移动载体110可以包括机身105、以及一个或多个推进单元150。推进单元150可以被配置为可移动载体110产生升力。推进单元150可以包括旋翼。可移动载体110能够在三维空间内飞行,并可沿俯仰轴、偏航轴、横滚轴中的至少一个旋转。可移动载体110的机身105内可包括:飞行控制器、一个或多个处理器、一个或多个存储器、一个或多个传感器、一个或多个通信单元。
可移动平台100可包括一个或多个成像设备130。在一些示例中,成像设备130可以是相机、摄像机。成像设备130可以是可见光成像设备、红外成像设备、紫外成像设备或热成像设备。成像设备130可以通过调节光学变焦级别、数字变焦级别中的至少之一来实现变焦,以调节成像设备130的画面中的目标尺寸。
成像设备130可安装于载具140。载具140可以允许成像设备130围绕俯仰轴、偏航轴、横滚轴中的至少一个旋转。载具140可以包括单轴云台、双轴云台或三轴云台。
可移动平台100可通过遥控器120控制。遥控器120可与可移动载体110、载具140、成像设备130中的至少之一通信。遥控器120包括显示器。显示器用于显示成像设备130的画面。遥控器120还包括输入装置。输入装置可用于接收用户的输入信息。用户的输入信息可以包括成像设备130的画面中目标160的位置。目标160的位置可以包括目标160的当前位置、期望位置。
以下以可移动载体为无人机为例,对本实施例的目标跟踪方法进行说 明。
通过S101获取成像设备的画面中的目标的当前位置和当前尺寸。
无人机在飞行过程中,成像设备可对无人机的周围环境进行成像,并将其画面中的图像传输至无人机的遥控器,遥控器的显示器将成像设备画面中的图像显示给用户。在一种实施方式中,当用户希望对画面中感兴趣的目标进行跟踪时,可通过遥控器的输入装置选取画面中的该目标。例如,用户可对显示器中的目标进行点击或框选以选取该目标。响应于用户的目标选取操作,遥控器向无人机发送与目标选取操作对应的指令。无人机接收到该指令后,成像设备对用户选取的目标进行识别。如图3所示,当目标识别成功后,成像设备的画面中显示一标识该目标轮廓的标识框,该标识框用于标识目标的当前位置和当前尺寸。
在另一种实施方式中,成像设备还可以自动选取并识别希望跟踪的目标。例如,用户可通过遥控器在无人机设置一组目标选取规则,无人机在飞行过程中,成像设备根据该目标选取规定自动选取跟踪的目标。在一个示例中,该目标选取规则可根据以下参数设置:目标类别、目标尺寸、目标颜色、目标距离等。例如,目标类别可包括人、车辆、动物等。当成像设备画面中出现人、车辆、动物时,可自动将其选取为跟踪目标并进行识别。当目标识别成功后通过标识框标识目标。
目标在成像设备画面中的尺寸可由以下因素决定:成像设备的焦距、目标距离。本领域技术人员可以理解,当成像设备焦距较小或目标距离成像设备较远时,目标在画面中的尺寸较小;反之,目标在画面中的尺寸较大。
所述标识框可以包括任何形状的框体,例如,矩形框、方形框、圆形框等,本实施例并不限定其具体形状。
通过S102获取目标在画面中的期望位置,并根据期望位置和当前位置确定成像设备的姿态角偏差。
在一些情况下,目标在画面中并未处于用户期望的位置。这种情况下,用户可调整目标在画面中的位置,通过遥控器指定目标在画面中的期望位置,以将目标移动至期望位置。如图4所示,用户可在遥控器的显示器上进行点击,点击的位置可作为目标的期望位置。响应于用户的期望位置点 击操作,遥控器向无人机发送与期望位置点击操作对应的指令。无人机接收到该指令后,根据该期望位置和目标的当前位置确定成像设备的姿态角偏差。所述成像设备的姿态角偏差指的是成像设备在世界坐标系中的姿态角偏差。世界坐标系可以是:惯性坐标系、地球坐标系、地理坐标系的一种或几种。
可以采用多种方式根据期望位置和当前位置确定成像设备的姿态角偏差,确定成像设备的姿态角偏差的具体方式将在之后详述。
得到目标当前位置和当前尺寸后,通过S103由当前位置和当前尺寸确定姿态角偏差对应的姿态角速度。
本实施例中,姿态角速度与目标的当前位置和当前尺寸相关。不同的目标当前位置或当前尺寸对应不同的姿态角速度。在目标跟踪的过程中,目标可能时刻或间歇性地运动,导致目标在成像设备的画面中的当前位置或当前尺寸发生变化,此时姿态角偏差对应的姿态角速度也随之变化。
本实施例中,根据当前位置和当前尺寸确定姿态角偏差对应的姿态角速度,如图5所示,包括:
S501:根据所述当前位置和所述当前尺寸确定至少一个控制参数;
S502:根据所述姿态角偏差和至少一个所述控制参数,得到所述姿态角速度。
本实施例根据当前位置和当前尺寸确定控制参数。在一些示例中,该控制参数可以是用于控制成像设备的控制器的控制参数。该控制器可以是姿态角控制器。在一些示例中,姿态角包括:航向角、俯仰角、横滚角中的一个或多个。在一些示例中,姿态角可以包括:航向角和俯仰角。成像设备的姿态角控制器可以包括航向角和俯仰角的各种类型的控制器,例如反馈控制器、预测控制器等。在一个示例中,成像设备的姿态角控制器可以包括:航向角的比例控制器(P)、比例积分控制器(PI)和比例积分微分控制器(PID),俯仰角的比例控制器(P)、比例积分控制器(PI)、比例积分微分控制器(PID)等。在本实施例的描述中,除有另外的说明,姿态角均指代航向角和俯仰角两者。
本实施例中,根据当前位置和当前尺寸确定至少一个控制参数,如图6所示,包括:
S601:根据所述当前位置和所述当前尺寸确定所述目标的视场角;
S602:根据所述当前位置确定所述目标与所述成像设备之间连线的姿态角;
S603:根据所述视场角和所述姿态角中的至少之一得到至少一个所述控制参数。
本实施例中,目标的视场角指的是目标相对于成像设备的姿态角范围。根据所述当前位置和所述当前尺寸确定所述目标的视场角,如图7所示,包括:
S701:获取标识框相对于所述成像设备的姿态角;
S702:根据所述标识框相对于所述成像设备的姿态角,得到所述视场角。
首先获取成像设备画面的当前视场角。当前视场角可根据成像设备焦距确定。如下式所示:
Figure PCTCN2020089009-appb-000001
Figure PCTCN2020089009-appb-000002
其中,fov zx表示画面行方向的视场角,fov zy表示画面列方向的视场角,focal_length表示成像设备的当前焦距,W、H分别表示成像设备的图像传感器的宽度和高度。
之后确定标识框的边沿相对于成像设备的姿态角。在一些示例中,标识框可以是矩形框。如图8所示,矩形框的相对于成像设备的姿态角包括:画面行方向的侧边、以及画面列方向的侧边相对于成像设备的姿态角。画面行方向的侧边相对于成像设备的姿态角包括:左侧边和右侧边相对于成像设备的航向角;画面列方向的侧边相对于成像设备的姿态角包括:上侧边和下侧边相对于成像设备的俯仰角。
然后将左侧边和右侧边相对于成像设备的航向角相减,得到目标相对于成像设备的航向角范围;将上侧边和下侧边相对于成像设备的俯仰角相减,得到目标相对于成像设备的俯仰角范围。将目标相对于成像设备的航 向角范围与俯仰角范围作为目标的视场角。
本实施例中,根据所述当前位置确定所述目标与所述成像设备之间连线的姿态角,如图9所示,包括:
S901:获取标识框相对于所述成像设备的俯仰角。
S902:根据所述标识框相对于所述成像设备的俯仰角,确定所述目标相对于所述成像设备的俯仰角;
S903:获取所述成像设备的当前俯仰角;
S904:根据所述目标相对于所述成像设备的俯仰角以及所述当前俯仰角,得到所述目标与所述成像设备之间连线的俯仰角。
获取标识框相对于成像设备的俯仰角的过程,可参见前述S701的获取标识框相对于所述成像设备的姿态角的过程。当得到标识框的上侧边和下侧边相对于成像设备的俯仰角后,将上侧边和下侧边相对于成像设备的俯仰角的中值作为目标相对于成像设备的俯仰角。
成像设备的当前俯仰角指的是成像设备在世界坐标系中的俯仰角。无人机通常设置有姿态传感器,例如惯性传感器;载具通常设置有角度传感器。通过姿态传感器和角度传感器即可得到成像设备在世界坐标系中的俯仰角。
示例性的,所述目标与所述成像设备之间连线可以是成像设备的光心与目标中心的连线,即成像设备的光心与所述标识框中心的连线。目标与成像设备之间连线的俯仰角指的是在世界坐标系下的俯仰角。目标相对于成像设备的俯仰角是目标在成像设备坐标系下的俯仰角,而成像设备的当前俯仰角指的是成像设备在世界坐标系下的当前俯仰角。因此,将目标相对于成像设备的俯仰角叠加至成像设备的当前俯仰角,即可得到目标与成像设备之间连线的俯仰角。
如前所述,成像设备的姿态角控制器可以包括各种类型的控制器,当采用比例控制器时,其输入输出曲线如图10所示,其中,横轴表示姿态角偏差,纵轴表示姿态角速度。
在一些示例中,S502中的控制参数包括:图10中的增益衰减阈值,且增益衰减阈值与所述视场角和/或所述姿态角正相关。
当增益衰减阈值为关于航向角控制的增益衰减阈值,且与视场角和姿 态角正相关时,所述视场角包括:目标相对于成像设备的航向角范围,所述姿态角包括:俯仰角。也即,在增益衰减阈值为关于航向角控制的增益衰减阈值,该增益衰减阈值为根据视场角和姿态角确定。
当增益衰减阈值为关于俯仰角控制的增益衰减阈值,且与视场角正相关时,所述视场角包括:目标相对于成像设备的俯仰角范围。也即,在增益衰减阈值为关于俯仰角控制的增益衰减阈值,该增益衰减阈值为根据视场角确定。
对于航向角控制器的增益衰减阈值,首先将目标相对于成像设备的航向角范围乘以第一比例系数。然后根据目标与成像设备之间连线的俯仰角调整目标相对于成像设备的航向角范围与第一比例系数的乘积结果,得到航向角控制器的增益衰减阈值。其中,第一比例系数可根据目标的识别情况确定。当画面中的标识框稳定时,可适当增加第一比例系数的值;当画面中的标识框不稳定时,可适当减小第一比例系数的值。在一些示例中,第一比例系数可以为1或大于1。在一些示例中,根据目标与成像设备之间连线的俯仰角调整所述乘积结果可以包括:将所述乘积结果除以目标与成像没备之间连线的俯仰角的三角函数值。
将目标相对于成像设备的俯仰角范围乘以第二比例系数,即可得到俯仰角控制器的增益衰减阈值。与第一比例系数类似,第二比例系数同样可根据目标的识别情况确定。在一些示例中,第二比例系数可以为1或大于1。
在一些示例中,S502中的控制参数包括:图10中的死区阈值。根据所述视场角和所述姿态角中的至少之一确定至少一个控制参数,包括:根据目标与成像设备之间连线的姿态角确定所述死区阈值。
在一些示例中,目标与成像设备之间连线的姿态角指的是目标与成像设备之间连线的俯仰角。对于航向角控制器的死区阈值,当该俯仰角的绝对值大于阈值时,控制死区阈值与俯仰角正相关;当该俯仰角的绝对值小于或等于所述阈值时,将死区阈值设为预设值。
在一些示例中,控制死区阈值与俯仰角正相关包括:
将目标与成像设备之间连线的俯仰角绝对值减去该阈值,并将俯仰角绝对值与阈值的差值乘以第三比例系数,再根据目标与成像设备之间连线 的俯仰角调整所述差值与所述第三比例系数的乘积结果,得到航向角控制器的死区阈值。
其中,上述阈值和第三比例系数可根据实际需求确定。例如,所述阈值可以为70度至85度;或者所述阈值可以为80度。在一些示例中,第三比例系数可以为0.1-0.9,例如,所述第三比例系数可以为0.6。在一些示例中,调整所述差值与所述第三比例系数的乘积结果可以包括:将所述乘积结果除以目标与成像设备之间连线的俯仰角的三角函数值。
当目标与成像设备之间连线的俯仰角的绝对值小于或等于所述阈值时,直接将航向角控制器的死区阈值设为第一预设值。第一预设值也可根据实际需求确定。在一些示例中,所述第一预设值可以包括0。
对于俯仰角控制器的死区阈值,可直接将其死区阈值设为第二预设值。第二预设值也可根据实际需求确定,该第二预设值可与所述第一预设值相同或不同。在一些示例中,所述第二预设值可以包括0。
通过上述过程确定至少一个控制参数后,S502根据姿态角偏差和至少一个控制参数得到姿态角速度。具体来说,可将姿态角偏差输入具有至少一个控制参数的姿态角控制器,姿态角控制器输出姿态角偏差对应的姿态角速度。例如,对于图10所示的比例控制器。确定出航向角控制器和俯仰角控制器的增益衰减阈值和死区阈值后,当姿态角偏差包括航向角偏差时,可将航向角偏差输入具有增益衰减阈值和死区阈值的航向角控制器,得到航向角速度。当姿态角偏差包括俯仰角偏差时,可将俯仰角偏差输入具有增益衰减阈值和死区阈值的俯仰角控制器,得到俯仰角速度。当姿态角偏差同时包括航向角偏差和俯仰角偏差,可将航向角偏差和俯仰角偏差分别输入上述航向角控制器和俯仰角控制器,得到航向角速度和俯仰角速度。
得到姿态角速度后,通过S104控制成像设备以姿态角速度旋转姿态角偏差,以使目标在画面中处于期望位置,实现对目标的构图。
当成像设备通过载具安装于无人机时,可通过控制载具旋转,或者,可通过控制无人机旋转,或者,控制载具和无人机两者旋转,以使成像设备以航向角速度旋转航向角偏差、俯仰角速度旋转俯仰角偏差。
当成像设备直接安装于无人机时,可通过控制无人机旋转,以使成像 设备以航向角速度旋转航向角偏差、俯仰角速度旋转俯仰角偏差。
由此可见,本实施例的目标跟踪方法,能够根据目标的当前位置和当前尺寸,动态调节包括增益衰减阈值和死区阈值在内的控制参数,能够适应不同距离、不同尺寸、不同方向的目标,相对于一般的将控制参数设置为固定值的目标跟踪方法,对姿态角和姿态角速度的控制更加平滑、更加稳定,避免了现有技术中由于成像设备的焦距、目标相对于成像设备的俯仰角较大、目标尺寸的变化而导致的姿态角和姿态角速度的控制缺乏平滑性和稳定性的缺陷。
以下介绍根据目标的当前位置和当前尺寸确定姿态角偏差对应的姿态角速度的过程。
如图11所示,确定姿态角偏差对应的姿态角速度包括:
S1101:根据所述期望位置和所述当前位置确定所述成像设备的期望姿态角;
S1102:获取所述成像设备的当前姿态角;
S1103:根据所述期望姿态角和所述当前姿态角确定所述姿态角偏差。
可参照S903的方式得到成像设备的当前姿态角。当前姿态角包括当前航向角和/或当前俯仰角,分别指的是成像设备在世界坐标系中的航向角和俯仰角。通过无人机的姿态传感器和角度传感器即可得到成像设备在世界坐标系中的姿态角。
得到成像设备的期望姿态角以及成像设备的当前姿态角后,将二者相减即可得到姿态角偏差。
以下介绍确定成像设备的期望姿态角的过程。
当目标位于成像设备的画面中且成像设备出目标时,确定所述成像设备的期望姿态角,如图12所示,包括:
S1201:根据所述当前位置确定所述目标与所述成像设备之间连线的姿态角;
S1202:根据所述期望位置确定所述目标相对于所述成像设备的姿态角偏差;
S1203:根据所述姿态角以及所述姿态角偏差,得到所述期望姿态角。
目标与成像设备之间连线的姿态角可以包括:目标与成像设备之间连 线的航向角和/或俯仰角;期望姿态角可以包括:期望航向角和/或期望俯仰角;姿态角偏差可以包括:航向角偏差和俯仰角偏差。
可参照上述S602确定目标与成像设备之间连线的航向角和俯仰角。当目标位于成像设备的画面中且成像设备出目标时,S1202中的期望位置指的是用户通过遥控器指定的期望位置。可通过解析方式或迭代方式计算期望航向角和期望俯仰角。
示例性的,解析方式是通过求解方程的方式计算期望姿态角。当目标相对于成像设备的俯仰角较大时,例如接近90度,方程的解会出现奇异情况。奇异情况包括:无有效解、多个有效解、唯一有效解、无穷解等情况。本实施例通过对奇异情况进行分析来确定期望姿态角。
在一些示例中,如图13a所示,目标相对于成像设备的俯仰角接近90度,虚线箭头①表示目标与成像设备之间连线的俯仰角,用户指定的期望位置是目标位于成像设备的画面列方向的下部。采用解析方式计算期望姿态角偏差出现奇异情况,得到两个解:虚线箭头①与虚线箭头②之间的实线箭头表示的目标相对于成像设备的第一俯仰角偏差、以及虚线箭头①与虚线箭头③之间的实线箭头表示的目标相对于成像设备的第二俯仰角偏差。其中,由于第一俯仰角偏差对应的成像设备的航向角与成像设备的期望航向角相差180度,而第二俯仰角偏差对应的成像设备的航向角与成像设备的期望航向角相等,因此,第一俯仰角偏差为无效解,第二俯仰角偏差为唯一有效解,将第二俯仰角偏差作为S1202得到的俯仰角偏差。
再如图13b所示,目标相对于成像设备的俯仰角接近90度,虚线箭头①表示目标与成像设备之间连线的俯仰角,用户指定的期望位置是目标位于成像设备的画面列方向的上部。采用解析方式计算期望姿态角偏差出现奇异情况,得到两个解:虚线箭头①与虚线箭头②之间的实线箭头表示的目标相对于成像设备的第一俯仰角偏差、以及虚线箭头①与虚线箭头③之间的实线箭头表示的目标相对于成像设备的第二俯仰角偏差。其中,由于第一俯仰角偏差对应的成像设备的航向角与成像设备的期望航向角相差180度,第二俯仰角偏差对应的成像设备的横滚角旋转了180度,因此,两个解均为无效解。在这种情况下,可通过逐渐减小俯仰角偏差的方式,重新确定有效解,该有效解为虚线箭头①与虚线箭头④之间的实线箭头表 示的目标相对于成像设备的第三俯仰角偏差,将第三俯仰角偏差作为S1202得到的俯仰角偏差。
又如图13c所示,目标相对于成像设备的俯仰角接近90度,用户指定的期望位置是目标位于成像设备的画面行方向的左部,采用解析方式计算期望姿态角得到的期望航向角偏差为虚线框①与画面中心之间的第一航向角偏差。但是由于第一航向角偏差与目标相对于成像设备的俯仰角之和大于90度,所以第一航向角偏差为无效解。在这种情况下,可通过逐渐减小航向角偏差的方式,重新确定有效解,该有效解为虚线框②与画面中心之间的第二航向角偏差,将第二航向角偏差作为S1202得到的航向角偏差。
由此可见,本实施例的目标跟踪方法,通过奇异情况分析保证目标位于成像设备的任意方向时,均能得到正确的成像设备姿态角,并使目标位于或尽可能地接近用户指定的期望位置,尤其是当目标相对于成像设备的俯仰角较大时,仍然能够达到上述效果,避免了因成像设备的期望姿态角出现奇异情况而导致目标跟踪失败。
现有技术的目标跟踪方法,通常将目标的当前位置与期望位置之间的差值直接转换为成像设备的姿态角偏差。这种方式得到的姿态角偏差与实际的姿态角偏差之间存在较大误差。如图14所示的成像设备的俯视画面,假设标识框所在为目标的当前位置,画面中心为目标的期望位置。这种情况下,现有技术得到的航向角偏差小于实际的航向角偏差,俯仰角偏差大于实际的俯仰角偏差。所以,现有技术的目标跟踪方法,目标的最终位置将不可避免的与期望位置有所偏离。而本实施例是先根据目标的期望位置和当前位置确定成像设备的期望姿态角,再根据期望姿态角和成像设备的当前姿态角确定姿态角偏差。对于图14所示的俯视画面,本实施例得到的航向角偏差和俯仰角偏差均等于实际的航向角偏差和俯仰角偏差,不存在误差。相比于现有技术的目标跟踪方法,得到的姿态角偏差精度更高,能够更精准地将目标移动至期望位置,提高目标跟踪的精度和构图效果。
以上结合图10中的比例控制器对本实施例的目标跟踪方法进行了说明,当本实施例的控制器采用其他控制器,例如比例积分控制器、比例积分微分控制器等,同样可以达到以上技术效果。
需要说明的是,本实施例描述的所有的步骤序号(例如,S101、S102、S103、S104等)仅是为了描述方便,其仅是用于指代某个步骤,并不是对步骤顺序的限定。实际上,本实施例的描述的S101-S104至S1201-S1203的各个步骤可以以任意顺序执行。例如,对于相互之间没有依赖性的两个步骤,可以先后执行也可以并行执行,先后执行的顺序也不加以限定。
本公开另一实施例提供了一种目标跟踪方法。为简要起见,本实施例与上一实施例相同或相似的特征不再赘述,以下仅描述其不同于上一实施例的内容。
本实施例的目标跟踪方法,如图15所示,还包括:
S1501:当所述成像设备的画面中的所述目标运动至所述成像设备的画面外时,确定另一姿态角偏差;
S1502:根据所述当前位置和所述当前尺寸确定所述另一姿态角偏差对应的另一姿态角速度;
S1503:控制所述成像设备以所述另一姿态角速度旋转所述另一姿态角偏差,以使所述目标重新出现在所述成像设备的画面中。
当成像设备对目标进行跟踪时,目标有可能会被其他物体遮挡。例如,当目标为道路上运动的车辆时,画面中的车辆可能会被道路两旁的路牌、路灯等物体遮挡。由于此类目标是相对于成像目标运动的,所以当遮挡车辆的物体移出成像设备画面时,目标可能已经位于画面外。本实施例的目标跟踪方法可以找回位于画面外的目标,使目标重新出现在画面中,并继续保持对目标的跟踪。
与上一实施例类似,本实施例可先确定成像设备的期望姿态角、以及成像设备的当前姿态角,再根据期望姿态角和当前姿态角确定另一姿态角偏差。
确定成像设备的期望姿态角,如图16所示,包括:
S1601:获取所述目标在所述画面中的最后时刻相对于所述成像设备的位置和运动速度、以及所述目标位于所述画面外的持续时间;
S1602:根据所述位置、所述运动速度和所述持续时间,确定所述目标的预测位置;
S1603:根据所述预测位置得到所述期望姿态角。其中,所述预测位 置包括:所述目标在匀速运动下和/或减速运动下的位置。
可通过多种方式获取目标在画面中的最后时刻相对于成像设备的位置和运动速度。如前所述,无人机可包括一个或多个传感器。在一些示例中,传感器可包括惯性传感器、卫星定位模块。无人机可通过惯性传感器和/卫星定位模块得到无人机在世界坐标系的位置和速度,即成像设备的位置和速度。无人机还可通过多种方式得到目标在世界坐标系的位置和速度。根据成像设备和目标在世界坐标系的位置和速度,即可得到目标相对于成像设备的位置和运动速度。
之后可根据当前时刻和最后时刻确定目标位于画面外的持续时间,并根据位置、运动速度和持续时间确定目标的预测位置。预测位置可以包括:第一预测位置和第二预测位置。其中,第一预测位置是假设目标以该运动速度做匀速运动到达的位置,第二预测位置是假设目标以该运动速度为初始速度,做减速运动到达的位置。
可通过解析方式或迭代方式确定期望姿态角。示例性的,在迭代方式中,假设成像设备处于第一期望姿态角,使成像设备处于第一期望姿态角时,第一预测位置与期望位置重叠。
判断此时第二预测位置是否位于成像设备的画面中。如图17a所示,当第二预测位置位于成像设备画面中时,则将第一期望姿态角作为期望姿态角。而当第二预测位置没有位于画面中,而是位于成像设备的画面外时,如图17b中的实线框所示,则控制成像设备改变第一期望姿态角,使成像设备向目标运动方向相反的方向运动,直到第二预测位置位于成像设备画面中,如图17b中的虚线框所示,将此时的第一期望姿态角作为期望姿态角。
得到期望姿态角后,可确定另一姿态角偏差及其对应的另一姿态角速度,并控制成像设备以另一姿态角速度旋转另一姿态角偏差,以使目标重新出现在成像设备的画面中。
由此可见,本实施例可找回移出成像设备画面外的目标,使目标重新出现在画面中,目标找回的能力更强,并且成像设备的姿态角速度控制更加平滑和稳定,避免了现有技术找回目标时成像设备的姿态角变化较大,姿态角速度不平滑,甚至目标丢失的缺陷。
本公开另一实施例提供了一种目标跟踪方法。为简要起见,本实施例与上述实施例相同或相似的特征不再赘述,以下仅描述其不同于上述实施例的内容。
本实施例的目标跟踪方法,如图18所示,还包括:
S1801:所述成像设备对所述画面中的所述目标进行识别;
S1802:根据所述目标的识别结果确定所述姿态角偏差对应的姿态角速度。
成像设备对画面中的目标进行识别的过程可参见前述S101。根据所述目标的识别结果确定所述姿态角偏差对应的姿态角速度,包括:
根据所述目标的识别结果确定至少一个控制参数;
根据所述姿态角偏差和至少一个所述控制参数,得到所述姿态角速度。
本实施例中的至少一个所述参数包括:姿态角控制的增益系数,且该增益系数具有一取值范围。
在目标跟踪过程中,当成像设备开始对目标进行识别时,将增益系数设置为该取值范围的最小值。当目标位于成像设备的画面中,并且成像设备持续将目标识别出时,可从最小值逐渐增大增益系数,直至达到该取值范围的最大值。如果在增益系数的增大过程中,或者,增益系数达到最大值之后,目标位于画面中,但成像设备未识别出目标,或者,目标从画面中运动至画面外,则减小增益系数,直至达到最小值。其中,上述增大和减小增益系数的过程针对航向角控制器、俯仰角控制器中的一个或二者。
增益系数的取值范围可根据目标跟踪的效果确定。在一些示例中,取值范围可以是0.1-1。在增益系数的增大和减小过程中,增益系数的变化速度也可根据目标跟踪的效果确定。在一些示例中,增益系数的变化速度可以是0.1-0.5/秒,例如,增益系数可以是0.15/秒。
增益系数相当于是姿态角控制器的比例系数。将增益系数施加至姿态角控制器后,相当于整体调整姿态角控制器的输入输出曲线。例如,图10所示的比例控制器乘以增益系数后其输入输出曲线可整体扩大或缩小。
本实施例的目标跟踪方法,如图19所示,还包括:
S1901:确定所述姿态角速度的上限值;
S1902:当所述姿态角速度超过所述上限值时,控制所述成像设备以 所述角速度上限值旋转所述姿态角偏差。
在一些示例中,根据以下速度阈值中的至少之一确定上限值:
成像设备在姿态轴的第一速度阈值;
由目标相对于成像设备的姿态角范围以及成像设备的曝光时间所确定的第二速度阈值;
由目标相对于成像设备的姿态角范围以及成像设备对目标的识别周期所确定的第三速度阈值。
姿态轴可以包括:航向轴和俯仰轴。第一速度阈值指的是成像设备在航向轴和俯仰轴的角速度最大值。当成像设备通过载具安装于无人机时,角速度最大值通常受限于载具和无人机的姿态角速度范围。当成像设备直接安装于无人机时,角速度最大值通常受限于无人机的姿态角速度范围。
将目标相对于成像设备的姿态角范围除以曝光时间,再乘以第四比例系数,可以得到第二速度阈值。其中,成像设备可根据环境亮度自动设置曝光时间,或者,也可以由用户设置曝光时间。用户可通过遥控器设置曝光时间,遥控器将曝光时间发送至无人机,无人机将成像设备的曝光时间设置为用户设置的曝光时间。第四比例系数的可根据成像设备的姿态角控制效果确定。在一些示例中,第四比例系数可以为0.01-1,例如,第四比例系数可以为0.1。根据第二速度阈值确定姿态角速度的上限值,可避免因成像设备的姿态角速度过快,导致成像设备的画面出现运动模糊,造成目标识别失败。
将目标相对于成像设备的姿态角范围除以识别周期,再乘以第五比例系数,可以得到第三速度阈值。目标的识别周期是指成像设备识别出一个目标所需要的时间。第五比例系数的可根据成像设备的姿态角控制效果确定。在一些示例中,第五比例系数可以为0.01-1,例如,第五比例系数可以为0.1。根据第三速度阈值确定姿态角速度的上限值,可避免因成像设备姿态角速度过快,导致画面中的目标位置变化过快,造成目标识别失败。
在一些示例中,可将第一速度阈值、所述第二速度阈值和所述第三速度阈值中的最小值作为姿态角速度的上限值。本实施例并不以此为限,还可以将其中任一个速度阈值、或三个速度阈值的平均值作为姿态角速度的上限值。
本实施例的S1901和S1902针对航向角控制器、俯仰角控制器中的一个或二者。即姿态轴可以包括:航向轴,姿态角范围包括:航向角范围,姿态角速度包括:航向角速度;和/或;姿态轴包括:俯仰轴,姿态角范围包括:俯仰角范围,姿态角速度包括:俯仰角速度。
通过根据目标识别结果动态调节增益系数,能够避免成像设备的姿态角出现大幅变化。尤其在成像设备开始识别目标和目标被遮挡后重新出现在画面的情况下,能有效避免成像设备的姿态角速度过快,使成像设备的姿态控制过程更加平滑和稳定。通过确定姿态角速度的上限值,并控制成像设备在上限值内旋转姿态角偏差,如图20所示,相当于在图10所示的比例控制器限定了输出最大值,使得姿态角偏差对应的姿态角速度不会超过上限值,同样可避免成像设备的姿态角速度过快,能使成像设备的姿态控制过程更加平滑和稳定。避免了现有技术中由于环境亮度低、目标被遮挡等原因导致目标丢失的缺陷。
本公开又一实施例提供了一种目标跟踪方法。为简要起见,本实施例与上述实施例相同或相似的特征不再赘述,以下仅描述其不同于上述实施例的内容。
本实施例的目标跟踪方法,如图21所示,还包括:
S2101:获取所述目标相对于所述成像设备的距离和运动速度;
S2102:根据所述当前位置确定所述目标与所述成像设备之间连线的姿态角;
S2103:根据所述距离、所述运动速度和所述姿态角确定所述姿态角速度的前馈值。
可通过多种方式得到目标相对于成像设备的距离和运动速度。如前所述,无人机可包括一个或多个传感器。在一些示例中,传感器可包括惯性传感器、卫星定位模块。无人机可通过惯性传感器和/卫星定位模块得到无人机在世界坐标系的位置和速度,即成像设备的位置和速度。无人机还可通过多种方式得到目标在世界坐标系的位置和速度。根据成像设备和目标在世界坐标系的位置和速度,即可得到目标相对于成像设备的距离和运动速度。
根据所述当前位置确定所述目标与所述成像设备之间连线的姿态角 的过程,可参见上述实施例的S602。通过S2102可得到目标与成像设备之间连线的俯仰角。
得到目标相对于成像设备的距离和运动速度、目标与成像设备之间连线的俯仰角后,可根据距离、运动速度和俯仰角确定姿态角速度的前馈值。其中,可根据运动速度得到目标沿成像设备姿态轴的速度,再根据沿成像设备姿态轴的速度、距离和俯仰角得到前馈值,并将前馈值叠加至S103确定的姿态角速度。
具体来说,首先将运动速度分解为沿成像设备航向轴的第一速度、沿成像设备俯仰轴的第二速度和沿成像设备横滚轴的第三速度。
对于航向角控制器,再根据目标相对于成像设备的距离将第一速度转换为目标相对于成像设备的航向角速度,并根据目标与成像设备之间连线的俯仰角将航向角速度得到航向角速度前馈。
对于俯仰角控制器,得到第二速度后,再根据目标相对于成像设备的距离将第二速度转换为目标相对于成像设备的俯仰角速度,并将第二俯仰角速度作为俯仰角速度前馈。
一般的目标跟踪方法,当目标相对于成像设备的位置发生变化,容易出现跟踪滞后的情况。例如,当目标经过成像设备的正下方或正上方由成像设备的一侧运动至另一侧时,由于成像设备的航向角发生大幅变化,会导致姿态角控制严重滞后,甚至目标丢失。本实施例能够根据目标相对于成像设备的距离和运动速度向姿态角控制器施加姿态角速度前馈,通过姿态角速度前馈保证了姿态控制的实时性,提高了目标跟踪效果,可避免现有技术的姿态跟踪滞后而导致目标丢失的缺陷。
本公开再一实施例还提供了一种目标跟踪装置,如图22所示,包括:
存储器,用于存储可执行指令;
处理器,用于执行所述存储器中存储的所述可执行指令,以执行如下操作:
获取成像设备的画面中的目标的当前位置和当前尺寸;
获取所述目标在所述画面中的期望位置,并根据所述期望位置和所述当前位置确定所述成像设备的姿态角偏差;
根据所述当前位置和所述当前尺寸确定所述姿态角偏差对应的姿态 角速度;
控制所述成像设备以所述姿态角速度旋转所述姿态角偏差,以使所述目标在所述画面中处于所述期望位置。
所述姿态角偏差包括:航向角偏差,所述姿态角速度包括:航向角速度;和/或;所述姿态角偏差包括:俯仰角偏差,所述姿态角速度包括:俯仰角速度。
本实施例的目标跟踪装置可以执行上述任一实施例所述的操作、步骤、过程。
在一些示例中,所述成像设备通过载具安装于可移动载体;所述处理器还用于执行以下操作:控制所述载具和/或所述可移动载体旋转,使所述成像设备以所述航向角速度旋转所述航向角偏差;和/或;控制所述载具和/或所述可移动载体旋转,使所述成像设备以所述俯仰角速度旋转所述俯仰角偏差。
在一些示例中,所述处理器还用于执行以下操作:根据所述当前位置和所述当前尺寸确定至少一个控制参数;根据所述姿态角偏差和至少一个所述控制参数,得到所述姿态角速度。
在一些示例中,所述处理器还用于执行以下操作:根据所述当前位置和所述当前尺寸确定所述目标的视场角;根据所述当前位置确定所述目标与所述成像设备之间连线的姿态角;根据所述视场角和所述姿态角中的至少之一得到至少一个所述控制参数。
在一些示例中,所述处理器还用于执行以下操作:获取标识框相对于所述成像设备的姿态角,所述标识框用于标识所述当前位置和所述当前尺寸;根据所述标识框相对于所述成像设备的姿态角,得到所述视场角。
在一些示例中,所述标识框相对于所述成像设备的姿态角包括:所述标识框相对于所述成像设备的航向角,所述视场角包括:所述目标相对于所述成像设备的航向角范围;和/或;所述标识框相对于所述成像设备的姿态角包括:所述标识框相对于所述成像设备的俯仰角,所述视场角包括:所述目标相对于所述成像设备的俯仰角范围。
在一些示例中,所述姿态角包括:俯仰角;所述处理器还用于执行以下操作:获取标识框相对于所述成像设备的俯仰角,所述标识框用于标识 所述当前位置和所述当前尺寸;根据所述标识框相对于所述成像设备的俯仰角,确定所述目标相对于所述成像设备的俯仰角;获取所述成像设备的当前俯仰角;根据所述目标相对于所述成像设备的俯仰角以及所述当前俯仰角,得到所述目标与所述成像设备之间连线的俯仰角。
在一些示例中,至少一个所述参数包括:增益衰减阈值和/或死区阈值。
在一些示例中,所述增益衰减阈值与所述视场角和/或所述姿态角正相关。
在一些示例中,在所述增益衰减阈值为关于航向角控制的增益衰减阈值,且与所述视场角和所述姿态角正相关时,所述视场角包括:所述目标相对于所述成像设备的航向角范围,所述姿态角包括:俯仰角;和/或;在所述增益衰减阈值为关于俯仰角控制的增益衰减阈值,且与所述视场角正相关时,所述视场角包括:所述目标相对于所述成像设备的俯仰角范围。
在一些示例中,所述处理器还用于执行以下操作:根据所述姿态角确定所述死区阈值。
在一些示例中,所述姿态角包括:俯仰角;所述处理器还用于执行以下操作:当所述俯仰角的绝对值大于阈值时,控制所述死区阈值与所述俯仰角正相关;当所述俯仰角的绝对值小于或等于所述阈值时,将所述死区阈值设为预设值;其中,所述死区阈值为关于航向角控制的死区阈值。
在一些示例中,所述处理器还用于执行以下操作:将所述死区阈值设为预设值,所述死区阈值为关于俯仰角控制的死区阈值。
在一些示例中,所述处理器还用于执行以下操作:所述成像设备对所述画面中的所述目标进行识别;根据所述目标的识别结果确定所述姿态角偏差对应的姿态角速度。
在一些示例中,所述处理器还用于执行以下操作:根据所述目标的识别结果确定至少一个控制参数;根据所述姿态角偏差和至少一个所述控制参数,得到所述姿态角速度。
在一些示例中,至少一个所述参数包括:关于航向角控制和/或俯仰角控制的增益系数;所述增益系数具有一取值范围;所述处理器还用于执行以下操作:当所述目标位于所述画面中且识别出所述目标时,在所述取值范围内增大所述增益系数;当所述目标位于所述画面中但未识别出所述目 标,或者所述目标位于所述画面外时,在所述取值范围内减小所述增益系数。
在一些示例中,所述处理器还用于执行以下操作:获取所述目标相对于所述成像设备的距离和运动速度;根据所述当前位置确定所述目标与所述成像设备之间连线的姿态角;根据所述距离、所述运动速度和所述姿态角确定所述姿态角速度的前馈值。
在一些示例中,所述处理器还用于执行以下操作:根据所述运动速度得到沿所述成像设备的姿态轴的速度;根据所述速度、所述距离和所述姿态角得到所述前馈值。
在一些示例中,所述姿态角包括:俯仰角,所述姿态角速度包括:俯仰角速度,所述姿态轴包括:俯仰轴;和/或;所述姿态角包括:航向角,所述姿态角速度包括:航向角速度,所述姿态轴包括:航向轴。
在一些示例中,所述处理器还用于执行以下操作:确定所述姿态角速度的上限值;当所述姿态角速度超过所述上限值时,控制所述成像设备以所述角速度上限值旋转所述姿态角偏差。
在一些示例中,所述处理器还用于执行以下操作:根据以下速度阈值确定所述上限值:所述成像设备在姿态轴的第一速度阈值、由所述目标相对于所述成像设备的姿态角范围以及所述成像设备的曝光时间所确定的第二速度阈值、由所述目标相对于所述成像设备的姿态角范围以及所述成像设备对所述目标的识别周期所确定的第三速度阈值。
在一些示例中,所述角速度上限值为所述第一速度阈值、所述第二速度阈值和所述第三速度阈值中的最小者。
在一些示例中,所述姿态轴包括:航向轴,所述姿态角范围包括:航向角范围,所述姿态角速度包括:航向角速度;和/或;所述姿态轴包括:俯仰轴,所述姿态角范围包括:俯仰角范围,所述姿态角速度包括:俯仰角速度。
在一些示例中,所述处理器还用于执行以下操作:根据所述期望位置和所述当前位置确定所述成像设备的期望姿态角;获取所述成像设备的当前姿态角;根据所述期望姿态角和所述当前姿态角确定所述姿态角偏差。
在一些示例中,所述处理器还用于执行以下操作:根据所述当前位置 确定所述目标与所述成像设备之间连线的姿态角;根据所述期望位置确定所述目标相对于所述成像设备的姿态角偏差;根据所述姿态角以及所述姿态角偏差,得到所述期望姿态角。
在一些示例中,所述当前姿态角包括:当前航向角,所述期望姿态角包括:期望航向角,所述姿态角包括:航向角,所述姿态角偏差包括:航向角偏差;和/或;所述当前姿态角包括:当前俯仰角,所述期望姿态角包括:期望俯仰角,所述姿态角包括:俯仰角,所述姿态角偏差包括:俯仰角偏差。
在一些示例中,所述处理器还用于执行以下操作:当所述成像设备的画面中的所述目标运动至所述成像设备的画面外时,确定另一姿态角偏差;根据所述当前位置和所述当前尺寸确定所述另一姿态角偏差对应的另一姿态角速度;控制所述成像设备以所述另一姿态角速度旋转所述另一姿态角偏差,以使所述目标重新出现在所述成像设备的画面中。
在一些示例中,所述处理器还用于执行以下操作:确定所述成像设备的期望姿态角;获取所述成像设备的当前姿态角;根据所述期望姿态角和所述当前姿态角确定所述另一姿态角偏差。
在一些示例中,所述处理器还用于执行以下操作:获取所述目标在所述画面中的最后时刻相对于所述成像设备的位置和运动速度、以及所述目标位于所述画面外的持续时间;根据所述位置、所述运动速度和所述持续时间,确定所述目标的预测位置;根据所述预测位置得到所述期望姿态角。
在一些示例中,所述预测位置包括:所述目标在匀速运动下和/或减速运动下的位置。
在一些示例中,所述当前姿态角包括:当前航向角,所述期望姿态角包括:期望航向角,所述另一姿态角偏差包括:航向角偏差,所述另一姿态角速度包括:航向角速度;和/或;所述当前姿态角包括:当前俯仰角,所述期望姿态角包括:期望俯仰角,所述另一姿态角偏差包括:俯仰角偏差,所述另一姿态角速度包括:俯仰角速度。
本公开再一实施例还提供了一种计算机可读存储介质,其存储有可执行指令,所述可执行指令在由一个或多个处理器执行时,可以使所述一个或多个处理器执行上述实施例的目标跟踪方法。
计算机可读存储介质,例如可以是能够包含、存储、传送、传播或传输指令的任意介质。例如,可读存储介质可以包括但不限于电、磁、光、电磁、红外或半导体系统、装置、器件或传播介质。可读存储介质的具体示例包括:磁存储装置,如磁带或硬盘(HDD);光存储装置,如光盘(CD-ROM);存储器,如随机存取存储器(RAM)或闪存;和/或有线/无线通信链路。
另外,计算机程序可被配置为具有例如包括计算机程序模块的计算机程序代码。应当注意,模块的划分方式和个数并不是固定的,本领域技术人员可以根据实际情况使用合适的程序模块或程序模块组合,当这些程序模块组合被计算机(或处理器)执行时,使得计算机可以执行本公开所述所述的无人机的仿真方法的流程及其变形。
本公开再一实施例还提供了一种可移动平台,包括:可移动载体、成像设备和载具。所述成像设备安装于所述可移动载体,或者,通过所述载具安装于所述可移动载体。所述可移动载体包括:无人机、无人车、无人船或机器人。所述载具包括:具有至少一个旋转自由度的云台。
所述可移动载体包括上述实施例的目标跟踪装置。所述目标跟踪装置可控制所述载具和/或所述可移动载体旋转,以调整所述成像设备的姿态角。
本公开再一实施例还提供了一种成像平台,包括:载具和成像设备;所述载具包括:上述实施例的目标跟踪装置。目标跟踪装置可控制载具旋转,以调整成像设备的姿态角。载具可以是手持式云台、云台相机在内的成像平台。
本领域技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。上述描述的装置的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
最后应说明的是:以上各实施例仅用以说明本公开的技术方案,而非对其限制;尽管参照前述各实施例对本公开进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;在不冲突的情 况下,本公开实施例中的特征可以任意组合;而这些修改或者替换,并不使相应技术方案的本质脱离本公开各实施例技术方案的范围。

Claims (70)

  1. 一种目标跟踪方法,其特征在于,包括:
    获取成像设备的画面中的目标的当前位置和当前尺寸;
    获取所述目标在所述画面中的期望位置,并根据所述期望位置和所述当前位置确定所述成像设备的姿态角偏差;
    根据所述当前位置和所述当前尺寸确定所述姿态角偏差对应的姿态角速度;
    控制所述成像设备以所述姿态角速度旋转所述姿态角偏差,以使所述目标在所述画面中处于所述期望位置。
  2. 如权利要求1所述的目标跟踪方法,其特征在于,所述姿态角偏差包括:航向角偏差,所述姿态角速度包括:航向角速度;
    和/或;
    所述姿态角偏差包括:俯仰角偏差,所述姿态角速度包括:俯仰角速度。
  3. 如权利要求2所述的目标跟踪方法,其特征在于,所述成像设备通过载具安装于可移动载体;
    所述控制所述成像设备以所述姿态角速度旋转所述姿态角偏差,包括:
    控制所述载具和/或所述可移动载体旋转,使所述成像设备以所述航向角速度旋转所述航向角偏差;
    和/或;
    控制所述载具和/或所述可移动载体旋转,使所述成像设备以所述俯仰角速度旋转所述俯仰角偏差。
  4. 如权利要求1所述的目标跟踪方法,其特征在于,所述根据所述当前位置和所述当前尺寸确定所述姿态角偏差对应的姿态角速度,包括:
    根据所述当前位置和所述当前尺寸确定至少一个控制参数;
    根据所述姿态角偏差和至少一个所述控制参数,得到所述姿态角速度。
  5. 如权利要求4所述的目标跟踪方法,其特征在于,所述根据所述当前位置和所述当前尺寸确定至少一个控制参数,包括:
    根据所述当前位置和所述当前尺寸确定所述目标的视场角;
    根据所述当前位置确定所述目标与所述成像设备之间连线的姿态角;
    根据所述视场角和所述姿态角中的至少之一得到至少一个所述控制参数。
  6. 如权利要求5所述的目标跟踪方法,其特征在于,所述根据所述当前位置和所述当前尺寸确定所述目标的视场角,包括:
    获取标识框相对于所述成像设备的姿态角,所述标识框用于标识所述当前位置和所述当前尺寸;
    根据所述标识框相对于所述成像设备的姿态角,得到所述视场角。
  7. 如权利要求6所述的目标跟踪方法,其特征在于,
    所述标识框相对于所述成像设备的姿态角包括:所述标识框相对于所述成像设备的航向角,所述视场角包括:所述目标相对于所述成像设备的航向角范围;
    和/或;
    所述标识框相对于所述成像设备的姿态角包括:所述标识框相对于所述成像设备的俯仰角,所述视场角包括:所述目标相对于所述成像设备的俯仰角范围。
  8. 如权利要求5所述的目标跟踪方法,其特征在于,所述姿态角包括:俯仰角;
    所述根据所述当前位置确定所述目标与所述成像设备之间连线的姿态角,包括:
    获取标识框相对于所述成像设备的俯仰角,所述标识框用于标识所述当前位置和所述当前尺寸;
    根据所述标识框相对于所述成像设备的俯仰角,确定所述目标相对于所述成像设备的俯仰角;
    获取所述成像设备的当前俯仰角;
    根据所述目标相对于所述成像设备的俯仰角以及所述当前俯仰角,得到所述目标与所述成像设备之间连线的俯仰角。
  9. 如权利要求5所述的目标跟踪方法,其特征在于,至少一个所述参数包括:增益衰减阈值和/或死区阈值。
  10. 如权利要求9所述的目标跟踪方法,其特征在于,所述增益衰减阈值与所述视场角和/或所述姿态角正相关。
  11. 如权利要求10所述的目标跟踪方法,其特征在于,在所述增益衰减阈值为关于航向角控制的增益衰减阈值,且与所述视场角和所述姿态角正相关时,
    所述视场角包括:所述目标相对于所述成像设备的航向角范围,所述姿态角包括:俯仰角。
    和/或;
    在所述增益衰减阈值为关于俯仰角控制的增益衰减阈值,且与所述视场角正相关时,
    所述视场角包括:所述目标相对于所述成像设备的俯仰角范围。
  12. 如权利要求9所述的目标跟踪方法,其特征在于,所述根据所述视场角和所述姿态角中的至少之一确定至少一个控制参数,包括:
    根据所述姿态角确定所述死区阈值。
  13. 如权利要求12所述的目标跟踪方法,其特征在于,所述姿态角包括:俯仰角;
    所述根据所述姿态角确定所述死区阈值,包括:
    当所述俯仰角的绝对值大于阈值时,控制所述死区阈值与所述俯仰角正相关;
    当所述俯仰角的绝对值小于或等于所述阈值时,将所述死区阈值设为 预设值;
    其中,所述死区阈值为关于航向角控制的死区阈值。
  14. 如权利要求9所述的目标跟踪方法,其特征在于,所述根据所述视场角和所述姿态角中的至少之一确定至少一个控制参数,包括:
    将所述死区阈值设为预设值,所述死区阈值为关于俯仰角控制的死区阈值。
  15. 如权利要求1所述的目标跟踪方法,其特征在于,还包括:
    所述成像设备对所述画面中的所述目标进行识别;
    根据所述目标的识别结果确定所述姿态角偏差对应的姿态角速度。
  16. 如权利要求15所述的目标跟踪方法,其特征在于,所述根据所述目标的识别结果确定所述姿态角偏差对应的姿态角速度,包括:
    根据所述目标的识别结果确定至少一个控制参数;
    根据所述姿态角偏差和至少一个所述控制参数,得到所述姿态角速度。
  17. 如权利要求16所述的目标跟踪方法,其特征在于,至少一个所述参数包括:关于航向角控制和/或俯仰角控制的增益系数;所述增益系数具有一取值范围;
    所述根据所述目标的识别结果确定至少一个控制参数,包括:
    当所述目标位于所述画面中且识别出所述目标时,在所述取值范围内增大所述增益系数;
    当所述目标位于所述画面中但未识别出所述目标,或者所述目标位于所述画面外时,在所述取值范围内减小所述增益系数。
  18. 如权利要求1所述的目标跟踪方法,其特征在于,还包括:
    获取所述目标相对于所述成像设备的距离和运动速度;
    根据所述当前位置确定所述目标与所述成像设备之间连线的姿态角;
    根据所述距离、所述运动速度和所述姿态角确定所述姿态角速度的前 馈值。
  19. 如权利要求18所述的目标跟踪方法,其特征在于,所述根据所述距离、所述运动速度和所述姿态角确定所述姿态角速度的前馈值,包括:
    根据所述运动速度得到沿所述成像设备的姿态轴的速度;
    根据所述速度、所述距离和所述姿态角得到所述前馈值。
  20. 如权利要求19所述的目标跟踪方法,其特征在于,
    所述姿态角包括:俯仰角,所述姿态角速度包括:俯仰角速度,所述姿态轴包括:俯仰轴;
    和/或;
    所述姿态角包括:航向角,所述姿态角速度包括:航向角速度,所述姿态轴包括:航向轴。
  21. 如权利要求1所述的目标跟踪方法,其特征在于,还包括:
    确定所述姿态角速度的上限值;
    当所述姿态角速度超过所述上限值时,控制所述成像设备以所述角速度上限值旋转所述姿态角偏差。
  22. 如权利要求21所述的目标跟踪方法,其特征在于,所述确定所述姿态角速度的上限值,包括:
    根据以下速度阈值确定所述上限值:
    所述成像设备在姿态轴的第一速度阈值、由所述目标相对于所述成像设备的姿态角范围以及所述成像设备的曝光时间所确定的第二速度阈值、由所述目标相对于所述成像设备的姿态角范围以及所述成像设备对所述目标的识别周期所确定的第三速度阈值。
  23. 如权利要求22所述的目标跟踪方法,其特征在于,所述角速度上限值为所述第一速度阈值、所述第二速度阈值和所述第三速度阈值中的最小者。
  24. 如权利要求22所述的目标跟踪方法,其特征在于,
    所述姿态轴包括:航向轴,所述姿态角范围包括:航向角范围,所述姿态角速度包括:航向角速度;
    和/或;
    所述姿态轴包括:俯仰轴,所述姿态角范围包括:俯仰角范围,所述姿态角速度包括:俯仰角速度。
  25. 如权利要求1所述的目标跟踪方法,其特征在于,所述根据所述期望位置和所述当前位置确定所述成像设备的姿态角偏差,包括:
    根据所述期望位置和所述当前位置确定所述成像设备的期望姿态角;
    获取所述成像设备的当前姿态角;
    根据所述期望姿态角和所述当前姿态角确定所述姿态角偏差。
  26. 如权利要求25所述的目标跟踪方法,其特征在于,所述根据所述期望位置和所述当前位置确定所述成像设备的期望姿态角,包括:
    根据所述当前位置确定所述目标与所述成像设备之间连线的姿态角;
    根据所述期望位置确定所述目标相对于所述成像设备的姿态角偏差;
    根据所述姿态角以及所述姿态角偏差,得到所述期望姿态角。
  27. 如权利要求26所述的目标跟踪方法,其特征在于,
    所述当前姿态角包括:当前航向角,所述期望姿态角包括:期望航向角,所述姿态角包括:航向角,所述姿态角偏差包括:航向角偏差;
    和/或;
    所述当前姿态角包括:当前俯仰角,所述期望姿态角包括:期望俯仰角,所述姿态角包括:俯仰角,所述姿态角偏差包括:俯仰角偏差。
  28. 如权利要求1所述的目标跟踪方法,其特征在于,还包括:
    当所述成像设备的画面中的所述目标运动至所述成像设备的画面外时,确定另一姿态角偏差;
    根据所述当前位置和所述当前尺寸确定所述另一姿态角偏差对应的 另一姿态角速度;
    控制所述成像设备以所述另一姿态角速度旋转所述另一姿态角偏差,以使所述目标重新出现在所述成像设备的画面中。
  29. 如权利要求28所述的目标跟踪方法,其特征在于,所述确定另一姿态角偏差,包括:
    确定所述成像设备的期望姿态角;
    获取所述成像设备的当前姿态角;
    根据所述期望姿态角和所述当前姿态角确定所述另一姿态角偏差。
  30. 如权利要求29所述的目标跟踪方法,其特征在于,所述确定所述成像设备的期望姿态角,包括:
    获取所述目标在所述画面中的最后时刻相对于所述成像设备的位置和运动速度、以及所述目标位于所述画面外的持续时间;
    根据所述位置、所述运动速度和所述持续时间,确定所述目标的预测位置;
    根据所述预测位置得到所述期望姿态角。
  31. 如权利要求30所述的目标跟踪方法,其特征在于,所述预测位置包括:所述目标在匀速运动下和/或减速运动下的位置。
  32. 如权利要求30所述的目标跟踪方法,其特征在于,
    所述当前姿态角包括:当前航向角,所述期望姿态角包括:期望航向角,所述另一姿态角偏差包括:航向角偏差,所述另一姿态角速度包括:航向角速度;
    和/或;
    所述当前姿态角包括:当前俯仰角,所述期望姿态角包括:期望俯仰角,所述另一姿态角偏差包括:俯仰角偏差,所述另一姿态角速度包括:俯仰角速度。
  33. 一种目标跟踪装置,其特征在于,包括:
    存储器,用于存储可执行指令;
    处理器,用于执行所述存储器中存储的所述可执行指令,以执行如下操作:
    获取成像设备的画面中的目标的当前位置和当前尺寸;
    获取所述目标在所述画面中的期望位置,并根据所述期望位置和所述当前位置确定所述成像设备的姿态角偏差;
    根据所述当前位置和所述当前尺寸确定所述姿态角偏差对应的姿态角速度;
    控制所述成像设备以所述姿态角速度旋转所述姿态角偏差,以使所述目标在所述画面中处于所述期望位置。
  34. 如权利要求33所述的目标跟踪装置,其特征在于,
    所述姿态角偏差包括:航向角偏差,所述姿态角速度包括:航向角速度;
    和/或;
    所述姿态角偏差包括:俯仰角偏差,所述姿态角速度包括:俯仰角速度。
  35. 如权利要求34所述的目标跟踪装置,其特征在于,所述成像设备通过载具安装于可移动载体;
    所述处理器还用于执行以下操作:
    控制所述载具和/或所述可移动载体旋转,使所述成像设备以所述航向角速度旋转所述航向角偏差;
    和/或;
    控制所述载具和/或所述可移动载体旋转,使所述成像设备以所述俯仰角速度旋转所述俯仰角偏差。
  36. 如权利要求33所述的目标跟踪装置,其特征在于,所述处理器还用于执行以下操作:
    根据所述当前位置和所述当前尺寸确定至少一个控制参数;
    根据所述姿态角偏差和至少一个所述控制参数,得到所述姿态角速度。
  37. 如权利要求36所述的目标跟踪装置,其特征在于,所述处理器还用于执行以下操作:
    根据所述当前位置和所述当前尺寸确定所述目标的视场角;
    根据所述当前位置确定所述目标与所述成像设备之间连线的姿态角;
    根据所述视场角和所述姿态角中的至少之一得到至少一个所述控制参数。
  38. 如权利要求37所述的目标跟踪装置,其特征在于,所述处理器还用于执行以下操作:
    获取标识框相对于所述成像设备的姿态角,所述标识框用于标识所述当前位置和所述当前尺寸;
    根据所述标识框相对于所述成像设备的姿态角,得到所述视场角。
  39. 如权利要求38所述的目标跟踪装置,其特征在于,
    所述标识框相对于所述成像设备的姿态角包括:所述标识框相对于所述成像设备的航向角,
    所述视场角包括:所述目标相对于所述成像设备的航向角范围;
    和/或;
    所述标识框相对于所述成像设备的姿态角包括:所述标识框相对于所述成像设备的俯仰角,
    所述视场角包括:所述目标相对于所述成像设备的俯仰角范围。
  40. 如权利要求37所述的目标跟踪装置,其特征在于,所述姿态角包括:俯仰角;
    所述处理器还用于执行以下操作:
    获取标识框相对于所述成像设备的俯仰角,所述标识框用于标识所述当前位置和所述当前尺寸;
    根据所述标识框相对于所述成像设备的俯仰角,确定所述目标相对于 所述成像设备的俯仰角;
    获取所述成像设备的当前俯仰角;
    根据所述目标相对于所述成像设备的俯仰角以及所述当前俯仰角,得到所述目标与所述成像设备之间连线的俯仰角。
  41. 如权利要求37所述的目标跟踪装置,其特征在于,至少一个所述参数包括:增益衰减阈值和/或死区阈值。
  42. 如权利要求41所述的目标跟踪装置,其特征在于,所述增益衰减阈值与所述视场角和/或所述姿态角正相关。
  43. 如权利要求42所述的目标跟踪装置,其特征在于,
    在所述增益衰减阈值为关于航向角控制的增益衰减阈值,且与所述视场角和所述姿态角正相关时,
    所述视场角包括:所述目标相对于所述成像设备的航向角范围,所述姿态角包括:俯仰角;
    和/或;
    在所述增益衰减阈值为关于俯仰角控制的增益衰减阈值,且与所述视场角正相关时,
    所述视场角包括:所述目标相对于所述成像设备的俯仰角范围。
  44. 如权利要求41所述的目标跟踪装置,其特征在于,所述处理器还用于执行以下操作:
    根据所述姿态角确定所述死区阈值。
  45. 如权利要求44所述的目标跟踪装置,其特征在于,所述姿态角包括:俯仰角;
    所述处理器还用于执行以下操作:
    当所述俯仰角的绝对值大于阈值时,控制所述死区阈值与所述俯仰角正相关;
    当所述俯仰角的绝对值小于或等于所述阈值时,将所述死区阈值设为预设值;
    其中,所述死区阈值为关于航向角控制的死区阈值。
  46. 如权利要求41所述的目标跟踪装置,其特征在于,所述处理器还用于执行以下操作:
    将所述死区阈值设为预设值,所述死区阈值为关于俯仰角控制的死区阈值。
  47. 如权利要求33所述的目标跟踪装置,其特征在于,所述处理器还用于执行以下操作:
    所述成像设备对所述画面中的所述目标进行识别;
    根据所述目标的识别结果确定所述姿态角偏差对应的姿态角速度。
  48. 如权利要求47所述的目标跟踪装置,其特征在于,所述处理器还用于执行以下操作:
    根据所述目标的识别结果确定至少一个控制参数;
    根据所述姿态角偏差和至少一个所述控制参数,得到所述姿态角速度。
  49. 如权利要求48所述的目标跟踪装置,其特征在于,至少一个所述参数包括:关于航向角控制和/或俯仰角控制的增益系数;所述增益系数具有一取值范围;
    所述处理器还用于执行以下操作:
    当所述目标位于所述画面中且识别出所述目标时,在所述取值范围内增大所述增益系数;
    当所述目标位于所述画面中但未识别出所述目标,或者所述目标位于所述画面外时,在所述取值范围内减小所述增益系数。
  50. 如权利要求33所述的目标跟踪装置,其特征在于,所述处理器还用于执行以下操作:
    获取所述目标相对于所述成像设备的距离和运动速度;
    根据所述当前位置确定所述目标与所述成像设备之间连线的姿态角;
    根据所述距离、所述运动速度和所述姿态角确定所述姿态角速度的前馈值。
  51. 如权利要求50所述的目标跟踪装置,其特征在于,所述处理器还用于执行以下操作:
    根据所述运动速度得到沿所述成像设备的姿态轴的速度;
    根据所述速度、所述距离和所述姿态角得到所述前馈值。
  52. 如权利要求51所述的目标跟踪装置,其特征在于,
    所述姿态角包括:俯仰角,所述姿态角速度包括:俯仰角速度,所述姿态轴包括:俯仰轴;
    和/或;
    所述姿态角包括:航向角,所述姿态角速度包括:航向角速度,所述姿态轴包括:航向轴。
  53. 如权利要求33所述的目标跟踪装置,其特征在于,所述处理器还用于执行以下操作:
    确定所述姿态角速度的上限值;
    当所述姿态角速度超过所述上限值时,控制所述成像设备以所述角速度上限值旋转所述姿态角偏差。
  54. 如权利要求53所述的目标跟踪装置,其特征在于,所述处理器还用于执行以下操作:
    根据以下速度阈值确定所述上限值:
    所述成像设备在姿态轴的第一速度阈值、由所述目标相对于所述成像设备的姿态角范围以及所述成像设备的曝光时间所确定的第二速度阈值、由所述目标相对于所述成像设备的姿态角范围以及所述成像设备对所述目标的识别周期所确定的第三速度阈值。
  55. 如权利要求54所述的目标跟踪装置,其特征在于,所述角速度上限值为所述第一速度阈值、所述第二速度阈值和所述第三速度阈值中的最小者。
  56. 如权利要求54所述的目标跟踪装置,其特征在于,
    所述姿态轴包括:航向轴,所述姿态角范围包括:航向角范围,所述姿态角速度包括:航向角速度;
    和/或;
    所述姿态轴包括:俯仰轴,所述姿态角范围包括:俯仰角范围,所述姿态角速度包括:俯仰角速度。
  57. 如权利要求33所述的目标跟踪装置,其特征在于,所述处理器还用于执行以下操作:
    根据所述期望位置和所述当前位置确定所述成像设备的期望姿态角;
    获取所述成像设备的当前姿态角;
    根据所述期望姿态角和所述当前姿态角确定所述姿态角偏差。
  58. 如权利要求57所述的目标跟踪装置,其特征在于,所述处理器还用于执行以下操作:
    根据所述当前位置确定所述目标与所述成像设备之间连线的姿态角;
    根据所述期望位置确定所述目标相对于所述成像设备的姿态角偏差;
    根据所述姿态角以及所述姿态角偏差,得到所述期望姿态角。
  59. 如权利要求58所述的目标跟踪装置,其特征在于,
    所述当前姿态角包括:当前航向角,所述期望姿态角包括:期望航向角,所述姿态角包括:航向角,所述姿态角偏差包括:航向角偏差;
    和/或;
    所述当前姿态角包括:当前俯仰角,所述期望姿态角包括:期望俯仰角,所述姿态角包括:俯仰角,所述姿态角偏差包括:俯仰角偏差。
  60. 如权利要求33所述的目标跟踪装置,其特征在于,所述处理器还用于执行以下操作:
    当所述成像设备的画面中的所述目标运动至所述成像设备的画面外时,确定另一姿态角偏差;
    根据所述当前位置和所述当前尺寸确定所述另一姿态角偏差对应的另一姿态角速度;
    控制所述成像设备以所述另一姿态角速度旋转所述另一姿态角偏差,以使所述目标重新出现在所述成像设备的画面中。
  61. 如权利要求60所述的目标跟踪装置,其特征在于,所述处理器还用于执行以下操作:
    确定所述成像设备的期望姿态角;
    获取所述成像设备的当前姿态角;
    根据所述期望姿态角和所述当前姿态角确定所述另一姿态角偏差。
  62. 如权利要求61所述的目标跟踪装置,其特征在于,所述处理器还用于执行以下操作:
    获取所述目标在所述画面中的最后时刻相对于所述成像设备的位置和运动速度、以及所述目标位于所述画面外的持续时间;
    根据所述位置、所述运动速度和所述持续时间,确定所述目标的预测位置;
    根据所述预测位置得到所述期望姿态角。
  63. 如权利要求62所述的目标跟踪装置,其特征在于,所述预测位置包括:所述目标在匀速运动下和/或减速运动下的位置。
  64. 如权利要求62所述的目标跟踪装置,其特征在于,
    所述当前姿态角包括:当前航向角,所述期望姿态角包括:期望航向角,所述另一姿态角偏差包括:航向角偏差,所述另一姿态角速度包括:航向角速度;
    和/或;
    所述当前姿态角包括:当前俯仰角,所述期望姿态角包括:期望俯仰角,所述另一姿态角偏差包括:俯仰角偏差,所述另一姿态角速度包括:俯仰角速度。
  65. 一种计算机可读存储介质,其特征在于,其存储有可执行指令,所述可执行指令在由一个或多个处理器执行时,可以使所述一个或多个处理器执行如权利要求1至32中任一项权利要求所述的目标跟踪方法。
  66. 一种可移动平台,其特征在于,包括:可移动载体、成像设备和载具;
    所述成像设备安装于所述可移动载体,或者,通过所述载具安装于所述可移动载体;
    所述可移动载体包括:如权利要求33-64任一项所述的目标跟踪装置;
    所述目标跟踪装置可控制所述载具和/或所述可移动载体旋转,以调整所述成像设备的姿态角。
  67. 如权利要求66所述的可移动平台,其特征在于,所述可移动载体包括:无人机、无人车、无人船或机器人。
  68. 如权利要求66所述的可移动平台,其特征在于,所述载具包括:具有至少一个旋转自由度的云台。
  69. 一种成像平台,其特征在于,包括:载具和成像设备;所述载具包括:如权利要求33-64任一项所述的目标跟踪装置;
    所述目标跟踪装置可控制所述载具旋转,以调整所述成像设备的姿态角。
  70. 如权利要求69所述的成像平台,其特征在于,所述载具包括:手持式云台。
PCT/CN2020/089009 2020-05-07 2020-05-07 目标跟踪方法和装置、可移动平台以及成像平台 WO2021223171A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2020/089009 WO2021223171A1 (zh) 2020-05-07 2020-05-07 目标跟踪方法和装置、可移动平台以及成像平台
CN202080004853.6A CN112639652A (zh) 2020-05-07 2020-05-07 目标跟踪方法和装置、可移动平台以及成像平台

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/089009 WO2021223171A1 (zh) 2020-05-07 2020-05-07 目标跟踪方法和装置、可移动平台以及成像平台

Publications (1)

Publication Number Publication Date
WO2021223171A1 true WO2021223171A1 (zh) 2021-11-11

Family

ID=75291189

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/089009 WO2021223171A1 (zh) 2020-05-07 2020-05-07 目标跟踪方法和装置、可移动平台以及成像平台

Country Status (2)

Country Link
CN (1) CN112639652A (zh)
WO (1) WO2021223171A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116974208A (zh) * 2023-09-22 2023-10-31 西北工业大学 基于捷联导引头的旋翼无人机目标打击控制方法及系统
CN117649426A (zh) * 2024-01-29 2024-03-05 中国科学院长春光学精密机械与物理研究所 抗无人机起落架遮挡的运动目标跟踪方法

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113301248B (zh) * 2021-04-13 2022-09-06 中科创达软件股份有限公司 拍摄方法、装置、电子设备及计算机存储介质
CN114371720B (zh) * 2021-12-29 2023-09-29 国家电投集团贵州金元威宁能源股份有限公司 无人机实现跟踪目标的控制方法和控制装置
CN117714883A (zh) * 2022-09-07 2024-03-15 华为技术有限公司 摄像机控制方法及相关装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107087427A (zh) * 2016-11-30 2017-08-22 深圳市大疆创新科技有限公司 飞行器的控制方法、装置和设备以及飞行器
CN107209854A (zh) * 2015-09-15 2017-09-26 深圳市大疆创新科技有限公司 用于支持顺畅的目标跟随的系统和方法
CN107749951A (zh) * 2017-11-09 2018-03-02 睿魔智能科技(东莞)有限公司 一种用于无人摄影的视觉感知方法和系统
CN110109482A (zh) * 2019-06-14 2019-08-09 上海应用技术大学 基于ssd神经网络的目标跟踪系统

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE202014011010U1 (de) * 2014-07-30 2017-05-31 SZ DJI Technology Co., Ltd. Systeme zur Zielverfolgung
JP6849272B2 (ja) * 2018-03-14 2021-03-24 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd 無人航空機を制御するための方法、無人航空機、及び無人航空機を制御するためのシステム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107209854A (zh) * 2015-09-15 2017-09-26 深圳市大疆创新科技有限公司 用于支持顺畅的目标跟随的系统和方法
CN107087427A (zh) * 2016-11-30 2017-08-22 深圳市大疆创新科技有限公司 飞行器的控制方法、装置和设备以及飞行器
CN107749951A (zh) * 2017-11-09 2018-03-02 睿魔智能科技(东莞)有限公司 一种用于无人摄影的视觉感知方法和系统
CN110109482A (zh) * 2019-06-14 2019-08-09 上海应用技术大学 基于ssd神经网络的目标跟踪系统

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116974208A (zh) * 2023-09-22 2023-10-31 西北工业大学 基于捷联导引头的旋翼无人机目标打击控制方法及系统
CN116974208B (zh) * 2023-09-22 2024-01-19 西北工业大学 基于捷联导引头的旋翼无人机目标打击控制方法及系统
CN117649426A (zh) * 2024-01-29 2024-03-05 中国科学院长春光学精密机械与物理研究所 抗无人机起落架遮挡的运动目标跟踪方法
CN117649426B (zh) * 2024-01-29 2024-04-09 中国科学院长春光学精密机械与物理研究所 抗无人机起落架遮挡的运动目标跟踪方法

Also Published As

Publication number Publication date
CN112639652A (zh) 2021-04-09

Similar Documents

Publication Publication Date Title
WO2021223171A1 (zh) 目标跟踪方法和装置、可移动平台以及成像平台
US10928838B2 (en) Method and device of determining position of target, tracking device and tracking system
CN112567201B (zh) 距离测量方法以及设备
US11073389B2 (en) Hover control
CN102298070B (zh) 估算无人机,尤其是能够在自动驾驶下执行悬停飞行的无人机的水平速度的方法
US11906983B2 (en) System and method for tracking targets
CN109102525B (zh) 一种基于自适应位姿估计的移动机器人跟随控制方法
US20180146126A1 (en) Gimbal control method, gimbal control system and gimbal device
US9319641B2 (en) Controlling movement of a camera to autonomously track a mobile object
CN111474953B (zh) 多动态视角协同的空中目标识别方法及系统
US20170242432A1 (en) Image processing for gesture-based control of an unmanned aerial vehicle
EP3742248A1 (en) Controlling a group of drones for image capture
Karakostas et al. UAV cinematography constraints imposed by visual target tracking
Mills et al. Vision based control for fixed wing UAVs inspecting locally linear infrastructure using skid-to-turn maneuvers
Karakostas et al. Shot type feasibility in autonomous UAV cinematography
WO2020024134A1 (zh) 轨迹切换的方法和装置
CN115291536A (zh) 基于视觉的无人机跟踪地面目标半物理仿真平台验证方法
AU2015367226A1 (en) Imaging system
WO2020237478A1 (zh) 一种飞行规划方法及相关设备
WO2020019113A1 (zh) 移动机器人的控制方法、装置及移动机器人系统
US20230401748A1 (en) Apparatus and methods to calibrate a stereo camera pair
Mademlis et al. Vision-based drone control for autonomous UAV cinematography
TWI726536B (zh) 影像擷取方法及影像擷取設備
CN111222586A (zh) 一种基于三维倾斜模型视角的倾斜影像匹配方法及装置
CN115357052A (zh) 一种无人机自动探索视频画面中兴趣点的方法及系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20934868

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20934868

Country of ref document: EP

Kind code of ref document: A1