WO2022027596A1 - Procédé et dispositif de commande pour une plate-forme mobile, et support de stockage lisible par ordinateur - Google Patents

Procédé et dispositif de commande pour une plate-forme mobile, et support de stockage lisible par ordinateur Download PDF

Info

Publication number
WO2022027596A1
WO2022027596A1 PCT/CN2020/107825 CN2020107825W WO2022027596A1 WO 2022027596 A1 WO2022027596 A1 WO 2022027596A1 CN 2020107825 W CN2020107825 W CN 2020107825W WO 2022027596 A1 WO2022027596 A1 WO 2022027596A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel area
movable platform
obstacle
posture
loss function
Prior art date
Application number
PCT/CN2020/107825
Other languages
English (en)
Chinese (zh)
Inventor
许中研
钱杰
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/107825 priority Critical patent/WO2022027596A1/fr
Priority to CN202080035658.XA priority patent/CN113906360A/zh
Publication of WO2022027596A1 publication Critical patent/WO2022027596A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Definitions

  • the present disclosure relates to the field, and in particular, to a control method, an apparatus, a computer-readable storage medium and a movable platform for a movable platform.
  • the target tracking function usually involves the path planning of the movable platform and the attitude adjustment of the camera.
  • the movable platform can plan a moving path in real time, and move along the moving path to follow the target.
  • the photographing device carried by the movable platform photographs the target according to preset rules, so that the tracked target is presented in the image of the photographing device.
  • the present disclosure provides a control method for a movable platform, the movable platform includes a photographing device, and the control method includes:
  • a path of movement of the movable platform around the obstacle is determined based at least in part on the location information of the photographed target and the location information of the obstacle.
  • the present disclosure also provides a control method for a movable platform, the movable platform includes a photographing device, and the control method includes:
  • the posture of the photographing device is adjusted according to the relative positional relationship between the first pixel area and the second pixel area.
  • the present disclosure also provides a control method for a movable platform, the control method comprising:
  • a moving path of the movable platform around the obstacle and the no-fly zone is determined.
  • the present disclosure also provides a control device for a movable platform, the movable platform includes a photographing device, and the control device includes:
  • a path of movement of the movable platform around the obstacle is determined based at least in part on the location information of the photographed target and the location information of the obstacle.
  • the present disclosure also provides a control device for a movable platform, the movable platform includes a photographing device, and the control device includes:
  • the posture of the photographing device is adjusted according to the relative positional relationship between the first pixel area and the second pixel area.
  • the present disclosure also provides a control device for a movable platform, the control device comprising:
  • a moving path of the movable platform around the obstacle and the no-fly zone is determined.
  • the present disclosure also provides a computer-readable storage medium storing executable instructions, which, when executed by one or more processors, can cause the one or more processors to execute the above control method .
  • the present disclosure also provides a movable platform, comprising: a photographing device, and a movable carrier; the movable carrier includes: the above-mentioned control device.
  • FIG. 1 is a flowchart of a control method for a movable platform provided by an embodiment of the present disclosure.
  • FIG. 2 is a schematic diagram of an application scenario of an unmanned aerial vehicle provided by an embodiment of the present disclosure.
  • FIG. 3 is a schematic diagram of determining a movement path according to position information of a shooting target and position information of an obstacle according to an embodiment of the present disclosure.
  • FIG. 4 is a schematic diagram of determining a movement path according to a distance between a waypoint and an obstacle according to an embodiment of the present disclosure.
  • FIG. 5 is a schematic diagram of determining a movement path according to path smoothness according to an embodiment of the present disclosure.
  • FIG. 6 shows a first posture adjustment strategy provided by an embodiment of the present disclosure when the first pixel area and the second pixel area do not overlap.
  • FIG. 7 shows an example of a second posture adjustment strategy in which the first pixel area is located in the second pixel area provided by an embodiment of the present disclosure.
  • FIG. 8 shows another example of a second posture adjustment strategy in which the first pixel area is located in the second pixel area provided by an embodiment of the present disclosure.
  • FIG. 9 shows a third attitude adjustment strategy for a linear second pixel region provided by an embodiment of the present disclosure.
  • FIG. 10 is a schematic diagram of determining a movement path according to location information of a shooting target and location information of a no-fly zone provided by an embodiment of the present disclosure.
  • FIG. 11 is a flowchart of a control method of a movable platform according to another embodiment of the present disclosure.
  • FIG. 12 is a flowchart of a control method of a movable platform according to still another embodiment of the present disclosure.
  • FIG. 13 is a schematic diagram of determining a movement path according to the position information of an obstacle and the position information of a no-fly zone according to an embodiment of the present disclosure.
  • FIG. 14 is a schematic structural diagram of a control device of a movable platform provided by an embodiment of the present disclosure.
  • FIG. 15 is a schematic structural diagram of a movable platform provided by an embodiment of the present disclosure.
  • the movable platform can move along the moving path and use the photographing device to track the target during the motion
  • only the environment in which the movable platform is located is usually considered.
  • the location of the obstacle basically does not consider other factors that may affect the shooting effect and safety, which will affect the shooting effect of target tracking and the safety of the movable platform.
  • the photographing device usually only pays attention to the tracked target, and the entire composition process lacks attention to the environment where the target is located and the target objects in the environment, which will also affect the shooting effect of target tracking.
  • the present disclosure provides a control method, device, computer-readable storage medium and movable platform for a movable platform, by acquiring the position information of the photographing target of the photographing device and the position information of obstacles in the environment where the movable platform is located , and determine the moving path of the movable platform around the obstacle according to the position information of the shooting target and the position information of the obstacle at least in part, so as to improve the shooting effect of target tracking.
  • An embodiment of the present disclosure provides a control method for a movable platform.
  • the movable platform includes a photographing device.
  • the control method includes:
  • S101 Acquire position information of a photographing target of the photographing device
  • S103 Determine a movement path of the movable platform around the obstacle based at least in part on the location information of the photographed target and the location information of the obstacle.
  • the control method of this embodiment can be applied to various movable platforms.
  • the movable platform may be, for example, an aerial movable platform.
  • Aerial movable platforms may include, but are not limited to, unmanned aerial vehicles, fixed-wing aircraft, rotary-wing aircraft, and the like.
  • the movable platform can also be, for example, a ground movable platform.
  • Ground movable platforms may include, but are not limited to, unmanned vehicles, robots, manned vehicles, and the like.
  • the movable platform can also be, for example, a handheld device or a mobile device.
  • Handheld setups may include, but are not limited to, handheld gimbals, PTZ cameras; mobile devices may include, but are not limited to: remote controls, smart phones/mobile phones, tablets, laptops, desktops, media content players, video game stations/ systems, virtual reality systems, augmented reality systems, wearable devices, etc.
  • the control method of this embodiment is described below by taking a movable platform such as an unmanned aerial vehicle as an example.
  • the drone 100 includes: a drone body 110 , a gimbal 140 and a photographing device 130 .
  • the drone body 110 may include the drone body 105 , and one or more propulsion units 150 .
  • Propulsion unit 150 may be configured to generate lift for drone 100 .
  • Propulsion unit 150 may include rotors.
  • the drone 100 can fly in three-dimensional space, and can rotate along at least one of a pitch axis, a yaw axis, and a roll axis.
  • Unmanned aerial vehicle 110 may include one or more sensors. These sensors include, for example, image sensors, distance sensors, height sensors, position sensors, and the like. As an example of an image sensor, the drone 100 may include one or more cameras 130 . In this embodiment, the photographing device 130 can be a visible light camera, an infrared camera, an ultraviolet camera, etc., and the photographing device 130 can photograph the target 160 within its field of view 170 .
  • the photographing device 130 is supported by the unmanned aerial vehicle 110 .
  • the camera 130 may be directly supported by the UAV 110 , or may be supported by the UAV 110 via the vehicle 140 .
  • the photographing device 130 may be installed on the carrier 140 .
  • the vehicle 140 may allow the camera 130 to rotate about at least one of a pitch axis, a yaw axis, and a roll axis to adjust the orientation of the camera 130 .
  • Vehicle 140 may include a single-axis gimbal, a dual-axis gimbal, or a three-axis gimbal.
  • the drone 100 can be controlled by the remote controller 120 .
  • the remote controller 120 may communicate with at least one of the drone body 110 , the gimbal 140 , and the photographing device 130 .
  • the remote control 120 includes a display.
  • the display is used to display images captured by the capturing device 130 .
  • the remote control 120 also includes an input device. An input device may be used to receive input from a user.
  • the position information of the photographing target of the photographing device is acquired through S101 .
  • the shooting device can shoot various shooting targets in the surrounding environment.
  • the shooting target not only includes the object that the drone wants to track.
  • the shooting target may include any object in the surrounding environment.
  • the object that the drone wants to track may be referred to as a following target in this embodiment, and the following target usually refers to a movable object.
  • following targets may include: people, animals, vehicles (air, ground, surface, underwater), and the like.
  • the person may be the user operating the drone itself, or someone other than the user; the vehicle may include: a motor vehicle, a non-motor vehicle, an unmanned aerial vehicle, a manned aerial vehicle, a watercraft, and the like.
  • the shooting target may also include a background target
  • the background target may include any object other than the following target.
  • Background objects often reflect the scene in which the following objects are located.
  • the background objects may include mountains, rivers, lakes, forests, seas, beaches, grasslands, various buildings, horizons, etc., and may also include sky, sun, moon, clouds, morning glow, sunset glow, and the like.
  • the captured image can be sent to the remote control through the communication link between the remote control and the drone body, the gimbal or the shooting device, and the remote control on the monitor to display the captured image.
  • the photographing apparatus may send the photographed image to the remote controller in real time for the user to view, or may also store the photographed image locally in the photographing apparatus, and then send the image to the remote control for display.
  • the user may manually select a follow target in the image.
  • the user may select one or more objects displayed in the image as the following target through the input device.
  • the input device may be a button, a joystick, or a knob of the remote control; when the display of the remote control is a touch screen, the user can also select a follow target through the display.
  • the UAV will track the following target during the next flight.
  • the user may not need to manually select the follow target, but the follow target may be automatically selected by one of the drone, gimbal, camera, and remote control.
  • the remote controller may select one or more objects in the image as the following target according to a preset rule.
  • the preset rules can be: color, size, shape, etc. That is, when an object in the image conforms to a preset color, size, or shape, the object is identified as a following target.
  • a user confirmation step may also be added.
  • the following target automatically selected by the remote control is used as an alternative target and displayed on the display, and the user performs a confirmation operation through the input device. Only after the user confirms a certain/some candidate target, the candidate target will be used as a follow target.
  • the position information of the shooting target can be obtained by various means, and the position information can be the position coordinates relative to a certain coordinate system.
  • the location information of the shooting target can be obtained through the three-dimensional reconstruction model.
  • the three-dimensional reconstruction model can reflect the correspondence between the pixels in the image of the photographing device and the spatial points corresponding to the pixels.
  • the 3D reconstructed model can look like this:
  • (u, v) represents the coordinates of the pixels in the image in the pixel coordinate system
  • (X W , Y W , Z W ) represents the spatial point corresponding to the pixel (u, v) in the UAV coordinate system.
  • Coordinates; Z C represents the distance between the spatial point corresponding to the pixel (u, v) and the optical center of the camera;
  • A represents the internal parameter matrix of the camera;
  • [RT] represents the external parameter matrix of the camera.
  • the pixels corresponding to the following target in the image captured by the photographing device can be substituted into the above-mentioned three-dimensional reconstruction model, so as to obtain the coordinates of the following target in the UAV coordinate system.
  • Z C can be obtained in various ways.
  • a drone or a photographing device can be installed with a distance sensor, and the distance between the following target and the optical center of the photographing device can be measured by the distance sensor; A and [RT] can be obtained by calibrated.
  • the position information obtained above is the coordinates relative to the coordinate system of the UAV, of course, this embodiment is not limited to this.
  • the UAV is usually equipped with a position sensor and an attitude sensor. Through the position sensor and the attitude sensor, the transformation relationship between the UAV coordinate system and the geodetic coordinate system can be obtained, so that the coordinates of the shooting target in the geodetic coordinate system can be obtained.
  • the position information of the background target can be obtained by means similar to the following target.
  • the drone, the camera or the remote control may store map data, and use the map data to determine the location information of the shooting target.
  • the map data records geographic information of various areas, and these areas include at least the activity area of the drone.
  • the geographic information may include: the type, name, location coordinates, etc. of each object in the activity area.
  • the position information of obstacles in the environment where the movable platform is located is obtained through S102.
  • the UAV can detect obstacles through image sensors, radar, etc., or directly obtain obstacles and their location information through remote devices (such as cloud servers).
  • remote devices such as cloud servers.
  • Various objects in the surrounding environment can be detected, and one of the drone, gimbal, camera and remote control will recognize the object.
  • the recognition result shows that the object is an obstacle, the object is marked as an obstacle.
  • This embodiment does not limit the method for identifying obstacles, and various image recognition methods can be used to identify obstacles, such as but not limited to methods based on feature detection, feature matching, and methods based on artificial neural networks.
  • the process of acquiring the location information of the obstacle may be basically similar to the process of acquiring the location information of the photographing target in S101. That is, the position information of the obstacle can be obtained through the three-dimensional reconstruction model, and the position information of the obstacle can also be obtained through the query of the map data.
  • the obstacle may be, for example, an object with obvious features, such as a utility pole, street light, outdoor sign, outdoor billboard, and the like.
  • the obstacle may not be significantly different from the background target, or some objects may be the background target in some cases and the obstacle in others, which may depend more on follow the position of one of the target, the background target, and the obstacle in the image, or at least the relative positions of both in the image.
  • trees for example, if there are a lot of trees in the surrounding environment, it may indicate that the drone is located in a forest environment, and the trees can be considered as background objects at this time. Trees can be considered obstacles if only one or a few trees appear in the image and are in close proximity to the drone.
  • buildings if some buildings are farther away from the drone or farther away from the drone than the following target, these buildings can be considered as background targets.
  • the moving path of the UAV around the obstacle can be determined through S103.
  • Determining the movement path of the UAV may include: establishing a loss function based at least in part on the position information of the photographed target and the position information of the obstacle, and minimizing the loss function to determine the movement path of the UAV around the obstacle.
  • the moving path of the UAV can be generated in real time during the target tracking process, and the moving path is usually composed of a series of waypoints, and the UAV flies along the series of waypoints.
  • the loss function is used to determine the cost of some factors in the moving path, and the quantified cost value can be obtained.
  • the loss function may include a first loss function as shown below:
  • Cost cost_1(target, obstacle)
  • Cost represents the loss function
  • cost_1 represents the first loss function
  • target represents the location information of the shooting target
  • obstacle represents the location information of the obstacle.
  • the connection line between the waypoint and the shooting target does not pass through the area where the obstacle is located, it means that the obstacle does not appear between the shooting target and the shooting device, so that the shooting target will not be blocked by the obstacle in the captured image; otherwise, If the connection line between the waypoint and the photographing target passes through the area where the obstacle is located, it means that the obstacle appears between the photographing target and the photographing device, so that the photographing target will be blocked by the obstacle in the photographed image. That is, the principle of the first loss function is to determine whether there are obstacles between the waypoint and the shooting target.
  • the cost value of the first loss function is smaller, so try to choose this path. That is to say, when the UAV flies along the moving path determined according to the first loss function, the photographed target in the photographed image will not be blocked by obstacles as much as possible or even completely. As shown in Figure 3, in the dotted line path on the left, a considerable part of the connection between the waypoints and the background target passes through obstacles, while in the solid line path on the right, there is almost no connection with the background target. The path point where the line passes through the obstacle, so the first loss function value of the solid line path is smaller.
  • the first loss function is a loss function that reflects the shooting effect.
  • Figure 3 illustrates the background target as an example, and the above process is similar for the following target. That is, the present application can also follow the target in the photographed image without being blocked by obstacles when the drone is flying, thereby ensuring the photographing effect.
  • both the following target and the background target appear in the image, in some examples, only the cost value of the following target or the background target may be considered to determine the movement path. In other examples, both the cost of following the target and the background target need to be considered.
  • the following target and the background target can be regarded as the same type of target. If the connection between the target and the target passes through the obstacles, the more path points, the greater the cost value of the first loss function, so try not to choose this path. If the connection between the target and the target passes through the obstacles, the fewer path points, the smaller the cost value of the first loss function, and the path is selected as much as possible.
  • the loss function may further include a second loss function as follows:
  • cost_2 represents the second loss function
  • distance represents the distance between the path point and the obstacle in the moving path. The larger the distance between the waypoint and the obstacle, the smaller the value of the second loss function.
  • the second loss function is a loss function reflecting the safety of the path, and the safety of the path is used to characterize whether the distance between the path point and the obstacle satisfies the safety index.
  • the distance between the drone and the obstacle cannot be too close, so as to ensure that the drone does not collide with the obstacle and improve the flight safety of the drone.
  • the second loss function if the distance between the path point in a path and the obstacle is greater, the path is more likely to be selected as the moving path of the UAV, and vice versa, the less likely it is to be selected as the UAV's moving path. movement path.
  • the distance between the path point of the solid line path and the obstacle is closer. If the UAV flies according to the solid line path, the UAV may collide with the obstacle. Compared with the solid line path, the distance between the path point of the dotted line path and the obstacle is farther. If the UAV flies according to the dotted line path, it is basically impossible for the UAV to collide with the obstacle. Without considering other factors, or the influence of other factors is basically the same, the dashed line path can be used as the movement path of the UAV.
  • the loss function further includes a third loss function as follows:
  • cost_3 represents the third loss function
  • curv represents the curvature of the moving path and/or the rate of change of the curvature, and the smaller the rate of change of the curvature and/or the curvature, the smaller the value of the third loss function.
  • the third loss function is a loss function reflecting the smoothness of the path, and the smoothness of the path is used to characterize the degree of bending of the path, and the degree of bending can usually be represented by the curvature and/or the rate of change of the curvature of the path. If the path is less curved, the path smoothness is better, and if the path is more curved, the path smoothness is poorer. As shown in Fig. 5, the dashed path has more bends than the solid line path, so the smoothness of the solid line path is better than that of the dashed line path. Without considering other factors, or the influence of other factors is basically the same, the solid line path is more likely to be used as the movement path of the UAV. Through the third loss function, the planned movement path can be made smoother, and the bending in the path can be minimized. The flight mileage of the small UAV and the drastic degree of the UAV attitude change save the UAV's flight time and energy consumption.
  • the loss function further includes a fourth loss function as follows:
  • cost_4 represents the fourth loss function
  • range represents the distance between the path point on the moving path and the following target, and the closer the distance is to the preset tracking distance, the smaller the value of the fourth loss function.
  • the fourth loss function is a loss function that reflects the distance between the photographing device and the following target.
  • the distance between the photographing device and the following target is usually determined. With certain constraints, this distance generally cannot be too close or too far, and it should be within a preset tracking distance range, otherwise the size and/or position of the pixel area corresponding to the tracking target will be affected.
  • the preset tracking distance may be set by the user through the remote controller, or automatically set by at least one of the drone, the photographing device and the remote controller. Through the fourth loss function, the distance between the planned path point and the following target is as close as possible to the preset tracking distance, so as to ensure the shooting effect of target tracking.
  • the first, second, third and fourth loss functions are described above, but it does not mean that the loss function can only include one of these loss functions.
  • the loss function may comprise at least one of the first, second, third and fourth loss functions.
  • the loss function can look like this:
  • Cost a*cost_1(target, obstacle)+b*cost_2(distance)+c*cost_3(curv)+d*cost_4(range)
  • a, b, c, d represent the weighting coefficients of the first, second, third and fourth loss functions, respectively.
  • the four weighting coefficients a, b, c, and d can be set separately. The higher the weighting coefficient of the loss function, the greater the proportion of the factor corresponding to the loss function when determining the moving path.
  • the proportion of factors corresponding to the loss function is lighter.
  • the user may manually set the weighting coefficient through the remote controller, or at least one of the drone, the photographing device and the remote controller may automatically set the above-mentioned weighting coefficient.
  • the weighting coefficient a of the first loss function can be set larger, and the other weighting coefficients can be set smaller. In this way, when planning the moving path, we will first try to ensure that the shooting target is not blocked by obstacles, and then consider several other factors on this basis.
  • the loss function may also be a weighted sum of the first loss function and one or both of the second, third and fourth loss functions.
  • the loss function can be a weighted sum of the first and second four loss functions.
  • the control method of this embodiment takes the relationship between the shooting target and the obstacle into consideration when determining the moving path, establishes a loss function based on the factor of whether the shooting target will be blocked by the obstacle, and plans the corresponding Therefore, when the UAV tracks the target along the moving path, the shooting target can be prevented from being blocked by obstacles as much as possible or even completely, thereby improving the shooting effect.
  • other factors such as the safety and smoothness of the path, and the distance between the path point and the following target can also be considered when determining the moving path, so that the planned path can guarantee a certain good effect in all aspects.
  • the control method of this embodiment further includes: determining the environment type of the environment where the drone is located, and determining the movement parameters of the drone according to the environment type.
  • the following target Since the following target is usually in a moving state, it may appear in various scenes, or switch between various scenes.
  • the environment in which the drone is located will also change accordingly. Background objects in the scene can reflect the type of environment the drone is in.
  • the environment types may include obstacle-dense types and obstacle-sparse types.
  • the "dense” and “sparse” are used to characterize the density of obstacles in the environment, and the degree of density will affect the flight state of the UAV to a certain extent. That is to say, in an environment with sparse obstacles and in an environment with dense obstacles, the UAV should adopt different flight states to ensure the shooting effect and the safety of the UAV.
  • the environment type of the environment in which the drone is located is determined according to the background target in the image captured by the photographing device.
  • background objects may include mountains, rivers, lakes, forests, seas, beaches, grasslands, various buildings, horizons, etc., and may also include sky, sun, moon, clouds, morning glow, sunset glow, and the like.
  • At least one of the type, position, number, size, and relative positional relationship of the background target can determine the type of environment in which the drone is located. For example, if the background targets are lakes, seas, beaches, grasslands, and the sky, it can be considered that the environment in which the drone is located is sparse with obstacles. If the background target is a forest, it is necessary to further judge factors such as the number and size of trees.
  • the UAV When the number of trees in the image is limited and the size is large, it can be considered that the UAV is in the forest and the environment is dense with obstacles; when the number of trees in the image is large and the size is small, it can be considered that the UAV is located Outside the forest, the environment is of sparse obstacle type. For buildings, the way of judging the type of environment is similar to the case of forests.
  • the environment type of the environment where the drone is located is determined according to the detection data of the detection device carried by the drone.
  • the detection device can be a variety of sensors carried by the UAV, and these sensors can perceive the external environment and obtain detection data. Sensors such as, but not limited to, image sensors, distance sensors, elevation sensors, and the like. By analyzing the detection data of the sensor, it can be determined whether the type of the environment is a dense type of obstacles or a type of sparse obstacles.
  • the flight state of the UAV is usually characterized by movement parameters.
  • the movement parameter of the UAV includes the speed of the UAV. If the environment type is a dense obstacle type, the speed of the UAV is determined to be the first speed; if the environment type is a sparse obstacle type, it is determined The speed of the drone is the second speed, and the second speed is greater than the first speed. That is to say, when the drone is in an environment with dense obstacles, such as forests and buildings, the flying speed of the drone should be small, so as to avoid collision with obstacles and ensure that no one The safety of the camera is ensured, and the stability of the shooting device is ensured, and the shooting effect is improved.
  • the flying speed of the UAV can be larger, which provides greater flexibility for the UAV to fly.
  • the movement parameters of the UAV also include the angular velocity of the photographing device. If the environment type is a dense obstacle type, the angular velocity of the photographing device is determined to be the first angular velocity; if the environment type is a sparse obstacle type, it is determined The angular velocity of the photographing device is the second angular velocity, and the second angular velocity is greater than the first angular velocity.
  • the angular velocity of the photographing device may include an angular velocity around at least one of a pan axis, a roll axis, and a pitch axis, and may reflect the attitude change rate of the photographing device.
  • the angular velocity of the shooting device can be small, which can ensure the stability of the shooting device and improve the shooting effect.
  • the angular velocity of the photographing device can be larger to provide greater flexibility for the flight of the UAV.
  • the control method of this embodiment further includes: adjusting the posture of the photographing device according to the first pixel area and/or the second pixel area.
  • the first pixel area is the pixel area corresponding to the following target in the image captured by the photographing device; the second pixel area is the pixel area corresponding to the background target in the image captured by the photographing device.
  • the photographing apparatus of this embodiment not only considers the position of the following target, but also takes the position of the background target into consideration.
  • the posture of the photographing device may be determined according to the position of the pixel area corresponding to the following target, the position of the pixel area corresponding to the background target, or the relative positions of the pixel areas corresponding to the following target and the background target. In some cases, the above process may be commonly referred to as composition.
  • the posture of the photographing device may be adjusted according to the relative positional relationship between the first pixel area and the second pixel area.
  • the relative positional relationship may refer to the distance between the first pixel area and the second pixel area, the pixel coordinate range of the two, the size of the two, and the like. If the relative positional relationship is the first positional relationship, the posture of the photographing device is adjusted based on the first posture adjustment strategy; if the relative positional relationship is the second positional relationship, the posture of the photographing device is adjusted based on the second posture adjustment strategy; the second posture adjustment The strategy is different from the first posture adjustment strategy.
  • the relative positional relationship may include: whether the first pixel area and the second pixel area overlap.
  • the first posture adjustment strategy includes: positioning the center of the image between the first pixel area and the second pixel area.
  • the attitude adjustment amount of the camera can be calculated by the following formula:
  • pitch represents the pitch angle
  • yaw represents the heading angle
  • Error() represents the attitude adjustment amount of the shooting device
  • x(target), y(target) respectively represent the position coordinates of the following target in the predetermined coordinate system
  • x(background), y(background) respectively represents the position coordinates of the background target in the same predetermined coordinate system
  • a and b are the weighting coefficients, which indicate the weight of keeping the following target and the background target at the center of the screen, and the center of the calculated image is located between the background target and the following target.
  • the weighting coefficient represents the proportion of the position of the following target and the background target.
  • the predetermined coordinate system may be a pixel coordinate system or an image coordinate system.
  • the user may manually set the weighting coefficient through the remote controller, or at least one of the drone, the photographing device and the remote controller may automatically set the above-mentioned weighting coefficient.
  • the weighting coefficient a may be set larger than the weighting coefficient b.
  • the weighting coefficient b may be set larger than the weighting coefficient a.
  • the image center is located between the first pixel area and the second pixel area, but the embodiment is not limited thereto.
  • Other designated positions of the image may also be located between the first pixel area and the second pixel area.
  • the other designated positions may be positions manually set by the user through the remote controller, or automatically set by at least one of the drone, the camera, and the remote controller.
  • the second posture adjustment strategy may include: positioning the first pixel area at a designated position in the image.
  • the second posture adjustment strategy may include: making the distance between the edge pixels of the second pixel area and the edge of the image smaller than a preset distance threshold.
  • the attitude adjustment amount of the photographing device can be calculated by the following formula:
  • the camera rotates the attitude adjustment amount obtained by the above formula in the heading and pitch directions respectively, and the first pixel area corresponding to the following target can be located at the center of the image.
  • the photographing device may have photographed part of the edge of the background object, and at this time, the second pixel area does not occupy the entire image.
  • the distance between the edge pixels of the second pixel area and the edge of the image can be smaller than the preset distance threshold, that is, try to make more background targets appear. in the image.
  • the attitude adjustment amount of the camera can be calculated by the following formula:
  • h and w represent the height and width of the image, respectively, and the meanings of the remaining symbols are shown in formulas (1) and (2).
  • the background object does not occupy the entire image.
  • the attitude adjustment amount is calculated by the above formula, so that while the first pixel area corresponding to the following target is located in the image, the distance between the edge pixels of the second pixel area corresponding to the background target and the edge of the image is smaller than the preset distance threshold .
  • the preset distance threshold may be manually set and adjusted by a user through a remote control, or automatically set or adjusted by at least one of the drone, the photographing device and the remote control.
  • first positional relationship and the second positional relationship as well as the corresponding first attitude adjustment strategy and the second attitude adjustment strategy are described above, this is only an exemplary illustration, and the embodiment is not limited thereto.
  • the relative positional relationship between the first pixel area and the second pixel area can be various, and corresponding to different attitude adjustment strategies.
  • the posture of the photographing device may be adjusted according to the shape of the second pixel area. If the shape of the second pixel area is of the first type, the posture of the photographing device is adjusted based on the third posture adjustment strategy; if the shape of the second pixel area is of the second type, the posture of the photographing device is adjusted based on the fourth posture adjustment strategy, The fourth attitude adjustment strategy is different from the third attitude adjustment strategy.
  • the first type is a line type
  • the third posture adjustment strategy includes: passing the second pixel area through a designated position in the image, and the first pixel area coincides with the designated position in at least one direction of the image.
  • the corresponding second pixel area is a straight line or a curve.
  • horizon can usually be the boundary between the sea, grassland, and desert and the sky.
  • making the second pixel area pass through the designated position in the image means the camera rotates the pitch angle adjustment amount in the direction of the pitch axis, so that the horizon passes through the center of the image; the first pixel The area coincides with the designated position in at least one direction of the image means: the camera rotates the heading angle adjustment amount in the direction of the heading axis, so that the center of the following target coincides with the center of the image in the heading direction.
  • the second pixel area is a curve
  • the midpoint of the curve along the horizontal direction of the image passes through the center of the image.
  • the horizon is located at the center of the vertical direction of the image
  • the following target is located at the center of the horizontal direction of the image
  • the following target is presented in front of the horizon and the backgrounds on both sides, which conforms to aesthetic habits and improves the shooting effect of the shooting device.
  • the second type is a face shape
  • the fourth posture adjustment strategy includes: if the first pixel area and the second pixel area do not overlap, adjusting the posture of the photographing device so that the center of the image is located at the first pixel between the area and the second pixel area; if the first pixel area is located in the second pixel area, adjust the posture of the photographing device so that the first pixel area is located at a designated position in the image; and/or, make the second pixel area The distance between the edge pixel and the edge of the image is less than the preset distance threshold.
  • the attitude adjustment method can be determined according to whether the first pixel area and the second pixel area overlap. In some examples, when the first pixel area and the second pixel area do not overlap, adjusting the posture of the photographing device so that the center of the image is located between the first pixel area and the second pixel area is the same as the aforementioned method according to the first pixel area.
  • the relative positional relationship with the second pixel area is similar in that there is no overlap.
  • the first pixel area when the first pixel area is located in the second pixel area, adjust the posture of the photographing device so that the first pixel area is located at a designated position in the image; and/or, make the edge pixels of the second pixel area and the edge of the image The distance between them is smaller than the preset distance threshold, which is similar to the aforementioned manner in which the first pixel area is located in the second pixel area according to the relative positional relationship between the first pixel area and the second pixel area.
  • the pose of the camera may also be adjusted according to the type of background object. If the type of the background target is the first background type, the posture of the photographing device is adjusted based on the fifth posture adjustment strategy; if the type of the background target is the second background type, the posture of the photographing device is adjusted based on the sixth posture adjustment strategy.
  • the attitude adjustment strategy is different from the fifth attitude adjustment strategy.
  • the first background type may be a surface type
  • the fifth posture adjustment strategy may include: the first pixel area is located in the upper half area of the image width direction, and the second pixel area is located in the lower half area in the image width direction; the second background type may be
  • the sixth attitude adjustment strategy includes: making the first pixel area located in the lower half area in the width direction of the image, and the second pixel area in the upper half area in the width direction of the image.
  • the surface types may include: mountains, rivers, lakes, forests, seas, beaches, grasslands, buildings and other background objects rooted in the ground, and the non-surface types may include: sky, sun, moon, clouds and other background targets.
  • the first background type is a surface type
  • the following target may be located above the background target in some cases.
  • the background targets are grasslands, seas or lakes, and birds or drones are used as the following targets, these following targets usually move over these background targets, at this time, the first pixel area can be located in the width direction of the image.
  • Half area, the second pixel area is located in the lower half area in the width direction of the image, which is more in line with the user's observation habits and is conducive to the improvement of the shooting effect.
  • the present application can flexibly determine the positions of the following target and the background target in the image by adjusting the weighting coefficient, which brings more flexibility and convenience to the composition of the photographing device.
  • the above describes the process of determining the moving path of the movable platform around the obstacle according to the position information of the shooting target and the position information of the obstacle.
  • the obstacles mentioned therein may be utility poles, street lights, outdoor signs, outdoor billboards, and background targets in some cases. Such obstacles can be considered as obstacles in the usual sense, or a solid obstacle. However, in some cases, it may not be enough to only consider such obstacles in the process of determining the moving path, which may affect the target tracking effect of the UAV to a certain extent, especially the flight safety and shooting of the UAV. device stability.
  • the no-fly zone in the environment where the UAV is located can also be used as an obstacle, and according to the position information of the shooting target and the position information of the no-fly zone, it is determined that the UAV bypasses the obstacle and the The movement path of the no-fly zone.
  • the no-fly zone can be regarded as a kind of virtual obstacle, which can be implemented in multiple ways.
  • the no-fly zone is determined by an electronic fence device.
  • Electronic fence devices also known as geo-fencing devices, are provided at a location and claim one or more electronic fence boundaries that enclose the no-fly zone and control the movement of drones within the electronic fence boundaries .
  • the electronic fence device can be fixed at a certain location, or it can be easily movable and/or portable.
  • the electronic fence device can be any type of device, eg visual identification device, audio identification device, radio identification device.
  • the visual identification device can be a device that can be sensed by the optical sensor of the drone;
  • the audio identification device can be a device that can be sensed by the audio collection device of the drone;
  • the radio identification device can include: mobile terminals, desktop computers, portable A computer, etc., it can also be another drone or a docking station for a drone.
  • An electronic fence device is usually associated with one or more sets of flight controls, which are used to indicate the control content of a no-fly zone, such as prohibiting drones from flying, restricting drones from flying under certain conditions, and so on.
  • flight controls There may be different flight controls for different drones, different users operating drones, and different electronic fence equipment.
  • the location information of the no-fly zone around the drone can be obtained by searching the location information of the drone itself.
  • the location of the no-fly zone is available to the drone itself.
  • the UAV can store map data locally, and the map data includes the location information of the no-fly zones within various geographical ranges. After the UAV obtains its own position information through the position sensor, it can obtain the position information of the no-fly zone within the preset range around the UAV by querying the locally stored map data. Alternatively, the UAV may not store the map data locally, but obtain the map data through a remote device (for example, a cloud server), and query the obtained map data to obtain the information of the no-fly zone within the preset range around the UAV. location information.
  • a remote device for example, a cloud server
  • the manner of determining the movement path according to the position information of the photographed target and the position information of the no-fly zone is similar to the foregoing manner of determining the movement path according to the position information of the photographed target and the position information of obstacles.
  • the solid line with arrows indicates the movement path. If there is a no-fly zone around the UAV, when determining the movement path, the position information of the shooting target, and the position information of the no-fly zone, or the position information of both the no-fly zone and the obstacle are used to determine the movement path. waypoint. Due to the consideration of the no-fly zone, the area between the no-fly zone and the obstacle can be regarded as a passable area, and the planned movement path passes through the communicable area without entering the no-fly zone.
  • the traditional method When determining the moving path of target tracking, the traditional method usually only considers the influence of obstacles and does not consider the influence of the no-fly zone, so that the UAV often accidentally breaks into the no-fly zone when flying along the moving path, which will lead to no flight. Humans and aircraft violate relevant flight controls, which brings a series of safety problems. In some cases, corresponding actions will be taken when the drone enters the no-fly zone or is very close to the no-fly zone. These actions generally include: immediate landing, sudden changes in the flight direction of the drone, speed and attitude of the drone Such flight parameters will change drastically, which is very detrimental to the smoothness of the drone's flight and the stability of the shooting device.
  • the moving path can be determined by the location information of the photographed target and the location information of the no-fly zone, so that when the drone flies along the moving path, if there is a no-fly zone around it, the drone can operate in the same way as an obstacle.
  • decelerate in advance or change the flight direction in advance to bypass the no-fly zone, so as to avoid entering the no-fly zone and improve flight safety.
  • the change of flight parameters is smoother, which improves the smoothness of the drone flight and the stability of the shooting device.
  • Another embodiment of the present disclosure provides a control method for a movable platform, where the movable platform includes a photographing device. As shown in FIG. 11 , the control method includes:
  • S1101 Acquire a first pixel area corresponding to a following target and a second pixel area corresponding to a background target in the image captured by the photographing device;
  • S1102 Adjust the posture of the photographing device according to the relative positional relationship between the first pixel area and the second pixel area.
  • the movable platform of this embodiment is similar to the movable platform of the previous embodiment.
  • the control method of this embodiment will be described below by taking a movable platform such as an unmanned aerial vehicle as an example.
  • the posture of the photographing device may be determined according to the relative positions of the first pixel area corresponding to the following target and the second pixel area corresponding to the background target. Part of the steps, operations, and flow of the control method in this embodiment may be similar to the corresponding parts in the previous embodiment.
  • the relative positional relationship may refer to the distance between the first pixel area and the second pixel area, the pixel coordinate range of the two, the size of the two, and the like.
  • the step of adjusting the posture of the photographing device according to the relative positional relationship between the first pixel area and the second pixel area is performed.
  • the relative positional relationship is the first positional relationship
  • adjust based on the second posture adjustment strategy The posture of the photographing device; the second posture adjustment strategy is different from the first posture adjustment strategy.
  • the relative positional relationship may include: whether the first pixel area and the second pixel area overlap.
  • the first posture adjustment strategy includes: positioning the center of the image between the first pixel area and the second pixel area.
  • the attitude adjustment amount of the camera can be calculated by the following formula:
  • pitch represents the pitch angle
  • yaw represents the heading angle
  • Error() represents the attitude adjustment amount of the shooting device
  • x(target), y(target) respectively represent the position coordinates of the following target in the predetermined coordinate system
  • x(background), y(background) respectively represents the position coordinates of the background target in the same predetermined coordinate system
  • a and b are the weighting coefficients, indicating the weight of keeping the following target and the background target at the center of the screen, and the center of the calculated image is located between the background target and the following target.
  • the weighting coefficient represents the proportion of the position of the following target and the background target.
  • the predetermined coordinate system may be a pixel coordinate system or an image coordinate system.
  • the user may manually set the weighting coefficient through the remote controller, or at least one of the drone, the photographing device and the remote controller may automatically set the above-mentioned weighting coefficient.
  • the weighting coefficient a may be set larger than the weighting coefficient b.
  • the weighting coefficient b may be set larger than the weighting coefficient a.
  • the image center is located between the first pixel area and the second pixel area, but the embodiment is not limited thereto.
  • Other designated positions of the image may also be located between the first pixel area and the second pixel area.
  • the other designated positions may be positions manually set by the user through the remote controller, or automatically set by at least one of the drone, the camera, and the remote controller.
  • the second posture adjustment strategy may include: positioning the first pixel area at a designated position in the image.
  • the second posture adjustment strategy may include: making the distance between the edge pixels of the second pixel area and the edge of the image smaller than a preset distance threshold.
  • the attitude adjustment amount of the photographing device can be calculated by the following formula:
  • the camera rotates the attitude adjustment amount obtained by the above formula in the heading and pitch directions respectively, and the first pixel area corresponding to the following target can be located at the center of the image.
  • the photographing device may have photographed part of the edge of the background object, and at this time, the second pixel area does not occupy the entire image.
  • the distance between the edge pixels of the second pixel area and the edge of the image can be smaller than the preset distance threshold, that is, try to make more background targets appear. in the image.
  • the attitude adjustment amount of the camera can be calculated by the following formula:
  • h and w represent the height and width of the image, respectively, and the meanings of the remaining symbols are shown in formulas (1) and (2).
  • the second pixel area corresponding to the background object is roughly triangular in shape, and is located in the lower left area of the image, and does not occupy the entire image.
  • the follower target is above and to the right of the background target in the image.
  • the attitude adjustment amount is calculated by the above formula, so that the photographing device is rotated as far as possible to the left in the heading direction and to the right in the pitch direction, so that the first pixel area corresponding to the following target is located in the upper right corner of the image, and the first pixel area corresponding to the background target is located at the same time.
  • the distance between the edge pixels of the two-pixel area and the edge of the image is less than a preset distance threshold.
  • the edge pixels of the second pixel area may be pixels on the hypotenuse of the second image area of the triangle.
  • the edges of the image can be the top and right edges of the image. This allows the background objects to appear in the image as much as possible.
  • the preset distance threshold may be manually set and adjusted by a user through a remote control, or automatically set or adjusted by at least one of the drone, the photographing device and the remote control.
  • first positional relationship and the second positional relationship as well as the corresponding first attitude adjustment strategy and the second attitude adjustment strategy are described above, this is only an exemplary illustration, and the embodiment is not limited thereto.
  • the relative positional relationship between the first pixel area and the second pixel area can be various, and corresponding to different attitude adjustment strategies.
  • the posture of the photographing device is adjusted according to the positions of the first pixel area and the second pixel area.
  • the second type may be a line type.
  • the step of adjusting the posture of the photographing device according to the positions of the first pixel area and the second pixel area includes: causing the second pixel area to pass through a designated position in the image, and the first pixel area coincides with the designated position in at least one direction of the image .
  • the corresponding second pixel area is a straight line or a curve.
  • horizon can usually be the boundary between the sea, grassland, and desert and the sky.
  • making the second pixel area pass through the designated position in the image means the camera rotates the pitch angle adjustment amount in the direction of the pitch axis, so that the horizon passes through the center of the image; the first pixel The area coincides with the designated position in at least one direction of the image means: the camera rotates the heading angle adjustment amount in the direction of the heading axis, so that the center of the following target coincides with the center of the image in the heading direction.
  • the second pixel area is a curve
  • the midpoint of the curve along the horizontal direction of the image passes through the center of the image.
  • the horizon is located at the center of the vertical direction of the image
  • the following target is located at the center of the horizontal direction of the image
  • the following target is presented in front of the horizon and the backgrounds on both sides, which conforms to aesthetic habits and improves the shooting effect of the shooting device.
  • Yet another embodiment of the present disclosure provides a control method for a movable platform, where the movable platform includes a photographing device. As shown in FIG. 12 , the control method includes:
  • S1201 Acquire position information of obstacles in the environment where the movable platform is located
  • S1202 Obtain location information of a no-fly zone in the environment where the movable platform is located;
  • S1203 Determine, according to the position information of the obstacle and the position information of the no-fly zone, the moving path of the movable platform to circumvent the obstacle and the no-fly zone.
  • the movable platform of this embodiment is similar to the movable platform of the above-mentioned embodiment.
  • the control method of this embodiment will be described below by taking a movable platform such as an unmanned aerial vehicle as an example.
  • the process of determining the moving path of the movable platform around the obstacle can be determined according to the position information of the obstacle.
  • the obstacles can be utility poles, street lights, outdoor signs, outdoor billboards, and background targets in some cases. However, in some cases, it may not be enough to only consider obstacles in the process of determining the movement path, which may affect the flight safety of the UAV to a certain extent.
  • the no-fly zone in the environment where the UAV is located can also be used as a factor for determining the moving path, and the UAV is determined to bypass the obstacle according to the position information of the obstacle and the position information of the no-fly zone. and the movement path of the no-fly zone.
  • the no-fly zone may be implemented in various manners.
  • the no-fly zone is determined by an electronic fence device.
  • Electronic fence devices also known as geo-fencing devices, are provided at a location and claim one or more electronic fence boundaries that enclose the no-fly zone and control the movement of drones within the electronic fence boundaries .
  • the electronic fence device can be fixed at a certain location, or it can be easily movable and/or portable.
  • the electronic fence device can be any type of device, eg visual identification device, audio identification device, radio identification device.
  • the visual identification device can be a device that can be sensed by the optical sensor of the drone;
  • the audio identification device can be a device that can be sensed by the audio collection device of the drone;
  • the radio identification device can include: mobile terminals, desktop computers, portable A computer, etc., it can also be another drone or a docking station for a drone.
  • An electronic fence device is usually associated with one or more sets of flight controls, which are used to indicate the control content of a no-fly zone, such as prohibiting drones from flying, restricting drones from flying under certain conditions, and so on.
  • flight controls There may be different flight controls for different drones, different users operating drones, and different electronic fence equipment.
  • the location information of the no-fly zone around the drone can be obtained by searching the location information of the drone itself.
  • the location of the no-fly zone is available to the drone itself.
  • the UAV can store map data locally, and the map data includes the location information of the no-fly zones within various geographical ranges. After the UAV obtains its own position information through the position sensor, it can obtain the position information of the no-fly zone within the preset range around the UAV by querying the locally stored map data. Alternatively, the UAV may not store the map data locally, but obtain the map data through a remote device (for example, a cloud server), and query the obtained map data to obtain the information of the no-fly zone within the preset range around the UAV. location information.
  • a remote device for example, a cloud server
  • Determining the movement path of the UAV may include: establishing a loss function based at least in part on the location information of the obstacle and the location information of the obstacle, and minimizing the loss function to determine the movement path of the UAV around the obstacle and the no-fly zone.
  • the moving path of the UAV can be generated in real time during the target tracking process, and the moving path is usually composed of a series of waypoints, and the UAV flies along the series of waypoints.
  • the loss function is used to determine the cost of some factors in the moving path, and the quantified cost value can be obtained.
  • the loss function may include a fifth loss function as shown below:
  • Cost cost_5(obstacle,no_fly_zone) (7)
  • Cost represents the loss function
  • cost_5 represents the fifth loss function
  • obstacle represents the distance between the waypoint in the moving path and the obstacle
  • no_fly_zone represents the distance between the waypoint in the moving path and the no-fly zone
  • the fifth loss function is a loss function reflecting the safety of the path, and the safety of the path is used to characterize whether the distance between the path point and the obstacle or the no-fly zone satisfies the safety index.
  • the safety of the path is used to characterize whether the distance between the path point and the obstacle or the no-fly zone satisfies the safety index.
  • the path is more likely to be selected as the moving path of the UAV, otherwise, the less likely it is to be selected as the UAV's moving path
  • the movement path of the drone As shown in FIG. 13 , the solid line with arrows indicates the movement path. If there are no-fly zones and obstacles around the UAV, when determining the movement path, the waypoints of the movement path are determined based on the position information of both the obstacles and the no-fly zone. Due to the consideration of the no-fly zone, the area between the no-fly zone and the obstacle can be regarded as a passable area, and the planned movement path passes through the communicable area without entering the no-fly zone.
  • the traditional method When determining the moving path of target tracking, the traditional method usually only considers the influence of obstacles and does not consider the influence of the no-fly zone, so that the UAV often accidentally breaks into the no-fly zone when flying along the moving path, which will lead to no flight. Humans and aircraft violate relevant flight controls, which brings a series of safety problems. In some cases, corresponding actions will be taken when the drone enters the no-fly zone or is very close to the no-fly zone. These actions generally include: immediate landing, sudden changes in the flight direction of the drone, speed and attitude of the drone Such flight parameters will change drastically, which is very detrimental to the smoothness of the drone's flight and the stability of the shooting device.
  • the moving path can be determined according to the position information of the obstacle and the position information of the no-fly zone, so that when the drone flies along the moving path, if there is a no-fly zone around it, the drone can fly in the same way as an obstacle.
  • decelerate in advance or change the flight direction in advance to bypass the no-fly zone, so as to avoid entering the no-fly zone and improve flight safety.
  • the change of flight parameters is smoother, which improves the smoothness of the drone flight and the stability of the shooting device.
  • Yet another embodiment of the present disclosure further provides a control device for a movable platform.
  • the movable platform includes a photographing device.
  • the control device includes:
  • a path of movement of the movable platform around the obstacle is determined based at least in part on the location information of the photographed target and the location information of the obstacle.
  • the control device of this embodiment can basically perform various operations corresponding to the steps of the control method of the above-mentioned embodiment.
  • the processor is further configured to perform the following operations: establish a loss function based at least in part on the location information of the photographed target and the location information of the obstacle; minimize the loss function to determine the possible The mobile platform circumvents the movement path of the obstacle.
  • the loss function includes a first loss function, and the connection between the fewer waypoints on the moving path and the shooting target passes through the area where the obstacle is located, the first loss function the smaller the value.
  • the loss function includes a second loss function, and the greater the distance between the path point on the moving path and the obstacle, the smaller the value of the second loss function.
  • the loss function includes a third loss function, and the smaller the curvature and/or the rate of change of the curvature of the moving path, the smaller the value of the third loss function.
  • the shooting target includes a following target
  • the loss function includes a fourth loss function
  • the processor is further configured to perform the following operations: determine an environment type of the environment in which the movable platform is located; and determine movement parameters of the movable platform according to the environment type.
  • the environment types include obstacle-dense types and obstacle-sparse types.
  • the processor is further configured to perform the following operation: determine an environment type of the environment in which the movable platform is located according to a background object in the image captured by the photographing device.
  • the processor is further configured to perform the following operation: according to the detection data of the detection device carried on the movable platform, determine the environment type of the environment where the movable platform is located.
  • the processor is further configured to perform the following operations: if the environment type is a dense obstacle type, determining the speed of the movable platform to be the first speed; if the environment type is a sparse obstacle type type, the speed of the movable platform is determined to be a second speed, and the second speed is greater than the first speed.
  • the processor is further configured to perform the following operations: if the environment type is a type with dense obstacles, determine the angular velocity of the photographing device to be the first angular velocity; if the environment type is a type with sparse obstacles , then it is determined that the angular velocity of the photographing device is a second angular velocity, and the second angular velocity is greater than the first angular velocity.
  • the shooting target includes a following target and a background target.
  • the processor is further configured to perform the following operations: adjust the posture of the photographing device according to the first pixel area and/or the second pixel area; wherein the first pixel area is where the following target is located The pixel area corresponding to the image captured by the photographing device; the second pixel area is the pixel area corresponding to the background object in the image captured by the photographing device.
  • the processor is further configured to perform the following operation: adjust the posture of the photographing device according to the relative positional relationship between the first pixel area and the second pixel area.
  • the processor is further configured to perform the following operations: if the relative positional relationship is a first positional relationship, adjust the posture of the photographing device based on a first posture adjustment strategy; if the relative positional relationship is For the second positional relationship, the posture of the photographing device is adjusted based on a second posture adjustment strategy; the second posture adjustment strategy is different from the first posture adjustment strategy.
  • the first positional relationship is no coincidence
  • the first posture adjustment strategy includes: positioning the center of the image between the first pixel area and the second pixel area; the first The two positional relationship is that the first pixel area is located in the second pixel area
  • the second posture adjustment strategy includes: making the first pixel area located at a specified position in the image; and/or making the first pixel area The distance between the edge pixels of the two-pixel area and the edge of the image is less than a preset distance threshold.
  • the processor is further configured to perform the following operation: adjust the posture of the photographing device according to the shape of the second pixel area.
  • the processor is further configured to perform the following operations: if the shape of the second pixel region is of the first type, adjust the posture of the photographing device based on a third posture adjustment strategy;
  • the posture of the photographing device is adjusted based on a fourth posture adjustment strategy, and the fourth posture adjustment strategy is different from the third posture adjustment strategy.
  • the first type is a line type
  • the third posture adjustment strategy includes: causing the second pixel area to pass through a designated position in the image, the first pixel area in the image Coincidence with the designated position in at least one direction.
  • the second type is a face type
  • the fourth posture adjustment strategy includes: if the first pixel area and the second pixel area do not overlap, adjusting the posture of the photographing device to make The center of the image is located between the first pixel area and the second pixel area; if the first pixel area is located within the second pixel area, adjust the posture of the photographing device so that the The first pixel area is located at a specified position in the image; and/or the distance between the edge pixel point of the second pixel area and the edge of the image is smaller than a preset distance threshold.
  • a no-fly zone in the environment in which the movable platform is located is used as the obstacle.
  • the processor is further configured to perform the following operation: determine, according to the location information of the photographing target and the location information of the no-fly zone, that the movable platform can circumvent the obstacle and the prohibited area. The movement path of the fly zone.
  • Another embodiment of the present disclosure further provides a control device for a movable platform, where the movable platform includes a photographing device.
  • the control device includes:
  • the posture of the photographing device is adjusted according to the relative positional relationship between the first pixel area and the second pixel area.
  • the control device of this embodiment can basically perform various operations corresponding to the steps of the control method of the above-mentioned embodiment.
  • the processor is further configured to perform the following operation: if the shape of the second pixel area is of a first type, perform the relative operation according to the first pixel area and the second pixel area Position relationship, the steps of adjusting the posture of the photographing device.
  • the first type includes: face type.
  • the processor is further configured to perform the following operations: if the relative positional relationship is a first positional relationship, adjust the posture of the photographing device based on a first posture adjustment strategy; if the relative positional relationship is For the second positional relationship, the posture of the photographing device is adjusted based on a second posture adjustment strategy; the second posture adjustment strategy is different from the first posture adjustment strategy.
  • the first positional relationship is no coincidence
  • the first posture adjustment strategy includes: positioning the center of the image between the first pixel area and the second pixel area; the first The two positional relationship is that the first pixel area is located in the second pixel area
  • the second posture adjustment strategy includes: making the first pixel area located at a specified position in the image; and/or making the first pixel area The distance between the edge pixels of the two-pixel area and the edge of the image is less than a preset distance threshold.
  • the posture of the photographing device is adjusted according to the positions of the first pixel area and the second pixel area.
  • the second type includes: line type.
  • the processor is further configured to: cause the second pixel area to pass through a designated location in the image, the first pixel area and the image in at least one direction The specified positions coincide.
  • Another embodiment of the present disclosure further provides a control device for a movable platform, where the movable platform includes a photographing device.
  • the control device includes:
  • a processor for executing the executable instructions stored in the memory to perform the following operations:
  • a moving path of the movable platform around the obstacle and the no-fly zone is determined.
  • the control device of this embodiment can basically perform various operations corresponding to the steps of the control method of the above-mentioned embodiment.
  • the processor is further configured to: establish a loss function based at least in part on the no-fly zone location information and the obstacle location information; minimize the loss function to determine the A movable platform circumvents the obstacle and the no-fly zone's movement path.
  • the loss function includes a fifth loss function, and the greater the distance between the path point on the moving path and the obstacle and the no-fly zone, the smaller the value of the fifth loss function is .
  • the loss function includes a second loss function, and the smaller the curvature and/or the rate of change of the curvature of the moving path, the smaller the value of the second loss function.
  • the no-fly zone is determined by electronic fence, flight control information.
  • the processor is further configured to obtain location information for the no-fly zone from at least one of a remote device local to the movable platform and a remote device that can communicate with the movable platform.
  • the processor is further configured to measure the location information of the obstacle via sensors of the movable platform; and/or obtain from a remote device communicable with the movable platform location information of the obstacle.
  • Still another embodiment of the present disclosure also provides a computer-readable storage medium storing executable instructions that, when executed by one or more processors, can cause the one or more processors to The control methods described in the above embodiments are executed.
  • a computer-readable storage medium can be any medium that can contain, store, communicate, propagate, or transmit instructions.
  • a readable storage medium may include, but is not limited to, an electrical, magnetic, optical, electromagnetic, infrared or semiconductor system, apparatus, device, or propagation medium.
  • Specific examples of readable storage media include: magnetic storage devices, such as magnetic tapes or hard disks (HDDs); optical storage devices, such as compact disks (CD-ROMs); memories, such as random access memory (RAM) or flash memory; and/or wired /Wireless communication link.
  • Still another embodiment of the present disclosure further provides a movable platform, as shown in FIG. 15 , including: a photographing device, and a movable carrier; the movable carrier includes: the control device of the above-mentioned embodiment.
  • the movable carrier includes a drone, an unmanned vehicle, an unmanned boat, or a robot.

Abstract

La présente invention porte sur un procédé et sur un dispositif de commande pour une plate-forme mobile, ainsi que sur un support de stockage lisible par ordinateur et sur la plate-forme mobile. La plate-forme mobile comprend un dispositif de photographie. Le procédé de commande consiste : à obtenir des informations de position d'une cible de photographie d'un dispositif de photographie ; à obtenir des informations de position d'un obstacle dans un environnement dans lequel se trouve une plate-forme mobile ; et à déterminer, au moins partiellement en fonction des informations de position de la cible de photographie et des informations de position de l'obstacle, un trajet de déplacement le long duquel la plate-forme mobile contourne l'obstacle.
PCT/CN2020/107825 2020-08-07 2020-08-07 Procédé et dispositif de commande pour une plate-forme mobile, et support de stockage lisible par ordinateur WO2022027596A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2020/107825 WO2022027596A1 (fr) 2020-08-07 2020-08-07 Procédé et dispositif de commande pour une plate-forme mobile, et support de stockage lisible par ordinateur
CN202080035658.XA CN113906360A (zh) 2020-08-07 2020-08-07 可移动平台的控制方法、装置、计算机可读存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/107825 WO2022027596A1 (fr) 2020-08-07 2020-08-07 Procédé et dispositif de commande pour une plate-forme mobile, et support de stockage lisible par ordinateur

Publications (1)

Publication Number Publication Date
WO2022027596A1 true WO2022027596A1 (fr) 2022-02-10

Family

ID=79186965

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/107825 WO2022027596A1 (fr) 2020-08-07 2020-08-07 Procédé et dispositif de commande pour une plate-forme mobile, et support de stockage lisible par ordinateur

Country Status (2)

Country Link
CN (1) CN113906360A (fr)
WO (1) WO2022027596A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114924585A (zh) * 2022-05-19 2022-08-19 广东工业大学 基于视觉的旋翼无人机在崎岖地表的安全降落方法及系统
CN116468351A (zh) * 2023-06-16 2023-07-21 深圳市磅旗科技智能发展有限公司 一种基于大数据的智慧物流管理方法
US20230259145A1 (en) * 2022-02-15 2023-08-17 Skydio, Inc. Enhanced Unmanned Aerial Vehicle Flight With Situational Awareness For Moving Vessels

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115291607B (zh) * 2022-08-02 2023-03-14 柳州城市职业学院 复杂未知水域内无人驾驶船路径自主规划生成方法及系统
CN115755981A (zh) * 2022-12-12 2023-03-07 浙江大学 一种交互式的无人机自主航拍方法及装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016041110A1 (fr) * 2014-09-15 2016-03-24 深圳市大疆创新科技有限公司 Procédé de commande de vol des aéronefs et dispositif associé
CN106325290A (zh) * 2016-09-30 2017-01-11 北京奇虎科技有限公司 一种基于无人机的监控系统及设备
CN106981073A (zh) * 2017-03-31 2017-07-25 中南大学 一种基于无人机的地面运动目标实时跟踪方法及系统
WO2018072063A1 (fr) * 2016-10-17 2018-04-26 深圳市大疆创新科技有限公司 Procédé et appareil de commande de vol d'aéronef, et aéronef
CN109493371A (zh) * 2018-11-29 2019-03-19 中国计量大学 一种基于视觉的四旋翼无人机行人跟踪方法

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106741028B (zh) * 2016-12-05 2018-11-02 四川西部动力机器人科技有限公司 一种机场智能行李车
CN108496134A (zh) * 2017-05-31 2018-09-04 深圳市大疆创新科技有限公司 无人机返航路径规划方法及装置
CN108702448B (zh) * 2017-09-27 2021-04-09 深圳市大疆创新科技有限公司 无人机图像采集方法及无人机、计算机可读存储介质
CN107992052B (zh) * 2017-12-27 2020-10-16 纳恩博(北京)科技有限公司 目标跟踪方法及装置、移动设备及存储介质
CN108519773B (zh) * 2018-03-07 2020-01-14 西安交通大学 一种结构化环境下无人驾驶车辆的路径规划方法
CN108931991A (zh) * 2018-08-30 2018-12-04 王瑾琨 移动载体自动跟随方法及具有自动跟随避障功能移动载体
CN109343528A (zh) * 2018-10-30 2019-02-15 杭州电子科技大学 一种节能的无人机路径规划避障方法
CN109739267A (zh) * 2018-12-21 2019-05-10 北京智行者科技有限公司 跟随路径的确定方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016041110A1 (fr) * 2014-09-15 2016-03-24 深圳市大疆创新科技有限公司 Procédé de commande de vol des aéronefs et dispositif associé
CN106325290A (zh) * 2016-09-30 2017-01-11 北京奇虎科技有限公司 一种基于无人机的监控系统及设备
WO2018072063A1 (fr) * 2016-10-17 2018-04-26 深圳市大疆创新科技有限公司 Procédé et appareil de commande de vol d'aéronef, et aéronef
CN106981073A (zh) * 2017-03-31 2017-07-25 中南大学 一种基于无人机的地面运动目标实时跟踪方法及系统
CN109493371A (zh) * 2018-11-29 2019-03-19 中国计量大学 一种基于视觉的四旋翼无人机行人跟踪方法

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230259145A1 (en) * 2022-02-15 2023-08-17 Skydio, Inc. Enhanced Unmanned Aerial Vehicle Flight With Situational Awareness For Moving Vessels
CN114924585A (zh) * 2022-05-19 2022-08-19 广东工业大学 基于视觉的旋翼无人机在崎岖地表的安全降落方法及系统
CN116468351A (zh) * 2023-06-16 2023-07-21 深圳市磅旗科技智能发展有限公司 一种基于大数据的智慧物流管理方法
CN116468351B (zh) * 2023-06-16 2023-11-07 深圳市磅旗科技智能发展有限公司 一种基于大数据的智慧物流管理方法、系统以及存储介质

Also Published As

Publication number Publication date
CN113906360A (zh) 2022-01-07

Similar Documents

Publication Publication Date Title
WO2022027596A1 (fr) Procédé et dispositif de commande pour une plate-forme mobile, et support de stockage lisible par ordinateur
US11797009B2 (en) Unmanned aerial image capture platform
US11242144B2 (en) Aerial vehicle smart landing
US20240062663A1 (en) User Interaction With An Autonomous Unmanned Aerial Vehicle
EP3459238B1 (fr) Système et procédé de capture d'image et d'emplacement tenant compte des besoins
CN103822635B (zh) 基于视觉信息的无人机飞行中空间位置实时计算方法
CN113038016B (zh) 无人机图像采集方法及无人机
CN108476288A (zh) 拍摄控制方法及装置
CN109923589A (zh) 构建和更新高程地图
US20210112194A1 (en) Method and device for taking group photo
CN205453893U (zh) 无人机
US11912407B1 (en) Unmanned vehicle morphing
US20220027038A1 (en) Interactive virtual interface
WO2021047502A1 (fr) Procédé et appareil d'estimation d'état cible et véhicule aérien sans pilote
CN108981706B (zh) 无人机航拍路径生成方法、装置、计算机设备和存储介质
US20200064133A1 (en) Information processing device, aerial photography route generation method, aerial photography route generation system, program, and storage medium
CN106973221A (zh) 基于美学评价的无人机摄像方法和系统
CN108235815A (zh) 摄像控制装置、摄像装置、摄像系统、移动体、摄像控制方法及程序
CN111444786A (zh) 基于无人机群的人群疏散方法、装置、系统及存储介质
WO2021056139A1 (fr) Procédé et dispositif d'acquisition de position d'atterrissage, véhicule aérien sans pilote, système et support de stockage
CA3069813C (fr) Capture, connexion et utilisation de donnees d'interieur de batiment a partir de dispositifs mobiles
WO2020225979A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, programme et système de traitement d'informations
CN110262567A (zh) 一种路径中继点空间生成方法、装置和无人机
CN112154389A (zh) 终端设备及其数据处理方法、无人机及其控制方法
WO2022205210A1 (fr) Procédé et appareil de photographie, support de stockage lisible par ordinateur, et dispositif terminal

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20948162

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20948162

Country of ref document: EP

Kind code of ref document: A1