WO2023173330A1 - Flight control method and apparatus for unmanned aerial vehicle, unmanned aerial vehicle, and storage medium - Google Patents

Flight control method and apparatus for unmanned aerial vehicle, unmanned aerial vehicle, and storage medium Download PDF

Info

Publication number
WO2023173330A1
WO2023173330A1 PCT/CN2022/081230 CN2022081230W WO2023173330A1 WO 2023173330 A1 WO2023173330 A1 WO 2023173330A1 CN 2022081230 W CN2022081230 W CN 2022081230W WO 2023173330 A1 WO2023173330 A1 WO 2023173330A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
target
detected object
description information
safe speed
Prior art date
Application number
PCT/CN2022/081230
Other languages
French (fr)
Chinese (zh)
Inventor
颜江
李罗川
高成强
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2022/081230 priority Critical patent/WO2023173330A1/en
Publication of WO2023173330A1 publication Critical patent/WO2023173330A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions

Definitions

  • the present application relates to the technical field of drones, and in particular to a flight control method and device for a drone, a drone and a storage medium.
  • This application provides a UAV flight control method, device, UAV and storage medium, which can improve control efficiency and reduce labor costs to a certain extent.
  • embodiments of the present application provide a flight control method for a UAV, which method includes:
  • the drone is controlled to change its flight direction to bypass the detected object.
  • embodiments of the present application provide a flight control device for a drone, the device including a memory and a processor;
  • the memory is used to store program code
  • the processor calls the program code, and when the program code is executed, is used to perform the following operations:
  • the drone is controlled to change its flight direction to bypass the detected object.
  • embodiments of the present application provide a drone, which is used to implement the above method.
  • embodiments of the present application provide a computer-readable storage medium, including instructions that, when run on a computer, cause the computer to perform the above method.
  • embodiments of the present application provide a computer program product containing instructions, which when the instructions are run on a computer, cause the computer to perform the above method.
  • the description information of the target object in the space is obtained, and the target flight direction of the UAV is obtained. Based on the sensor mounted on the drone, the description information of the detected object in the space indicated by the target flight direction is detected. If the description information of the detected object matches the description information of the target object, the drone is controlled to decelerate along the target flight direction until it approaches the detected object. If the description information of the detected object does not match the description information of the target object, the drone is controlled to change its flight direction to bypass the detected object.
  • control efficiency can be improved to a certain extent and labor costs can be reduced.
  • it can avoid collisions between the drone and the detected object to ensure the safety of the drone's flight, while ensuring that the drone can get close to the target object in space, thus taking into account flight safety and scene requirements.
  • the manual control of drone flight is limited by personal experience, and sometimes it is impossible to reasonably control the movement of the drone.
  • the drone is automatically controlled to avoid the detected object or approach the detected object based on the detection results. To a certain extent, the movement of the drone can be more reasonably controlled for the detected object in space.
  • Figure 1 is a step flow chart of a UAV flight control method provided by an embodiment of the present application
  • Figure 2 is a schematic diagram of speed change provided by the embodiment of the present application.
  • Figure 3 is a schematic diagram of a query provided by an embodiment of the present application.
  • Figure 4 is a schematic diagram of a function curve provided by an embodiment of the present application.
  • Figure 5 is a schematic diagram of a collision query channel provided by an embodiment of the present application.
  • Figure 6 is a schematic diagram of a preset range provided by an embodiment of the present application.
  • Figure 7 is a system flow diagram involved in a flight control method provided by an embodiment of the present application.
  • Figure 8 is a block diagram of a flight control device for a drone provided by an embodiment of the present application.
  • Figure 9 is a block diagram of a computing processing device provided by an embodiment of the present application.
  • Figure 10 is a block diagram of a portable or fixed storage unit provided by an embodiment of the present application.
  • an exemplary application scenario involved in the embodiment of the present application is described.
  • the drone needs to inspect specific objects in space to perform specified operations.
  • the drone needs to be close to specific objects in space to complete specified operations.
  • the drone needs to avoid objects in the space. This leads to the need to take into account safety in flight control while ensuring that the drone can get close to designated objects.
  • embodiments of the present application provide a flight control method for a drone.
  • Figure 1 is a step flow chart of a UAV flight control method provided by an embodiment of the present application. As shown in Figure 1, the method may include:
  • Step 101 Obtain description information of the target object in the space.
  • Step 102 Obtain the target flight direction of the drone.
  • the space may refer to the space where the drone is located, the target object may be specified in advance, and the target object may be a designated object in the above application scenario.
  • the description information of the target object may be information that can be used to indicate the target object, such as image information, position distribution information, etc. of the target object. By obtaining the description information of the target object, the drone can accurately learn the objects that need to be approached during this flight.
  • the target flight direction may refer to the current flight direction of the drone.
  • the current flight direction of the drone can be detected based on the direction sensor mounted on the drone, thereby obtaining the target flight direction.
  • other methods can also be used to obtain the target flight direction, and this application does not limit this.
  • Step 103 Based on the sensor mounted on the drone, detect the description information of the detected object in the space indicated by the target flight direction.
  • the space indicated by the target flight direction may be a space within a preset range in the target flight direction.
  • the preset range can be set according to actual needs.
  • the description information of the detected object may be information that can be used to indicate the detected object, such as image information, location information, etc. of the detected object.
  • the detected object may be an object falling into the space indicated by the target's flight direction.
  • the space indicated by the target flight direction can be detected based on sensors capable of detecting objects such as infrared sensors and ultrasonic sensors mounted on the drone to detect whether objects appear in the space indicated by the target flight direction.
  • the object can be determined as a detected object.
  • the image of the detected object can be collected based on the mounted visual sensor as description information of the detected object.
  • the position information of the detected object can be collected as description information of the detected object.
  • Step 104 If the description information of the detected object matches the description information of the target object, control the drone to decelerate along the target flight direction to approach the detected object.
  • Step 105 If the description information of the detected object does not match the description information of the target object, control the drone to change its flight direction to bypass the detected object.
  • the description information of the detected object matches the description information of the target object, which can be understood to mean that the currently detected object is the target object. If the description information of the detected object does not match the description information of the target object, it can be understood that the currently detected object being detected is not the target object.
  • the description information as image information as an example, after obtaining the image information of the target object, it can be combined with the collected image information of the detected object based on the Convolutional Neural Networks (CNN) carried by the drone.
  • CNN Convolutional Neural Networks
  • the position information of the detected object can be further determined, and the position information of the detected object can be associated with the object position distribution information in space, so that the drone can locate the detected object based on the associated object position distribution information. , thereby approaching or bypassing the detected object.
  • the object position distribution information can be a map of the current space, and the position information of the detected object can be the position of the detected object in the three-dimensional navigation coordinate system.
  • the drone can be controlled to decelerate along the target flight direction until it is close to the detected object to facilitate detection. Operate on the target. At the same time, slowing down the movement until it is close to the detected object can avoid excessive speed to a certain extent, thereby ensuring the stability of the drone during the approach. If the currently detected object is not a target, the drone can be directly controlled to change its flight direction to bypass the detected object, thus ensuring flight safety while avoiding unnecessary deceleration and approach operations.
  • the flight control method provided by the embodiment of the present application can be applied to a drone, that is, the drone automatically controls itself to approach or bypass the detected object.
  • it can also be applied to the control device of a drone, that is, the control device automatically controls the controlled drone to approach or bypass the detected object.
  • the embodiments of this application do not limit this.
  • the flight control method of a UAV obtains the description information of the target object in space and obtains the target flight direction of the UAV. Based on the sensor mounted on the drone, the description information of the detected object in the space indicated by the target flight direction is detected. If the description information of the detected object matches the description information of the target object, the drone is controlled to decelerate along the target flight direction until it approaches the detected object. If the description information of the detected object does not match the description information of the target object, the drone is controlled to change its flight direction to bypass the detected object.
  • control efficiency can be improved to a certain extent and labor costs can be reduced.
  • it can avoid collisions between the drone and the detected object to ensure the safety of the drone's flight, while ensuring that the drone can get close to the target object in space, thus taking into account flight safety and scene requirements.
  • the manual control of drone flight is limited by personal experience, and sometimes it is impossible to reasonably control the movement of the drone.
  • the drone is automatically controlled to avoid the detected object or approach the detected object based on the detection results. To a certain extent, the movement of the drone can be more reasonably controlled for the detected object in space.
  • the following operations may be further performed.
  • the designated stopping distance may be pre-specified according to actual needs, and controlling the drone to stop flying may control the flying speed of the drone to reduce to 0 at the designated stopping distance from the detected object.
  • controlling the deceleration of the drone to a designated stopping distance from the detected object it is possible to ensure that the drone can operate on the target object while preventing the drone from getting too close to the target object. , thereby ensuring the safety of drone operations.
  • the drone after the drone stops flying, the drone can be controlled to perform a target operation on the detected object, thereby meeting the operation requirements for the target object.
  • the target operation can be specified in advance according to actual needs. For example, performing the target operation may be to observe the target object to determine whether the target object is in a preset state, and transmit the observation results back to the user equipment of the monitoring personnel. For example, in a wind power generation scenario, it is possible to observe whether the wind power generation equipment is in normal operation, and the observation results indicating whether the wind power generation equipment is in normal operation are sent back to the monitoring personnel's equipment, thus facilitating the monitoring personnel when the wind power generation equipment is not in normal operation. If the equipment is in good condition, perform maintenance on the equipment in a timely manner.
  • the target operation can also be to collect images of the target object, spray maintenance fluid on the target object, and so on.
  • the designated stopping distance can be set based on one or more of the following factors: the density of objects in the space; the description information of the target object; and the operation type of the drone.
  • the specified stopping distance can be preset before the UAV performs this mission, or set according to these factors during the flight.
  • the designated stopping distance is set based on factors such as the density of objects in the space, the description information of the target object, and the operation type of the drone, which can make the set designated stopping distance more suitable for this flight scenario to a certain extent. , thereby ensuring the reasonableness of the specified stopping distance set.
  • the user can be provided with a selectable range, and the value selected by the user from the selectable range is set to the specified stopping distance.
  • the end value of the selectable range can be automatically adjusted based on one or more factors including the density of objects in the space, description information of the target object, and the type of operation of the drone. Specifically, the greater the density of the object, that is, the denser the object in the space, the more complex the shape of the target represented by the description information of the target, and the higher the difficulty of the drone's operation type, the range of options can be The larger the end value can be, the larger the designated stopping distance set by the user can be, thereby ensuring the safety of the UAV when performing target operations on detected objects detected in space.
  • the minimum value of the selectable range limit can be no less than the preset safe distance. In this way, the problem of the set designated stopping distance being too small and the UAV being too close to the target can be avoided.
  • the above-mentioned operation of controlling the drone to decelerate along the target flight direction to approach the detected object may specifically include the following steps:
  • Step S21 Determine the target safe speed based on local map information and/or single-frame depth image; the target safe speed is positively related to the current distance between the drone and the detected object.
  • the local map information can be used to characterize the local map of the surrounding environment of the drone
  • the depth image can be a depth image of the current surrounding environment collected by the drone.
  • the depth image can be obtained from the image collector of the drone to the surrounding environment.
  • Step S22 Control the drone to move along the target flight direction and decelerate at the target safe speed until it approaches the detected object.
  • the target safety speed is positively related to the current distance between the UAV and the detected object. That is to say, when the current distance between the UAV and the detected object is smaller, the determined target The safe speed can be smaller. As the drone continues to get closer to the detected object, the target safety speed can be continuously updated and become smaller and smaller. In this way, in the process of controlling the UAV to decelerate at the target safe speed until it approaches the detected object, it can gradually decelerate and approach the detected object, ensuring that the speed changes smoothly during the process of approaching the detected object and ensuring the flight safety of the UAV.
  • the above-mentioned operation of determining the target safe speed based on local map information and/or single-frame depth images may specifically include:
  • Step S31 Calculate the first safe speed of the drone based on the local map information.
  • the local map information may be generated based on multi-frame depth images of the current surrounding environment collected by the drone. In this way, local map information is generated based on multi-frame depth images of the current surrounding environment, so that the local map information can more accurately represent the local map of the current surrounding environment of the drone, and to a certain extent, the determined safe speed can be more suitable for the current surrounding environment. , thereby improving the speed planning effect.
  • Step S32 Calculate the second safe speed of the drone based on the single-frame depth image.
  • Step S33 Determine the target safety speed based on the first safety speed and the second safety speed.
  • the target safe speed may be positively correlated with the first safe speed and the second safe speed.
  • the average value of the first safe speed and the second safe speed may be used as the target safe speed.
  • the minimum value of the first safety speed and the second safety speed may be determined as the target safety speed. In this way, by selecting the minimum value as the target safe speed, the problem of excessive target safe speed can be avoided as much as possible while setting a reasonable safe speed.
  • speed planning is performed based on local map information and a single-frame depth image, that is, the first safe speed and the second safe speed are determined based on the local map information and the single-frame depth image, and the first safe speed and the second safe speed are determined based on the local map information and the single-frame depth image.
  • Safe speed determines the target safe speed. Since the local map has strong anti-noise ability and has the ability to "memory" the observed area, it has a certain degree of blind spot perception.
  • the single-frame depth image can represent the distance while making up for the shortcomings of the small range of the local map. Therefore, speed planning based on local map information and single-frame depth images can make the determined target safety speed more reasonable to a certain extent. It should be noted that speed planning can also be performed based only on local map information, or speed planning can be performed only on a single frame of depth image. In this way, the calculation amount of speed planning can be reduced to a certain extent.
  • the above-mentioned local map information may be specifically generated based on the multi-frame depth image when the distance between the drone and the detected object reaches a preset mapping distance, and the preset Mapping distance is directly related to the specified stopping distance of the drone.
  • the current distance between the UAV and the detected object can be continuously detected, and the current distance and the preset mapping distance can be continuously detected. Make a comparison. If the current distance is equal to the preset mapping distance, local map information can be generated based on the collected multi-frame depth images. For example, fusion can be performed based on the collected multi-frame depth images to generate local map information.
  • the preset mapping distance can be greater than the specified stopping distance.
  • the set preset mapping distance can be positively related to the specified stopping distance, thereby ensuring that there is sufficient distance between the position where the drone starts mapping and the specified stopping distance from the detected object, and to avoid being at the specified stopping distance from the detected object.
  • the vehicle When the vehicle is at a relatively close location, it starts to perform speed planning based on the established local map information, resulting in a problem of poor speed planning effect.
  • local map information can be used for short-range fine-grained speed planning, and a single frame depth image can be used for long-range coarse-grained speed planning.
  • the default value can be used as the first safe speed.
  • the first safe speed is determined based on the local map information. Because the first safe speed determined based on local map information is often smaller than the second safe speed. Therefore, after reaching the preset mapping distance and establishing local map information, the drone will be controlled to fly at a smaller first safe speed as the target safe speed. In other words, after reaching the preset mapping distance, the drone's speed will drop suddenly.
  • Figure 2 is a schematic diagram of speed changes provided by an embodiment of the present application. As shown in Figure 2, the horizontal axis represents the distance to the detected object, and the vertical axis represents the speed value. Line 1 represents the first safe speed, line 2 represents the second safe speed, and line 3 represents the flight direction. It can be seen that when the preset mapping distance is not reached, the first safe speed is the default value and is greater than the second safe speed.
  • the first safe speed determined based on the local map information is smaller than the second safe speed. It should be noted that after the drone performs the target operation, it can continue to fly away from the currently detected object. At this time, as the distance becomes farther and farther, the speed can become higher and higher. And because the distance between the drone and the detected object will not meet the preset mapping distance after moving away, the first safe speed can be returned to the default value.
  • the multi-frame depth images used to establish local map information are obtained by sensors mounted on the drone in different directions.
  • multiple frames of depth images acquired in different directions are used to establish local map information, so that the established local map information can represent the map information in each direction, so that the drone can omnidirectionally perceive and detect objects based on the local map information.
  • the distance between For example, a visual sensor may be provided in the first direction of the drone.
  • the depth image in the first direction of the current surrounding environment may be collected based on the visual sensor. Control the drone to flip in the second direction to obtain a depth image in a direction where no visual sensor is provided.
  • the second direction is different from the first direction, so that a drone with non-omnidirectional sensing can also obtain multi-frame depth images in different directions, thereby omnidirectionally sensing the detected object.
  • the drone can only be equipped with visual sensors in the front-to-back direction. After taking off, the drone can be controlled to rotate a certain yaw angle (Yaw) with the drone as the center to collect depth images in the second direction.
  • Yaw yaw angle
  • the attitude of the drone will change during flight, and the drone can be controlled to collect depth images based on the visual sensor after the attitude occurs. In this way, depth images in different orientations are obtained through continuous accumulation.
  • the above-mentioned operation of calculating the first safe speed of the drone based on the local map information may specifically include:
  • Step S41 Determine the current distance between the drone and the detected object based on the local map information.
  • trajectory generation can be performed based on the current flight direction and local map information to generate a query trajectory.
  • a collision query is performed based on the query trajectory to determine the current distance between the drone and the detected object.
  • the current flight direction is used to provide a priori information. In which direction the drone flies, the query is performed in that direction, and other irrelevant directions are involved in the calculation. In this way, computing resource consumption can be further reduced.
  • the method of generating query trajectories can be selected according to actual needs, and this application does not limit this. For example, it can be generated based on a differential flat model or a method based on convex optimization.
  • the current location of the drone's center point can be used as the starting query point, based on the distance between the query point represented in the local map and the nearest object and the current collision query radius. Calculate the next step of trajectory sampling.
  • the local map used in collision query can be any environment map available for query, such as occupation map, esdf map, etc.
  • the next query point based on the trajectory sampling step, and calculate the next trajectory sampling step based on the distance between the query point and the nearest object and the current collision query radius, until the drone reaches the distance specified by the detected object and stops. distance. Further, based on the accumulated trajectory sampling step, the current distance between the drone and the detected object is determined. For example, the sum of the trajectory sampling steps is determined as the current distance.
  • FIG. 3 is a query schematic diagram provided by an embodiment of the present application. As shown in Figure 3, if a uniform step size is used and esdf is directly used as the next trajectory sampling step, the next query point will be point 2. However, in fact the next query point should be point 1. Therefore, uneven steps are used in the embodiments of the present application, and calculating steps in the above manner can make the calculated steps more reasonable, thus avoiding the problem that the distance found is longer than the actual distance.
  • checked_dist can be the distance from point to point 4 in Figure 3, (r-checked esdf ) represents the distance from point 3 to point 4. In this way, real_dist can be made more reasonable through correction to prevent the drone from exceeding the target position.
  • Step S42 Calculate the first safe speed of the UAV based on the current distance and the designated stopping distance of the UAV; the first safe speed is positively related to the current distance difference, and the current distance difference The value is the difference between the current distance and the specified stopping distance.
  • the difference between the current distance and the designated stopping distance can be used as the input of the preset function, and the output of the preset function can be used as the first safe speed.
  • map_safe spd represents the first safe speed
  • safe_dist represents the specified stopping distance
  • map_safe spd f(real_dist-safe_dist).
  • the current distance between the drone and the detected object is determined based on local map information.
  • calculate the first safe speed of the drone Based on the current distance and the specified stopping distance of the drone, calculate the first safe speed of the drone; the first safe speed is positively related to the current distance difference, and the current distance difference is the difference between the current distance and the specified stopping distance .
  • local map information has stronger anti-noise ability, so the rationality of first safety can be ensured to a certain extent.
  • the above-mentioned operation of calculating the second safe speed of the drone based on the single-frame depth image may specifically include:
  • Step S51 Based on the number of pixels falling into each depth interval in the single-frame depth image and the total number of pixels in the single-frame depth image, determine the proportion of the number of pixels corresponding to each depth interval.
  • the single-frame depth image can be continuously collected and updated by the drone in the flight direction as it approaches the target.
  • the depth histogram can be calculated based on the single-frame depth image, and the depth interval to which the depth value represented by each pixel in the single-frame depth image belongs, and the depth interval into which each pixel falls can be obtained.
  • single-frame depth map speed planning can also be performed by directly calculating the closest point cloud, and the embodiments of the present application do not limit this.
  • the ratio of the number of pixels falling within the depth interval to the total number of pixels in a single frame depth map can be calculated to obtain the proportion of the number of pixels corresponding to the depth interval.
  • Step S52 Determine the weight value corresponding to each depth interval according to the proportion of the number of pixels corresponding to each depth interval; the weight value corresponding to the depth interval is positively correlated with the proportion of the number of pixels.
  • Step S53 Determine the second safe speed based on the speed value and weight value corresponding to each depth interval; the speed value corresponding to the depth interval is positively correlated with the depth value represented by the depth interval.
  • the smaller the depth value the greater the proportion of pixels corresponding to the depth interval. Therefore, by setting the speed value corresponding to the depth interval to be positively related to the depth value represented by the depth interval, and the weight value corresponding to the depth interval to be positively related to the proportion of the number of pixels, it can be made that when the drone is closer to the target, the lower speed value The corresponding weight value is larger, which to a certain extent can make the second safe speed gradually decrease as the drone approaches the target.
  • the speed value corresponding to each depth interval may be preset.
  • the velocity values corresponding to the depth range represented by each depth interval can be as shown in the following table:
  • p0, p1, p2, p3, p4, p5, p6, p7, p8, and p9 represent the proportion of the number of pixels corresponding to each depth interval. Furthermore, the pixel number ratio can be used to represent the original weight of the depth interval, and the original weight can be adjusted to obtain the weight value corresponding to each depth interval.
  • Depth_safe spd represents the second safe speed
  • depth_safe spd Pnew*V.
  • V represents the vector corresponding to the second row in the above table
  • Pnew represents the adjusted vector corresponding to the third row in the above table. That is, the sum of the products of the adjusted weight value and the speed value of each depth interval can be calculated to obtain the second safe speed.
  • the adjusted sum of the weight values corresponding to each depth interval may be equal to 1, and the weight value corresponding to each depth interval may determine the proportion of the speed value corresponding to each depth interval in the final second safe speed. Therefore, the method of calculating the sum of the products of the adjusted weight value and the speed value of each depth interval can be equivalent to using the adjusted weight value to perform a weighted average.
  • pnew can include ph, new and pl, new.
  • the value of h can be 0 or 1
  • ph can represent the weight corresponding to the high-speed interval.
  • the value of l can be from 2 to 9, and pl can represent the weight corresponding to the low-speed interval.
  • the adjusted weight value can be calculated based on the following formula:
  • ph,new and pl,new can be calculated based on the following formulas:
  • FIG 4 is a schematic diagram of a function curve provided by an embodiment of the present application.
  • the compression rate can be larger, that is, the adjusted weight can be smaller.
  • ph can be compressed according to the function shown in Figure 4, so that the adjusted ph is smaller and the adjusted pl is larger. That is, the lower the credibility of the interval with the smaller original weight, the greater the credibility of other intervals, thereby achieving the bias of the speed mapping.
  • the f(x) mapping function corresponding to f(ph) can be obtained through fourth-order curve fitting, and f(x) can be equal to AX.
  • the values of a1, a2, a3 and a4 can be set according to actual needs. For example, a1, a2, a3 and a4 can be: -47.63, 44.2, -9.96, 0.9055, -0.01439 respectively.
  • the pixel number proportion of each depth interval is counted through a single frame depth image, and the weight value corresponding to each depth interval is determined based on the pixel number proportion.
  • the weight value and speed value corresponding to each depth interval can be determined based on 2. Safe speed. In this way, since there is no need for coordinate conversion operations on the point cloud, speed planning can be implemented only by knowing the depth information. Therefore, computing resource consumption can be saved to a certain extent.
  • the target safe speed can be decomposed.
  • the collision query can be performed by separating vertical and horizontal channels.
  • FIG. 5 is a channel schematic diagram of a collision query provided by an embodiment of the present application. As shown in FIG. 5 , collision query can be performed on the horizontal channel 01 and the vertical channel 02 respectively.
  • the second safe speed determined based on the single-frame depth map can also be decomposed into the second safe speed in the vertical direction and the second safe speed in the horizontal direction, and then the target safety speed in the vertical direction and the target safety in the horizontal direction can be obtained respectively.
  • Speed, the target safe speed in the horizontal direction can be decomposed according to the direction of movement, and finally the target safe speed in the xyz three dimensions is obtained.
  • the detected objects may include objects located in a space within a preset range indicated by the target flight direction, and the space within the preset range moves with the movement of the UAV in space.
  • the space within the preset range indicated by the target flight direction is the space within the preset range in the target flight direction.
  • the space in the preset range can be adjacent to the nose of the drone, and as the drone continues to fly forward in the target flight direction, the space in the preset range also changes continuously.
  • the object can be determined to be the detected object. In this way, objects that interfere with the drone's flight can be accurately identified as detected objects, so that in subsequent operations, intelligent behavioral decisions can be made based on whether the detected objects are target objects, that is, whether to approach the detected object or Detour around the detected object.
  • the size of the space in the preset range is set with horizontal and vertical boundary sizes based on the shape of the drone.
  • the shape of the drone corresponds to the horizontal and vertical boundary dimensions of the space in which the preset range is set, so that the space in the preset range can be more adapted to the shape characteristics of the drone, which can ensure to a certain extent Objects that interfere with the flight of the drone can be detected, thereby ensuring the safety of the drone and avoiding missing targets.
  • the boundary size of the space in the preset range in the corresponding direction can be set according to the maximum size of the UAV in each direction, so that the space in the preset range can cover the UAV.
  • the boundary size of the space in the preset range in each direction can be set to be larger than the maximum size of the drone in each direction, so as to avoid missing objects that may interfere with the flight of the drone as much as possible.
  • the boundary size in the corresponding direction of the preset range of space can be larger.
  • the boundary size of the space in the preset range in the vertical direction is smaller than the boundary size in the horizontal direction.
  • the drone may be a multi-rotor drone.
  • the size of the blade plane of this type of UAV is larger, that is, the size of the UAV in the horizontal direction is larger.
  • the vertical size of such UAVs is often smaller than the plane size of the blades.
  • Objects flown by humans and machines can be detected in a preset range of space. At the same time, it can also exclude, to a certain extent, objects in spaces that have little impact on flight safety from falling into the preset range, further reducing the risk of UAVs in space objects. Perceived computational overhead.
  • the boundary size in the vertical direction of the space of the preset range is set to be smaller than the boundary size in the horizontal direction.
  • the boundary size can avoid the problem of identifying the ground as a detected object, thus affecting the flight efficiency of the drone, when the drone is flying at a lower height and is closer to the ground.
  • the target object in this operation task is not the ground
  • the distance between the drone and the ground is less than a
  • the drone will not respond normally to the user's stick operation.
  • the ground does not actually pose a threat to the drone.
  • the vertical boundary size of the preset range of space is smaller than the horizontal boundary size, as long as the distance between the drone and the ground is smaller than the vertical boundary size, it will not be The ground is recognized as an obstacle. Even when the distance between the drone and the ground is less than a, it can still respond normally to the user's swinging operation, thus adapting to the scene of hitting the swing near the ground.
  • the boundary size in the vertical direction can also be set to be equal to the boundary size in the horizontal direction.
  • the object in the surrounding environment is a specified object
  • the object when the object falls into the space of the preset range and the proportion of the object in the space of the preset range is greater than the preset ratio, The object is then determined to be the detected object.
  • the specified object may include the ground, and the preset ratio may be set according to actual needs. In this way, you can avoid directly identifying obstacles on the ground that will interfere with the flight of the drone and set a higher tolerance for the ground.
  • the processing of the ground can be made more reasonable, thereby ensuring safety while improving the response sensitivity of the UAV in near-ground scenarios.
  • the object configuration of the drone is closer to a pie shape. Therefore, the space of the preset range can be spherical, and the query range in the horizontal direction of the space of the preset range can be closer to a round cake to avoid being too conservative.
  • Figure 6 is a schematic diagram of a preset range provided by an embodiment of the present application. As shown in Figure 6, a plane rectangular coordinate system can be constructed in the normal plane tangential to the collision point trajectory, wherein the vertical axis in the coordinate system can be The z-axis of the navigation coordinate system is parallel, and the horizontal axis can be parallel to the horizontal plane of the navigation coordinate system. The area between line X and line Y can be obtained by translating the horizontal axis up and down according to the upper and lower tolerance.
  • the object query process can be queried sequentially from left to right as shown by the interlaced small circles in Figure 6. If an object is found in the corridor area formed between line X and line Y, it is considered that there is indeed a detected object. If no object is detected in the corridor area formed between line
  • the above target flight direction can be determined through the following steps:
  • Step S61 Receive a flight control instruction sent by the first control device of the drone; the flight control instruction is generated based on the user's control operation of the control stick of the first control device.
  • Step S62 Determine the flight direction determined in response to the flight control instruction as the target flight direction.
  • the first control device may be a remote controller of the drone, and the control operation may be a lever operation on a control rod of the first control device.
  • the first control device can generate a flight control instruction indicating direction A, and the drone can receive the flight control instruction, and in response to the flight control instruction, determine the direction A indicated by the flight control instruction as Target flight direction.
  • the drone upon receiving a flight control instruction generated based on the user's control operation of the control stick of the first control device, if the detected object in the target flight direction is the target object, the drone can fly along the stick. Continue flying in the indicated target flight direction to approach the target.
  • the drone does not follow the target flight direction indicated by the stick, but changes the flight direction to bypass the detected object. In this way, it can be ensured that the drone responds to flight control instructions while ensuring flight safety.
  • the above operation of obtaining the description information of the target object in the space may specifically include: receiving the description information of the target object sent by the second control device of the drone; wherein, the description of the target object The information is determined by the second control device based on the object selected by the user. In this way, the user can select the target object in this mission as needed based on the second control device, so that the drone's behavior decision-making for the detected object is more in line with the actual needs of the user.
  • the second control device may be the same device as the first control device.
  • the second control device may also be a different device from the first control device.
  • the second control device may be an electronic device used by the user.
  • the second control device can display a scene view to the user, and the scene view can provide multiple candidate objects.
  • the candidate object selected by the touch operation is determined as the target object, and the description information of the selected candidate object is determined as the description information of the target object.
  • the scene view can be a 3D map of the established working area, an orthophoto, etc.
  • the description information may be extracted based on images of the selected candidate object collected in historical operations.
  • the description information of the target object in the space can be obtained; the planned trajectory of the UAV can be obtained; and the plan can be detected based on the sensor mounted on the UAV.
  • the description information of the detected object in the space touched by the trajectory; if the description information of the detected object matches the description information of the target object, the drone is controlled to decelerate along the planned trajectory until it is close to the target object.
  • the detected object if the description information of the detected object does not match the description information of the target object, adjust the planned trajectory so that the UAV detours along the adjusted planned trajectory. Detected object.
  • the implementation of each step in the flight control method may refer to the implementation of the same or similar steps mentioned above.
  • the planned trajectory in the flight control method can be the trajectory that the UAV is currently flying to follow, and the planned trajectory can be the trajectory in the current flight direction.
  • the above planned trajectory can be a plan automatically planned by the drone based on its own perception of the surrounding environment. It can also be a trajectory planned in response to external control instructions.
  • the space touched by the planned trajectory may be a preset range of space touched by the planned trajectory.
  • the operation of obtaining the planned trajectory of the UAV may include: obtaining a flight direction instruction, and obtaining the planned trajectory of the UAV according to the flight direction instruction.
  • obtaining the flight direction instruction may be receiving the flight direction instruction sent by the control device.
  • it can also be an automatically generated flight direction instruction based on its own perception of the surrounding environment.
  • the flight trajectory indicated by the flight direction instruction can be determined as the planned trajectory of the UAV.
  • the flight control method of the UAV provided by the embodiment of the present application automatically detects whether the description information of the detected object in the space touched by the planned trajectory matches the description information of the target object, and automatically controls the drone based on the detection result.
  • Man-machine avoids or approaches the detected object, which can improve control efficiency and reduce labor costs to a certain extent. At the same time, it can avoid collisions between the drone and the detected object to ensure the safety of the drone's flight, while ensuring that the drone can get close to the target object in space, thus taking into account flight safety and scene requirements.
  • FIG. 7 is a system flow diagram involved in a flight control method provided by an embodiment of the present application.
  • target detection and tracking can be performed on a two-dimensional target (ie, a detected object) to determine the detected object. Whether the object is a target object.
  • the obstacle avoidance decision-making module confirms the position information of the two-dimensional target in the three-dimensional navigation coordinate system based on the target tracking results, the position and attitude of the aircraft and the gimbal, and when the UAV flies towards the two-dimensional target and the two-dimensional target is In the case of a target object, the speed planning module performs the next operation.
  • the speed planning module can determine the target safe speed and control the underlying flight control system of the drone based on the target safe speed to control the drone to move along the target flight direction and decelerate at the target safe speed until it approaches the detected object.
  • the environmental speed limit can be combined with the environment speed limit to avoid the determined target safe speed exceeding the environmental speed limit.
  • the obstacle bypass module can perform the next operation to control the drone to bypass the detected object based on the underlying flight control system.
  • the drone will avoid obstacles based on detected objects.
  • the drone needs to be able to approach the inspection target for inspection instead of bypassing the inspection target.
  • the user only needs to select the inspection targets of this inspection task.
  • the inspection targets and non-inspection targets in the space will be analyzed. It can distinguish the detected objects, automatically switch the response mode for the detected objects, and realize intelligent obstacle avoidance behavior decision-making.
  • the local map and single-frame depth image are used to implement smooth speed planning to park near the target more safely.
  • a detour is performed to avoid the detected object to ensure flight safety.
  • FIG 8 is a block diagram of a UAV flight control device provided by an embodiment of the present application.
  • the device may include: a memory 301 and a processor 302.
  • the memory 301 is used to store program codes
  • the processor 302 calls the program code, and when the program code is executed, is used to perform the following operations:
  • the drone is controlled to change its flight direction to bypass the detected object.
  • the flight control device of the unmanned aerial vehicle obtains the description information of the target object in the space and obtains the target flight direction of the unmanned aerial vehicle. Based on the sensor mounted on the drone, the description information of the detected object in the space indicated by the target flight direction is detected. If the description information of the detected object matches the description information of the target object, the drone is controlled to decelerate along the target flight direction until it approaches the detected object. If the description information of the detected object does not match the description information of the target object, the drone is controlled to change its flight direction to bypass the detected object.
  • control efficiency can be improved to a certain extent and labor costs can be reduced.
  • it can avoid collisions between the drone and the detected object to ensure the safety of the drone's flight, while ensuring that the drone can get close to the target object in space, thus taking into account flight safety and scene requirements.
  • the manual control of drone flight is limited by personal experience, and sometimes it is impossible to reasonably control the movement of the drone.
  • the drone is automatically controlled to avoid the detected object or approach the detected object based on the detection results. To a certain extent, the movement of the drone can be more reasonably controlled for the detected object in space.
  • the processor 302 when controlling the drone to decelerate along the target flight direction to approach the detected object, the processor 302 is also configured to execute:
  • the drone When the distance between the drone and the detected object is not greater than the specified stopping distance, the drone is controlled to stop flying.
  • processor 302 is also used to execute:
  • the UAV After the UAV stops flying, the UAV is controlled to perform a target operation on the detected object.
  • the specified stopping distance is set based on one or more of the following factors:
  • processor 302 is specifically configured to execute:
  • the target safe speed is positively related to the current distance between the drone and the detected object;
  • the local map information is generated based on multi-frame depth images of the current surrounding environment collected by the drone.
  • the multi-frame depth images are obtained by sensors mounted on the drone in different directions.
  • processor 302 is also specifically configured to execute:
  • the target safe speed is determined based on the first safe speed and the second safe speed.
  • processor 302 is also specifically configured to execute:
  • the first safe speed of the UAV is calculated; the first safe speed is positively related to the current distance difference, and the current distance difference is the The difference between the current distance and the specified stopping distance.
  • processor 302 is also specifically configured to execute:
  • the second safe speed is determined based on the speed value and weight value corresponding to each depth interval; the speed value corresponding to the depth interval is positively related to the depth value represented by the depth interval.
  • processor 302 is also specifically configured to execute:
  • the minimum value of the first safety speed and the second safety speed is determined as the target safety speed.
  • the local map information is generated based on the multi-frame depth image when the distance between the drone and the detected object reaches a preset mapping distance, and the preset Mapping distance is directly related to the specified stopping distance of the drone.
  • the detected object includes an object located in a preset range of space indicated by the target flight direction, and the preset range of space moves with the movement of the UAV in space.
  • the size of the space in the preset range is set with horizontal and vertical boundary sizes based on the shape of the drone.
  • the boundary size of the space in the preset range in the vertical direction is smaller than the boundary size in the horizontal direction.
  • processor 302 is also used to execute:
  • the flight direction determined in response to the flight control instruction is determined as the target flight direction.
  • processor 302 is also specifically configured to execute:
  • embodiments of the present application also provide a computer-readable storage medium, which stores a computer program that, when run on a computer, causes the computer to perform each step in the above method, and can achieve the same results.
  • the technical effects will not be repeated here to avoid repetition.
  • embodiments of the present application also provide a computer program product containing instructions, which when the instructions are run on a computer, cause the computer to execute the above method.
  • embodiments of the present application also provide an unmanned aerial vehicle, which is used to implement each step in the above method and can achieve the same technical effect. To avoid duplication, it will not be described again here.
  • the device embodiments described above are only illustrative.
  • the units described as separate components may or may not be physically separated.
  • the components shown as units may or may not be physical units, that is, they may be located in One location, or it can be distributed across multiple network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the solution of this embodiment. Persons of ordinary skill in the art can understand and implement the method without any creative effort.
  • Various component embodiments of the present application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof.
  • a microprocessor or a digital signal processor may be used in practice to implement some or all functions of some or all components in the computing processing device according to embodiments of the present application.
  • the present application may also be implemented as an apparatus or device program (eg, computer program and computer program product) for performing part or all of the methods described herein.
  • Such a program implementing the present application may be stored on a computer-readable medium, or may be in the form of one or more signals. Such signals may be downloaded from an Internet website, or provided on a carrier signal, or in any other form.
  • FIG. 9 is a block diagram of a computing processing device provided by an embodiment of the present application. As shown in the figure, the diagram illustrates a computing processing device that can implement the method according to the present application.
  • the computing processing device conventionally includes a processor 410 and a computer program product or computer-readable medium in the form of memory 420 .
  • Memory 420 may be electronic memory such as flash memory, EEPROM (Electrically Erasable Programmable Read Only Memory), EPROM, hard disk, or ROM.
  • the memory 420 has a storage space 430 for program codes for executing any method steps in the above-described methods.
  • the storage space 430 for program codes may include individual program codes respectively used to implement various steps in the above method.
  • These program codes can be read from or written into one or more computer program products.
  • These computer program products include program code carriers such as hard disks, compact disks (CDs), memory cards or floppy disks.
  • Such computer program products are typically portable or fixed storage units as described with reference to FIG. 10 .
  • the storage unit may have storage segments, storage spaces, etc. arranged similarly to the memory 420 in the computing processing device of FIG. 9 .
  • the program code may, for example, be compressed in a suitable form.
  • the storage unit includes computer readable code, ie code that can be read by, for example, a processor such as 410, which code, when executed by a computing processing device, causes the computing processing device to perform each of the methods described above. step.
  • any reference signs placed between parentheses shall not be construed as limiting the claim.
  • the word “comprising” does not exclude the presence of elements or steps not listed in a claim.
  • the word “a” or “an” preceding an element does not exclude the presence of a plurality of such elements.
  • the application may be implemented by means of hardware comprising several different elements and by means of a suitably programmed computer. In the element claim enumerating several means, several of these means may be embodied by the same item of hardware.
  • the use of the words first, second, third, etc. does not indicate any order. These words can be interpreted as names.

Abstract

A flight control method and apparatus for an unmanned aerial vehicle, an unmanned aerial vehicle, and a storage medium. The method comprises: obtaining description information of a target object in the space; obtaining a target flight direction of an unmanned aerial vehicle; detecting, on the basis of a sensor carried by the unmanned aerial vehicle, description information of a detected object in the space indicated by the target flight direction; if the description information of the detected object matches the description information of the target object, controlling the unmanned aerial vehicle to perform a decelerating motion along the target flight direction to approach the detected object; and if the description information of the detected object does not match the description information of the target object, controlling the unmanned aerial vehicle to change the flight direction to bypass the detected object. Thus, whether the description information of the detected object matches the description information of the target object is automatically detected, and the unmanned aerial vehicle is automatically controlled, on the basis of a detection result, to avoid or approach the detected object, so that to a certain extent, the control efficiency can be improved and the labor cost can be reduced.

Description

无人机的飞行控制方法、装置、无人机以及存储介质UAV flight control method, device, UAV and storage medium 技术领域Technical field
本申请涉及无人机技术领域,特别是涉及一种无人机的飞行控制方法、装置、无人机以及存储介质。The present application relates to the technical field of drones, and in particular to a flight control method and device for a drone, a drone and a storage medium.
背景技术Background technique
目前,随着无人机技术的不断发展,无人机得到了越来越多的应用。在使用无人机的过程中,如何控制无人机安全飞行成为日益关注的问题。At present, with the continuous development of drone technology, drones are being used more and more. In the process of using drones, how to control drones to fly safely has become an issue of increasing concern.
在先技术中,在控制无人机飞行时,往往是用户基于自身对周围环境的判断,手动控制无人机进行飞行,以确保飞行安全。这种方式中,效率较低且人工成本较高。In the prior art, when controlling the flight of a drone, the user often manually controls the drone to fly based on his or her own judgment of the surrounding environment to ensure flight safety. In this way, efficiency is lower and labor costs are higher.
发明内容Contents of the invention
本申请提供一种无人机的飞行控制方法、装置、无人机以及存储介质,一定程度上可以提高控制效率,降低人工成本。This application provides a UAV flight control method, device, UAV and storage medium, which can improve control efficiency and reduce labor costs to a certain extent.
第一方面,本申请实施例提供了一种无人机的飞行控制方法,该方法包括:In a first aspect, embodiments of the present application provide a flight control method for a UAV, which method includes:
获取空间中的目标物的描述信息;Obtain description information of target objects in space;
获取无人机的目标飞行方向;Obtain the target flight direction of the drone;
基于所述无人机搭载的传感器,检测所述目标飞行方向指示的空间中的被探测物体的描述信息;Based on the sensor mounted on the drone, detect the description information of the detected object in the space indicated by the target flight direction;
若所述被探测物体的描述信息与所述目标物的描述信息匹配,则控制所述无人机沿着所述目标飞行方向减速运动至接近所述被探测物体;If the description information of the detected object matches the description information of the target object, control the drone to decelerate along the target flight direction to approach the detected object;
若所述被探测物体的描述信息与所述目标物体的描述信息不匹配,则控制所述无人机改变飞行方向以绕行所述被探测物体。If the description information of the detected object does not match the description information of the target object, the drone is controlled to change its flight direction to bypass the detected object.
第二方面,本申请实施例提供了一种无人机的飞行控制装置,所述装置包括存储器和处理器;In a second aspect, embodiments of the present application provide a flight control device for a drone, the device including a memory and a processor;
所述存储器,用于存储程序代码;The memory is used to store program code;
所述处理器,调用所述程序代码,当所述程序代码被执行时,用于执行以下操作:The processor calls the program code, and when the program code is executed, is used to perform the following operations:
获取空间中的目标物的描述信息;Obtain description information of target objects in space;
获取无人机的目标飞行方向;Obtain the target flight direction of the drone;
基于所述无人机搭载的传感器,检测所述目标飞行方向指示的空间中的被探测物体的描述信息;Based on the sensor mounted on the drone, detect the description information of the detected object in the space indicated by the target flight direction;
若所述被探测物体的描述信息与所述目标物的描述信息匹配,则控制所述无人机沿着所述目标飞行方向减速运动至接近所述被探测物体;If the description information of the detected object matches the description information of the target object, control the drone to decelerate along the target flight direction to approach the detected object;
若所述被探测物体的描述信息与所述目标物体的描述信息不匹配,则控制所述无人机改变飞行方向以绕行所述被探测物体。If the description information of the detected object does not match the description information of the target object, the drone is controlled to change its flight direction to bypass the detected object.
第三方面,本申请实施例提供了一种无人机,所述无人机用于实现上述方法。In a third aspect, embodiments of the present application provide a drone, which is used to implement the above method.
第四方面,本申请实施例提供了一种计算机可读存储介质,包括指令,当其在计算机上运行时,使得所述计算机执行上述方法。In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, including instructions that, when run on a computer, cause the computer to perform the above method.
第五方面,本申请实施例提供了一种包含指令的计算机程序产品,当所述指令在计算机上运行时,使得所述计算机执行上述方法。In a fifth aspect, embodiments of the present application provide a computer program product containing instructions, which when the instructions are run on a computer, cause the computer to perform the above method.
在本申请实施例中,获取空间中的目标物的描述信息,获取无人机的目标飞行方向。基于无人机搭载的传感器,检测目标飞行方向指示的空间中的被探测物体的描述信息。若被探测物体的描述信息与目标物的描述信息匹配,则控制无人机沿着目标飞行方向减速运动至接近被探测物体。若被探测物体的描述信息与目标物体的描述信息不匹配,则控制无人机改变飞行方向以绕行被探测物体。这样,通过自动检测被探测物体的描述信息与目标物的描述信息是否匹配,并基于检测结果自动控制无人机避让被探测物体或者靠近被探测物体,一定程度上可以提高控制效率,降低人工成本。同时,可以在避免无人机与被探测物体发生碰撞,确保无人机飞行安全的同时,确保无人机能够靠近空间中的目标物,从而兼顾飞行安全以及场景需求。In the embodiment of the present application, the description information of the target object in the space is obtained, and the target flight direction of the UAV is obtained. Based on the sensor mounted on the drone, the description information of the detected object in the space indicated by the target flight direction is detected. If the description information of the detected object matches the description information of the target object, the drone is controlled to decelerate along the target flight direction until it approaches the detected object. If the description information of the detected object does not match the description information of the target object, the drone is controlled to change its flight direction to bypass the detected object. In this way, by automatically detecting whether the description information of the detected object matches the description information of the target object, and automatically controlling the drone to avoid the detected object or approach the detected object based on the detection results, control efficiency can be improved to a certain extent and labor costs can be reduced. . At the same time, it can avoid collisions between the drone and the detected object to ensure the safety of the drone's flight, while ensuring that the drone can get close to the target object in space, thus taking into account flight safety and scene requirements.
且手动控制无人机飞行的方式中,受限于个人经验,有时无法合理的控制无人机运动。本申请实施例中基于检测结果自动控制无人机避让被探测物体或者靠近被探测物体,一定程度上可以针对空间中的被探测物体更加合理的控制无人机运动。Moreover, the manual control of drone flight is limited by personal experience, and sometimes it is impossible to reasonably control the movement of the drone. In the embodiment of this application, the drone is automatically controlled to avoid the detected object or approach the detected object based on the detection results. To a certain extent, the movement of the drone can be more reasonably controlled for the detected object in space.
附图说明Description of the drawings
图1是本申请实施例提供的一种无人机的飞行控制方法的步骤流程图;Figure 1 is a step flow chart of a UAV flight control method provided by an embodiment of the present application;
图2是本申请实施例提供的一种速度变化示意图;Figure 2 is a schematic diagram of speed change provided by the embodiment of the present application;
图3是本申请实施例提供的一种查询示意图;Figure 3 is a schematic diagram of a query provided by an embodiment of the present application;
图4是本申请实施例提供的一种函数曲线示意图;Figure 4 is a schematic diagram of a function curve provided by an embodiment of the present application;
图5是本申请实施例提供的一种碰撞查询的通道示意图;Figure 5 is a schematic diagram of a collision query channel provided by an embodiment of the present application;
图6是本申请实施例提供一种预设范围的示意图;Figure 6 is a schematic diagram of a preset range provided by an embodiment of the present application;
图7是本申请实施例提供的一种飞行控制方法所涉及的系统流程框图;Figure 7 is a system flow diagram involved in a flight control method provided by an embodiment of the present application;
图8是本申请实施例提供的一种无人机的飞行控制装置的框图;Figure 8 is a block diagram of a flight control device for a drone provided by an embodiment of the present application;
图9为本申请实施例提供的一种计算处理设备的框图;Figure 9 is a block diagram of a computing processing device provided by an embodiment of the present application;
图10为本申请实施例提供的一种便携式或者固定存储单元的框图。Figure 10 is a block diagram of a portable or fixed storage unit provided by an embodiment of the present application.
具体实施方式Detailed ways
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application. Obviously, the described embodiments are part of the embodiments of the present application, rather than all of the embodiments. Based on the embodiments in this application, all other embodiments obtained by those of ordinary skill in the art without creative efforts fall within the scope of protection of this application.
首先,对本申请实施例涉及的一种示例性应用场景进行说明。在该应用场景中,无人机需要针对空间中的特定物体进行巡检,以执行指定操作。相应地,在飞行过程中就需要无人机靠近空间中的特定物体,以完成指定操作。而空间中分布的物体往往较多,且有时物体分布会较为复杂,为了确保无人机的飞行安全,无人机又需要对空间中的物体进行避让。这就导致需要在飞行控制中兼顾安全性的同时,确保无人机能够靠近指定物体。First, an exemplary application scenario involved in the embodiment of the present application is described. In this application scenario, the drone needs to inspect specific objects in space to perform specified operations. Correspondingly, during flight, the drone needs to be close to specific objects in space to complete specified operations. There are often many objects distributed in space, and sometimes the distribution of objects is more complex. In order to ensure the flight safety of the drone, the drone needs to avoid objects in the space. This leads to the need to take into account safety in flight control while ensuring that the drone can get close to designated objects.
为此,本申请实施例提供一种无人机的飞行控制方法。To this end, embodiments of the present application provide a flight control method for a drone.
图1是本申请实施例提供的一种无人机的飞行控制方法的步骤流程图,如图1所示,该方法可以包括:Figure 1 is a step flow chart of a UAV flight control method provided by an embodiment of the present application. As shown in Figure 1, the method may include:
步骤101、获取空间中的目标物的描述信息。Step 101: Obtain description information of the target object in the space.
步骤102、获取无人机的目标飞行方向。Step 102: Obtain the target flight direction of the drone.
本申请实施例中,该空间可以指的是无人机所处的空间,目标物可以是预先指定的,目标物可以为上述应用场景中的指定物体。目标物的描述信息可以是能够用于指示目标物的信息,例如,目标物的图像信息、位置分布信息等等。通过获取目标物的描述信息,可以使无人机准确的获知本次飞行过程中需要靠近的物体。进一步地,目标飞行方向可以指的是无人机当前的飞行方向。示例性地,可以基于无人机搭载的方向传感器,检测无人机的当前飞行方向,从而获取目标飞行方向。当然,也可以采用其他方式获取目标飞行方向,本申请对此不做限制。In the embodiment of the present application, the space may refer to the space where the drone is located, the target object may be specified in advance, and the target object may be a designated object in the above application scenario. The description information of the target object may be information that can be used to indicate the target object, such as image information, position distribution information, etc. of the target object. By obtaining the description information of the target object, the drone can accurately learn the objects that need to be approached during this flight. Further, the target flight direction may refer to the current flight direction of the drone. For example, the current flight direction of the drone can be detected based on the direction sensor mounted on the drone, thereby obtaining the target flight direction. Of course, other methods can also be used to obtain the target flight direction, and this application does not limit this.
步骤103、基于所述无人机搭载的传感器,检测所述目标飞行方向指示的空间中的被探测物体的描述信息。Step 103: Based on the sensor mounted on the drone, detect the description information of the detected object in the space indicated by the target flight direction.
本申请实施例中,目标飞行方向指示的空间可以是目标飞行方向上预设范围内的空间。其中,预设范围可以根据实际需求设置。被探测物体的描述信息可以是能够用于指示被探测物体的信息,例如,被探测物体的图像信息、位置信息等等。被探测物体可以是落入目标飞行方向指示的空间中的物体。示例性地,可以基于无人机搭载的红外传感器、超声波传感器等能够检测物体的传感器,对目标飞行方向指示的空间进行检测,以检测目标飞行方向指示的空间中是否出现物体。对于目标飞行方向指示的空间中出现的物体,可以将该物体确定为被探测物体。进一步地,可以基于搭载的视觉传感器采集该被探测物体的图像,以作为被探测物体的描述信息。或者,可以采集该被探测物体的位置信息,以作为被探测物体的描述信息。In the embodiment of the present application, the space indicated by the target flight direction may be a space within a preset range in the target flight direction. Among them, the preset range can be set according to actual needs. The description information of the detected object may be information that can be used to indicate the detected object, such as image information, location information, etc. of the detected object. The detected object may be an object falling into the space indicated by the target's flight direction. For example, the space indicated by the target flight direction can be detected based on sensors capable of detecting objects such as infrared sensors and ultrasonic sensors mounted on the drone to detect whether objects appear in the space indicated by the target flight direction. For objects that appear in the space indicated by the target flight direction, the object can be determined as a detected object. Further, the image of the detected object can be collected based on the mounted visual sensor as description information of the detected object. Alternatively, the position information of the detected object can be collected as description information of the detected object.
步骤104、若所述被探测物体的描述信息与所述目标物的描述信息匹配,则控制所述无人机沿着所述目标飞行方向减速运动至接近所述被探测物体。Step 104: If the description information of the detected object matches the description information of the target object, control the drone to decelerate along the target flight direction to approach the detected object.
步骤105、若所述被探测物体的描述信息与所述目标物体的描述信息不匹配,则控制所述无人机改变飞行方向以绕行所述被探测物体。Step 105: If the description information of the detected object does not match the description information of the target object, control the drone to change its flight direction to bypass the detected object.
其中,被探测物体的描述信息与目标物的描述信息匹配,可以理解为当前检测到的被探测物体是目标物。被探测物体的描述信息与目标物的描述信息不匹配,可以理解为当前检测到的被探测物体不是目标物。具体的,以描 述信息为图像信息为例,在获取目标物的图像信息之后,可以结合采集到的被探测物体的图像信息,基于无人机搭载的卷积神经网络(Convolutional Neural Networks,CNN)的目标检测和跟踪模块,检测被探测物体是否为目标物。具体实施时,可以进一步确定被探测物体的位置信息,将该被探测物体的位置信息与空间中物体位置分布信息关联,使得无人机可以基于关联后的物体位置分布信息,定位到被探测物体,从而针对被探测物体进行靠近或者绕行。其中,物体位置分布信息可以当前空间的地图,被探测物体的位置信息可以为被探测物体在三维导航坐标系下的位置。Among them, the description information of the detected object matches the description information of the target object, which can be understood to mean that the currently detected object is the target object. If the description information of the detected object does not match the description information of the target object, it can be understood that the currently detected object being detected is not the target object. Specifically, taking the description information as image information as an example, after obtaining the image information of the target object, it can be combined with the collected image information of the detected object based on the Convolutional Neural Networks (CNN) carried by the drone. The target detection and tracking module detects whether the detected object is a target. During specific implementation, the position information of the detected object can be further determined, and the position information of the detected object can be associated with the object position distribution information in space, so that the drone can locate the detected object based on the associated object position distribution information. , thereby approaching or bypassing the detected object. Among them, the object position distribution information can be a map of the current space, and the position information of the detected object can be the position of the detected object in the three-dimensional navigation coordinate system.
如果当前检测到的被探测物体是目标物,则说明当前飞行方向上存在目标物,因此,无需改变飞行方向,可以控制无人机沿着目标飞行方向减速运动至接近被探测物体,以方便对目标物进行操作。同时,减速运动至接近被探测物体,一定程度上可以避免速度过快,从而确保无人机在靠近过程中的稳定性。如果当前检测到的被探测物体不是目标物,则可以直接控制无人机改变飞行方向,以绕行被探测物体,从而在确保飞行安全的同时,避免执行不必要的减速接近操作。需要说明的是,本申请实施例提供的飞行控制方法,可以应用于无人机,即,无人机自动控制自身靠近或绕行被探测物体。或者,也可以应用于无人机的控制设备,即,控制设备自动控制受控的无人机靠近或绕行被探测物体。本申请实施例对此不做限制。If the currently detected object is a target, it means that there is a target in the current flight direction. Therefore, there is no need to change the flight direction. The drone can be controlled to decelerate along the target flight direction until it is close to the detected object to facilitate detection. Operate on the target. At the same time, slowing down the movement until it is close to the detected object can avoid excessive speed to a certain extent, thereby ensuring the stability of the drone during the approach. If the currently detected object is not a target, the drone can be directly controlled to change its flight direction to bypass the detected object, thus ensuring flight safety while avoiding unnecessary deceleration and approach operations. It should be noted that the flight control method provided by the embodiment of the present application can be applied to a drone, that is, the drone automatically controls itself to approach or bypass the detected object. Alternatively, it can also be applied to the control device of a drone, that is, the control device automatically controls the controlled drone to approach or bypass the detected object. The embodiments of this application do not limit this.
综上所述,本申请实施例提供的无人机的飞行控制方法,获取空间中的目标物的描述信息,获取无人机的目标飞行方向。基于无人机搭载的传感器,检测目标飞行方向指示的空间中的被探测物体的描述信息。若被探测物体的描述信息与目标物的描述信息匹配,则控制无人机沿着目标飞行方向减速运动至接近被探测物体。若被探测物体的描述信息与目标物体的描述信息不匹配,则控制无人机改变飞行方向以绕行被探测物体。这样,通过自动检测被探测物体的描述信息与目标物的描述信息是否匹配,并基于检测结果自动控制无人机避让被探测物体或者靠近被探测物体,一定程度上可以提高控制效率,降低人工成本。同时,可以在避免无人机与被探测物体发生碰撞,确保无人机飞行安全的同时,确保无人机能够靠近空间中的目标物,从而兼顾飞行安全以及场景需求。To sum up, the flight control method of a UAV provided by the embodiment of the present application obtains the description information of the target object in space and obtains the target flight direction of the UAV. Based on the sensor mounted on the drone, the description information of the detected object in the space indicated by the target flight direction is detected. If the description information of the detected object matches the description information of the target object, the drone is controlled to decelerate along the target flight direction until it approaches the detected object. If the description information of the detected object does not match the description information of the target object, the drone is controlled to change its flight direction to bypass the detected object. In this way, by automatically detecting whether the description information of the detected object matches the description information of the target object, and automatically controlling the drone to avoid the detected object or approach the detected object based on the detection results, control efficiency can be improved to a certain extent and labor costs can be reduced. . At the same time, it can avoid collisions between the drone and the detected object to ensure the safety of the drone's flight, while ensuring that the drone can get close to the target object in space, thus taking into account flight safety and scene requirements.
且手动控制无人机飞行的方式中,受限于个人经验,有时无法合理的控制无人机运动。本申请实施例中基于检测结果自动控制无人机避让被探测物体或者靠近被探测物体,一定程度上可以针对空间中的被探测物体更加合理的控制无人机运动。Moreover, the manual control of drone flight is limited by personal experience, and sometimes it is impossible to reasonably control the movement of the drone. In the embodiment of this application, the drone is automatically controlled to avoid the detected object or approach the detected object based on the detection results. To a certain extent, the movement of the drone can be more reasonably controlled for the detected object in space.
可选的,本申请实施例中,在控制无人机沿着目标飞行方向减速运动至接近被探测物体,即,当前检测到的被探测物体是目标物的情况下,可以进一步执行下述操作:在所述无人机与所述被探测物体之间的距离不大于指定停止距离的情况下,控制所述无人机停止飞行。也就是说,在控制无人机沿着目标飞行方向减速运动至接近被探测物体的过程中,可以不断检测无人机与被探测物体之间的当前距离,并将当前距离与指定停止距离进行比对。如果当前距离与指定停止距离相等,则可以控制无人机停止飞行,从而控制无人机沿着目标飞行方向减速运动至距离被探测物体指定停止距离处。这样,无需手动调节无人机与目标物之间的距离,通过自动检测距离并在指定停止距离处停止飞行,一定程度上可以在实现自动而精准地控制与目标物之间的距离的同时,进一步节省人工成本。其中,指定停止距离可以是根据实际需求预先指定的,控制无人机停止飞行可以是控制无人机的飞行速度在距离被探测物体指定停止距离处减至0。本申请实施例中,通过控制无人机减速运动至距离被探测物体指定停止距离处,一定程度上可以在确保无人机能够对目标物进行操作的同时,避免无人机与目标物过于接近,从而确保无人机进行操作时的安全性。Optionally, in this embodiment of the present application, when the drone is controlled to decelerate along the target flight direction until it is close to the detected object, that is, when the currently detected detected object is the target object, the following operations may be further performed. : Control the drone to stop flying when the distance between the drone and the detected object is not greater than the specified stopping distance. That is to say, in the process of controlling the UAV to decelerate along the target flight direction to approach the detected object, the current distance between the UAV and the detected object can be continuously detected, and the current distance and the specified stopping distance can be compared. Comparison. If the current distance is equal to the specified stopping distance, the drone can be controlled to stop flying, thereby controlling the drone to decelerate along the target flight direction to the specified stopping distance from the detected object. In this way, there is no need to manually adjust the distance between the drone and the target. By automatically detecting the distance and stopping the flight at the specified stopping distance, it is possible to achieve automatic and precise control of the distance to the target to a certain extent. Further save labor costs. Among them, the designated stopping distance may be pre-specified according to actual needs, and controlling the drone to stop flying may control the flying speed of the drone to reduce to 0 at the designated stopping distance from the detected object. In the embodiment of the present application, by controlling the deceleration of the drone to a designated stopping distance from the detected object, it is possible to ensure that the drone can operate on the target object while preventing the drone from getting too close to the target object. , thereby ensuring the safety of drone operations.
进一步地,本申请实施例中可以在无人机停止飞行后,控制无人机对所被探测物体执行目标操作,从而满足对目标物的操作需求。其中,目标操作可以根据实际需求预先指定。示例性地,执行目标操作可以是对目标物进行观测,以确定目标物是否处于预设状态,并将观测结果回传给监测人员的用户设备。例如,在风力发电场景中,可以观测风力发电设备是否处于正常运行状态,并将表征是否处于正常运行状态的观测结果回传给监测人员的设备,从而方便监测人员在风力发电设备未处于正常运行状态的情况下,及时对设备进行维护。或者,目标操作也可以是采集目标物的图像、对目标物喷洒保养液,等等。Furthermore, in the embodiment of the present application, after the drone stops flying, the drone can be controlled to perform a target operation on the detected object, thereby meeting the operation requirements for the target object. Among them, the target operation can be specified in advance according to actual needs. For example, performing the target operation may be to observe the target object to determine whether the target object is in a preset state, and transmit the observation results back to the user equipment of the monitoring personnel. For example, in a wind power generation scenario, it is possible to observe whether the wind power generation equipment is in normal operation, and the observation results indicating whether the wind power generation equipment is in normal operation are sent back to the monitoring personnel's equipment, thus facilitating the monitoring personnel when the wind power generation equipment is not in normal operation. If the equipment is in good condition, perform maintenance on the equipment in a timely manner. Alternatively, the target operation can also be to collect images of the target object, spray maintenance fluid on the target object, and so on.
在一种实现方式中,指定停止距离可以基于以下一种或者多种因素设置:所述空间中物体的稠密程度;所述目标物的描述信息;所述无人机的作业类型。其中,指定停止距离可以是在无人机执行本次任务之前预先设置的,或者是在飞行过程中根据这些因素设置的。本申请实施例中,基于空间中物体的稠密程度、目标物的描述信息、无人机的作业类型这些因素设置指定停止距离,一定程度上可以使得设置的指定停止距离更加适配本次飞行场景,从而确保所设置的指定停止距离的合理性。在实际设置时,可以为用户提供可选择范围,将用户从可选择范围中选择的值设置为指定停止距离。其中,可选择范围的端值可以基于空间中物体的稠密程度、目标物的描述信息、无人机的作业类型中的一种或者多种因素自动调整。具体的,物体的稠密程度越大,即,空间中物体越稠密,目标物的描述信息表征的目标物的形态越复杂、无人机的作业类型的难度越高的情况下,可选择范围的端值可以越大,进而使得用户自主设置的指定停止距离可以越大,从而确保无人机对空间中检测到的被探测物体进行目标操作时的安全性。需要说明的是,可选择范围限定的最小值可以不小于预设安全距离,这样,可以避免设置的指定停止距离过小,无人机与目标物过近的问题。In one implementation, the designated stopping distance can be set based on one or more of the following factors: the density of objects in the space; the description information of the target object; and the operation type of the drone. Among them, the specified stopping distance can be preset before the UAV performs this mission, or set according to these factors during the flight. In the embodiment of this application, the designated stopping distance is set based on factors such as the density of objects in the space, the description information of the target object, and the operation type of the drone, which can make the set designated stopping distance more suitable for this flight scenario to a certain extent. , thereby ensuring the reasonableness of the specified stopping distance set. In the actual setting, the user can be provided with a selectable range, and the value selected by the user from the selectable range is set to the specified stopping distance. Among them, the end value of the selectable range can be automatically adjusted based on one or more factors including the density of objects in the space, description information of the target object, and the type of operation of the drone. Specifically, the greater the density of the object, that is, the denser the object in the space, the more complex the shape of the target represented by the description information of the target, and the higher the difficulty of the drone's operation type, the range of options can be The larger the end value can be, the larger the designated stopping distance set by the user can be, thereby ensuring the safety of the UAV when performing target operations on detected objects detected in space. It should be noted that the minimum value of the selectable range limit can be no less than the preset safe distance. In this way, the problem of the set designated stopping distance being too small and the UAV being too close to the target can be avoided.
可选的,上述控制所述无人机沿着所述目标飞行方向减速运动至接近所述被探测物体的操作,具体可以包括下述步骤:Optionally, the above-mentioned operation of controlling the drone to decelerate along the target flight direction to approach the detected object may specifically include the following steps:
步骤S21、基于局部地图信息和/或单帧深度图像确定目标安全速度;所述目标安全速度与所述无人机与所述被探测物体之间的当前距离正相关。Step S21: Determine the target safe speed based on local map information and/or single-frame depth image; the target safe speed is positively related to the current distance between the drone and the detected object.
其中,局部地图信息可以用于表征无人机周围环境的局部地图,深度图像可以是无人机采集的当前周围环境的深度图像,该深度图像可以是将从无人机的图像采集器到周围环境中各点的深度作为像素值的图像。Among them, the local map information can be used to characterize the local map of the surrounding environment of the drone, and the depth image can be a depth image of the current surrounding environment collected by the drone. The depth image can be obtained from the image collector of the drone to the surrounding environment. An image of the depth of each point in the environment as pixel values.
步骤S22、控制所述无人机沿着所述目标飞行方向,以所述目标安全速度减速运动至接近所述被探测物体。Step S22: Control the drone to move along the target flight direction and decelerate at the target safe speed until it approaches the detected object.
本申请实施例中,目标安全速度与无人机与被探测物体之间的当前距离正相关,也就是说,无人机与被探测物体之间的当前距离越小的情况下,确定的目标安全速度可以越小。随着无人机不断靠近被探测物体,目标安全速度可以持续更新,并变得越来越小。这样,在控制无人机以目标安全速度减 速运动至接近被探测物体的过程中,可以逐渐减速靠近被探测物体,确保靠近被探测物体的过程中速度平滑变化,确保无人机的飞行安全。In the embodiment of this application, the target safety speed is positively related to the current distance between the UAV and the detected object. That is to say, when the current distance between the UAV and the detected object is smaller, the determined target The safe speed can be smaller. As the drone continues to get closer to the detected object, the target safety speed can be continuously updated and become smaller and smaller. In this way, in the process of controlling the UAV to decelerate at the target safe speed until it approaches the detected object, it can gradually decelerate and approach the detected object, ensuring that the speed changes smoothly during the process of approaching the detected object and ensuring the flight safety of the UAV.
可选的,在一种实现方式中,上述基于局部地图信息和/或单帧深度图像确定目标安全速度的操作,具体可以包括:Optionally, in one implementation, the above-mentioned operation of determining the target safe speed based on local map information and/or single-frame depth images may specifically include:
步骤S31、基于所述局部地图信息,计算所述无人机的第一安全速度。Step S31: Calculate the first safe speed of the drone based on the local map information.
在一种实现方式中,局部地图信息可以是基于无人机采集的当前周围环境的多帧深度图像生成的。这样,基于当前周围环境的多帧深度图像生成局部地图信息,使得局部地图信息可以较为准确的表征无人机当前周围环境的局部地图,一定程度上可以使确定的安全速度更适配当前周围环境,从而提高速度规划效果。In one implementation, the local map information may be generated based on multi-frame depth images of the current surrounding environment collected by the drone. In this way, local map information is generated based on multi-frame depth images of the current surrounding environment, so that the local map information can more accurately represent the local map of the current surrounding environment of the drone, and to a certain extent, the determined safe speed can be more suitable for the current surrounding environment. , thereby improving the speed planning effect.
步骤S32、基于所述单帧深度图像,计算所述无人机的第二安全速度。Step S32: Calculate the second safe speed of the drone based on the single-frame depth image.
步骤S33、基于所述第一安全速度以及所述第二安全速度,确定所述目标安全速度。Step S33: Determine the target safety speed based on the first safety speed and the second safety speed.
本步骤中,目标安全速度可以与第一安全速度以及第二安全速度正相关。示例性地,可以将第一安全速度以及第二安全速度的均值作为目标安全速度。或者,也可以将第一安全速度以及第二安全速度中的最小值,确定为目标安全速度。这样,通过选择其中的最小值作为目标安全速度,可以在设置合理安全速度的同时,尽可能避免目标安全速度过大的问题。In this step, the target safe speed may be positively correlated with the first safe speed and the second safe speed. For example, the average value of the first safe speed and the second safe speed may be used as the target safe speed. Alternatively, the minimum value of the first safety speed and the second safety speed may be determined as the target safety speed. In this way, by selecting the minimum value as the target safe speed, the problem of excessive target safe speed can be avoided as much as possible while setting a reasonable safe speed.
本申请实施例中,基于局部地图信息以及单帧深度图像进行速度规划,即,基于局部地图信息以及单帧深度图像分别确定第一安全速度以及第二安全速度,并基于第一安全速度以及第二安全速度确定目标安全速度。由于局部地图抗噪能力强,且对观察过的区域具有“记忆”能力,具有一定程度的盲区感知能力,而单帧深度图像可以在表征距离的同时,弥补局部地图范围较小的缺陷。因此,结合局部地图信息以及单帧深度图像进行速度规划,一定程度上可以使所确定的目标安全速度更加合理。需要说明的是,也可以仅基于局部地图信息进行速度规划,或者,仅基于单帧深度图像进行速度规划。这样,一定程度上可以减少速度规划的计算量。In the embodiment of the present application, speed planning is performed based on local map information and a single-frame depth image, that is, the first safe speed and the second safe speed are determined based on the local map information and the single-frame depth image, and the first safe speed and the second safe speed are determined based on the local map information and the single-frame depth image. 2. Safe speed determines the target safe speed. Since the local map has strong anti-noise ability and has the ability to "memory" the observed area, it has a certain degree of blind spot perception. The single-frame depth image can represent the distance while making up for the shortcomings of the small range of the local map. Therefore, speed planning based on local map information and single-frame depth images can make the determined target safety speed more reasonable to a certain extent. It should be noted that speed planning can also be performed based only on local map information, or speed planning can be performed only on a single frame of depth image. In this way, the calculation amount of speed planning can be reduced to a certain extent.
且相较于直接刹车的方式,本申请实施例中,在自动切换到刹车模式的情况下,即,控制无人机停止到距离被探测物体指定停止距离处的情况下, 通过结合局部地图信息以及单帧深度图像进行平滑速度规划,一定程度上可以使无人机能平滑减速运动至接近被探测物体的位置,从而可以精确控制无人机停下来后与被探测物体的距离。And compared with the direct braking method, in the embodiment of the present application, when automatically switching to the braking mode, that is, when controlling the drone to stop at a designated stopping distance from the detected object, by combining the local map information And single-frame depth images are used for smooth speed planning, which to a certain extent can enable the drone to smoothly decelerate to a position close to the detected object, thereby accurately controlling the distance between the drone and the detected object after it stops.
进一步地,上述局部地图信息可以具体是在所述无人机与所述被探测物体之间的距离达到预设建图距离的情况下,基于所述多帧深度图像生成的,所述预设建图距离与所述无人机的指定停止距离正相关。具体地,在控制无人机沿着目标飞行方向减速运动至接近被探测物体的过程中,可以不断检测无人机与被探测物体之间的当前距离,并将当前距离与预设建图距离进行比对。如果当前距离与预设建图距离相等,则可以基于采集到的多帧深度图像生成局部地图信息。示例性地,可以基于采集到的多帧深度图像进行融合,以生成局部地图信息。其中,预设建图距离可以大于指定停止距离。设置的预设建图距离可以与指定停止距离正相关,从而确保无人机开始建图的位置与距被探测物体的指定停止距离处有充足的距离,避免在距离被探测物体的指定停止距离处较近的位置,才开始结合建立的局部地图信息进行速度规划,导致速度规划效果较差的问题。Further, the above-mentioned local map information may be specifically generated based on the multi-frame depth image when the distance between the drone and the detected object reaches a preset mapping distance, and the preset Mapping distance is directly related to the specified stopping distance of the drone. Specifically, in the process of controlling the UAV to decelerate along the target flight direction to approach the detected object, the current distance between the UAV and the detected object can be continuously detected, and the current distance and the preset mapping distance can be continuously detected. Make a comparison. If the current distance is equal to the preset mapping distance, local map information can be generated based on the collected multi-frame depth images. For example, fusion can be performed based on the collected multi-frame depth images to generate local map information. Among them, the preset mapping distance can be greater than the specified stopping distance. The set preset mapping distance can be positively related to the specified stopping distance, thereby ensuring that there is sufficient distance between the position where the drone starts mapping and the specified stopping distance from the detected object, and to avoid being at the specified stopping distance from the detected object. When the vehicle is at a relatively close location, it starts to perform speed planning based on the established local map information, resulting in a problem of poor speed planning effect.
需要说明的是,本申请实施例中可以是以局部地图信息做近距离细粒度的速度规划,以单帧深度图像做远距离粗粒度的速度规划。在未达到预设建图距离的情况下,可以以默认值作为第一安全速度。进一步地,在建图得到局部地图信息后,基于局部地图信息确定的第一安全速度。由于基于局部地图信息确定的第一安全速度往往会小于第二安全速度。因此,在达到预设建图距离,建立局部地图信息之后,会控制无人机以更小的第一安全速度作为目标安全速度飞行。也就是说,在达到预设建图距离后,无人机会出现速度突降。因此,本申请实施例中通过控制预设建图距离与指定停止距离正相关,还可以避免在距离目标物的指定停止距离处较近的位置,出现速度突降,从而可以更加精确控制无人机的停止位置。示例性地,图2是本申请实施例提供的一种速度变化示意图,如图2所示,横轴表示与被探测物体之间的距离,纵轴表示速度值。线条1表示第一安全速度,线条2表示第二安全速度,线条3表示飞行方向。可以看出,在未达到预设建图距离的情况下,第一安全速度为默认值且大于第二安全速度。在达到预设建图距离的情况下,基于局 部地图信息确定的第一安全速度小于第二安全速度。需要说明的是,在无人机执行目标操作后,可以继续飞行远离当前检测到的被探测物体。此时,由于距离越来越远,速度可以越来越高。且由于远离之后,会导致无人机与该被探测物体之间的距离不满足预设建图距离,因此,第一安全速度可以回升至默认值。It should be noted that in the embodiment of the present application, local map information can be used for short-range fine-grained speed planning, and a single frame depth image can be used for long-range coarse-grained speed planning. When the preset mapping distance is not reached, the default value can be used as the first safe speed. Further, after the local map information is obtained through mapping, the first safe speed is determined based on the local map information. Because the first safe speed determined based on local map information is often smaller than the second safe speed. Therefore, after reaching the preset mapping distance and establishing local map information, the drone will be controlled to fly at a smaller first safe speed as the target safe speed. In other words, after reaching the preset mapping distance, the drone's speed will drop suddenly. Therefore, in the embodiment of the present application, by controlling the positive correlation between the preset mapping distance and the specified stopping distance, it is also possible to avoid a sudden drop in speed at a position closer to the specified stopping distance of the target, thereby enabling more precise control of the unmanned vehicle. The stop position of the machine. Exemplarily, Figure 2 is a schematic diagram of speed changes provided by an embodiment of the present application. As shown in Figure 2, the horizontal axis represents the distance to the detected object, and the vertical axis represents the speed value. Line 1 represents the first safe speed, line 2 represents the second safe speed, and line 3 represents the flight direction. It can be seen that when the preset mapping distance is not reached, the first safe speed is the default value and is greater than the second safe speed. When the preset mapping distance is reached, the first safe speed determined based on the local map information is smaller than the second safe speed. It should be noted that after the drone performs the target operation, it can continue to fly away from the currently detected object. At this time, as the distance becomes farther and farther, the speed can become higher and higher. And because the distance between the drone and the detected object will not meet the preset mapping distance after moving away, the first safe speed can be returned to the default value.
可选的,建立局部地图信息所使用的多帧深度图像是所述无人机搭载的传感器在不同朝向上获取得到的。这样,使用在不同朝向上获取多帧深度图像建立局部地图信息,使得建立的局部地图信息可以表征各个朝向上的地图信息,从而使得无人机可以基于该局部地图信息全向感知与被探测物体之间的距离。示例性地,无人机的第一方向上可以设置有视觉传感器,采集当前周围环境的多帧深度图像时,可以基于视觉传感器采集当前周围环境中第一方向上的深度图像。控制无人机向第二方向进行翻转,以获取未设置有视觉传感器的方向上的深度图像。其中,第二方向与第一方向不同这样,可以使非全向感知的无人机也能得到不同朝向上的多帧深度图像,从而全向感知被探测物体。例如,无人机可以仅在前后方向上搭载视觉传感器,可以控制无人机在起飞后,以无人机为中心,转动若干偏航角(Yaw),以采集第二方向上的深度图像。当然,飞行过程中无人机的姿态会发生变化,可以控制无人机在发生姿态后基于视觉传感器采集深度图像。这样,通过不断积累获得不同朝向上的深度图像。Optionally, the multi-frame depth images used to establish local map information are obtained by sensors mounted on the drone in different directions. In this way, multiple frames of depth images acquired in different directions are used to establish local map information, so that the established local map information can represent the map information in each direction, so that the drone can omnidirectionally perceive and detect objects based on the local map information. the distance between. For example, a visual sensor may be provided in the first direction of the drone. When collecting multiple frames of depth images of the current surrounding environment, the depth image in the first direction of the current surrounding environment may be collected based on the visual sensor. Control the drone to flip in the second direction to obtain a depth image in a direction where no visual sensor is provided. Wherein, the second direction is different from the first direction, so that a drone with non-omnidirectional sensing can also obtain multi-frame depth images in different directions, thereby omnidirectionally sensing the detected object. For example, the drone can only be equipped with visual sensors in the front-to-back direction. After taking off, the drone can be controlled to rotate a certain yaw angle (Yaw) with the drone as the center to collect depth images in the second direction. Of course, the attitude of the drone will change during flight, and the drone can be controlled to collect depth images based on the visual sensor after the attitude occurs. In this way, depth images in different orientations are obtained through continuous accumulation.
可选的,上述基于所述局部地图信息,计算所述无人机的第一安全速度的操作,具体可以包括:Optionally, the above-mentioned operation of calculating the first safe speed of the drone based on the local map information may specifically include:
步骤S41、基于所述局部地图信息,确定所述无人机与所述被探测物体之间的当前距离。Step S41: Determine the current distance between the drone and the detected object based on the local map information.
本步骤中,可以结合当前飞行方向以及局部地图信息进行轨迹生成,以生成查询轨迹。基于该查询轨迹进行碰撞查询,以确定无人机与被探测物体之间的当前距离。这样,可以避免直接的点云操作,从而可以节省计算资源。且以当前飞行方向提供先验信息,无人机往哪个方向飞,就在哪个方向上进行查询,其他无关的方向参与计算。这样,可以进一步降低计算资源消耗。具体的,生成查询轨迹的方式可以根据实际需求选择,本申请对此不做限制。 示例性地,可以基于微分平坦模型或者基于凸优化的方法生成。In this step, trajectory generation can be performed based on the current flight direction and local map information to generate a query trajectory. A collision query is performed based on the query trajectory to determine the current distance between the drone and the detected object. In this way, direct point cloud manipulation can be avoided, thus saving computing resources. And the current flight direction is used to provide a priori information. In which direction the drone flies, the query is performed in that direction, and other irrelevant directions are involved in the calculation. In this way, computing resource consumption can be further reduced. Specifically, the method of generating query trajectories can be selected according to actual needs, and this application does not limit this. For example, it can be generated based on a differential flat model or a method based on convex optimization.
在基于查询轨迹进行碰撞查询时,可以以无人机的中心点当前所在位置作为起始的查询点,基于局部地图中表示的查询点与附近最近物体之间的距离以及当前的碰撞查询半径,计算下一步轨迹采样步长。其中,碰撞查询使用的局部地图可以为任意一种可供查询的环境地图,例如,occupy地图、esdf地图,等等。When performing a collision query based on the query trajectory, the current location of the drone's center point can be used as the starting query point, based on the distance between the query point represented in the local map and the nearest object and the current collision query radius. Calculate the next step of trajectory sampling. Among them, the local map used in collision query can be any environment map available for query, such as occupation map, esdf map, etc.
基于轨迹采样步长确定下一个查询点,并重新基于查询点与附近最近物体之间的距离以及当前的碰撞查询半径,计算下一步轨迹采样步长,直至无人机达到距离被探测物体指定停止距离处。进一步地,基于累计的轨迹采样步长,确定无人机与被探测物体之间的当前距离。例如,将轨迹采样步长之和确定为当前距离。Determine the next query point based on the trajectory sampling step, and calculate the next trajectory sampling step based on the distance between the query point and the nearest object and the current collision query radius, until the drone reaches the distance specified by the detected object and stops. distance. Further, based on the accumulated trajectory sampling step, the current distance between the drone and the detected object is determined. For example, the sum of the trajectory sampling steps is determined as the current distance.
示例性地,以esdf表示当前查询点与附近最近物体之间的距离,r表示当前的碰撞查询半径,step表示下一步轨迹采样步长,
Figure PCTCN2022081230-appb-000001
这样,可以使得计算的step更加合理。例如,图3是本申请实施例提供的一种查询示意图,如图3所示,如果采用均匀步长,直接将esdf作为下一步轨迹采样步长,会导致下一步的查询点为点2。然而实际上下一步的查询点应该为点1。因此,本申请实施例中采用不均匀step,以上述方式计算step,可以使得计算的step更加合理,从而避免查到的距离比实际要长的问题。进一步地,以距离被探测物体指定停止距离处作为目标位置,将该目标位置视为障碍物,由于采用不均匀step,可能会导致一步就查到目标位置跟前的情况,导致无人机超过目标位置。其中,图3中斜线覆盖的部分可以表征无人机超过指定停止距离的部分。因此,可以基于检测到当前距离进行修正。示例性地,以real_dist表示修正后的当前距离,checked_dist表示检测到当前距离,checked esdf表示检测到的与最近障碍物之间的距离,可以采用下述公式进行修正:real_dist=checked_dist-(r-checked esdf)。其中,checked_dist可以为图3中point点至点4的距离,(r-checked esdf)表示点3至点4的距离。这样,通过修正可以使real_dist更加合理,避免无人机存在超过目标位置的部分。
For example, esdf represents the distance between the current query point and the nearest object, r represents the current collision query radius, step represents the next trajectory sampling step,
Figure PCTCN2022081230-appb-000001
In this way, the calculation steps can be made more reasonable. For example, Figure 3 is a query schematic diagram provided by an embodiment of the present application. As shown in Figure 3, if a uniform step size is used and esdf is directly used as the next trajectory sampling step, the next query point will be point 2. However, in fact the next query point should be point 1. Therefore, uneven steps are used in the embodiments of the present application, and calculating steps in the above manner can make the calculated steps more reasonable, thus avoiding the problem that the distance found is longer than the actual distance. Furthermore, the designated stopping distance from the detected object is used as the target position, and the target position is regarded as an obstacle. Due to the uneven step, the situation in front of the target position may be found in one step, causing the drone to exceed the target. Location. Among them, the part covered by the diagonal lines in Figure 3 can represent the part of the drone that exceeds the specified stopping distance. Therefore, corrections can be made based on detecting the current distance. For example, real_dist represents the corrected current distance, checked_dist represents the detected current distance, and checked esdf represents the detected distance to the nearest obstacle. The following formula can be used for correction: real_dist=checked_dist-(r- checked esdf ). Among them, checked_dist can be the distance from point to point 4 in Figure 3, (r-checked esdf ) represents the distance from point 3 to point 4. In this way, real_dist can be made more reasonable through correction to prevent the drone from exceeding the target position.
步骤S42、基于所述当前距离以及所述无人机的指定停止距离,计算所 述无人机的第一安全速度;所述第一安全速度与当前距离差值正相关,所述当前距离差值为所述当前距离与所述指定停止距离之间的差值。Step S42: Calculate the first safe speed of the UAV based on the current distance and the designated stopping distance of the UAV; the first safe speed is positively related to the current distance difference, and the current distance difference The value is the difference between the current distance and the specified stopping distance.
具体的,可以基于将述当前距离与指定停止距离之间的差值作为预设函数的输入,将该预设函数的输出作为第一安全速度。示例的,以map_safe spd表示第一安全速度,safe_dist表示指定停止距离,map_safe spd=f(real_dist-safe_dist)。 Specifically, the difference between the current distance and the designated stopping distance can be used as the input of the preset function, and the output of the preset function can be used as the first safe speed. For example, map_safe spd represents the first safe speed, safe_dist represents the specified stopping distance, map_safe spd =f(real_dist-safe_dist).
本申请实施例中,基于局部地图信息,确定无人机与被探测物体之间的当前距离。基于当前距离以及无人机的指定停止距离,计算无人机的第一安全速度;第一安全速度与当前距离差值正相关,当前距离差值为当前距离与指定停止距离之间的差值。这样,可以使得距离越近时,规划的第一安全速度越小,从而实现平滑速度规划。且局部地图信息抗噪能力更强,因此,一定程度上可以确保第一安全的合理性。In the embodiment of the present application, the current distance between the drone and the detected object is determined based on local map information. Based on the current distance and the specified stopping distance of the drone, calculate the first safe speed of the drone; the first safe speed is positively related to the current distance difference, and the current distance difference is the difference between the current distance and the specified stopping distance . In this way, the closer the distance is, the smaller the planned first safe speed is, thereby achieving smooth speed planning. Moreover, local map information has stronger anti-noise ability, so the rationality of first safety can be ensured to a certain extent.
可选的,上述基于所述单帧深度图像,计算所述无人机的第二安全速度的操作,具体可以包括:Optionally, the above-mentioned operation of calculating the second safe speed of the drone based on the single-frame depth image may specifically include:
步骤S51、基于所述单帧深度图像中落入各深度区间的像素数量以及所述单帧深度图的总像素数量,确定各所述深度区间对应的像素数量占比。Step S51: Based on the number of pixels falling into each depth interval in the single-frame depth image and the total number of pixels in the single-frame depth image, determine the proportion of the number of pixels corresponding to each depth interval.
其中,单帧深度图像可以是无人机在靠近目标物的过程中,在飞行方向上不断采集更新的。具体的,可以基于深度图速度规划模块,根据单帧深度图像统计深度直方图,检测单帧深度图像中各像素表征的深度值所属的深度区间,得到各像素落入的深度区间。当然,实际应用中也可以单帧深度图速度规划可以直接计算最近点云进行,本申请实施例对此不做限制。对于任一深度区间,可以计算落入该深度区间的像素数量与单帧深度图的总像素数量的比值,得到深度区间对应的像素数量占比。Among them, the single-frame depth image can be continuously collected and updated by the drone in the flight direction as it approaches the target. Specifically, based on the depth map speed planning module, the depth histogram can be calculated based on the single-frame depth image, and the depth interval to which the depth value represented by each pixel in the single-frame depth image belongs, and the depth interval into which each pixel falls can be obtained. Of course, in practical applications, single-frame depth map speed planning can also be performed by directly calculating the closest point cloud, and the embodiments of the present application do not limit this. For any depth interval, the ratio of the number of pixels falling within the depth interval to the total number of pixels in a single frame depth map can be calculated to obtain the proportion of the number of pixels corresponding to the depth interval.
步骤S52、根据各所述深度区间对应的像素数量占比,确定各所述深度区间对应的权重值;所述深度区间对应的权重值与所述像素数量占比正相关。Step S52: Determine the weight value corresponding to each depth interval according to the proportion of the number of pixels corresponding to each depth interval; the weight value corresponding to the depth interval is positively correlated with the proportion of the number of pixels.
步骤S53、基于各所述深度区间对应的速度值以及权重值,确定所述第二安全速度;所述深度区间对应的速度值与所述深度区间表征的深度值正相关。Step S53: Determine the second safe speed based on the speed value and weight value corresponding to each depth interval; the speed value corresponding to the depth interval is positively correlated with the depth value represented by the depth interval.
在无人机越靠近目标物的情况下,深度值越小的深度区间对应的像素数量占比会越大。因此,通过设置深度区间对应的速度值与深度区间表征的深度值正相关,深度区间对应的权重值与像素数量占比正相关,可以使得无人机越靠近目标物的情况下,低速度值对应的权重值更大,从而一定程度上可以使得在无人机靠近目标物的过程中第二安全速度逐渐减小。When the drone is closer to the target, the smaller the depth value, the greater the proportion of pixels corresponding to the depth interval. Therefore, by setting the speed value corresponding to the depth interval to be positively related to the depth value represented by the depth interval, and the weight value corresponding to the depth interval to be positively related to the proportion of the number of pixels, it can be made that when the drone is closer to the target, the lower speed value The corresponding weight value is larger, which to a certain extent can make the second safe speed gradually decrease as the drone approaches the target.
其中,各深度区间对应的速度值可以是预先设定的。示例性地,各深度区间表征的深度范围对应的速度值可以如下表所示:The speed value corresponding to each depth interval may be preset. For example, the velocity values corresponding to the depth range represented by each depth interval can be as shown in the following table:
Figure PCTCN2022081230-appb-000002
Figure PCTCN2022081230-appb-000002
其中,p0、p1、p2、p3、p4、p5、p6、p7、p8、p9表示各深度区间对应的像素数量占比。进一步地,像素数量占比可以用于表征深度区间原始权重,可对原始权重进行调整,得到各深度区间对应的权重值。Among them, p0, p1, p2, p3, p4, p5, p6, p7, p8, and p9 represent the proportion of the number of pixels corresponding to each depth interval. Furthermore, the pixel number ratio can be used to represent the original weight of the depth interval, and the original weight can be adjusted to obtain the weight value corresponding to each depth interval.
以depth_safe spd表示第二安全速度,depth_safe spd=Pnew*V。其中,V表示上表中第二行对应的向量,Pnew表示上表中第三行对应的经过调整后的向量。即,可以计算各深度区间的调整后的权重值与速度值的乘积之和,得到第二安全速度。需要说明的是,调整后的各深度区间对应的权重值之和可以等于1,各深度区间对应的权重值可以决定各深度区间对应的速度值在最终得到的第二安全速度中的占比。因此,计算各深度区间的调整后的权重值与速度值的乘积之和的方式,可以相当于利用调整后的权重值进行加权平均。 Depth_safe spd represents the second safe speed, depth_safe spd =Pnew*V. Among them, V represents the vector corresponding to the second row in the above table, and Pnew represents the adjusted vector corresponding to the third row in the above table. That is, the sum of the products of the adjusted weight value and the speed value of each depth interval can be calculated to obtain the second safe speed. It should be noted that the adjusted sum of the weight values corresponding to each depth interval may be equal to 1, and the weight value corresponding to each depth interval may determine the proportion of the speed value corresponding to each depth interval in the final second safe speed. Therefore, the method of calculating the sum of the products of the adjusted weight value and the speed value of each depth interval can be equivalent to using the adjusted weight value to perform a weighted average.
以pnew表示深度区间调整后的权重值,pnew可以包括ph,new以及pl,new。其中,h的取值可以为0、1,ph可以表示高速区间对应的权重。l的取值可以为2~9,pl可以表示低速区间对应的权重。可以基于下述公式计算调整后的权重值:Use pnew to represent the weight value after the depth interval adjustment. pnew can include ph, new and pl, new. Among them, the value of h can be 0 or 1, and ph can represent the weight corresponding to the high-speed interval. The value of l can be from 2 to 9, and pl can represent the weight corresponding to the low-speed interval. The adjusted weight value can be calculated based on the following formula:
Ph,new=ph,new*Ph/ph,Pl,new=pl,new*Pl/pl;其中,P表示向量,p表示数值。即,针对所有高速区间,可以计算调整后的权重值与原始权重值的比值,得到变化倍数,计算变化倍数与各高速区间的原始权重值的乘积, 得到各高速区间对应的权重值。针对所有低速区间,可以计算调整后的权重值与原始权重值的比值,得到变化倍数,计算变化倍数与各低速区间的原始权重值的乘积,得到各低速区间对应的权重值。Ph, new=ph, new*Ph/ph, Pl, new=pl, new*Pl/pl; where P represents a vector and p represents a numerical value. That is, for all high-speed sections, the ratio of the adjusted weight value to the original weight value can be calculated to obtain the change multiple, and the product of the change multiple and the original weight value of each high-speed section can be calculated to obtain the weight value corresponding to each high-speed section. For all low-speed sections, the ratio of the adjusted weight value to the original weight value can be calculated to obtain the change factor. The product of the change factor and the original weight value of each low-speed section can be calculated to obtain the weight value corresponding to each low-speed section.
进一步地,可以基于下述公式计算ph,new以及pl,new:Further, ph,new and pl,new can be calculated based on the following formulas:
ph,new=f(ph),pl,new=1-ph,new;ph,new=f(ph), pl,new=1-ph,new;
具体调整时,可以对原始权重较小的区间进行权重压缩,进而增大其他区间的权重。图4是本申请实施例提供的一种函数曲线示意图,如图4所示,在像素数量占比越小的情况下,压缩率可以越大,即,调整后的权重可以越小。假设ph<pl,则可以对ph按照图4所示函数进行压缩,从而使得调整后的ph更小,调整后的pl更大。即,使得原始权重较小的区间的可信度越低,其他区间的可信度越大,从而实现速度映射的偏重性。During specific adjustments, the weight of intervals with smaller original weights can be compressed, and then the weights of other intervals can be increased. Figure 4 is a schematic diagram of a function curve provided by an embodiment of the present application. As shown in Figure 4, when the proportion of the number of pixels is smaller, the compression rate can be larger, that is, the adjusted weight can be smaller. Assuming ph<pl, ph can be compressed according to the function shown in Figure 4, so that the adjusted ph is smaller and the adjusted pl is larger. That is, the lower the credibility of the interval with the smaller original weight, the greater the credibility of other intervals, thereby achieving the bias of the speed mapping.
其中,f(ph)对应的f(x)映射函数可以是经过四阶曲线拟合得到的,f(x)可以等于AX。其中,A可以为[a1,a2,a3,a4],X=[x 4,x 3,x 2,x 1]。即,f(x)=a1*x 4+a2*x 3+a3*x 4+a4*x 1。a1,a2,a3,a4的数值可以根据实际需求设置,例如,a1,a2,a3,a4可以分别为:-47.63,44.2,-9.96,0.9055,-0.01439。 Among them, the f(x) mapping function corresponding to f(ph) can be obtained through fourth-order curve fitting, and f(x) can be equal to AX. Among them, A can be [a1, a2, a3, a4], and X=[x 4 , x 3 , x 2 , x 1 ]. That is, f(x)=a1*x 4 +a2*x 3 +a3*x 4 +a4*x 1 . The values of a1, a2, a3 and a4 can be set according to actual needs. For example, a1, a2, a3 and a4 can be: -47.63, 44.2, -9.96, 0.9055, -0.01439 respectively.
本申请实施例中,通过单帧深度图像统计各深度区间的像素数量占比,基于像素数量占比确定各深度区间对应的权重值,基于各深度区间对应的权重值以及速度值即可确定第二安全速度。这样,由于无需对点云的坐标转换操作,仅需知道深度信息即可实现速度规划。因此,一定程度上可以节省计算资源消耗。In the embodiment of this application, the pixel number proportion of each depth interval is counted through a single frame depth image, and the weight value corresponding to each depth interval is determined based on the pixel number proportion. The weight value and speed value corresponding to each depth interval can be determined based on 2. Safe speed. In this way, since there is no need for coordinate conversion operations on the point cloud, speed planning can be implemented only by knowing the depth information. Therefore, computing resource consumption can be saved to a certain extent.
需要说明的是,在根据局部地图速度规划结果和单帧深度图速度规划结果,得到最终的目标安全速度:safe spd=min(map_safe spd,depth_safe spd)之后,可以对目标安全速度进行速度分解。具体的,在进行碰撞查询时,可以采用垂直与水平通道分离的方式进行碰撞查询。示例性地,图5是本申请实施例提供的一种碰撞查询的通道示意图,如图5所示,碰撞查询可以分别在水平通道01以及垂直通道02上进行。相应地,基于单帧深度图确定的第二安全速度也可以分解为垂直方向的第二安全速度以及水平方向的第二安全速度,进而可以分别得到垂直方向的目标安 全速度以及水平方向的目标安全速度,水平方向的目标安全速度可以根据运动方向进行分解,最后得到xyz三个维度上的目标安全速度。 It should be noted that after obtaining the final target safe speed based on the local map speed planning results and the single-frame depth map speed planning results: safe spd = min (map_safe spd , depth_safe spd ), the target safe speed can be decomposed. Specifically, when performing collision query, the collision query can be performed by separating vertical and horizontal channels. Exemplarily, FIG. 5 is a channel schematic diagram of a collision query provided by an embodiment of the present application. As shown in FIG. 5 , collision query can be performed on the horizontal channel 01 and the vertical channel 02 respectively. Correspondingly, the second safe speed determined based on the single-frame depth map can also be decomposed into the second safe speed in the vertical direction and the second safe speed in the horizontal direction, and then the target safety speed in the vertical direction and the target safety in the horizontal direction can be obtained respectively. Speed, the target safe speed in the horizontal direction can be decomposed according to the direction of movement, and finally the target safe speed in the xyz three dimensions is obtained.
可选的,上述被探测物体可以包括位于所述目标飞行方向指示的预设范围的空间中的物体,所述预设范围的空间随着所述无人机在空间中的运动而运动。其中,目标飞行方向指示的预设范围的空间即为目标飞行方向上预设范围内的空间。示例性地,预设范围的空间可以与无人机的机头相邻,随着无人机沿目标飞行方向不断向前飞行,预设范围的空间也不断变化。相应地,可以在当前周围环境中的物体落入目标飞行方向指示的预设范围的空间的情况下,将该物体确定为被探测物体。这样,可以准确的将会对无人机飞行产生干涉的物体识别为被探测物体,从而在后续操作中基于被探测物体是否为目标物,进行智能的行为决策,即,靠近该被探测物体或绕行该被探测物体。Optionally, the detected objects may include objects located in a space within a preset range indicated by the target flight direction, and the space within the preset range moves with the movement of the UAV in space. The space within the preset range indicated by the target flight direction is the space within the preset range in the target flight direction. For example, the space in the preset range can be adjacent to the nose of the drone, and as the drone continues to fly forward in the target flight direction, the space in the preset range also changes continuously. Correspondingly, if an object in the current surrounding environment falls into the space of the preset range indicated by the target flight direction, the object can be determined to be the detected object. In this way, objects that interfere with the drone's flight can be accurately identified as detected objects, so that in subsequent operations, intelligent behavioral decisions can be made based on whether the detected objects are target objects, that is, whether to approach the detected object or Detour around the detected object.
可选的,所述预设范围的空间的尺寸基于所述无人机的外形设置水平方向和竖直方向的边界尺寸。这样,基于无人机的外形对应设置预设范围的空间的水平方向和竖直方向的边界尺寸,使得设置的预设范围的空间可以更加适配无人机的外形特点,一定程度上可以确保干涉无人机飞行的物体均可以被探测到,从而确保无人机的飞行安全以及避免遗漏目标物。Optionally, the size of the space in the preset range is set with horizontal and vertical boundary sizes based on the shape of the drone. In this way, the shape of the drone corresponds to the horizontal and vertical boundary dimensions of the space in which the preset range is set, so that the space in the preset range can be more adapted to the shape characteristics of the drone, which can ensure to a certain extent Objects that interfere with the flight of the drone can be detected, thereby ensuring the safety of the drone and avoiding missing targets.
具体的,可以按照无人机各个方向上的最大尺寸,设置预设范围的空间在对应方向上的边界尺寸,以使得预设范围的空间能够覆盖无人机。在一种实现方式中,可以设置预设范围的空间在各方向的边界尺寸大于无人机各个方向上的最大尺寸,从而尽可能避免遗漏会干涉无人机飞行的物体。其中,无人机在该方向上的最大尺寸越大的情况下,预设范围的空间的对应方向上的边界尺寸可以越大。Specifically, the boundary size of the space in the preset range in the corresponding direction can be set according to the maximum size of the UAV in each direction, so that the space in the preset range can cover the UAV. In one implementation, the boundary size of the space in the preset range in each direction can be set to be larger than the maximum size of the drone in each direction, so as to avoid missing objects that may interfere with the flight of the drone as much as possible. Wherein, when the maximum size of the drone in this direction is larger, the boundary size in the corresponding direction of the preset range of space can be larger.
可选的,所述预设范围的空间在竖直方向的边界尺寸小于在水平方向上的边界尺寸。在一种实施例中,无人机可以为多旋翼无人机。为了提高动力性能,此类无人机的桨叶平面的尺寸较大,即,无人机的水平方向的尺寸较大。同时,为了便于提升无人机在水平面移动的机动性能,此类无人机的竖直方向的尺寸往往会小于桨叶平面的尺寸。本申请实施例中,通过设置预设范围的空间在竖直方向的边界尺寸小于在水平方向上的边界尺寸,可以较为 高效的利用上述预设范围的空间进行被探测物体的搜索,保证干涉无人机飞行的物体能够被预设范围的空间检测得到,同时也能一定程度上排除对飞行安全影响较小的空间中的物体落入预设范围的空间,进一步减小无人机在空间物体感知上的算力开销。Optionally, the boundary size of the space in the preset range in the vertical direction is smaller than the boundary size in the horizontal direction. In one embodiment, the drone may be a multi-rotor drone. In order to improve the power performance, the size of the blade plane of this type of UAV is larger, that is, the size of the UAV in the horizontal direction is larger. At the same time, in order to improve the maneuverability of the UAV in the horizontal plane, the vertical size of such UAVs is often smaller than the plane size of the blades. In the embodiment of the present application, by setting the boundary size of the preset range of space in the vertical direction to be smaller than the boundary size in the horizontal direction, the space in the above-mentioned preset range can be used more efficiently to search for detected objects, ensuring no interference. Objects flown by humans and machines can be detected in a preset range of space. At the same time, it can also exclude, to a certain extent, objects in spaces that have little impact on flight safety from falling into the preset range, further reducing the risk of UAVs in space objects. Perceived computational overhead.
进一步地,相较于设置竖直方向的边界尺寸与水平方向上的边界尺寸相等的方式,本申请实施例中,通过设置预设范围的空间在竖直方向的边界尺寸小于在水平方向上的边界尺寸,可以在无人机的飞行高度较低,距离地面更近的情况下,避免将地面认定为被探测物体,从而影响无人机的飞行效率的问题。Furthermore, compared to setting the boundary size in the vertical direction to be equal to the boundary size in the horizontal direction, in the embodiment of the present application, the boundary size in the vertical direction of the space of the preset range is set to be smaller than the boundary size in the horizontal direction. The boundary size can avoid the problem of identifying the ground as a detected object, thus affecting the flight efficiency of the drone, when the drone is flying at a lower height and is closer to the ground.
假设竖直方向的边界尺寸与水平方向上的边界尺寸均为a,本次作业任务中的目标物不为地面,那么在无人机与地面之间的距离小于a的情况下,就会将地面识别为被探测物体,且由于本次的目标物不为地面,因此,会将被探测物体识别为障碍物。相应地,如果此时控制无人机朝向更接近地面的方向飞行,例如,用户对第一控制设备的控制杆进行打杆操作,无人机将不会正常响应用户的打杆操作。但是,这种情况下,地面实际上并不会对无人机造成威胁。本申请实施例中,由于预设范围的空间在竖直方向的边界尺寸小于在水平方向上的边界尺寸,只要无人机与地面之间的距离小于竖直方向的边界尺寸,就不会将地面识别为障碍物。即使在无人机与地面之间的距离小于a的情况下,依旧可以正常响应用户的打杆操作,从而适配近地面打杆的场景。Assuming that the boundary size in the vertical direction and the boundary size in the horizontal direction are both a, and the target object in this operation task is not the ground, then when the distance between the drone and the ground is less than a, it will The ground is identified as the detected object, and since the target object this time is not the ground, the detected object is identified as an obstacle. Correspondingly, if the drone is controlled to fly in a direction closer to the ground at this time, for example, if the user performs a stick operation on the control stick of the first control device, the drone will not respond normally to the user's stick operation. However, in this case, the ground does not actually pose a threat to the drone. In the embodiment of the present application, since the vertical boundary size of the preset range of space is smaller than the horizontal boundary size, as long as the distance between the drone and the ground is smaller than the vertical boundary size, it will not be The ground is recognized as an obstacle. Even when the distance between the drone and the ground is less than a, it can still respond normally to the user's swinging operation, thus adapting to the scene of hitting the swing near the ground.
需要说明的是,也可以设置竖直方向的边界尺寸与水平方向上的边界尺寸相等。相应地,这种情况下,如果周围环境中的物体为指定物体,则在该物体落入预设范围的空间且该物体在预设范围的空间中的占比大于预设比值的情况下,再将该物体确定为被探测物体。其中,该指定物体可以包括地面,预设比值可以根据实际需求设置。这样,可以避免直接将地面认定会干涉无人机飞行的障碍物,为地面设置更高的容忍度。本申请实施例中,通过设置更加合理的地面滤除策略,可以使得针对地面的处理更加合理,从而在确保安全性的同时,提高无人机的在近地面场景下的响应灵敏度。It should be noted that the boundary size in the vertical direction can also be set to be equal to the boundary size in the horizontal direction. Correspondingly, in this case, if the object in the surrounding environment is a specified object, then when the object falls into the space of the preset range and the proportion of the object in the space of the preset range is greater than the preset ratio, The object is then determined to be the detected object. The specified object may include the ground, and the preset ratio may be set according to actual needs. In this way, you can avoid directly identifying obstacles on the ground that will interfere with the flight of the drone and set a higher tolerance for the ground. In the embodiment of this application, by setting a more reasonable ground filtering strategy, the processing of the ground can be made more reasonable, thereby ensuring safety while improving the response sensitivity of the UAV in near-ground scenarios.
在一种实施例中,无人机的物体构型更接近圆饼形。因此,预设范 围的空间可以为球形,预设范围的空间的水平方向上的查询范围可以更接近圆饼,以避免过于保守。图6是本申请实施例提供一种预设范围的示意图,如图6所示,可以在碰撞点轨迹切向的法平面内构建平面直角坐标系,其中,该坐标系中的纵轴可以与导航坐标系的z轴平行,横轴可以与导航坐标系水平面平行。线条X与线条Y之间的区域可以是根据上下方容忍度上下平移横轴得到的。由于垂直方向上的边界尺寸更小,因此,线条Y与横轴的距离更小。其中,物体查询过程可以如图6中交错的小圆所示从左到右依次查询。如果线条X与线条Y之间形成的走廊区域内查询到物体,则认为的确有被探测物体。如果线条X与线条Y之间形成的走廊区域内没有查询到物体,则认为没有被探测物体,这样,一定程度上可以把地面排除,同时使得上方的通过性也得到加强。In one embodiment, the object configuration of the drone is closer to a pie shape. Therefore, the space of the preset range can be spherical, and the query range in the horizontal direction of the space of the preset range can be closer to a round cake to avoid being too conservative. Figure 6 is a schematic diagram of a preset range provided by an embodiment of the present application. As shown in Figure 6, a plane rectangular coordinate system can be constructed in the normal plane tangential to the collision point trajectory, wherein the vertical axis in the coordinate system can be The z-axis of the navigation coordinate system is parallel, and the horizontal axis can be parallel to the horizontal plane of the navigation coordinate system. The area between line X and line Y can be obtained by translating the horizontal axis up and down according to the upper and lower tolerance. Since the border size in the vertical direction is smaller, the distance between line Y and the horizontal axis is smaller. Among them, the object query process can be queried sequentially from left to right as shown by the interlaced small circles in Figure 6. If an object is found in the corridor area formed between line X and line Y, it is considered that there is indeed a detected object. If no object is detected in the corridor area formed between line
可选的,上述目标飞行方向可以通过下述步骤确定:Optionally, the above target flight direction can be determined through the following steps:
步骤S61、接收所述无人机的第一控制设备发送的飞行控制指令;所述飞行控制指令是基于用户对所述第一控制设备的控制杆的控制操作生成的。Step S61: Receive a flight control instruction sent by the first control device of the drone; the flight control instruction is generated based on the user's control operation of the control stick of the first control device.
步骤S62、将响应于所述飞行控制指令确定的飞行方向,确定为所述目标飞行方向。Step S62: Determine the flight direction determined in response to the flight control instruction as the target flight direction.
其中,第一控制设备可以为无人机的遥控器,控制操作可以是对第一控制设备的控制杆的打杆操作。假设用户向A方向打杆,那么第一控制设备可以生成指示A方向的飞行控制指令,无人机可以接收该飞行控制指令,响应于该飞行控制指令,将飞行控制指令指示的A方向确定为目标飞行方向。相应地,在接收到基于用户对第一控制设备的控制杆的控制操作生成的飞行控制指令的情况下,如果目标飞行方向上的被探测物体为目标物,则无人机可以沿着打杆指示的目标飞行方向继续飞行,以接近目标物。如果目标飞行方向上的被探测物体不为目标物,则无人机不遵循打杆指示的目标飞行方向,而是改变飞行方向以绕行所述被探测物体。这样,可以确保无人机在能够保证飞行安全的情况下,响应飞行控制指令。The first control device may be a remote controller of the drone, and the control operation may be a lever operation on a control rod of the first control device. Assume that the user moves the stick in direction A, then the first control device can generate a flight control instruction indicating direction A, and the drone can receive the flight control instruction, and in response to the flight control instruction, determine the direction A indicated by the flight control instruction as Target flight direction. Correspondingly, upon receiving a flight control instruction generated based on the user's control operation of the control stick of the first control device, if the detected object in the target flight direction is the target object, the drone can fly along the stick. Continue flying in the indicated target flight direction to approach the target. If the detected object in the target flight direction is not the target object, the drone does not follow the target flight direction indicated by the stick, but changes the flight direction to bypass the detected object. In this way, it can be ensured that the drone responds to flight control instructions while ensuring flight safety.
可选的,上述获取空间中的目标物的描述信息的操作,具体可以包括:接收所述无人机的第二控制设备发送的所述目标物的描述信息;其中,所述目标物的描述信息是所述第二控制设备基于用户所选中物体确定的。这样, 使得用户可以基于第二控制设备按需选择本次任务中的目标物,从而使得无人机针对被探测物体的行为决策,更符合用户实际需求。具体的,第二控制设备可以与第一控制设备为同一设备。第二控制设备也可以与第一控制设备为不同设备,例如,第二控制设备可以为用户所使用的电子设备。实际应用中,第二控制设备可以向用户展示场景视图,该场景视图中可以提供有多个备选物体。在检测到用户对备选物体的触控操作的情况下,将该触控操作选中的备选物体确定为目标物,将所选中备选物体的描述信息确定为目标物的描述信息。其中,场景视图可以是已经建立的作业区域的3D地图、正射影像等等。在获取所选中备选物体的描述信息时,可以是基于历史作业中采集的该所选中备选物体的图像,提取描述信息。Optionally, the above operation of obtaining the description information of the target object in the space may specifically include: receiving the description information of the target object sent by the second control device of the drone; wherein, the description of the target object The information is determined by the second control device based on the object selected by the user. In this way, the user can select the target object in this mission as needed based on the second control device, so that the drone's behavior decision-making for the detected object is more in line with the actual needs of the user. Specifically, the second control device may be the same device as the first control device. The second control device may also be a different device from the first control device. For example, the second control device may be an electronic device used by the user. In actual applications, the second control device can display a scene view to the user, and the scene view can provide multiple candidate objects. When the user's touch operation on the candidate object is detected, the candidate object selected by the touch operation is determined as the target object, and the description information of the selected candidate object is determined as the description information of the target object. Among them, the scene view can be a 3D map of the established working area, an orthophoto, etc. When obtaining the description information of the selected candidate object, the description information may be extracted based on images of the selected candidate object collected in historical operations.
可选的,在另一种无人机的飞行控制方法中,可以获取空间中的目标物的描述信息;获取无人机的规划轨迹;基于所述无人机搭载的传感器,检测所述规划轨迹触及的空间中的被探测物体的描述信息;若所述被探测物体的描述信息与所述目标物的描述信息匹配,则控制所述无人机沿着所述规划轨迹减速运动至接近所述被探测物体;若所述被探测物体的描述信息与所述目标物体的描述信息不匹配,则调整所述规划轨迹,以使所述无人机沿着调整后的规划轨迹绕行所述被探测物体。其中,该飞行控制方法中各步骤的实现方式可以参照前述相同或相似步骤的实现方式。该飞行控制方法中的规划轨迹可以是无人机当前飞行所要遵循的轨迹,规划轨迹可以为当前飞行方向上的轨迹。上述规划轨迹可以是无人机基于自身对周围环境的感知,自动规划的规划。也可以是响应于外部控制指令规划的轨迹。规划轨迹触及的空间可以为规划轨迹上触及的预设范围的空间。Optionally, in another UAV flight control method, the description information of the target object in the space can be obtained; the planned trajectory of the UAV can be obtained; and the plan can be detected based on the sensor mounted on the UAV. The description information of the detected object in the space touched by the trajectory; if the description information of the detected object matches the description information of the target object, the drone is controlled to decelerate along the planned trajectory until it is close to the target object. The detected object; if the description information of the detected object does not match the description information of the target object, adjust the planned trajectory so that the UAV detours along the adjusted planned trajectory. Detected object. The implementation of each step in the flight control method may refer to the implementation of the same or similar steps mentioned above. The planned trajectory in the flight control method can be the trajectory that the UAV is currently flying to follow, and the planned trajectory can be the trajectory in the current flight direction. The above planned trajectory can be a plan automatically planned by the drone based on its own perception of the surrounding environment. It can also be a trajectory planned in response to external control instructions. The space touched by the planned trajectory may be a preset range of space touched by the planned trajectory.
示例性地,在一种实现中,可以获取无人机的规划轨迹的操作,具体可以包括:获取飞行方向指令,并根据所述飞行方向指令获取无人机的规划轨迹。其中,获取飞行方向指令可以是接收控制设备发送的飞行方向指令。或者,也可以是基于自身对周围环境的感知,自动生成的飞行方向指令。相应地,可以将该飞行方向指令指示的飞行轨迹,确定为无人机的规划轨迹。For example, in one implementation, the operation of obtaining the planned trajectory of the UAV may include: obtaining a flight direction instruction, and obtaining the planned trajectory of the UAV according to the flight direction instruction. Wherein, obtaining the flight direction instruction may be receiving the flight direction instruction sent by the control device. Alternatively, it can also be an automatically generated flight direction instruction based on its own perception of the surrounding environment. Correspondingly, the flight trajectory indicated by the flight direction instruction can be determined as the planned trajectory of the UAV.
综上所述,本申请实施例提供的无人机的飞行控制方法,通过自动检测规划轨迹触及的空间中的探测物体的描述信息与目标物的描述信息是否匹 配,并基于检测结果自动控制无人机避让被探测物体或者靠近被探测物体,一定程度上可以提高控制效率,降低人工成本。同时,可以在避免无人机与被探测物体发生碰撞,确保无人机飞行安全的同时,确保无人机能够靠近空间中的目标物,从而兼顾飞行安全以及场景需求。To sum up, the flight control method of the UAV provided by the embodiment of the present application automatically detects whether the description information of the detected object in the space touched by the planned trajectory matches the description information of the target object, and automatically controls the drone based on the detection result. Man-machine avoids or approaches the detected object, which can improve control efficiency and reduce labor costs to a certain extent. At the same time, it can avoid collisions between the drone and the detected object to ensure the safety of the drone's flight, while ensuring that the drone can get close to the target object in space, thus taking into account flight safety and scene requirements.
图7是本申请实施例提供的一种飞行控制方法所涉及的系统流程框图,如图7所示,针对二维目标(即,被探测物体)可以进行目标检测和跟踪,以确定该被探测物体是否为目标物。接着,避障决策模块根据目标跟踪结果、飞机和云台的位姿等数据,确认二维目标在三维导航坐标系下的位置信息,并在无人机朝向二维目标飞行且二维目标为目标物的情况下,由速度规划模块执行下一操作。速度规划模块可以确定目标安全速度,并基于目标安全速度控制无人机的底层飞控系统,以控制无人机沿着目标飞行方向,以目标安全速度减速运动至接近所述被探测物体。其中,进行速度规划时,可以同时结合环境限速,以避免确定的目标安全速度超出环境限速。反之,可以由于障碍物绕行模块执行下一操作,以基于底层飞控系统,控制无人机绕行被探测物体。Figure 7 is a system flow diagram involved in a flight control method provided by an embodiment of the present application. As shown in Figure 7, target detection and tracking can be performed on a two-dimensional target (ie, a detected object) to determine the detected object. Whether the object is a target object. Then, the obstacle avoidance decision-making module confirms the position information of the two-dimensional target in the three-dimensional navigation coordinate system based on the target tracking results, the position and attitude of the aircraft and the gimbal, and when the UAV flies towards the two-dimensional target and the two-dimensional target is In the case of a target object, the speed planning module performs the next operation. The speed planning module can determine the target safe speed and control the underlying flight control system of the drone based on the target safe speed to control the drone to move along the target flight direction and decelerate at the target safe speed until it approaches the detected object. Among them, when performing speed planning, the environmental speed limit can be combined with the environment speed limit to avoid the determined target safe speed exceeding the environmental speed limit. On the contrary, the obstacle bypass module can perform the next operation to control the drone to bypass the detected object based on the underlying flight control system.
无人机在飞行过程中,针对探测到的物体会进行避障。但是在使用无人机执行巡检任务的场景中,需要无人机能够靠近巡检目标进行检测而不是绕开巡检目标。本申请实施例中,用户仅需框选本次巡检任务的巡检目标,在无人机的飞行过程中,会基于对被探测物体的检测结果,对空间中的巡检目标和非巡检目标进行区分,自动切换针对被探测物体的响应模式,实现智能避障行为决策。在被探测物体为巡检目标的情况下,利用局部地图和单帧深度图像实现平滑速度规划,以更安全的停靠在目标物附近。在被探测物体不为巡检目标的情况下,进行绕行,以躲避被探测物体,从而确保飞行安全。During flight, the drone will avoid obstacles based on detected objects. However, in scenarios where drones are used to perform inspection tasks, the drone needs to be able to approach the inspection target for inspection instead of bypassing the inspection target. In the embodiment of this application, the user only needs to select the inspection targets of this inspection task. During the flight of the drone, based on the detection results of the detected objects, the inspection targets and non-inspection targets in the space will be analyzed. It can distinguish the detected objects, automatically switch the response mode for the detected objects, and realize intelligent obstacle avoidance behavior decision-making. When the detected object is the inspection target, the local map and single-frame depth image are used to implement smooth speed planning to park near the target more safely. When the detected object is not the inspection target, a detour is performed to avoid the detected object to ensure flight safety.
图8是本申请实施例提供的一种无人机的飞行控制装置的框图,该装置可以包括:存储器301以及处理器302。Figure 8 is a block diagram of a UAV flight control device provided by an embodiment of the present application. The device may include: a memory 301 and a processor 302.
所述存储器301,用于存储程序代码;The memory 301 is used to store program codes;
所述处理器302,调用所述程序代码,当所述程序代码被执行时,用于执行以下操作:The processor 302 calls the program code, and when the program code is executed, is used to perform the following operations:
获取空间中的目标物的描述信息;Obtain description information of target objects in space;
获取无人机的目标飞行方向;Obtain the target flight direction of the drone;
基于所述无人机搭载的传感器,检测所述目标飞行方向指示的空间中的被探测物体的描述信息;Based on the sensor mounted on the drone, detect the description information of the detected object in the space indicated by the target flight direction;
若所述被探测物体的描述信息与所述目标物的描述信息匹配,则控制所述无人机沿着所述目标飞行方向减速运动至接近所述被探测物体;If the description information of the detected object matches the description information of the target object, control the drone to decelerate along the target flight direction to approach the detected object;
若所述被探测物体的描述信息与所述目标物体的描述信息不匹配,则控制所述无人机改变飞行方向以绕行所述被探测物体。If the description information of the detected object does not match the description information of the target object, the drone is controlled to change its flight direction to bypass the detected object.
综上所述,本申请实施例提供的无人人机的飞行控制装置,获取空间中的目标物的描述信息,获取无人机的目标飞行方向。基于无人机搭载的传感器,检测目标飞行方向指示的空间中的被探测物体的描述信息。若被探测物体的描述信息与目标物的描述信息匹配,则控制无人机沿着目标飞行方向减速运动至接近被探测物体。若被探测物体的描述信息与目标物体的描述信息不匹配,则控制无人机改变飞行方向以绕行被探测物体。这样,通过自动检测被探测物体的描述信息与目标物的描述信息是否匹配,并基于检测结果自动控制无人机避让被探测物体或者靠近被探测物体,一定程度上可以提高控制效率,降低人工成本。同时,可以在避免无人机与被探测物体发生碰撞,确保无人机飞行安全的同时,确保无人机能够靠近空间中的目标物,从而兼顾飞行安全以及场景需求。To sum up, the flight control device of the unmanned aerial vehicle provided by the embodiment of the present application obtains the description information of the target object in the space and obtains the target flight direction of the unmanned aerial vehicle. Based on the sensor mounted on the drone, the description information of the detected object in the space indicated by the target flight direction is detected. If the description information of the detected object matches the description information of the target object, the drone is controlled to decelerate along the target flight direction until it approaches the detected object. If the description information of the detected object does not match the description information of the target object, the drone is controlled to change its flight direction to bypass the detected object. In this way, by automatically detecting whether the description information of the detected object matches the description information of the target object, and automatically controlling the drone to avoid the detected object or approach the detected object based on the detection results, control efficiency can be improved to a certain extent and labor costs can be reduced. . At the same time, it can avoid collisions between the drone and the detected object to ensure the safety of the drone's flight, while ensuring that the drone can get close to the target object in space, thus taking into account flight safety and scene requirements.
且手动控制无人机飞行的方式中,受限于个人经验,有时无法合理的控制无人机运动。本申请实施例中基于检测结果自动控制无人机避让被探测物体或者靠近被探测物体,一定程度上可以针对空间中的被探测物体更加合理的控制无人机运动。Moreover, the manual control of drone flight is limited by personal experience, and sometimes it is impossible to reasonably control the movement of the drone. In the embodiment of this application, the drone is automatically controlled to avoid the detected object or approach the detected object based on the detection results. To a certain extent, the movement of the drone can be more reasonably controlled for the detected object in space.
可选的,在控制所述无人机沿着所述目标飞行方向减速运动至接近所述被探测物体的情况下,所述处理器302还用于执行:Optionally, when controlling the drone to decelerate along the target flight direction to approach the detected object, the processor 302 is also configured to execute:
在所述无人机与所述被探测物体之间的距离不大于指定停止距离的情况下,控制所述无人机停止飞行。When the distance between the drone and the detected object is not greater than the specified stopping distance, the drone is controlled to stop flying.
可选的,所述处理器302还用于执行:Optionally, the processor 302 is also used to execute:
在所述无人机停止飞行后,控制所述无人机对所述被探测物体执行目标 操作。After the UAV stops flying, the UAV is controlled to perform a target operation on the detected object.
可选的,所述指定停止距离基于以下一种或者多种因素设置:Optionally, the specified stopping distance is set based on one or more of the following factors:
所述空间中物体的稠密程度;The density of objects in said space;
所述目标物的描述信息;Description information of the target object;
所述无人机的作业类型。The type of operation of the drone.
可选的,所述处理器302,具体用于执行:Optionally, the processor 302 is specifically configured to execute:
基于局部地图信息和/或单帧深度图像确定目标安全速度;所述目标安全速度与所述无人机与所述被探测物体之间的当前距离正相关;Determine the target safe speed based on local map information and/or single-frame depth images; the target safe speed is positively related to the current distance between the drone and the detected object;
控制所述无人机沿着所述目标飞行方向,以所述目标安全速度减速运动至接近所述被探测物体。Control the drone to move along the target flight direction and decelerate at the target safe speed until it approaches the detected object.
可选的,所述局部地图信息是基于所述无人机采集的当前周围环境的多帧深度图像生成的。Optionally, the local map information is generated based on multi-frame depth images of the current surrounding environment collected by the drone.
可选的,所述多帧深度图像是所述无人机搭载的传感器在不同朝向上获取得到的。Optionally, the multi-frame depth images are obtained by sensors mounted on the drone in different directions.
可选的,所述处理器302还具体用于执行:Optionally, the processor 302 is also specifically configured to execute:
基于所述局部地图信息,计算所述无人机的第一安全速度;Based on the local map information, calculate the first safe speed of the drone;
基于所述单帧深度图像,计算所述无人机的第二安全速度;Based on the single-frame depth image, calculate the second safe speed of the drone;
基于所述第一安全速度以及所述第二安全速度,确定所述目标安全速度。The target safe speed is determined based on the first safe speed and the second safe speed.
可选的,所述处理器302还具体用于执行:Optionally, the processor 302 is also specifically configured to execute:
基于所述局部地图信息,确定所述无人机与所述被探测物体之间的当前距离;Based on the local map information, determine the current distance between the drone and the detected object;
基于所述当前距离以及所述无人机的指定停止距离,计算所述无人机的第一安全速度;所述第一安全速度与当前距离差值正相关,所述当前距离差值为所述当前距离与所述指定停止距离之间的差值。Based on the current distance and the designated stopping distance of the UAV, the first safe speed of the UAV is calculated; the first safe speed is positively related to the current distance difference, and the current distance difference is the The difference between the current distance and the specified stopping distance.
可选的,所述处理器302还具体用于执行:Optionally, the processor 302 is also specifically configured to execute:
基于所述单帧深度图像中落入各深度区间的像素数量以及所述单帧深度图的总像素数量,确定各所述深度区间对应的像素数量占比;Based on the number of pixels falling into each depth interval in the single-frame depth image and the total number of pixels in the single-frame depth image, determine the proportion of the number of pixels corresponding to each depth interval;
根据各所述深度区间对应的像素数量占比,确定各所述深度区间对应的 权重值;所述深度区间对应的权重值与所述像素数量占比正相关;Determine the weight value corresponding to each depth interval according to the proportion of the number of pixels corresponding to each depth interval; the weight value corresponding to the depth interval is positively related to the proportion of the number of pixels;
基于各所述深度区间对应的速度值以及权重值,确定所述第二安全速度;所述深度区间对应的速度值与所述深度区间表征的深度值正相关。The second safe speed is determined based on the speed value and weight value corresponding to each depth interval; the speed value corresponding to the depth interval is positively related to the depth value represented by the depth interval.
可选的,所述处理器302还具体用于执行:Optionally, the processor 302 is also specifically configured to execute:
将所述第一安全速度以及所述第二安全速度中的最小值,确定为所述目标安全速度。The minimum value of the first safety speed and the second safety speed is determined as the target safety speed.
可选的,所述局部地图信息是在所述无人机与所述被探测物体之间的距离达到预设建图距离的情况下,基于所述多帧深度图像生成的,所述预设建图距离与所述无人机的指定停止距离正相关。Optionally, the local map information is generated based on the multi-frame depth image when the distance between the drone and the detected object reaches a preset mapping distance, and the preset Mapping distance is directly related to the specified stopping distance of the drone.
可选的,所述被探测物体包括位于所述目标飞行方向指示的预设范围的空间中的物体,所述预设范围的空间随着所述无人机在空间中的运动而运动。Optionally, the detected object includes an object located in a preset range of space indicated by the target flight direction, and the preset range of space moves with the movement of the UAV in space.
可选的,所述预设范围的空间的尺寸基于所述无人机的外形设置水平方向和竖直方向的边界尺寸。Optionally, the size of the space in the preset range is set with horizontal and vertical boundary sizes based on the shape of the drone.
可选的,所述预设范围的空间在竖直方向的边界尺寸小于在水平方向上的边界尺寸。Optionally, the boundary size of the space in the preset range in the vertical direction is smaller than the boundary size in the horizontal direction.
可选的,所述处理器302还用于执行:Optionally, the processor 302 is also used to execute:
接收所述无人机的第一控制设备发送的飞行控制指令;所述飞行控制指令是基于用户对所述第一控制设备的控制杆的控制操作生成的;Receive flight control instructions sent by the first control device of the UAV; the flight control instructions are generated based on the user's control operation of the control lever of the first control device;
将响应于所述飞行控制指令确定的飞行方向,确定为所述目标飞行方向。The flight direction determined in response to the flight control instruction is determined as the target flight direction.
可选的,所述处理器302还具体用于执行:Optionally, the processor 302 is also specifically configured to execute:
接收所述无人机的第二控制设备发送的所述目标物的描述信息;其中,所述目标物的描述信息是所述第二控制设备基于用户所选中物体确定的。Receive the description information of the target object sent by the second control device of the drone; wherein the description information of the target object is determined by the second control device based on the object selected by the user.
上述装置执行操作与上述方法中的各个对应步骤类似,且能达到相同的技术效果,为避免重复,这里不再赘述。The operations performed by the above device are similar to the corresponding steps in the above method and can achieve the same technical effect. To avoid repetition, they will not be described again here.
进一步地,本申请实施例还提供一种计算机可读存储介质,所述计算机可读存储介质上存储计算机程序,当其在计算机上运行时使得计算机执行上述方法中的各个步骤,且能达到相同的技术效果,为避免重复, 这里不再赘述。Further, embodiments of the present application also provide a computer-readable storage medium, which stores a computer program that, when run on a computer, causes the computer to perform each step in the above method, and can achieve the same results. The technical effects will not be repeated here to avoid repetition.
进一步地,本申请实施例还提供一种包含指令的计算机程序产品,当所述指令在计算机上运行时,使得所述计算机执行上述方法。Furthermore, embodiments of the present application also provide a computer program product containing instructions, which when the instructions are run on a computer, cause the computer to execute the above method.
进一步地,本申请实施例还提供一种无人机,所述无人机用于实现上述方法中的各个步骤,且能达到相同的技术效果,为避免重复,这里不再赘述Furthermore, embodiments of the present application also provide an unmanned aerial vehicle, which is used to implement each step in the above method and can achieve the same technical effect. To avoid duplication, it will not be described again here.
以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。本领域普通技术人员在不付出创造性的劳动的情况下,即可以理解并实施。The device embodiments described above are only illustrative. The units described as separate components may or may not be physically separated. The components shown as units may or may not be physical units, that is, they may be located in One location, or it can be distributed across multiple network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the solution of this embodiment. Persons of ordinary skill in the art can understand and implement the method without any creative effort.
本申请的各个部件实施例可以以硬件实现,或者以在一个或者多个处理器上运行的软件模块实现,或者以它们的组合实现。本领域的技术人员应当理解,可以在实践中使用微处理器或者数字信号处理器来实现根据本申请实施例的计算处理设备中的一些或者全部部件的一些或者全部功能。本申请还可以实现为用于执行这里所描述的方法的一部分或者全部的设备或者装置程序(例如,计算机程序和计算机程序产品)。这样的实现本申请的程序可以存储在计算机可读介质上,或者可以具有一个或者多个信号的形式。这样的信号可以从因特网网站上下载得到,或者在载体信号上提供,或者以任何其他形式提供。Various component embodiments of the present application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art should understand that a microprocessor or a digital signal processor may be used in practice to implement some or all functions of some or all components in the computing processing device according to embodiments of the present application. The present application may also be implemented as an apparatus or device program (eg, computer program and computer program product) for performing part or all of the methods described herein. Such a program implementing the present application may be stored on a computer-readable medium, or may be in the form of one or more signals. Such signals may be downloaded from an Internet website, or provided on a carrier signal, or in any other form.
例如,图9为本申请实施例提供的一种计算处理设备的框图,如图所示,图示出了可以实现根据本申请的方法的计算处理设备。该计算处理设备传统上包括处理器410和以存储器420形式的计算机程序产品或者计算机可读介质。存储器420可以是诸如闪存、EEPROM(电可擦除可编程只读存储器)、EPROM、硬盘或者ROM之类的电子存储器。存储器420具有用于执行上述方法中的任何方法步骤的程序代码的存储空间430。例如,用于程序代码的存储空间430可以包括分别用于实现上面的方法中的各种步骤的各个程序代码。这些程序代码可以从一个或者多 个计算机程序产品中读出或者写入到这一个或者多个计算机程序产品中。这些计算机程序产品包括诸如硬盘,紧致盘(CD)、存储卡或者软盘之类的程序代码载体。这样的计算机程序产品通常为如参考图10所述的便携式或者固定存储单元。该存储单元可以具有与图9的计算处理设备中的存储器420类似布置的存储段、存储空间等。程序代码可以例如以适当形式进行压缩。通常,存储单元包括计算机可读代码,即可以由例如诸如410之类的处理器读取的代码,这些代码当由计算处理设备运行时,导致该计算处理设备执行上面所描述的方法中的各个步骤。For example, FIG. 9 is a block diagram of a computing processing device provided by an embodiment of the present application. As shown in the figure, the diagram illustrates a computing processing device that can implement the method according to the present application. The computing processing device conventionally includes a processor 410 and a computer program product or computer-readable medium in the form of memory 420 . Memory 420 may be electronic memory such as flash memory, EEPROM (Electrically Erasable Programmable Read Only Memory), EPROM, hard disk, or ROM. The memory 420 has a storage space 430 for program codes for executing any method steps in the above-described methods. For example, the storage space 430 for program codes may include individual program codes respectively used to implement various steps in the above method. These program codes can be read from or written into one or more computer program products. These computer program products include program code carriers such as hard disks, compact disks (CDs), memory cards or floppy disks. Such computer program products are typically portable or fixed storage units as described with reference to FIG. 10 . The storage unit may have storage segments, storage spaces, etc. arranged similarly to the memory 420 in the computing processing device of FIG. 9 . The program code may, for example, be compressed in a suitable form. Typically, the storage unit includes computer readable code, ie code that can be read by, for example, a processor such as 410, which code, when executed by a computing processing device, causes the computing processing device to perform each of the methods described above. step.
本说明书中的各个实施例均采用递进的方式描述,每个实施例重点说明的都是与其他实施例的不同之处,各个实施例之间相同相似的部分互相参见即可。本文中所称的“一个实施例”、“实施例”或者“一个或者多个实施例”意味着,结合实施例描述的特定特征、结构或者特性包括在本申请的至少一个实施例中。此外,请注意,这里“在一个实施例中”的词语例子不一定全指同一个实施例。Each embodiment in this specification is described in a progressive manner. Each embodiment focuses on its differences from other embodiments. The same and similar parts between the various embodiments can be referred to each other. Reference herein to "one embodiment," "an embodiment," or "one or more embodiments" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. In addition, please note that the examples of the word "in one embodiment" here do not necessarily all refer to the same embodiment.
在此处所提供的说明书中,说明了大量具体细节。然而,能够理解,本申请的实施例可以在没有这些具体细节的情况下被实践。在一些实例中,并未详细示出公知的方法、结构和技术,以便不模糊对本说明书的理解。In the instructions provided here, a number of specific details are described. However, it is understood that embodiments of the present application may be practiced without these specific details. In some instances, well-known methods, structures, and techniques have not been shown in detail so as not to obscure the understanding of this description.
在权利要求中,不应将位于括号之间的任何参考符号构造成对权利要求的限制。单词“包含”不排除存在未列在权利要求中的元件或步骤。位于元件之前的单词“一”或“一个”不排除存在多个这样的元件。本申请可以借助于包括有若干不同元件的硬件以及借助于适当编程的计算机来实现。在列举了若干装置的单元权利要求中,这些装置中的若干个可以是通过同一个硬件项来具体体现。单词第一、第二、以及第三等的使用不表示任何顺序。可将这些单词解释为名称。In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The application may be implemented by means of hardware comprising several different elements and by means of a suitably programmed computer. In the element claim enumerating several means, several of these means may be embodied by the same item of hardware. The use of the words first, second, third, etc. does not indicate any order. These words can be interpreted as names.
最后说明的是:以上实施例仅用以说明本申请的技术方案,而非限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并 不使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围。Finally, it should be noted that the above embodiments are only used to illustrate the technical solutions of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, those of ordinary skill in the art should understand that they can still implement the foregoing various The technical solutions described in the embodiments may be modified, or some of the technical features thereof may be equivalently replaced; however, these modifications or substitutions shall not cause the essence of the corresponding technical solutions to deviate from the spirit and scope of the technical solutions in the embodiments of the present application.

Claims (37)

  1. 一种无人机的飞行控制方法,其特征在于,所述方法包括:A flight control method for a UAV, characterized in that the method includes:
    获取空间中的目标物的描述信息;Obtain description information of target objects in space;
    获取无人机的目标飞行方向;Obtain the target flight direction of the drone;
    基于所述无人机搭载的传感器,检测所述目标飞行方向指示的空间中的被探测物体的描述信息;Based on the sensor mounted on the drone, detect the description information of the detected object in the space indicated by the target flight direction;
    若所述被探测物体的描述信息与所述目标物的描述信息匹配,则控制所述无人机沿着所述目标飞行方向减速运动至接近所述被探测物体;If the description information of the detected object matches the description information of the target object, control the drone to decelerate along the target flight direction to approach the detected object;
    若所述被探测物体的描述信息与所述目标物体的描述信息不匹配,则控制所述无人机改变飞行方向以绕行所述被探测物体。If the description information of the detected object does not match the description information of the target object, the drone is controlled to change its flight direction to bypass the detected object.
  2. 根据权利要求1所述的方法,其特征在于,在控制所述无人机沿着所述目标飞行方向减速运动至接近所述被探测物体的情况下,所述方法还包括:The method according to claim 1, characterized in that in the case of controlling the UAV to decelerate along the target flight direction to approach the detected object, the method further includes:
    在所述无人机与所述被探测物体之间的距离不大于指定停止距离的情况下,控制所述无人机停止飞行。When the distance between the drone and the detected object is not greater than the specified stopping distance, the drone is controlled to stop flying.
  3. 根据权利要求2所述的方法,其特征在于,所述方法还包括:The method of claim 2, further comprising:
    在所述无人机停止飞行后,控制所述无人机对所述被探测物体执行目标操作。After the drone stops flying, the drone is controlled to perform a target operation on the detected object.
  4. 根据权利要求2或3所述的方法,其特征在于,所述指定停止距离基于以下一种或者多种因素设置:The method according to claim 2 or 3, characterized in that the designated stopping distance is set based on one or more of the following factors:
    所述空间中物体的稠密程度;The density of objects in said space;
    所述目标物的描述信息;Description information of the target object;
    所述无人机的作业类型。The type of operation of the drone.
  5. 根据权利要求1-3任一所述的方法,其特征在于,所述控制所述无人机沿着所述目标飞行方向减速运动至接近所述被探测物体,包括:The method according to any one of claims 1 to 3, characterized in that controlling the drone to decelerate along the target flight direction to approach the detected object includes:
    基于局部地图信息和/或单帧深度图像确定目标安全速度;所述目标安全速度与所述无人机与所述被探测物体之间的当前距离正相关;Determine the target safe speed based on local map information and/or single-frame depth images; the target safe speed is positively related to the current distance between the drone and the detected object;
    控制所述无人机沿着所述目标飞行方向,以所述目标安全速度减速运动至接近所述被探测物体。Control the drone to move along the target flight direction and decelerate at the target safe speed until it approaches the detected object.
  6. 根据权利要求5所述的方法,其特征在于,所述局部地图信息是基于所述无人机采集的当前周围环境的多帧深度图像生成的。The method of claim 5, wherein the local map information is generated based on multi-frame depth images of the current surrounding environment collected by the drone.
  7. 根据权利要求6所述的方法,其特征在于,所述多帧深度图像是所述无人机搭载的传感器在不同朝向上获取得到的。The method according to claim 6, characterized in that the multi-frame depth images are obtained by sensors mounted on the drone in different directions.
  8. 根据权利要求5所述的方法,其特征在于,所述基于局部地图信息和/或单帧深度图像确定目标安全速度,包括:The method of claim 5, wherein determining the target safe speed based on local map information and/or a single frame depth image includes:
    基于所述局部地图信息,计算所述无人机的第一安全速度;Based on the local map information, calculate the first safe speed of the drone;
    基于所述单帧深度图像,计算所述无人机的第二安全速度;Based on the single-frame depth image, calculate the second safe speed of the drone;
    基于所述第一安全速度以及所述第二安全速度,确定所述目标安全速度。The target safe speed is determined based on the first safe speed and the second safe speed.
  9. 根据权利要求8所述的方法,其特征在于,所述基于所述局部地图信息,计算所述无人机的第一安全速度,包括:The method of claim 8, wherein calculating the first safe speed of the drone based on the local map information includes:
    基于所述局部地图信息,确定所述无人机与所述被探测物体之间的当前距离;Based on the local map information, determine the current distance between the drone and the detected object;
    基于所述当前距离以及所述无人机的指定停止距离,计算所述无人机的第一安全速度;所述第一安全速度与当前距离差值正相关,所述当前距离差值为所述当前距离与所述指定停止距离之间的差值。Based on the current distance and the designated stopping distance of the UAV, the first safe speed of the UAV is calculated; the first safe speed is positively related to the current distance difference, and the current distance difference is the The difference between the current distance and the specified stopping distance.
  10. 根据权利要求8所述的方法,其特征在于,所述基于所述单帧深度图像,计算所述无人机的第二安全速度,包括:The method of claim 8, wherein calculating the second safe speed of the drone based on the single-frame depth image includes:
    基于所述单帧深度图像中落入各深度区间的像素数量以及所述单帧深度图的总像素数量,确定各所述深度区间对应的像素数量占比;Based on the number of pixels falling into each depth interval in the single-frame depth image and the total number of pixels in the single-frame depth image, determine the proportion of the number of pixels corresponding to each depth interval;
    根据各所述深度区间对应的像素数量占比,确定各所述深度区间对应的权重值;所述深度区间对应的权重值与所述像素数量占比正相关;Determine the weight value corresponding to each depth interval according to the proportion of the number of pixels corresponding to each depth interval; the weight value corresponding to the depth interval is positively related to the proportion of the number of pixels;
    基于各所述深度区间对应的速度值以及权重值,确定所述第二安全速度;所述深度区间对应的速度值与所述深度区间表征的深度值正相关。The second safe speed is determined based on the speed value and weight value corresponding to each depth interval; the speed value corresponding to the depth interval is positively related to the depth value represented by the depth interval.
  11. 根据权利要求8所述的方法,其特征在于,所述基于所述第一安全速度以及所述第二安全速度,确定所述目标安全速度,包括:The method of claim 8, wherein determining the target safe speed based on the first safe speed and the second safe speed includes:
    将所述第一安全速度以及所述第二安全速度中的最小值,确定为所述目标安全速度。The minimum value of the first safety speed and the second safety speed is determined as the target safety speed.
  12. 根据权利要求6所述的方法,其特征在于,所述局部地图信息是在所述无人机与所述被探测物体之间的距离达到预设建图距离的情况下,基于所述多帧深度图像生成的,所述预设建图距离与所述无人机的指定停止距离正相关。The method according to claim 6, characterized in that the local map information is based on the multi-frame when the distance between the drone and the detected object reaches a preset mapping distance. Depth images are generated, and the preset mapping distance is positively related to the specified stopping distance of the drone.
  13. 根据权利要求1-3任一所述的方法,其特征在于,所述被探测物体包括位于所述目标飞行方向指示的预设范围的空间中的物体,所述预设范围的空间随着所述无人机在空间中的运动而运动。The method according to any one of claims 1 to 3, characterized in that the detected object includes an object located in a preset range of space indicated by the target flight direction, and the space of the preset range changes with the Describe the movement of the drone in space.
  14. 根据权利要求13任一所述的方法,其特征在于,所述预设范围的空间的尺寸基于所述无人机的外形设置水平方向和竖直方向的边界尺寸。The method according to any one of claims 13, characterized in that the size of the space in the preset range sets horizontal and vertical boundary sizes based on the shape of the drone.
  15. 根据权利要求13所述的方法,其特征在于,所述预设范围的空间在竖直方向的边界尺寸小于在水平方向上的边界尺寸。The method according to claim 13, characterized in that the boundary size of the space in the preset range in the vertical direction is smaller than the boundary size in the horizontal direction.
  16. 根据权利要求1-3任一所述的方法,其特征在于,所述方法还包括:The method according to any one of claims 1-3, characterized in that the method further includes:
    接收所述无人机的第一控制设备发送的飞行控制指令;所述飞行控制指令是基于用户对所述第一控制设备的控制杆的控制操作生成的;Receive flight control instructions sent by the first control device of the UAV; the flight control instructions are generated based on the user's control operation of the control lever of the first control device;
    将响应于所述飞行控制指令确定的飞行方向,确定为所述目标飞行方向。The flight direction determined in response to the flight control instruction is determined as the target flight direction.
  17. 根据权利要求1-3任一所述的方法,其特征在于,所述获取空间中的目标物的描述信息,包括:The method according to any one of claims 1 to 3, characterized in that said obtaining the description information of the target object in the space includes:
    接收所述无人机的第二控制设备发送的所述目标物的描述信息;其中,所述目标物的描述信息是所述第二控制设备基于用户所选中物体确定的。Receive the description information of the target object sent by the second control device of the drone; wherein the description information of the target object is determined by the second control device based on the object selected by the user.
  18. 一种无人机的飞行控制装置,其特征在于,所述装置包括存储器和处理器;A flight control device for an unmanned aerial vehicle, characterized in that the device includes a memory and a processor;
    所述存储器,用于存储程序代码;The memory is used to store program code;
    所述处理器,调用所述程序代码,当所述程序代码被执行时,用于执行以下操作:The processor calls the program code, and when the program code is executed, is used to perform the following operations:
    获取空间中的目标物的描述信息;Obtain description information of target objects in space;
    获取无人机的目标飞行方向;Obtain the target flight direction of the drone;
    基于所述无人机搭载的传感器,检测所述目标飞行方向指示的空间中的被探测物体的描述信息;Based on the sensor mounted on the drone, detect the description information of the detected object in the space indicated by the target flight direction;
    若所述被探测物体的描述信息与所述目标物的描述信息匹配,则控制所述无人机沿着所述目标飞行方向减速运动至接近所述被探测物体;If the description information of the detected object matches the description information of the target object, control the drone to decelerate along the target flight direction to approach the detected object;
    若所述被探测物体的描述信息与所述目标物体的描述信息不匹配,则控制所述无人机改变飞行方向以绕行所述被探测物体。If the description information of the detected object does not match the description information of the target object, the drone is controlled to change its flight direction to bypass the detected object.
  19. 根据权利要求1所述的装置,其特征在于,在控制所述无人机沿着所述目标飞行方向减速运动至接近所述被探测物体的情况下,所述处理器还用于执行:The device according to claim 1, wherein when controlling the drone to decelerate along the target flight direction to approach the detected object, the processor is further configured to execute:
    在所述无人机与所述被探测物体之间的距离不大于指定停止距离的情况下,控制所述无人机停止飞行。When the distance between the drone and the detected object is not greater than the specified stopping distance, the drone is controlled to stop flying.
  20. 根据权利要求19所述的装置,其特征在于,所述处理器还用于执行:The device according to claim 19, wherein the processor is further configured to execute:
    在所述无人机停止飞行后,控制所述无人机对所述被探测物体执行目标操作。After the drone stops flying, the drone is controlled to perform a target operation on the detected object.
  21. 根据权利要求19或20所述的装置,其特征在于,所述指定停止距离基于以下一种或者多种因素设置:The device according to claim 19 or 20, characterized in that the designated stopping distance is set based on one or more of the following factors:
    所述空间中物体的稠密程度;The density of objects in said space;
    所述目标物的描述信息;Description information of the target object;
    所述无人机的作业类型。The type of operation of the drone.
  22. 根据权利要求18-20任一所述的装置,其特征在于,所述处理器,具体用于执行:The device according to any one of claims 18-20, characterized in that the processor is specifically configured to execute:
    基于局部地图信息和/或单帧深度图像确定目标安全速度;所述目标安全速度与所述无人机与所述被探测物体之间的当前距离正相关;Determine the target safe speed based on local map information and/or single-frame depth images; the target safe speed is positively related to the current distance between the drone and the detected object;
    控制所述无人机沿着所述目标飞行方向,以所述目标安全速度减速运动至接近所述被探测物体。Control the drone to move along the target flight direction and decelerate at the target safe speed until it approaches the detected object.
  23. 根据权利要求22所述的装置,其特征在于,所述局部地图信息是基于所述无人机采集的当前周围环境的多帧深度图像生成的。The device according to claim 22, wherein the local map information is generated based on multi-frame depth images of the current surrounding environment collected by the drone.
  24. 根据权利要求23所述的装置,其特征在于,所述多帧深度图像是所述无人机搭载的传感器在不同朝向上获取得到的。The device according to claim 23, wherein the multi-frame depth images are obtained by sensors mounted on the drone in different directions.
  25. 根据权利要求22所述的装置,其特征在于,所述处理器还具体用 于执行:The device according to claim 22, wherein the processor is further specifically configured to execute:
    基于所述局部地图信息,计算所述无人机的第一安全速度;Based on the local map information, calculate the first safe speed of the drone;
    基于所述单帧深度图像,计算所述无人机的第二安全速度;Based on the single-frame depth image, calculate the second safe speed of the drone;
    基于所述第一安全速度以及所述第二安全速度,确定所述目标安全速度。The target safe speed is determined based on the first safe speed and the second safe speed.
  26. 根据权利要求25所述的装置,其特征在于,所述处理器还具体用于执行:The device according to claim 25, wherein the processor is further configured to execute:
    基于所述局部地图信息,确定所述无人机与所述被探测物体之间的当前距离;Based on the local map information, determine the current distance between the drone and the detected object;
    基于所述当前距离以及所述无人机的指定停止距离,计算所述无人机的第一安全速度;所述第一安全速度与当前距离差值正相关,所述当前距离差值为所述当前距离与所述指定停止距离之间的差值。Based on the current distance and the designated stopping distance of the UAV, the first safe speed of the UAV is calculated; the first safe speed is positively related to the current distance difference, and the current distance difference is the The difference between the current distance and the specified stopping distance.
  27. 根据权利要求25所述的装置,其特征在于,所述处理器还具体用于执行:The device according to claim 25, wherein the processor is further configured to execute:
    基于所述单帧深度图像中落入各深度区间的像素数量以及所述单帧深度图的总像素数量,确定各所述深度区间对应的像素数量占比;Based on the number of pixels falling into each depth interval in the single-frame depth image and the total number of pixels in the single-frame depth image, determine the proportion of the number of pixels corresponding to each depth interval;
    根据各所述深度区间对应的像素数量占比,确定各所述深度区间对应的权重值;所述深度区间对应的权重值与所述像素数量占比正相关;Determine the weight value corresponding to each depth interval according to the proportion of the number of pixels corresponding to each depth interval; the weight value corresponding to the depth interval is positively related to the proportion of the number of pixels;
    基于各所述深度区间对应的速度值以及权重值,确定所述第二安全速度;所述深度区间对应的速度值与所述深度区间表征的深度值正相关。The second safe speed is determined based on the speed value and weight value corresponding to each depth interval; the speed value corresponding to the depth interval is positively related to the depth value represented by the depth interval.
  28. 根据权利要求25所述的装置,其特征在于,所述处理器还具体用于执行:The device according to claim 25, wherein the processor is further configured to execute:
    将所述第一安全速度以及所述第二安全速度中的最小值,确定为所述目标安全速度。The minimum value of the first safety speed and the second safety speed is determined as the target safety speed.
  29. 根据权利要求23所述的装置,其特征在于,所述局部地图信息是在所述无人机与所述被探测物体之间的距离达到预设建图距离的情况下,基于所述多帧深度图像生成的,所述预设建图距离与所述无人机的指定停止距离正相关。The device according to claim 23, wherein the local map information is based on the multi-frame when the distance between the drone and the detected object reaches a preset mapping distance. Depth images are generated, and the preset mapping distance is positively related to the specified stopping distance of the drone.
  30. 根据权利要求18-20任一所述的装置,其特征在于,所述被探测物 体包括位于所述目标飞行方向指示的预设范围的空间中的物体,所述预设范围的空间随着所述无人机在空间中的运动而运动。The device according to any one of claims 18 to 20, wherein the detected object includes an object located in a space within a preset range indicated by the target flight direction, and the space within the preset range changes with the Describe the movement of the drone in space.
  31. 根据权利要求30任一所述的装置,其特征在于,所述预设范围的空间的尺寸基于所述无人机的外形设置水平方向和竖直方向的边界尺寸。The device according to any one of claims 30, wherein the size of the space in the preset range is set with horizontal and vertical boundary sizes based on the shape of the drone.
  32. 根据权利要求30所述的装置,其特征在于,所述预设范围的空间在竖直方向的边界尺寸小于在水平方向上的边界尺寸。The device according to claim 30, characterized in that the boundary size of the space in the preset range in the vertical direction is smaller than the boundary size in the horizontal direction.
  33. 根据权利要求18-20任一所述的装置,其特征在于,所述处理器还用于执行:The device according to any one of claims 18-20, characterized in that the processor is also used to execute:
    接收所述无人机的第一控制设备发送的飞行控制指令;所述飞行控制指令是基于用户对所述第一控制设备的控制杆的控制操作生成的;Receive flight control instructions sent by the first control device of the UAV; the flight control instructions are generated based on the user's control operation of the control lever of the first control device;
    将响应于所述飞行控制指令确定的飞行方向,确定为所述目标飞行方向。The flight direction determined in response to the flight control instruction is determined as the target flight direction.
  34. 根据权利要求18-20任一所述的装置,其特征在于,所述处理器还具体用于执行:The device according to any one of claims 18-20, characterized in that the processor is further specifically configured to execute:
    接收所述无人机的第二控制设备发送的所述目标物的描述信息;其中,所述目标物的描述信息是所述第二控制设备基于用户所选中物体确定的。Receive the description information of the target object sent by the second control device of the drone; wherein the description information of the target object is determined by the second control device based on the object selected by the user.
  35. 一种无人机,其特征在于,所述无人机用于实现权利要求1-17任一项所述的方法。A drone, characterized in that the drone is used to implement the method described in any one of claims 1-17.
  36. 一种计算机可读存储介质,其特征在于,包括指令,当其在计算机上运行时,使得所述计算机执行权利要求1-17任一项所述的方法。A computer-readable storage medium, characterized by comprising instructions that, when run on a computer, cause the computer to execute the method described in any one of claims 1-17.
  37. 一种包含指令的计算机程序产品,其特征在于,当所述指令在计算机上运行时,使得所述计算机执行权利要求1-17任一项所述的方法。A computer program product containing instructions, characterized in that, when the instructions are run on a computer, the computer is caused to execute the method described in any one of claims 1-17.
PCT/CN2022/081230 2022-03-16 2022-03-16 Flight control method and apparatus for unmanned aerial vehicle, unmanned aerial vehicle, and storage medium WO2023173330A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/081230 WO2023173330A1 (en) 2022-03-16 2022-03-16 Flight control method and apparatus for unmanned aerial vehicle, unmanned aerial vehicle, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/081230 WO2023173330A1 (en) 2022-03-16 2022-03-16 Flight control method and apparatus for unmanned aerial vehicle, unmanned aerial vehicle, and storage medium

Publications (1)

Publication Number Publication Date
WO2023173330A1 true WO2023173330A1 (en) 2023-09-21

Family

ID=88022104

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/081230 WO2023173330A1 (en) 2022-03-16 2022-03-16 Flight control method and apparatus for unmanned aerial vehicle, unmanned aerial vehicle, and storage medium

Country Status (1)

Country Link
WO (1) WO2023173330A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110262568A (en) * 2019-07-19 2019-09-20 深圳市道通智能航空技术有限公司 A kind of unmanned plane barrier-avoiding method, device and unmanned plane based on target following
CN113141468A (en) * 2021-05-24 2021-07-20 维沃移动通信(杭州)有限公司 Focusing method and device and electronic equipment
CN113222868A (en) * 2021-04-25 2021-08-06 北京邮电大学 Image synthesis method and device
CN214002041U (en) * 2020-10-29 2021-08-20 华能阳江风力发电有限公司 A unmanned aerial vehicle for anemometer tower patrols and examines
CN113625737A (en) * 2021-08-13 2021-11-09 海创飞龙(福建)科技有限公司 Unmanned aerial vehicle device for detecting and scoring
CN113696889A (en) * 2021-08-18 2021-11-26 北京三快在线科技有限公司 Unmanned equipment control method and device based on safe distance

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110262568A (en) * 2019-07-19 2019-09-20 深圳市道通智能航空技术有限公司 A kind of unmanned plane barrier-avoiding method, device and unmanned plane based on target following
CN214002041U (en) * 2020-10-29 2021-08-20 华能阳江风力发电有限公司 A unmanned aerial vehicle for anemometer tower patrols and examines
CN113222868A (en) * 2021-04-25 2021-08-06 北京邮电大学 Image synthesis method and device
CN113141468A (en) * 2021-05-24 2021-07-20 维沃移动通信(杭州)有限公司 Focusing method and device and electronic equipment
CN113625737A (en) * 2021-08-13 2021-11-09 海创飞龙(福建)科技有限公司 Unmanned aerial vehicle device for detecting and scoring
CN113696889A (en) * 2021-08-18 2021-11-26 北京三快在线科技有限公司 Unmanned equipment control method and device based on safe distance

Similar Documents

Publication Publication Date Title
US11242144B2 (en) Aerial vehicle smart landing
CN109144097B (en) Obstacle or ground recognition and flight control method, device, equipment and medium
US20240069572A1 (en) Aerial Vehicle Touchdown Detection
JP7456537B2 (en) Aircraft control device, aircraft control method, and program
WO2020181719A1 (en) Unmanned aerial vehicle control method, unmanned aerial vehicle, and system
US11892845B2 (en) System and method for mission planning and flight automation for unmanned aircraft
WO2018179404A1 (en) Information processing device, information processing method, and information processing program
WO2020082364A1 (en) Unmanned aerial vehicle control method and device, unmanned aerial vehicle, and computer readable storage medium
CN115933754A (en) Electric power inspection unmanned aerial vehicle obstacle avoidance method based on millimeter wave radar and binocular vision
KR20220129218A (en) Speed control method of unmanned vehicle to awareness the flight situation about an obstacle, and, unmanned vehicle the performed the method
CN113934207A (en) Automatic obstacle avoidance navigation system of mobile robot and navigation method thereof
Chen et al. A review of autonomous obstacle avoidance technology for multi-rotor UAVs
CN106155082A (en) A kind of unmanned plane bionic intelligence barrier-avoiding method based on light stream
WO2023173330A1 (en) Flight control method and apparatus for unmanned aerial vehicle, unmanned aerial vehicle, and storage medium
CN106598065A (en) Binocular-supersonic fusion obstacle avoidance control method for unmanned aerial vehicles
CN213814412U (en) Double-cradle head unmanned aerial vehicle
WO2021217346A1 (en) Information processing method, information processing apparatus, and moveable device
CN111736622B (en) Unmanned aerial vehicle obstacle avoidance method and system based on combination of binocular vision and IMU
CN111615677B (en) Unmanned aerial vehicle safety landing method and device, unmanned aerial vehicle and medium
CN112947426A (en) Cleaning robot motion control system and method based on multi-sensing fusion
WO2023082283A1 (en) Obstacle avoidance method and apparatus for movable platform, movable platform, and storage medium
CN114265423B (en) Unmanned aerial vehicle mobile platform landing method and system based on rotating frame detection and positioning
TWI809727B (en) Method for searching a path by using a three-dimensional reconstructed map
CN215954140U (en) Automatic obstacle avoidance navigation system of mobile robot
JP7044147B2 (en) Information processing equipment, information processing methods, and information processing programs

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22931367

Country of ref document: EP

Kind code of ref document: A1