WO2019015158A1 - Obstacle avoidance method for unmanned aerial vehicle, and unmanned aerial vehicle - Google Patents

Obstacle avoidance method for unmanned aerial vehicle, and unmanned aerial vehicle Download PDF

Info

Publication number
WO2019015158A1
WO2019015158A1 PCT/CN2017/108022 CN2017108022W WO2019015158A1 WO 2019015158 A1 WO2019015158 A1 WO 2019015158A1 CN 2017108022 W CN2017108022 W CN 2017108022W WO 2019015158 A1 WO2019015158 A1 WO 2019015158A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
area
flight
pixel
region
Prior art date
Application number
PCT/CN2017/108022
Other languages
French (fr)
Chinese (zh)
Inventor
王晓曼
Original Assignee
歌尔科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 歌尔科技有限公司 filed Critical 歌尔科技有限公司
Publication of WO2019015158A1 publication Critical patent/WO2019015158A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Definitions

  • the invention belongs to the field of electronic technology, and in particular to a method for avoiding obstacles of a drone and a drone.
  • drones have been widely used in aerial photography, cargo transportation, resource exploration, and route survey. Since the drone does not have special personnel to operate the drone during flight, the drone needs to fly autonomously according to the planned route, because it is difficult to avoid the obstacles encountered in the actual flight when planning the route.
  • Unmanned aircraft owners can avoid obstacles encountered in flight based on video autonomous obstacle avoidance methods.
  • the method captures an image of the surrounding environment by the camera carried by the drone, and analyzes the obstacle behavior of the obstacle based on the image taken to analyze whether there is an obstacle in front of the obstacle, so that the drone can be safely flying.
  • the video-based autonomous obstacle avoidance method is to hover when encountering an obstacle, and cannot continue to fly, and can only transmit the image back to the ground station to recalculate the flight route and then perform obstacle avoidance flight, resulting in the drone's Flight efficiency is low.
  • the present invention provides a UAV obstacle avoidance method and a UAV, which solves the technical problem that the UAV has low flight efficiency, and realizes the UAV autonomous obstacle avoidance flying, which can greatly improve Flight efficiency.
  • the present invention provides a method for obstacle avoidance of a drone, comprising:
  • the clustering the pixel points in the stereo disparity map to obtain a plurality of cluster regions includes:
  • Pixel points in the adjacent area of the target pixel that are different from the pixel of the target pixel in a preset range are marked as belonging to the same cluster area as the target pixel.
  • the unmarked pixel point is used as a target pixel point, and starting from adjacent pixel points of the target pixel point, sequentially determining whether a pixel difference between each pixel point and the target pixel point is preset
  • the scope includes:
  • the stereo disparity map for acquiring the current environment image by the binocular camera comprises:
  • Calculating, according to each cluster area, a flight distance of the environment area corresponding to each cluster area to the drone includes:
  • the position coordinate calculation formula is:
  • (X, Y, Z) is the position coordinate of the environmental region in the three-dimensional coordinate system
  • Z represents the flight distance of the environmental region corresponding to each of the cluster regions to the drone
  • T x represents a baseline of the binocular camera
  • f denotes a focal length of the binocular camera
  • (x, y) denotes a position coordinate of the target pixel point in any cluster region
  • d denotes the one in the any cluster region
  • the parallax corresponding to the target pixel, (c x , c y ) is the imaging origin after the binocular camera is corrected.
  • the selecting one of the environment regions from the flight distance greater than the distance threshold comprises:
  • the candidate area is used as a flying area
  • an environment region whose flight distance is the farthest from the flight distance greater than the distance threshold and not including the candidate region is selected as the candidate region, and the step of determining whether the candidate region satisfies the flight condition is continued.
  • the method further comprises:
  • the drone is controlled to fly according to the original route.
  • the method further comprises:
  • the drone is controlled to hover.
  • the calculating the rotation angle of the drone to the any of the flying regions and controlling the drone to fly according to the rotation angle comprises:
  • the invention also provides an unmanned obstacle avoidance, comprising:
  • An acquisition module configured to acquire a stereo disparity map of the current environment image by using a binocular camera
  • a clustering module configured to cluster pixel points in the stereo disparity map to obtain a plurality of cluster regions
  • a flight distance calculation module configured to calculate a flight distance of the environmental region corresponding to each cluster region to the drone based on each cluster region;
  • a first determining module configured to determine whether a flight distance of each environmental area and the drone is greater than a distance threshold
  • a first selection module configured to select any one of the surrounding areas from an environmental area whose flight distance is greater than a distance threshold
  • the flying module is configured to calculate a rotation angle of the drone to the any of the flying regions, and control the drone to fly around according to the rotation angle.
  • the clustering module comprises:
  • a second determining unit configured to use any pixel point that is not marked as a target pixel point, and phase from the target pixel point Starting from adjacent pixels, determining whether the pixel difference between each pixel point and the target pixel point is within a preset range;
  • a marking unit configured to mark a pixel difference between the adjacent pixel of the target pixel and the pixel of the target pixel in a preset range, and mark the same cluster area as the target pixel.
  • the second determining unit is specifically configured to:
  • the first acquiring module is specifically configured to:
  • the flight distance calculation module is specifically configured to:
  • the position coordinate calculation formula is:
  • (X, Y, Z) is the position coordinate of the environmental region in the three-dimensional coordinate system
  • Z represents the flight distance of the environmental region corresponding to each of the cluster regions to the drone
  • T x represents a baseline of the binocular camera
  • f denotes a focal length of the binocular camera
  • (x, y) denotes a position coordinate of the target pixel point in any cluster region
  • d denotes the one in the any cluster region
  • the parallax corresponding to the target pixel, (c x , c y ) is the imaging origin after the binocular camera is corrected.
  • the first selection module includes: a second selection unit, a third determination unit, a determination unit, and a third selection unit;
  • the second selecting unit is configured to select a farthest flight distance from an environment region where the flight distance is greater than the distance threshold if the flight distance corresponding to the flight direction of the drone is less than the distance threshold
  • An environment The area is a candidate area
  • the third determining unit is configured to determine whether the candidate area meets a flight condition; if yes, trigger the determining unit; if not, trigger the third selecting unit;
  • the determining unit is configured to use the candidate area as a flying area
  • the third selection unit is configured to select, as a candidate region, an environment region whose flight distance is the farthest from the environment region where the flight distance is greater than the distance threshold and does not include the candidate region, and return to determine whether the candidate region meets the flight condition. The steps continue.
  • the method further comprises:
  • the first control module is configured to control the drone to fly according to the original route if the flight area corresponding to the flight direction of the drone and the flight distance of the drone are greater than a distance threshold.
  • the method further comprises:
  • a second control module configured to control the drone to hover if none of the candidate regions meet the flight condition.
  • the flying module is specifically configured to:
  • the present invention can obtain the following technical effects:
  • the invention provides a UAV obstacle avoidance method and a UAV, which acquires a stereo disparity map of a current environment image through a binocular camera and clusters the pixels in the stereo disparity map to obtain a plurality of Cluster area. Based on each cluster area, the flight distance of the environment area corresponding to each cluster area to the drone is calculated. By determining whether the flight distance of each environmental area and the drone is greater than a distance threshold, the drone selects any one of the flying areas from the environmental area whose flight distance is greater than the distance threshold, and calculates the unmanned aerial vehicle Describe the rotation angle of any flying area, and control the drone to fly around according to the rotation angle.
  • the invention selects the flying area larger than the distance threshold and performs the flying according to the rotation angle of the flying area, thereby realizing The drone autonomously avoids obstacles and greatly improves flight efficiency.
  • FIG. 1 is a flow chart of an embodiment of a method for obstacle avoidance of a drone according to an embodiment of the present invention
  • FIG. 2 is a flow chart of another embodiment of a method for obstacle avoidance of a drone according to an embodiment of the present invention
  • FIG. 3 is a schematic structural view of an embodiment of a drone according to an embodiment of the present invention.
  • FIG. 4 is a schematic structural view of another embodiment of a drone according to an embodiment of the present invention.
  • the autonomous flight of the drone requires automatic obstacle avoidance to ensure the flight safety of the drone.
  • the existing automatic obstacle avoidance means may include video-based obstacle avoidance, but currently the video-based autonomous obstacle avoidance method is to hover when encountering an obstacle, unable to continue flying, and can only return the image to the ground station to recalculate the flight. After the route, obstacle avoidance flight, there are many restrictions on the use, resulting in low flight efficiency of the drone.
  • a stereo disparity map of a current environment image is acquired by a binocular camera and pixels in the stereo disparity map are clustered to obtain a plurality of cluster regions. Based on each cluster area, the flight distance of the environment area corresponding to each cluster area to the drone is calculated. Then, by determining whether the flight distance of each environmental area and the drone is greater than a distance threshold, the drone selects any one of the flying areas from the environmental area whose flight distance is greater than the distance threshold, and controls the drone according to the drone. The rotation angle is made to fly around.
  • the invention selects the flying area larger than the distance threshold and performs the flying according to the rotation angle of the flying area, thereby realizing The drone autonomously avoids obstacles and greatly improves flight efficiency.
  • FIG. 1 is a flowchart of an embodiment of a method for avoiding obstacles of a drone according to an embodiment of the present invention.
  • the method may include:
  • the binocular camera is composed of two left and right cameras at the front end of the drone, and is used to capture the image of the environment around the flight path of the drone.
  • the binocular camera simulates the principle of human eye vision.
  • Two environment images are acquired by the left and right cameras. Since the environment image captured by the binocular image is distorted, it is necessary to correct the collected environment image and use a stereo matching algorithm to calculate Obtain a stereo disparity map.
  • the stereo matching algorithm mainly uses the correspondence between two environmental images acquired by the binocular camera, and obtains a disparity map according to the triangulation principle; after obtaining the disparity information, the depth information of the original environment image can be easily obtained according to the projection model. And three-dimensional information to calculate the stereo disparity map.
  • the pixel points may be clustered according to the gray value of each pixel in the stereo disparity map, and the pixel points with the gray value close to each other are divided into one cluster region, thereby Will be in the stereo disparity map
  • the pixels are divided into a plurality of cluster regions.
  • the environment area may be sorted from large to small according to the distance from each environment area to the drone. Obtain the order of flight distance from the environmental area to the drone.
  • the acquiring a stereo disparity map of the current environment image by the binocular camera includes:
  • the formula of the Q matrix may be:
  • the T x represents a baseline of the binocular camera
  • f represents a focal length of the binocular camera
  • (c x , c y ), (c′ x , c′ y ) are respectively corrected by the binocular camera.
  • the imaging origin refers to the intersection of the optical axis of the binocular camera and the acquired image plane, usually at the center of the image.
  • the imaging origin of the two environmental images acquired by the binocular camera can be obtained by binocular camera calibration. Since the acquired environmental image needs image correction, the same imaging origin after correction of the two environmental images is (c x , c y ) Thus, it is possible to obtain an ideal form in which two environmental images are perfectly aligned in parallel.
  • the two environment images acquired by the two cameras may be image corrected by the internal and external parameters of the binocular camera, and the corrected two environment images are obtained.
  • the planes are perfectly parallel aligned to facilitate stereo matching of the two corrected images using a stereo matching algorithm to obtain a stereo disparity map of the current environment image.
  • the stereo matching algorithm may calculate the disparity d of each pixel in the corrected image acquired by the binocular camera according to the triangulation principle, so that the parallax image can be obtained, and after obtaining the disparity information, according to the projection model, The depth information and the three-dimensional information of the original environment image are calculated to obtain a stereo disparity map.
  • the position coordinate calculation formula can be:
  • (X, Y, Z) is the position coordinate of the environmental region in the three-dimensional coordinate system, and Z may represent the flight distance of the environmental region corresponding to each of the cluster regions to the drone, (x, y) represents the position coordinates of the target pixel point in any of the cluster regions, and d represents the parallax corresponding to the target pixel in any of the cluster regions.
  • the distance threshold may be set according to performance parameters of the actual in-flight drone, for example, the drone may hover when it is three meters away from the obstacle in front to ensure that it does not hit the obstacle. It is believed that the drone can set a safe distance of three meters and use the safety distance as a distance threshold to ensure safe flight of the drone.
  • the calculating the rotation angle of the drone to the any of the flying regions and controlling the drone to fly according to the rotation angle comprises:
  • the rotation angle can be calculated by the position coordinates (X, Y, Z) in the solid coordinate region corresponding to the environment region corresponding to the flying region, and the drone is controlled to fly around according to the rotation angle.
  • the flying around the flying area with the smallest rotation angle can be preferentially selected.
  • the flying area larger than the distance threshold is selected and the flying around the flying area is performed, and
  • the environmental region with the smallest rotation angle and greater than the threshold of the distance zone is preferentially used as the surrounding area to fly around, which not only ensures that the flight path does not change greatly, but also realizes the autonomous autonomous obstacle avoidance of the drone, which greatly improves the flight efficiency.
  • the pixel points in the stereo disparity map are clustered to obtain multiple aggregations.
  • Class areas include:
  • Pixel points in the adjacent area of the target pixel that are different from the pixel of the target pixel in a preset range are marked as belonging to the same cluster area as the target pixel.
  • any unmarked pixel point in the stereo disparity map as the target pixel point, and mark the clustering area of the target pixel point as the area 0, and the position coordinate of the target pixel point is ( i, j), the gray value is d 0 .
  • the untagged one of the pixel points is used as a target pixel point, and the pixel points and the target pixel point are sequentially determined from adjacent pixel points of the target pixel point. Whether the pixel difference value is within the preset range includes:
  • the determining, from the adjacent pixel points of the target pixel point, sequentially determining whether a pixel difference between each pixel point and the target pixel point is within a preset range may first select the target pixel point
  • the adjacent position coordinate is the first pixel point of (i, j-1), and the corresponding gray value d 1 of the first pixel point is obtained.
  • the preset range of setting the gray value is D, and it is judged whether
  • the pixel points in the three directions of the target pixel point, the left and the right direction are sequentially selected to belong to the area 0, and after the clustering of the area 0 is completed, any pixel that is not marked is reselected as the target pixel point is marked as the area 1 and The clustering of the region 1 is completed according to the clustering process of the region 0.
  • the clustering is completed, so that a plurality of clustering regions can be obtained.
  • FIG. 2 is a flowchart of another embodiment of a method for obstacle avoidance of a drone according to an embodiment of the present invention.
  • the method may include:
  • 201 Obtain a stereo disparity map of a current environment image by using a binocular camera
  • the flight distances of the drones may be sorted according to the distance from each environment area to obtain an environment according to the environment. The sequence in which the flight distances of the regions are sorted.
  • the method may further include:
  • the drone is controlled to fly according to the original route.
  • the drone When the flight area corresponding to the flight direction of the drone and the flight distance of the drone is less than the distance threshold, the drone cannot continue to fly according to the initial flight direction, and the environment area larger than the flight distance needs to be selected. Take a fly around. According to the arrangement order of the flight distances corresponding to the respective environmental regions, the environment region with the largest flight distance may be preferentially selected as the candidate region for the flight. Of course, it is also possible to select an environmental area larger than the flight distance closest to the flight direction of the drone as a candidate area to fly around.
  • the flight condition may be determined according to the actual size of the candidate area.
  • the size of the drone is 1 meter wide and 0.5 meters high. If the actual size of the candidate area is larger than the size of the drone, the drone can be safely passed, and the candidate area satisfies the flight condition if the candidate area If the actual size is smaller than the size of the drone, the drone cannot fly through the candidate area and therefore does not meet the flight conditions.
  • the candidate area is used as a flying area.
  • step 208 Select an environment area whose flight distance is the farthest from the environment area whose flight distance is greater than the distance threshold and does not include the candidate area as the candidate area, and return to step 206 to continue the corresponding operation.
  • the candidate region with the largest flight distance among the environmental regions other than the candidate region is selected as the new candidate region according to the order of the flight distance corresponding to the environment region. It is re-determined whether the candidate area satisfies the flight condition.
  • the determining whether the flight distance between each environment area and the drone is large After the distance threshold it may also include:
  • the drone is controlled to hover.
  • control the drone to hover and send the surrounding environment image to the ground station to recalculate the flight route, and then wait to receive After the flight route sent by the ground station, follow the flight instructions to avoid obstacles.
  • the operation of the step 201 - step 203 in the embodiment is the same as the operation of the step 101 - step 103 in the embodiment of FIG. 1 .
  • the operation of the step 209 is the same as the operation of the step 106 in the embodiment of FIG. 1 , and details are not described herein again.
  • the drone by first determining whether the environment area corresponding to the flight direction of the drone is greater than the distance threshold, if it is greater than, the flight continues according to the original route; if less, the environment with the flight distance greater than the distance threshold and the farthest flight distance is preferentially selected.
  • the area is used as a candidate area, and it is judged that the candidate area satisfies the flight condition, so that the drone can find an optimal flight path according to the surrounding environment, and realize the autonomous autonomous obstacle avoidance. Flying around, greatly improving flight efficiency.
  • FIG. 3 is a schematic structural diagram of an embodiment of a drone according to an embodiment of the present invention.
  • the drone may include:
  • the obtaining module 301 is configured to acquire a stereo disparity map of the current environment image by using the binocular camera.
  • the binocular camera is composed of two left and right cameras at the front end of the drone, and is used to capture the image of the environment around the flight path of the drone.
  • the binocular camera acquires two environmental images by simulating the human eye vision principle, and the environmental image captured by the binocular image is distorted. Therefore, a stereo matching algorithm is needed to calculate and obtain a stereo disparity map.
  • the stereo matching algorithm mainly uses the correspondence between a pair of environmental images acquired by the binocular camera, and obtains a disparity map according to the triangulation principle; after obtaining the disparity information, the depth information of the original environment image can be easily obtained according to the projection model. And three-dimensional information to calculate the stereo disparity map.
  • the clustering module 302 is configured to cluster the pixels in the stereo disparity map to obtain a plurality of cluster regions.
  • the pixel points may be clustered according to the gray value of each pixel in the stereo disparity map, and the pixel points with the gray value close to each other are divided into one cluster region, thereby The pixel points in the stereo disparity map are divided into a plurality of cluster regions.
  • the flight distance calculation module 303 is configured to calculate, according to each cluster area, a flight distance of the environment area corresponding to each cluster area to the drone.
  • the environment area may be sorted from large to small according to the distance from each environment area to the drone. Obtain the order of flight distance from the environmental area to the drone.
  • the acquiring module 301 may be specifically configured to:
  • the T x represents a baseline of the binocular camera
  • f represents a focal length of the binocular camera
  • (c x , c y ) respectively for the binocular camera
  • the imaging origin refers to the intersection of the optical axis of the binocular camera and the acquired image plane, usually at the center of the image.
  • the imaging origin of the two environmental images acquired by the binocular camera can be obtained by binocular camera calibration. Since the acquired environmental image needs image correction, the same imaging origin after correction of the two environmental images is (c x , c y ) Thus, it is possible to obtain an ideal form in which two environmental images are perfectly aligned in parallel.
  • the two environment images acquired by the two cameras may be image corrected by the internal and external parameters of the binocular camera, and the corrected two environment images are obtained.
  • the planes are perfectly parallel aligned to facilitate stereo matching of the two corrected images using a stereo matching algorithm to obtain a stereo disparity map of the current environment image.
  • the stereo matching algorithm may calculate the disparity d of each pixel in the environment image acquired by the binocular camera according to the triangulation principle, so that the parallax image can be obtained.
  • the depth information and the three-dimensional information of the original environment image can be obtained according to the projection model, thereby obtaining a stereo disparity map.
  • the flight distance calculation module 303 can be specifically configured to:
  • the position coordinate calculation formula can be:
  • (X, Y, Z) is the position coordinate of the environmental region in the three-dimensional coordinate system
  • Z may represent each of the poly (x, y) represents the positional coordinates of the target pixel in any cluster area
  • d represents the target in any of the cluster areas.
  • the first determining module 304 is configured to determine whether a flight distance of each environment area and the drone is greater than a distance threshold.
  • a first selection module 305 configured to select any one of the surrounding areas from an environmental area whose flight distance is greater than a distance threshold
  • the distance threshold may be set according to performance parameters of the actual in-flight drone, for example, the drone may hover when it is three meters away from the obstacle in front to ensure that it does not hit the obstacle. It is believed that the drone can set a safe distance of three meters and use the safety distance as a distance threshold to ensure safe flight of the drone.
  • the flying module 306 is configured to calculate a rotation angle of the drone to the any of the flying regions, and control the drone to fly around according to the rotation angle.
  • the flying module 306 can be specifically configured to:
  • the rotation angle can be calculated by the position coordinates (X, Y, Z) in the solid coordinate region corresponding to the environment region corresponding to the flying region, and the drone is controlled to fly around according to the rotation angle.
  • the flying around the flying area with the smallest rotation angle can be preferentially selected.
  • the drone by determining the flight distance of the environment area corresponding to each cluster area in the stereo disparity map to the drone, selecting a flying area larger than the distance threshold and performing a flight according to the rotation angle of the flying area, and The environment area with the smallest rotation angle and greater than the threshold of the distance zone can be preferentially used as the flying area to fly around, which not only ensures that the flight path does not change greatly, but also realizes the autonomous autonomous obstacle avoidance of the drone, which greatly improves the flight efficiency.
  • the clustering module may include:
  • a second determining unit configured to use any pixel point that is not marked as a target pixel point, and sequentially determine, from the adjacent pixel points of the target pixel point, whether the pixel difference between each pixel point and the target pixel point is Within the preset range;
  • a marking unit configured to mark a pixel difference between the adjacent pixel of the target pixel and the pixel of the target pixel in a preset range, and mark the same cluster area as the target pixel.
  • any unmarked pixel point in the stereo disparity map as the target pixel point, and mark the clustering area of the target pixel point as the area 0, and the position coordinate of the target pixel point is ( i, j), the gray value is d 0 .
  • the second determining unit may be specifically configured to:
  • the determining, from the adjacent pixel points of the target pixel point, sequentially determining whether a pixel difference between each pixel point and the target pixel point is within a preset range may first select the target pixel point
  • the adjacent position coordinate is the first pixel point of (i, j-1), and the corresponding gray value d 1 of the first pixel point is obtained.
  • the preset range of setting the gray value is D, and it is judged whether
  • the pixel points in the three directions of the target pixel point, the left and the right direction are sequentially selected to belong to the area 0, and after the clustering of the area 0 is completed, any pixel that is not marked is reselected as the target pixel point is marked as the area 1 and The clustering of the region 1 is completed according to the clustering process of the region 0.
  • the clustering is completed, so that a plurality of clustering regions can be obtained.
  • FIG. 4 is a schematic structural diagram of another embodiment of a drone according to an embodiment of the present invention, and the drone may include:
  • the obtaining module 401 is configured to acquire a stereo disparity map of the current environment image by using a binocular camera;
  • the clustering module 402 is configured to cluster the pixels in the stereo disparity map to obtain a plurality of cluster regions.
  • the flight distance calculation module 403 is configured to calculate, according to each cluster area, a flight distance of the environment area corresponding to each cluster area to the drone.
  • the first determining module 404 is configured to determine whether a flight distance of each environmental area and the drone is greater than a distance threshold.
  • the flight distances of the drones may be sorted according to the distance from each environment area to obtain an environment according to the environment. The sequence in which the flight distances of the regions are sorted.
  • the method may further include:
  • the first control module is configured to control the drone to fly according to the original route if the flight area corresponding to the flight direction of the drone and the flight distance of the drone are greater than a distance threshold.
  • a first selection module 405, configured to select any one of the surrounding areas from an environmental area whose flight distance is greater than a distance threshold
  • the first selection module 405 can include:
  • the second selecting unit 411 is configured to select, if the flight distance of the environment area corresponding to the flight direction of the drone and the drone is less than the distance threshold, select the farthest flight distance from the environmental area where the flight distance is greater than the distance threshold An environmental area is used as a candidate area.
  • the drone When the flight area corresponding to the flight direction of the drone and the flight distance of the drone is less than the distance threshold, the drone cannot continue to fly according to the initial flight direction, and the environment area larger than the flight distance needs to be selected. Take a fly around. According to the arrangement order of the flight distances corresponding to the respective environmental regions, the environment region with the largest flight distance may be preferentially selected as the candidate region for the flight. Of course, it is also possible to select an environmental area larger than the flight distance closest to the flight direction of the drone as a candidate area to fly around.
  • the third determining unit 412 is configured to determine whether the candidate area meets a flight condition; if yes, trigger the determining unit 413; if not, trigger the third selecting unit 414.
  • the flight condition may be determined according to the actual size of the candidate area.
  • the size of the drone is 1 meter wide and 0.5 meters high. If the actual size of the candidate area is larger than the size of the drone, the drone can be safely passed, and the candidate area satisfies the flight condition if the candidate area If the actual size is smaller than the size of the drone, the drone cannot fly through the candidate area and therefore does not meet the flight conditions.
  • the determining unit 413 is configured to use the candidate area as a flying area.
  • the third selecting unit 414 is configured to select, as the candidate region, an environment region whose flight distance is the farthest from the environment region whose flight distance is greater than the distance threshold and does not include the candidate region, and returns to step 206 to continue performing the corresponding operation.
  • the candidate region with the largest flight distance among the environmental regions other than the candidate region is selected as the new candidate region according to the order of the flight distance corresponding to the environment region. It is re-determined whether the candidate area satisfies the flight condition.
  • the flying module 406 is configured to calculate a rotation angle of the drone to the any of the flying regions, and control the drone to fly around according to the rotation angle.
  • the method may further include:
  • a second control module configured to control the drone to hover if none of the candidate regions meet the flight condition.
  • control the drone to hover and send the surrounding environment image to the ground station to recalculate the flight route, and then wait to receive After the flight route sent by the ground station, follow the flight instructions to avoid obstacles.
  • the acquisition module 401 is the same as the acquisition module 301 in the embodiment of FIG. 3
  • the clustering module 402 is the same as the clustering module 302 in the embodiment of FIG. 3
  • the module 303 is the same
  • the flying module 406 is the same as the flying module 306 in the embodiment of FIG. 3, and details are not described herein again.
  • the drone by determining whether the environment area corresponding to the flight direction of the drone is greater than the distance threshold, if Then, the flight continues according to the original route; if it is smaller, an environment region whose flight distance is greater than the distance threshold and the farthest flight distance is preferentially selected as the candidate region, and it is determined that the candidate region satisfies the flight condition, so that the flight can be performed.
  • the drone can find the optimal flight path to fly around, and realize the autonomous autonomous obstacle avoidance of the drone, which greatly improves the flight efficiency.
  • a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
  • processors CPUs
  • input/output interfaces network interfaces
  • memory volatile and non-volatile memory
  • the memory may include non-persistent memory, random access memory (RAM), and/or non-volatile memory in a computer readable medium, such as read only memory (ROM) or flash memory.
  • RAM random access memory
  • ROM read only memory
  • Memory is an example of a computer readable medium.
  • Computer readable media includes both permanent and non-persistent, removable and non-removable media.
  • Information storage can be implemented by any method or technology.
  • the information can be computer readable instructions, data structures, modules of programs, or other data.
  • Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read only memory. (ROM), electrically erasable programmable read only memory (EEPROM), flash memory or other memory technology, compact disk read only memory (CD-ROM), digital versatile disk (DVD) or other optical storage, Magnetic tape cartridges, magnetic tape storage or other magnetic storage devices or any other non-transportable media can be used to store information that can be accessed by a computing device.
  • computer readable media does not include non-transitory computer readable media, such as modulated data signals and carrier waves.
  • first device if a first device is coupled to a second device, the first device can be directly electrically coupled to the second device, or electrically coupled indirectly through other devices or coupling means. Connected to the second device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

An obstacle avoidance method for an unmanned aerial vehicle, and an unmanned aerial vehicle. The obstacle avoidance method for an unmanned aerial vehicle comprises: acquiring, by means of a binocular camera, a stereoscopic parallax map of a current environment image (101), and clustering pixel points in the stereoscopic parallax map to obtain a plurality of clustering regions (102); based on each clustering region, calculating a flight distance from an environment region corresponding to each clustering region to an unmanned aerial vehicle (103); and determining whether the flight distance from each environment area to the unmanned aerial vehicle is greater than a distance threshold value (104), so that the unmanned aerial vehicle selects any fly-around region from environment regions with a flight distance being greater than the distance threshold value (105), and calculating a rotation angle for the unmanned aerial vehicle to turn to the fly-around region, and controlling the unmanned aerial vehicle so that same flies around according to the rotation angle (106). The autonomous obstacle avoidance and flying around of an unmanned aerial vehicle are realised, thus improving the flight efficiency.

Description

无人机避障方法及无人机UAV obstacle avoidance method and drone 技术领域Technical field
本发明属于电子技术领域,具体地说,涉及一种无人机避障方法及一种无人机。The invention belongs to the field of electronic technology, and in particular to a method for avoiding obstacles of a drone and a drone.
背景技术Background technique
随着无人机技术的快速发展,无人机在航空拍摄,货物运载、资源勘探、路线勘察等领域得到广泛的应用。由于无人机在飞行时没有专门的人员操纵无人机飞行,无人机需要按照规划路线自主飞行,由于在规划路线时难以避免实际飞行中遇到的障碍物。With the rapid development of drone technology, drones have been widely used in aerial photography, cargo transportation, resource exploration, and route survey. Since the drone does not have special personnel to operate the drone during flight, the drone needs to fly autonomously according to the planned route, because it is difficult to avoid the obstacles encountered in the actual flight when planning the route.
无人机主可以基于视频的自主避障方法,躲避飞行中遇到的障碍物。该方法通过无人机的携带的摄像机拍摄周围环境的图像,并基于拍摄的图像分析前方是否存在障碍物导致无法飞行通过时,执行针对障碍物的障碍行为,从而可以保证无人机安全飞行。Unmanned aircraft owners can avoid obstacles encountered in flight based on video autonomous obstacle avoidance methods. The method captures an image of the surrounding environment by the camera carried by the drone, and analyzes the obstacle behavior of the obstacle based on the image taken to analyze whether there is an obstacle in front of the obstacle, so that the drone can be safely flying.
现有技术中,基于视频的自主避障方法是在遇到障碍物时进行悬停,无法继续飞行,只能将图像传回地面站点重新计算飞行路线后进行避障飞行,导致无人机的飞行效率较低。In the prior art, the video-based autonomous obstacle avoidance method is to hover when encountering an obstacle, and cannot continue to fly, and can only transmit the image back to the ground station to recalculate the flight route and then perform obstacle avoidance flight, resulting in the drone's Flight efficiency is low.
发明内容Summary of the invention
有鉴于此,本发明提供了一种无人机避障方法及一种无人机,解决了无人机飞行效率较低的技术问题,实现了无人机自主避障绕飞,可大大提高了飞行效率。In view of this, the present invention provides a UAV obstacle avoidance method and a UAV, which solves the technical problem that the UAV has low flight efficiency, and realizes the UAV autonomous obstacle avoidance flying, which can greatly improve Flight efficiency.
为了解决上述技术问题,本发明提供了一种无人机避障方法,包括:In order to solve the above technical problem, the present invention provides a method for obstacle avoidance of a drone, comprising:
通过双目相机获取当前环境图像的立体视差图;Obtaining a stereo disparity map of the current environment image through the binocular camera;
将所述立体视差图中的像素点进行聚类,获得多个聚类区域;Clustering pixel points in the stereo disparity map to obtain a plurality of cluster regions;
基于每一个聚类区域,计算每一个聚类区域对应的环境区域到无人机的飞行距离;Calculating a flight distance of the environmental area corresponding to each cluster area to the drone based on each cluster area;
判断每一个环境区域与所述无人机的飞行距离是否大于距离阈值;Determining whether the flight distance of each environmental area and the drone is greater than a distance threshold;
从飞行距离大于距离阈值的环境区域中选择任一个绕飞区域;Selecting any one of the flying areas from an environmental area whose flight distance is greater than the distance threshold;
计算所述无人机转向所述任一个绕飞区域的旋转角度,并控制所述无人机按照所述旋转角度进行绕飞。Calculating a rotation angle of the drone to the any of the flying regions, and controlling the drone to fly around according to the rotation angle.
优选地,所述将所述立体视差图中的像素点进行聚类,获得多个聚类区域包括:Preferably, the clustering the pixel points in the stereo disparity map to obtain a plurality of cluster regions includes:
将未标记的任一个像素点作为目标像素点,从所述目标像素点的相邻像素点开始,依次 判断各个像素点与所述目标像素点的像素差值是否在预设范围内;Taking any unmarked pixel as the target pixel, starting from the adjacent pixel of the target pixel Determining whether a pixel difference between each pixel point and the target pixel point is within a preset range;
将所述目标像素点的相邻区域中与所述目标像素点的像素差值在预设范围的像素点,标记为与所述目标像素点属于同一个聚类区域。Pixel points in the adjacent area of the target pixel that are different from the pixel of the target pixel in a preset range are marked as belonging to the same cluster area as the target pixel.
优选地,所述将未标记的任一个像素点作为目标像素点,从所述目标像素点的相邻像素点开始,依次判断各个像素点与所述目标像素点的像素差值是否在预设范围内包括:Preferably, the unmarked pixel point is used as a target pixel point, and starting from adjacent pixel points of the target pixel point, sequentially determining whether a pixel difference between each pixel point and the target pixel point is preset The scope includes:
将未标记的任一个像素点作为目标像素点,从所述目标像素点的每一个相邻像素点开始,依次判断连续相邻的各个像素点与所述目标像素点的像素差值是否在预设范围内直至任一个像素点与所述目标像素点的像素差值未在所述预设范围内。Taking any unmarked pixel as the target pixel, starting from each adjacent pixel of the target pixel, sequentially determining whether the pixel difference between each successive pixel and the target pixel is in advance It is assumed that the pixel difference value up to any one of the pixel points and the target pixel point is not within the preset range.
优选地,所述通过双目相机获取当前环境图像的立体视差图包括:Preferably, the stereo disparity map for acquiring the current environment image by the binocular camera comprises:
对所述双目相机进行标定,获得由所述双目相机的焦距、基线以及原点偏移量构成的Q矩阵;Calibrating the binocular camera to obtain a Q matrix composed of a focal length, a baseline, and an origin offset of the binocular camera;
基于所述Q矩阵对所述双目相机获取的所述当前环境图像进行图像矫正,获得矫正图像;Performing image correction on the current environment image acquired by the binocular camera based on the Q matrix to obtain a corrected image;
针对所述矫正图像进行立体匹配计算获得所述立体视差图;Performing stereo matching calculation on the corrected image to obtain the stereo disparity map;
所述基于每一个聚类区域,计算每一个聚类区域对应的环境区域到所述无人机的飞行距离包括:Calculating, according to each cluster area, a flight distance of the environment area corresponding to each cluster area to the drone includes:
确定每一个聚类区域中的目标像素点的位置坐标;Determining position coordinates of target pixel points in each cluster area;
基于每一个聚类区域的目标像素点,按照如下位置坐标公式计算获得所述每一个聚类区域对应的环境区域在立体坐标系中的位置坐标;Calculating, according to the target pixel point of each clustering region, the position coordinates of the environment region corresponding to each of the cluster regions in the three-dimensional coordinate system according to the following coordinate formula;
所述位置坐标计算公式为:The position coordinate calculation formula is:
Figure PCTCN2017108022-appb-000001
Figure PCTCN2017108022-appb-000001
其中,(X,Y,Z)为所述环境区域在立体坐标系中的位置坐标,Z表示所述每一个聚类区域对应的环境区域到所述无人机的飞行距离,Tx表示所述双目相机的基线,f表示所述双目相机的焦距,(x,y)表示任一聚类区域中所述目标像素点的位置坐标,d表示所述任一聚类区域中所述目标像素对应的视差,(cx,cy)为所述双目相机矫正后的成像原点。Where (X, Y, Z) is the position coordinate of the environmental region in the three-dimensional coordinate system, and Z represents the flight distance of the environmental region corresponding to each of the cluster regions to the drone, and T x represents a baseline of the binocular camera, f denotes a focal length of the binocular camera, (x, y) denotes a position coordinate of the target pixel point in any cluster region, and d denotes the one in the any cluster region The parallax corresponding to the target pixel, (c x , c y ) is the imaging origin after the binocular camera is corrected.
优选地,所述从飞行距离大于距离阈值的环境区域中选择任一个绕飞区域包括:Preferably, the selecting one of the environment regions from the flight distance greater than the distance threshold comprises:
如果所述无人机的飞行方向对应的环境区域与所述无人机的飞行距离小于距离阈值,从 飞行距离大于距离阈值的环境区域中选择飞行距离最远的一个环境区域作为候选区域;If the flight area corresponding to the flight direction of the drone and the flight distance of the drone is less than the distance threshold, An environmental area in which the flight distance is greater than the distance threshold and the farthest flight distance is selected as the candidate area;
判断所述候选区域是否满足飞行条件;Determining whether the candidate area satisfies flight conditions;
如果是,将所述候选区域作为绕飞区域;If yes, the candidate area is used as a flying area;
如果否,从飞行距离大于距离阈值且不包括所述候选区域的环境区域中选择飞行距离最远的一个环境区域作为候选区域,并返回判断所述候选区域是否满足飞行条件的步骤继续执行。If not, an environment region whose flight distance is the farthest from the flight distance greater than the distance threshold and not including the candidate region is selected as the candidate region, and the step of determining whether the candidate region satisfies the flight condition is continued.
优选地,还包括:Preferably, the method further comprises:
如果所述无人机的飞行方向对应的环境区域与所述无人机的飞行距离大于距离阈值,控制所述无人机按照原始路线飞行。If the flight area corresponding to the flight direction of the drone and the flight distance of the drone are greater than the distance threshold, the drone is controlled to fly according to the original route.
优选地,还包括:Preferably, the method further comprises:
如果任一个候选区域均不满足所述飞行条件,控制所述无人机悬停。If none of the candidate regions meet the flight conditions, the drone is controlled to hover.
优选地,所述计算所述无人机转向所述任一个绕飞区域的旋转角度,并控制所述无人机按照所述旋转角度进行绕飞,包括:Preferably, the calculating the rotation angle of the drone to the any of the flying regions and controlling the drone to fly according to the rotation angle comprises:
按照所述绕飞区域对应的位置坐标,计算所述无人机在所述绕飞区域的旋转的角度并控制所述无人机按照所述旋转角度进行绕飞。Calculating an angle of rotation of the drone in the flying area according to position coordinates corresponding to the flying area and controlling the drone to fly according to the rotation angle.
本发明还提供了一种无人避障,包括:The invention also provides an unmanned obstacle avoidance, comprising:
获取模块,用于通过双目相机获取当前环境图像的立体视差图;An acquisition module, configured to acquire a stereo disparity map of the current environment image by using a binocular camera;
聚类模块,用于将所述立体视差图中的像素点进行聚类,获得多个聚类区域;a clustering module, configured to cluster pixel points in the stereo disparity map to obtain a plurality of cluster regions;
飞行距离计算模块,用于基于每一个聚类区域,计算每一个聚类区域对应的环境区域到无人机的飞行距离;a flight distance calculation module, configured to calculate a flight distance of the environmental region corresponding to each cluster region to the drone based on each cluster region;
第一判断模块,用于判断每一个环境区域与所述无人机的飞行距离是否大于距离阈值;a first determining module, configured to determine whether a flight distance of each environmental area and the drone is greater than a distance threshold;
第一选择模块,用于从飞行距离大于距离阈值的环境区域中选择任一个绕飞区域;a first selection module, configured to select any one of the surrounding areas from an environmental area whose flight distance is greater than a distance threshold;
绕飞模块,用于计算所述无人机转向所述任一个绕飞区域的旋转角度,并控制所述无人机按照所述旋转角度进行绕飞。The flying module is configured to calculate a rotation angle of the drone to the any of the flying regions, and control the drone to fly around according to the rotation angle.
优选地,所述聚类模块包括:Preferably, the clustering module comprises:
第二判断单元,用于将未标记的任一个像素点作为目标像素点,从所述目标像素点的相 邻像素点开始,依次判断各个像素点与所述目标像素点的像素差值是否在预设范围内;a second determining unit, configured to use any pixel point that is not marked as a target pixel point, and phase from the target pixel point Starting from adjacent pixels, determining whether the pixel difference between each pixel point and the target pixel point is within a preset range;
标记单元,用于将所述目标像素点的相邻区域中与所述目标像素点的像素差值在预设范围的像素点,标记为与所述目标像素点属于同一个聚类区域。And a marking unit, configured to mark a pixel difference between the adjacent pixel of the target pixel and the pixel of the target pixel in a preset range, and mark the same cluster area as the target pixel.
优选地,所述第二判断单元具体用于:Preferably, the second determining unit is specifically configured to:
将未标记的任一个像素点作为目标像素点,从所述目标像素点的每一个相邻像素点开始,依次判断连续相邻的各个像素点与所述目标像素点的像素差值是否在预设范围内直至任一个像素点与所述目标像素点的像素差值未在所述预设范围内。Taking any unmarked pixel as the target pixel, starting from each adjacent pixel of the target pixel, sequentially determining whether the pixel difference between each successive pixel and the target pixel is in advance It is assumed that the pixel difference value up to any one of the pixel points and the target pixel point is not within the preset range.
优选地,所述第一获取模块具体用于:Preferably, the first acquiring module is specifically configured to:
对所述双目相机进行标定,获得由所述双目相机的焦距、基线以及原点偏移量构成的Q矩阵;Calibrating the binocular camera to obtain a Q matrix composed of a focal length, a baseline, and an origin offset of the binocular camera;
基于所述Q矩阵对所述双目相机获取的所述当前环境图像进行图像矫正,获得矫正图像;Performing image correction on the current environment image acquired by the binocular camera based on the Q matrix to obtain a corrected image;
针对所述矫正图像进行立体匹配计算获得所述立体视差图;Performing stereo matching calculation on the corrected image to obtain the stereo disparity map;
所述飞行距离计算模块具体用于:The flight distance calculation module is specifically configured to:
确定每一个聚类区域中的目标像素点的位置坐标;Determining position coordinates of target pixel points in each cluster area;
基于每一个聚类区域的目标像素点,按照如下位置坐标公式计算获得所述每一个聚类区域对应的环境区域在立体坐标系中的位置坐标;Calculating, according to the target pixel point of each clustering region, the position coordinates of the environment region corresponding to each of the cluster regions in the three-dimensional coordinate system according to the following coordinate formula;
所述位置坐标计算公式为:The position coordinate calculation formula is:
Figure PCTCN2017108022-appb-000002
Figure PCTCN2017108022-appb-000002
其中,(X,Y,Z)为所述环境区域在立体坐标系中的位置坐标,Z表示所述每一个聚类区域对应的环境区域到所述无人机的飞行距离,Tx表示所述双目相机的基线,f表示所述双目相机的焦距,(x,y)表示任一聚类区域中所述目标像素点的位置坐标,d表示所述任一聚类区域中所述目标像素对应的视差,(cx,cy)为所述双目相机矫正后的成像原点。Where (X, Y, Z) is the position coordinate of the environmental region in the three-dimensional coordinate system, and Z represents the flight distance of the environmental region corresponding to each of the cluster regions to the drone, and T x represents a baseline of the binocular camera, f denotes a focal length of the binocular camera, (x, y) denotes a position coordinate of the target pixel point in any cluster region, and d denotes the one in the any cluster region The parallax corresponding to the target pixel, (c x , c y ) is the imaging origin after the binocular camera is corrected.
优选地,所述第一选择模块包括:第二选择单元、第三判断单元、确定单元、第三选择单元;Preferably, the first selection module includes: a second selection unit, a third determination unit, a determination unit, and a third selection unit;
所述第二选择单元,用于如果所述无人机的飞行方向对应的环境区域与所述无人机的飞行距离小于距离阈值,从飞行距离大于距离阈值的环境区域中选择飞行距离最远的一个环境 区域作为候选区域;The second selecting unit is configured to select a farthest flight distance from an environment region where the flight distance is greater than the distance threshold if the flight distance corresponding to the flight direction of the drone is less than the distance threshold An environment The area is a candidate area;
所述第三判断单元,用于判断所述候选区域是否满足飞行条件;如果是,触发所述确定单元;如果否,触发所述第三选择单元;The third determining unit is configured to determine whether the candidate area meets a flight condition; if yes, trigger the determining unit; if not, trigger the third selecting unit;
所述确定单元,用于将所述候选区域作为绕飞区域;The determining unit is configured to use the candidate area as a flying area;
所述第三选择单元,用于从飞行距离大于距离阈值且不包括所述候选区域的环境区域中选择飞行距离最远的一个环境区域作为候选区域,并返回判断所述候选区域是否满足飞行条件的步骤继续执行。The third selection unit is configured to select, as a candidate region, an environment region whose flight distance is the farthest from the environment region where the flight distance is greater than the distance threshold and does not include the candidate region, and return to determine whether the candidate region meets the flight condition. The steps continue.
优选地,还包括:Preferably, the method further comprises:
第一控制模块,用于如果所述无人机的飞行方向对应的环境区域与所述无人机的飞行距离大于距离阈值,控制所述无人机按照原始路线飞行。The first control module is configured to control the drone to fly according to the original route if the flight area corresponding to the flight direction of the drone and the flight distance of the drone are greater than a distance threshold.
优选地,还包括:Preferably, the method further comprises:
第二控制模块,用于如果任一个候选区域均不满足所述飞行条件,控制所述无人机悬停。And a second control module, configured to control the drone to hover if none of the candidate regions meet the flight condition.
优选地,所述绕飞模块具体用于:Preferably, the flying module is specifically configured to:
按照所述绕飞区域对应的位置坐标,计算所述无人机在所述绕飞区域的旋转的角度并控制所述无人机按照所述旋转角度进行绕飞。Calculating an angle of rotation of the drone in the flying area according to position coordinates corresponding to the flying area and controlling the drone to fly according to the rotation angle.
与现有技术相比,本发明可以获得包括以下技术效果:Compared with the prior art, the present invention can obtain the following technical effects:
本发明给出了一种无人机避障方法及一种无人机,通过双目相机获取当前环境图像的立体视差图并将所述立体视差图中的像素点进行聚类,获得多个聚类区域。基于每一个聚类区域,计算每一个聚类区域对应的环境区域到无人机的飞行距离。通过判断每一个环境区域与所述无人机的飞行距离是否大于距离阈值,使无人机从飞行距离大于距离阈值的环境区域中选择任一个绕飞区域,并计算所述无人机转向所述任一个绕飞区域的旋转角度,并控制所述无人机按照所述旋转角度进行绕飞。本发明通过判断立体视差图中的各个聚类区域对应的环境区域到无人机的飞行距离,选择大于距离阈值的绕飞区域并按照转向该绕飞区域的旋转角度进行绕飞,从而实现了无人机自主避障绕飞,大大提高了飞行效率。The invention provides a UAV obstacle avoidance method and a UAV, which acquires a stereo disparity map of a current environment image through a binocular camera and clusters the pixels in the stereo disparity map to obtain a plurality of Cluster area. Based on each cluster area, the flight distance of the environment area corresponding to each cluster area to the drone is calculated. By determining whether the flight distance of each environmental area and the drone is greater than a distance threshold, the drone selects any one of the flying areas from the environmental area whose flight distance is greater than the distance threshold, and calculates the unmanned aerial vehicle Describe the rotation angle of any flying area, and control the drone to fly around according to the rotation angle. By determining the flight distance of the environment area corresponding to each cluster area in the stereo disparity map to the drone, the invention selects the flying area larger than the distance threshold and performs the flying according to the rotation angle of the flying area, thereby realizing The drone autonomously avoids obstacles and greatly improves flight efficiency.
附图说明DRAWINGS
图1是本发明实施例的一种无人机避障方法的一个实施例的流程图;;1 is a flow chart of an embodiment of a method for obstacle avoidance of a drone according to an embodiment of the present invention;
图2是本发明实施例的一种无人机避障方法的另一个实施例的流程图; 2 is a flow chart of another embodiment of a method for obstacle avoidance of a drone according to an embodiment of the present invention;
图3是本发明实施例的一种无人机的一个实施例的结构示意图;3 is a schematic structural view of an embodiment of a drone according to an embodiment of the present invention;
图4是本发明实施例的一种无人机的另一个实施例的结构示意图。4 is a schematic structural view of another embodiment of a drone according to an embodiment of the present invention.
具体实施方式Detailed ways
以下将配合附图及实施例来详细说明本发明的实施方式,藉此对本发明如何应用技术手段来解决技术问题并达成技术功效的实现过程能充分理解并据以实施。The embodiments of the present invention will be described in detail below with reference to the accompanying drawings and embodiments, in which the present invention can be fully understood and implemented by the technical means of solving the technical problems and achieving the technical effects.
随着电子技术的快速发展,无人机在在航空拍摄,货物运载、资源勘探、路线勘察等领域得到广泛的应用。无人机自主飞行需要自动避障功能保障无人机的飞行安全。现有的自动避障手段可以包括基于视频的避障,但是目前基于视频的自主避障方法是在遇到障碍物时进行悬停,无法继续飞行,只能将图像传回地面站点重新计算飞行路线后进行避障飞行,这就在使用上有很多限制,导致无人机的飞行效率较低。With the rapid development of electronic technology, drones have been widely used in aerial photography, cargo transportation, resource exploration, route survey and other fields. The autonomous flight of the drone requires automatic obstacle avoidance to ensure the flight safety of the drone. The existing automatic obstacle avoidance means may include video-based obstacle avoidance, but currently the video-based autonomous obstacle avoidance method is to hover when encountering an obstacle, unable to continue flying, and can only return the image to the ground station to recalculate the flight. After the route, obstacle avoidance flight, there are many restrictions on the use, resulting in low flight efficiency of the drone.
为了解决无人机飞行效率较低的技术问题,发明人经过一系列研究提出了本发明的技术方案。在本发明中,通过双目相机获取当前环境图像的立体视差图并将所述立体视差图中的像素点进行聚类,获得多个聚类区域。基于每一个聚类区域,计算出每一聚类区域对应的环境区域到无人机的飞行距离。然后通过判断每一个环境区域与所述无人机的飞行距离是否大于距离阈值,使无人机从飞行距离大于距离阈值的环境区域中选择任一个绕飞区域,并控制所述无人机按照所述旋转角度进行绕飞。本发明通过判断立体视差图中的各个聚类区域对应的环境区域到无人机的飞行距离,选择大于距离阈值的绕飞区域并按照转向该绕飞区域的旋转角度进行绕飞,从而实现了无人机自主避障绕飞,大大提高了飞行效率。In order to solve the technical problem that the drone has low flight efficiency, the inventors have proposed a technical solution of the present invention through a series of studies. In the present invention, a stereo disparity map of a current environment image is acquired by a binocular camera and pixels in the stereo disparity map are clustered to obtain a plurality of cluster regions. Based on each cluster area, the flight distance of the environment area corresponding to each cluster area to the drone is calculated. Then, by determining whether the flight distance of each environmental area and the drone is greater than a distance threshold, the drone selects any one of the flying areas from the environmental area whose flight distance is greater than the distance threshold, and controls the drone according to the drone. The rotation angle is made to fly around. By determining the flight distance of the environment area corresponding to each cluster area in the stereo disparity map to the drone, the invention selects the flying area larger than the distance threshold and performs the flying according to the rotation angle of the flying area, thereby realizing The drone autonomously avoids obstacles and greatly improves flight efficiency.
下面将结合附图对本发明技术方案进行详细描述。The technical solution of the present invention will be described in detail below with reference to the accompanying drawings.
图1为本发明实施例的一种无人机避障方法的一个实施例的流程图,该方法可以包括:FIG. 1 is a flowchart of an embodiment of a method for avoiding obstacles of a drone according to an embodiment of the present invention. The method may include:
101:通过双目相机获取当前环境图像的立体视差图。101: Obtain a stereo disparity map of the current environment image by a binocular camera.
双目相机是由左右两个相机组成位于无人机的前端,用于拍摄无人机飞行路线周围的环境图像。双目相机是模拟人眼视觉原理,通过左右两个相机分别获取两个环境图像,由于双目图像采集的环境图像会发生畸变,因此需要对采集的环境图像校正,并采用立体匹配算法,计算获得立体视差图。立体匹配算法主要是通过双目相机获取的两个环境图像间的对应关系,根据三角测量原理,得到视差图;在获得视差信息后,根据投影模型可以很容易地可以得到原始环境图像的深度信息和三维信息,从而计算获得立体视差图。The binocular camera is composed of two left and right cameras at the front end of the drone, and is used to capture the image of the environment around the flight path of the drone. The binocular camera simulates the principle of human eye vision. Two environment images are acquired by the left and right cameras. Since the environment image captured by the binocular image is distorted, it is necessary to correct the collected environment image and use a stereo matching algorithm to calculate Obtain a stereo disparity map. The stereo matching algorithm mainly uses the correspondence between two environmental images acquired by the binocular camera, and obtains a disparity map according to the triangulation principle; after obtaining the disparity information, the depth information of the original environment image can be easily obtained according to the projection model. And three-dimensional information to calculate the stereo disparity map.
102:将所述立体视差图中的像素点进行聚类,获得多个聚类区域。102: Cluster the pixels in the stereo disparity map to obtain a plurality of cluster regions.
在得到周围环境图像的立体视差图后,可以按照立体视差图中每一个像素点的灰度值对像素点进行聚类,将灰度值接近的像素点划分到一个聚类区域中,从而可以将立体视差图中 的像素点划分为多个聚类区域。After obtaining the stereo disparity map of the surrounding environment image, the pixel points may be clustered according to the gray value of each pixel in the stereo disparity map, and the pixel points with the gray value close to each other are divided into one cluster region, thereby Will be in the stereo disparity map The pixels are divided into a plurality of cluster regions.
103:基于每一个聚类区域,计算每一个聚类区域对应的环境区域到无人机的飞行距离。103: Calculate, according to each cluster area, a flight distance of the environment area corresponding to each cluster area to the drone.
可选地,计算得到每一个聚类区域对应的环境图像到无人机的飞行距离后,还可以按照每一个环境区域到无人机的距离的大小对环境区域进行由大到小的排序,获得环境区域到无人机的飞行距离的排列顺序。Optionally, after calculating the flight distance of the environment image corresponding to each cluster area to the drone, the environment area may be sorted from large to small according to the distance from each environment area to the drone. Obtain the order of flight distance from the environmental area to the drone.
可选地,作为又一个实施例,所述通过双目相机获取当前环境图像的立体视差图包括:Optionally, as still another embodiment, the acquiring a stereo disparity map of the current environment image by the binocular camera includes:
对所述双目相机进行标定,获得由所述双目相机的焦距、基线以及原点偏移量构成的Q矩阵;Calibrating the binocular camera to obtain a Q matrix composed of a focal length, a baseline, and an origin offset of the binocular camera;
基于所述Q矩阵对所述双目相机获取的所述当前环境图像进行图像矫正,获得矫正图像;其中,所述双目相机获取的当前环境图像包括两个,因此,对应获得两个矫正图像。Performing image correction on the current environment image acquired by the binocular camera based on the Q matrix to obtain a corrected image; wherein the current environment image acquired by the binocular camera includes two, and thus, two corrected images are obtained correspondingly. .
针对所述两个矫正图像进行立体匹配计算获得所述立体视差图;其中,Q矩阵的公式可以为:Performing a stereo matching calculation on the two corrected images to obtain the stereo disparity map; wherein, the formula of the Q matrix may be:
Figure PCTCN2017108022-appb-000003
Figure PCTCN2017108022-appb-000003
其中,所述Tx表示所述双目相机的基线,f表示所述双目相机的焦距,(cx,cy)、(c'x,c'y)分别为所述双目相机矫正后的两个成像原点,且校正后的两个成像原点相同即cx=c'x,cy=c'yWherein, the T x represents a baseline of the binocular camera, and f represents a focal length of the binocular camera, and (c x , c y ), (c′ x , c′ y ) are respectively corrected by the binocular camera. The latter two imaging origins, and the two corrected imaging origins are the same, ie c x = c' x , c y = c' y .
其中,成像原点是指双目摄像机的光轴与采集的图像平面的交点,通常位于图像的中心。双目相机采集的两个环境图像的成像原点可以通过双目相机标定获得,由于采集的环境图像需要进行图像矫正,且两个环境图像矫正后的成像原点相同均为(cx,cy),从而可以得到两个环境图像完全平行对准的理想形式为。The imaging origin refers to the intersection of the optical axis of the binocular camera and the acquired image plane, usually at the center of the image. The imaging origin of the two environmental images acquired by the binocular camera can be obtained by binocular camera calibration. Since the acquired environmental image needs image correction, the same imaging origin after correction of the two environmental images is (c x , c y ) Thus, it is possible to obtain an ideal form in which two environmental images are perfectly aligned in parallel.
可选地,通过双目相机标定获得该双目相机的内外参数后,可以通过该双目相机的内外参数对两个摄像头采集的两个环境图像进行图像矫正,经过矫正后的两个环境图像平面为完全平行对准的,以便于使用立体匹配算法对两个矫正图像进行立体匹配,获得当前环境图像的立体视差图。可选地,立体匹配算法可以是根据三角测量原理,计算得到双目相机获取的矫正图像中每一个像素点的视差d,从而可以获得视差图像,并在获得视差信息后,根据投影模型可以得到原始环境图像的深度信息和三维信息,从而计算获得立体视差图。Optionally, after obtaining the internal and external parameters of the binocular camera by binocular camera calibration, the two environment images acquired by the two cameras may be image corrected by the internal and external parameters of the binocular camera, and the corrected two environment images are obtained. The planes are perfectly parallel aligned to facilitate stereo matching of the two corrected images using a stereo matching algorithm to obtain a stereo disparity map of the current environment image. Optionally, the stereo matching algorithm may calculate the disparity d of each pixel in the corrected image acquired by the binocular camera according to the triangulation principle, so that the parallax image can be obtained, and after obtaining the disparity information, according to the projection model, The depth information and the three-dimensional information of the original environment image are calculated to obtain a stereo disparity map.
所述基于每一个聚类区域,计算每一个聚类区域对应的环境区域到所述无人机的飞行距 离包括:Calculating, according to each cluster area, a flight distance corresponding to an environmental area corresponding to each cluster area to the drone From:
确定每一个聚类区域中的目标像素点的位置坐标;Determining position coordinates of target pixel points in each cluster area;
基于每一个聚类区域的目标像素点,按照如下位置坐标公式计算获得所述每一个聚类区域对应的环境区域在立体坐标系中的位置坐标;Calculating, according to the target pixel point of each clustering region, the position coordinates of the environment region corresponding to each of the cluster regions in the three-dimensional coordinate system according to the following coordinate formula;
所述位置坐标计算公式可以为:The position coordinate calculation formula can be:
Figure PCTCN2017108022-appb-000004
Figure PCTCN2017108022-appb-000004
其中,(X,Y,Z)为所述环境区域在立体坐标系中的位置坐标,Z可以表示所述每一个聚类区域对应的环境区域到所述无人机的飞行距离,(x,y)表示任一聚类区域中所述目标像素点的位置坐标,d表示所述任一聚类区域中所述目标像素对应的视差。Where (X, Y, Z) is the position coordinate of the environmental region in the three-dimensional coordinate system, and Z may represent the flight distance of the environmental region corresponding to each of the cluster regions to the drone, (x, y) represents the position coordinates of the target pixel point in any of the cluster regions, and d represents the parallax corresponding to the target pixel in any of the cluster regions.
104:判断每一个环境区域与所述无人机的飞行距离是否大于距离阈值。104: Determine whether the flight distance of each environmental area and the drone is greater than a distance threshold.
105:从飞行距离大于距离阈值的环境区域中选择任一个绕飞区域;105: selecting any one of the flying areas from an environmental area whose flight distance is greater than a distance threshold;
可选地,所述距离阈值可以根据实际飞行中无人机的性能参数设定,例如无人机在距离前方障碍物三米的时候可进行悬停,保证不会撞到障碍物上,则认为该无人机可以设定三米为安全距离,并将该安全距离作为距离阈值,保证无人机的安全飞行。Optionally, the distance threshold may be set according to performance parameters of the actual in-flight drone, for example, the drone may hover when it is three meters away from the obstacle in front to ensure that it does not hit the obstacle. It is believed that the drone can set a safe distance of three meters and use the safety distance as a distance threshold to ensure safe flight of the drone.
106:计算所述无人机转向所述任一个绕飞区域的旋转角度,并控制所述无人机按照所述旋转角度进行绕飞。106: Calculate a rotation angle of the drone to the any of the flying regions, and control the drone to fly around according to the rotation angle.
可选地,在某些实施例中,所述计算所述无人机转向所述任一个绕飞区域的旋转角度,并控制所述无人机按照所述旋转角度进行绕飞,包括:Optionally, in some embodiments, the calculating the rotation angle of the drone to the any of the flying regions and controlling the drone to fly according to the rotation angle comprises:
按照所述绕飞区域对应的位置坐标,计算所述无人机在所述绕飞区域的旋转的角度并控制所述无人机按照所述旋转角度进行绕飞。Calculating an angle of rotation of the drone in the flying area according to position coordinates corresponding to the flying area and controlling the drone to fly according to the rotation angle.
在无人机选择绕飞区域后,无人机需要进行转向飞行至绕飞区域,因此需要计算无人机转向所选择绕飞区域的旋转角度。该旋转角度可以通过绕飞区域对应的环境区域在立体坐标系中的位置坐标(X,Y,Z)计算获得,并控制无人机按照该旋转角度进行绕飞。After the drone selects the flying area, the drone needs to make a steering flight to the flying area, so it is necessary to calculate the rotation angle of the selected flying area of the drone. The rotation angle can be calculated by the position coordinates (X, Y, Z) in the solid coordinate region corresponding to the environment region corresponding to the flying region, and the drone is controlled to fly around according to the rotation angle.
可选地,为了避免无人机的飞行路线不做太大改变,在所述任一个绕飞区域均可通过无人机的情况下,可以优先选择旋转角度最小的绕飞区域进行绕飞。Optionally, in order to prevent the flight path of the drone from being changed too much, in the case where any of the flying areas can pass through the unmanned aerial vehicle, the flying around the flying area with the smallest rotation angle can be preferentially selected.
本实施例中,通过判断立体视差图中的各个聚类区域对应的环境区域到无人机的距离,选择大于距离阈值的绕飞区域并按照该绕飞区域的旋转角度进行绕飞,并可优先选择旋转角度最小且大于距离区阈值的环境区域作为绕飞区域进行绕飞,不仅可以保证飞行路线不发生较大改变,并且实现了无人机自主避障绕飞,大大提高了飞行效率。In this embodiment, by determining the distance of the environment area corresponding to each cluster area in the stereo disparity map to the drone, the flying area larger than the distance threshold is selected and the flying around the flying area is performed, and The environmental region with the smallest rotation angle and greater than the threshold of the distance zone is preferentially used as the surrounding area to fly around, which not only ensures that the flight path does not change greatly, but also realizes the autonomous autonomous obstacle avoidance of the drone, which greatly improves the flight efficiency.
可选地,作为又一个实施例,所述将所述立体视差图中的像素点进行聚类,获得多个聚 类区域包括:Optionally, as another embodiment, the pixel points in the stereo disparity map are clustered to obtain multiple aggregations. Class areas include:
将未标记的任一个像素点作为目标像素点,从所述目标像素点的相邻像素点开始,依次判断各个像素点与所述目标像素点的像素差值是否在预设范围内;Taking any unmarked pixel as the target pixel, starting from the adjacent pixel of the target pixel, sequentially determining whether the pixel difference between each pixel and the target pixel is within a preset range;
将所述目标像素点的相邻区域中与所述目标像素点的像素差值在预设范围的像素点,标记为与所述目标像素点属于同一个聚类区域。Pixel points in the adjacent area of the target pixel that are different from the pixel of the target pixel in a preset range are marked as belonging to the same cluster area as the target pixel.
可选地,将所述立体立体视差图中的任一像素点进行聚类后,标记该任一个像素点所在的聚类区域。Optionally, after clustering any pixel in the stereoscopic disparity map, marking the cluster region where the pixel is located.
例如,首选可以选择立体视差图中的任一个未标记的像素点作为目标像素点,并将该目标像素点标记该像素点的聚类区域为区域0,且该目标像素点的位置坐标为(i,j),灰度值为d0For example, it is preferred to select any unmarked pixel point in the stereo disparity map as the target pixel point, and mark the clustering area of the target pixel point as the area 0, and the position coordinate of the target pixel point is ( i, j), the gray value is d 0 .
可选地,作为又一个实施例,所述将未标记的任一个像素点作为目标像素点,从所述目标像素点的相邻像素点开始,依次判断各个像素点与所述目标像素点的像素差值是否在预设范围内包括:Optionally, as another embodiment, the untagged one of the pixel points is used as a target pixel point, and the pixel points and the target pixel point are sequentially determined from adjacent pixel points of the target pixel point. Whether the pixel difference value is within the preset range includes:
将未标记的任一个像素点作为目标像素点,从所述目标像素点的每一个相邻像素点开始,依次判断连续相邻的各个像素点与所述目标像素点的像素差值是否在预设范围内直至任一个像素点与所述目标像素点的像素差值未在所述预设范围内。Taking any unmarked pixel as the target pixel, starting from each adjacent pixel of the target pixel, sequentially determining whether the pixel difference between each successive pixel and the target pixel is in advance It is assumed that the pixel difference value up to any one of the pixel points and the target pixel point is not within the preset range.
可选地,所述从所述目标像素点的相邻像素点开始,依次判断各个像素点与所述目标像素点的像素差值是否在预设范围内,可以是首先选择该目标像素点上方相邻位置坐标为(i,j-1)的第一像素点,并得到该第一像素点相应的灰度值d1。设置灰度值的预设范围为D,并判断|d0-d1|是否在预设范围内,如果在,则标记该第一像素点为区域0。然后依次判断与第一像素相邻的像素点是否属于区域0,直到该方向上找到不属于区域0的像素点后停止在该方向的判断。按照上述过程依次选择目标像素点下、左、右三个方向的像素点是否属于区域0,直到区域0聚类完成后,重新选择未标记的任一个像素点作为目标像素点标记为区域1并按照区域0的聚类过程完成区域1的聚类。以此类推,直到对该立体视差图中的每一像素点均标记聚类区域后表示聚类完成,从而可以获得多个聚类区域。Optionally, the determining, from the adjacent pixel points of the target pixel point, sequentially determining whether a pixel difference between each pixel point and the target pixel point is within a preset range, may first select the target pixel point The adjacent position coordinate is the first pixel point of (i, j-1), and the corresponding gray value d 1 of the first pixel point is obtained. The preset range of setting the gray value is D, and it is judged whether |d 0 -d 1 | is within the preset range, and if so, the first pixel is marked as area 0. Then, it is sequentially determined whether the pixel adjacent to the first pixel belongs to the region 0 until the pixel in the direction that does not belong to the region 0 is found, and the judgment in the direction is stopped. According to the above process, the pixel points in the three directions of the target pixel point, the left and the right direction are sequentially selected to belong to the area 0, and after the clustering of the area 0 is completed, any pixel that is not marked is reselected as the target pixel point is marked as the area 1 and The clustering of the region 1 is completed according to the clustering process of the region 0. By analogy, until the clustering area is marked for each pixel in the stereo disparity map, the clustering is completed, so that a plurality of clustering regions can be obtained.
图2为本发明实施例的一种无人机避障方法的另一个实施例的流程图,该方法可以包括:FIG. 2 is a flowchart of another embodiment of a method for obstacle avoidance of a drone according to an embodiment of the present invention. The method may include:
201:通过双目相机获取当前环境图像的立体视差图;201: Obtain a stereo disparity map of a current environment image by using a binocular camera;
202:将所述立体视差图中的像素点进行聚类,获得多个聚类区域;202: Clustering pixels in the stereo disparity map to obtain a plurality of cluster regions;
203:基于每一个聚类区域,计算每一个聚类区域对应的环境区域到无人机的飞行距离。203: Calculate, according to each cluster area, a flight distance of the environment area corresponding to each cluster area to the drone.
204:判断每一个环境区域与所述无人机的飞行距离是否大于距离阈值。 204: Determine whether a flight distance of each environmental area and the drone is greater than a distance threshold.
可选地,在判断每一个环境区域与所述无人机的飞行距离是否大于距离阈值时,可以根据每一个环境区域距离无人机的飞行距离的由大到小进行排序,得到一个按照环境区域的飞行距离进行排序的序列。Optionally, when determining whether the flight distance of each environment area and the drone is greater than a distance threshold, the flight distances of the drones may be sorted according to the distance from each environment area to obtain an environment according to the environment. The sequence in which the flight distances of the regions are sorted.
可选地,可以首先判断在无人机飞行方向对应的环境区域与所述无人机的飞行距离是否大于距离阈值。Optionally, it may be first determined whether the flight distance of the environment area corresponding to the flight direction of the drone and the drone is greater than a distance threshold.
可选地,作为又一个实施例,所述判断每一个环境区域与所述无人机的飞行距离是否大于距离阈值之后,还可以包括:Optionally, in another embodiment, after determining whether the distance between each of the environmental regions and the drone is greater than a distance threshold, the method may further include:
如果所述无人机的飞行方向对应的环境区域与所述无人机的飞行距离大于距离阈值,控制所述无人机按照原始路线飞行。If the flight area corresponding to the flight direction of the drone and the flight distance of the drone are greater than the distance threshold, the drone is controlled to fly according to the original route.
205:如果所述无人机的飞行方向对应的环境区域与所述无人机的飞行距离小于距离阈值,从飞行距离大于距离阈值的环境区域中选择飞行距离最远的一个环境区域作为候选区域。205: If the flight area corresponding to the flight direction of the UAV and the UAV is less than the distance threshold, select an environment area that is the farthest from the flight area with the flight distance greater than the distance threshold as the candidate area. .
当所述无人机的飞行方向对应的环境区域与所述无人机的飞行距离小于距离阈值时,则该无人机无法按照初始的飞行方向继续飞行,需要选择周围大于飞行距离的环境区域进行绕飞。可以按照各个环境区域对应的飞行距离的排列顺序,优先选择飞行距离最大的环境区域作为候选区域进行绕飞。当然,也可以选择距离无人机飞行方向最近的大于飞行距离的环境区域作为候选区域绕飞。When the flight area corresponding to the flight direction of the drone and the flight distance of the drone is less than the distance threshold, the drone cannot continue to fly according to the initial flight direction, and the environment area larger than the flight distance needs to be selected. Take a fly around. According to the arrangement order of the flight distances corresponding to the respective environmental regions, the environment region with the largest flight distance may be preferentially selected as the candidate region for the flight. Of course, it is also possible to select an environmental area larger than the flight distance closest to the flight direction of the drone as a candidate area to fly around.
206:判断所述候选区域是否满足飞行条件;如果是,执行步骤207;如果否,执行步骤208。206: Determine whether the candidate area meets a flight condition; if yes, perform step 207; if no, perform step 208.
可选地,所述飞行条件可以根据候选区域的实际大小进行判断。例如,无人机的尺寸是宽1米,高0.5米,如果候选区域的实际大小大于无人机的尺寸,则可以保证无人机安全通过,该候选区域满足飞行条件,如果该候选区域的实际大小小于无人机的尺寸,则无人机无法飞行通过该候选区域,因此不满足飞行条件。Alternatively, the flight condition may be determined according to the actual size of the candidate area. For example, the size of the drone is 1 meter wide and 0.5 meters high. If the actual size of the candidate area is larger than the size of the drone, the drone can be safely passed, and the candidate area satisfies the flight condition if the candidate area If the actual size is smaller than the size of the drone, the drone cannot fly through the candidate area and therefore does not meet the flight conditions.
207:将所述候选区域作为绕飞区域。207: The candidate area is used as a flying area.
208:从飞行距离大于距离阈值且不包括所述候选区域的环境区域中选择飞行距离最远的一个环境区域作为候选区域,并返回步骤206继续执行相应操作。208: Select an environment area whose flight distance is the farthest from the environment area whose flight distance is greater than the distance threshold and does not include the candidate area as the candidate area, and return to step 206 to continue the corresponding operation.
可选地,如果该候选区域不满足飞行条件,则重新按照环境区域对应的飞行距离的排列顺序,选择除该候选区域以外的环境区域中,飞行距离最大的候选区域作为新的候选区域,并重新判断该候选区域是否满足飞行条件。Optionally, if the candidate region does not satisfy the flight condition, the candidate region with the largest flight distance among the environmental regions other than the candidate region is selected as the new candidate region according to the order of the flight distance corresponding to the environment region. It is re-determined whether the candidate area satisfies the flight condition.
209:计算所述无人机转向所述任一个绕飞区域的旋转角度,并控制所述无人机按照所述旋转角度进行绕飞。209: Calculate a rotation angle of the drone to the any of the flying regions, and control the drone to fly around according to the rotation angle.
可选地,作为又一个实施例,所述判断每一个环境区域与所述无人机的飞行距离是否大 于距离阈值之后,还可以包括:Optionally, as still another embodiment, the determining whether the flight distance between each environment area and the drone is large After the distance threshold, it may also include:
如果任一个候选区域均不满足所述飞行条件,控制所述无人机悬停。If none of the candidate regions meet the flight conditions, the drone is controlled to hover.
可选地,如果任一个候选区域均不满足飞行条件,则表明无人机无法继续飞行,则控制该无人机悬停,并发送周围环境图像至地面站点重新计算飞行路线后,等待接收到地面站点发送的飞行路线后按照飞行指令避障绕飞。Optionally, if none of the candidate regions meet the flight condition, indicating that the drone cannot continue to fly, control the drone to hover and send the surrounding environment image to the ground station to recalculate the flight route, and then wait to receive After the flight route sent by the ground station, follow the flight instructions to avoid obstacles.
其中,本实施例中步骤201-步骤203的操作与图1实施例中步骤101-步骤103的操作相同,步骤209的操作与图1实施例中步骤106的操作相同,在此不再赘述。The operation of the step 201 - step 203 in the embodiment is the same as the operation of the step 101 - step 103 in the embodiment of FIG. 1 . The operation of the step 209 is the same as the operation of the step 106 in the embodiment of FIG. 1 , and details are not described herein again.
本实施例中,通过优先判断无人机飞行方向对应的环境区域是否大于距离阈值,如果大于则按照原始路线继续飞行;如果小于,则优先选择飞行距离大于距离阈值且飞行距离最远的一个环境区域作为候选区域,并判断该候选区域满足飞行条件的情况下进行绕飞,从而可使无人机根据周围环境,可以需找到最优的飞行路径进行绕飞,实现了无人机自主避障绕飞,大大提高了飞行效率。In this embodiment, by first determining whether the environment area corresponding to the flight direction of the drone is greater than the distance threshold, if it is greater than, the flight continues according to the original route; if less, the environment with the flight distance greater than the distance threshold and the farthest flight distance is preferentially selected. The area is used as a candidate area, and it is judged that the candidate area satisfies the flight condition, so that the drone can find an optimal flight path according to the surrounding environment, and realize the autonomous autonomous obstacle avoidance. Flying around, greatly improving flight efficiency.
图3为本发明实施例的一种无人机的一个实施例的结构示意图,该无人机可以包括:FIG. 3 is a schematic structural diagram of an embodiment of a drone according to an embodiment of the present invention. The drone may include:
获取模块301,用于通过双目相机获取当前环境图像的立体视差图。The obtaining module 301 is configured to acquire a stereo disparity map of the current environment image by using the binocular camera.
双目相机是由左右两个相机组成位于无人机的前端,用于拍摄无人机飞行路线周围的环境图像。双目相机是模拟人眼视觉原理获取两个环境图像,双目图像采集的环境图像会发生畸变,因此需要采用立体匹配算法,计算获得立体视差图。立体匹配算法主要是通过双目相机获取的一对环境图像间的对应关系,根据三角测量原理,得到视差图;在获得视差信息后,根据投影模型可以很容易地可以得到原始环境图像的深度信息和三维信息,从而计算获得立体视差图。The binocular camera is composed of two left and right cameras at the front end of the drone, and is used to capture the image of the environment around the flight path of the drone. The binocular camera acquires two environmental images by simulating the human eye vision principle, and the environmental image captured by the binocular image is distorted. Therefore, a stereo matching algorithm is needed to calculate and obtain a stereo disparity map. The stereo matching algorithm mainly uses the correspondence between a pair of environmental images acquired by the binocular camera, and obtains a disparity map according to the triangulation principle; after obtaining the disparity information, the depth information of the original environment image can be easily obtained according to the projection model. And three-dimensional information to calculate the stereo disparity map.
聚类模块302,用于将所述立体视差图中的像素点进行聚类,获得多个聚类区域。The clustering module 302 is configured to cluster the pixels in the stereo disparity map to obtain a plurality of cluster regions.
在得到周围环境图像的立体视差图后,可以按照立体视差图中每一个像素点的灰度值对像素点进行聚类,将灰度值接近的像素点划分到一个聚类区域中,从而可以将立体视差图中的像素点划分为多个聚类区域。After obtaining the stereo disparity map of the surrounding environment image, the pixel points may be clustered according to the gray value of each pixel in the stereo disparity map, and the pixel points with the gray value close to each other are divided into one cluster region, thereby The pixel points in the stereo disparity map are divided into a plurality of cluster regions.
飞行距离计算模块303,用于基于每一个聚类区域,计算每一个聚类区域对应的环境区域到无人机的飞行距离。The flight distance calculation module 303 is configured to calculate, according to each cluster area, a flight distance of the environment area corresponding to each cluster area to the drone.
可选地,计算得到每一个聚类区域对应的环境图像到无人机的飞行距离后,还可以按照每一个环境区域到无人机的距离的大小对环境区域进行由大到小的排序,获得环境区域到无人机的飞行距离的排列顺序。Optionally, after calculating the flight distance of the environment image corresponding to each cluster area to the drone, the environment area may be sorted from large to small according to the distance from each environment area to the drone. Obtain the order of flight distance from the environmental area to the drone.
可选地,作为又一个实施例,所述获取模块301具体可以用于:Optionally, as a further embodiment, the acquiring module 301 may be specifically configured to:
对所述双目相机进行标定,获得由所述双目相机的焦距、基线以及原点偏移量构成的Q 矩阵;Calibrating the binocular camera to obtain a Q composed of the focal length, the baseline, and the origin offset of the binocular camera matrix;
基于所述Q矩阵对所述双目相机获取的所述当前环境图像进行图像矫正,获得矫正图像;其中,所述双目相机获取的当前环境图像包括两个,因此,矫正后获得两个矫正图像。Performing image correction on the current environment image acquired by the binocular camera based on the Q matrix to obtain a corrected image; wherein the current environment image acquired by the binocular camera includes two, and thus, two corrections are obtained after correction image.
针对所述两个矫正图像进行立体匹配计算获得所述立体视差图;Performing a stereo matching calculation on the two corrected images to obtain the stereo disparity map;
其中,Q矩阵的公式可以为:Wherein, the formula of the Q matrix can be:
Figure PCTCN2017108022-appb-000005
Figure PCTCN2017108022-appb-000005
其中,所述Tx表示所述双目相机的基线,f表示所述双目相机的焦距,(cx,cy)、(c'x,c'y)分别为所述双目相机的左右两个相机矫正后的两个成像原点,且校正后的两个成像原点相同即cx=c'x,cy=c'yWherein, the T x represents a baseline of the binocular camera, and f represents a focal length of the binocular camera, (c x , c y ), (c′ x , c′ y ) respectively for the binocular camera The two imaging origins corrected by the left and right cameras, and the two corrected imaging origins are the same, ie c x = c' x , c y = c' y .
其中,成像原点是指双目摄像机的光轴与采集的图像平面的交点,通常位于图像的中心。双目相机采集的两个环境图像的成像原点可以通过双目相机标定获得,由于采集的环境图像需要进行图像矫正,且两个环境图像矫正后的成像原点相同均为(cx,cy),从而可以得到两个环境图像完全平行对准的理想形式为。The imaging origin refers to the intersection of the optical axis of the binocular camera and the acquired image plane, usually at the center of the image. The imaging origin of the two environmental images acquired by the binocular camera can be obtained by binocular camera calibration. Since the acquired environmental image needs image correction, the same imaging origin after correction of the two environmental images is (c x , c y ) Thus, it is possible to obtain an ideal form in which two environmental images are perfectly aligned in parallel.
可选地,通过双目相机标定获得该双目相机的内外参数后,可以通过该双目相机的内外参数对两个摄像头采集的两个环境图像进行图像矫正,经过矫正后的两个环境图像平面为完全平行对准的,以便于使用立体匹配算法对两个矫正图像进行立体匹配,获得当前环境图像的立体视差图。可选地,立体匹配算法可以根据三角测量原理,计算得到双目相机获取的环境图像中每一个像素点的视差d,从而可以获得视差图像。并在获得视差信息后,根据投影模型可以得到原始环境图像的深度信息和三维信息,从而计算获得立体视差图。Optionally, after obtaining the internal and external parameters of the binocular camera by binocular camera calibration, the two environment images acquired by the two cameras may be image corrected by the internal and external parameters of the binocular camera, and the corrected two environment images are obtained. The planes are perfectly parallel aligned to facilitate stereo matching of the two corrected images using a stereo matching algorithm to obtain a stereo disparity map of the current environment image. Optionally, the stereo matching algorithm may calculate the disparity d of each pixel in the environment image acquired by the binocular camera according to the triangulation principle, so that the parallax image can be obtained. After obtaining the disparity information, the depth information and the three-dimensional information of the original environment image can be obtained according to the projection model, thereby obtaining a stereo disparity map.
所述飞行距离计算模块303具体可以用于:The flight distance calculation module 303 can be specifically configured to:
确定每一个聚类区域中的目标像素点的位置坐标;Determining position coordinates of target pixel points in each cluster area;
基于每一个聚类区域的目标像素点,按照如下位置坐标公式计算获得所述每一个聚类区域对应的环境区域在立体坐标系中的位置坐标;Calculating, according to the target pixel point of each clustering region, the position coordinates of the environment region corresponding to each of the cluster regions in the three-dimensional coordinate system according to the following coordinate formula;
所述位置坐标计算公式可以为:The position coordinate calculation formula can be:
Figure PCTCN2017108022-appb-000006
Figure PCTCN2017108022-appb-000006
其中,(X,Y,Z)为所述环境区域在立体坐标系中的位置坐标,Z可以表示所述每一个聚 类区域对应的环境区域到所述无人机的飞行距离,(x,y)表示任一聚类区域中所述目标像素点的位置坐标,d表示所述任一聚类区域中所述目标像素对应的视差。Where (X, Y, Z) is the position coordinate of the environmental region in the three-dimensional coordinate system, and Z may represent each of the poly (x, y) represents the positional coordinates of the target pixel in any cluster area, and d represents the target in any of the cluster areas. The parallax corresponding to the pixel.
第一判断模块304,用于判断每一个环境区域与所述无人机的飞行距离是否大于距离阈值。The first determining module 304 is configured to determine whether a flight distance of each environment area and the drone is greater than a distance threshold.
第一选择模块305,用于从飞行距离大于距离阈值的环境区域中选择任一个绕飞区域;a first selection module 305, configured to select any one of the surrounding areas from an environmental area whose flight distance is greater than a distance threshold;
可选地,所述距离阈值可以根据实际飞行中无人机的性能参数设定,例如无人机在距离前方障碍物三米的时候可进行悬停,保证不会撞到障碍物上,则认为该无人机可以设定三米为安全距离,并将该安全距离作为距离阈值,保证无人机的安全飞行。Optionally, the distance threshold may be set according to performance parameters of the actual in-flight drone, for example, the drone may hover when it is three meters away from the obstacle in front to ensure that it does not hit the obstacle. It is believed that the drone can set a safe distance of three meters and use the safety distance as a distance threshold to ensure safe flight of the drone.
绕飞模块306,用于计算所述无人机转向所述任一个绕飞区域的旋转角度,并控制所述无人机按照所述旋转角度进行绕飞。The flying module 306 is configured to calculate a rotation angle of the drone to the any of the flying regions, and control the drone to fly around according to the rotation angle.
可选地,在某些实施例中,所述绕飞模块306具体可以用于:Optionally, in some embodiments, the flying module 306 can be specifically configured to:
按照所述绕飞区域对应的位置坐标,计算所述无人机在所述绕飞区域的旋转的角度并控制所述无人机按照所述旋转角度进行绕飞。Calculating an angle of rotation of the drone in the flying area according to position coordinates corresponding to the flying area and controlling the drone to fly according to the rotation angle.
在无人机选择绕飞区域后,无人机需要进行转向飞行至绕飞区域,因此需要计算无人机转向所选择绕飞区域的旋转角度。该旋转角度可以通过绕飞区域对应的环境区域在立体坐标系中的位置坐标(X,Y,Z)计算获得,并控制无人机按照该旋转角度进行绕飞。After the drone selects the flying area, the drone needs to make a steering flight to the flying area, so it is necessary to calculate the rotation angle of the selected flying area of the drone. The rotation angle can be calculated by the position coordinates (X, Y, Z) in the solid coordinate region corresponding to the environment region corresponding to the flying region, and the drone is controlled to fly around according to the rotation angle.
可选地,为了避免无人机的飞行路线不做太大改变,在所述任一个绕飞区域均可通过无人机的情况下,可以优先选择旋转角度最小的绕飞区域进行绕飞。Optionally, in order to prevent the flight path of the drone from being changed too much, in the case where any of the flying areas can pass through the unmanned aerial vehicle, the flying around the flying area with the smallest rotation angle can be preferentially selected.
本实施例中,通过判断立体视差图中的各个聚类区域对应的环境区域到无人机的飞行距离,选择大于距离阈值的绕飞区域并按照该绕飞区域的旋转角度进行绕飞,并可优先选择旋转角度最小且大于距离区阈值的环境区域作为绕飞区域进行绕飞,不仅可以保证飞行路线不发生较大改变,并且实现了无人机自主避障绕飞,大大提高了飞行效率。In this embodiment, by determining the flight distance of the environment area corresponding to each cluster area in the stereo disparity map to the drone, selecting a flying area larger than the distance threshold and performing a flight according to the rotation angle of the flying area, and The environment area with the smallest rotation angle and greater than the threshold of the distance zone can be preferentially used as the flying area to fly around, which not only ensures that the flight path does not change greatly, but also realizes the autonomous autonomous obstacle avoidance of the drone, which greatly improves the flight efficiency. .
可选地,作为又一个实施例,所述聚类模块可以包括:Optionally, in another embodiment, the clustering module may include:
第二判断单元,用于将未标记的任一个像素点作为目标像素点,从所述目标像素点的相邻像素点开始,依次判断各个像素点与所述目标像素点的像素差值是否在预设范围内;a second determining unit, configured to use any pixel point that is not marked as a target pixel point, and sequentially determine, from the adjacent pixel points of the target pixel point, whether the pixel difference between each pixel point and the target pixel point is Within the preset range;
标记单元,用于将所述目标像素点的相邻区域中与所述目标像素点的像素差值在预设范围的像素点,标记为与所述目标像素点属于同一个聚类区域。And a marking unit, configured to mark a pixel difference between the adjacent pixel of the target pixel and the pixel of the target pixel in a preset range, and mark the same cluster area as the target pixel.
可选地,将所述立体立体视差图中的任一像素点进行聚类后,标记该任一个像素点所在的聚类区域。Optionally, after clustering any pixel in the stereoscopic disparity map, marking the cluster region where the pixel is located.
例如,首选可以选择立体视差图中的任一个未标记的像素点作为目标像素点,并将该目标像素点标记该像素点的聚类区域为区域0,且该目标像素点的位置坐标为(i,j),灰度值为 d0For example, it is preferred to select any unmarked pixel point in the stereo disparity map as the target pixel point, and mark the clustering area of the target pixel point as the area 0, and the position coordinate of the target pixel point is ( i, j), the gray value is d 0 .
可选地,作为又一个实施例,所述第二判断单元具体可以用于:Optionally, as a further embodiment, the second determining unit may be specifically configured to:
将未标记的任一个像素点作为目标像素点,从所述目标像素点的每一个相邻像素点开始,依次判断连续相邻的各个像素点与所述目标像素点的像素差值是否在预设范围内直至任一个像素点与所述目标像素点的像素差值未在所述预设范围内。Taking any unmarked pixel as the target pixel, starting from each adjacent pixel of the target pixel, sequentially determining whether the pixel difference between each successive pixel and the target pixel is in advance It is assumed that the pixel difference value up to any one of the pixel points and the target pixel point is not within the preset range.
可选地,所述从所述目标像素点的相邻像素点开始,依次判断各个像素点与所述目标像素点的像素差值是否在预设范围内,可以是首先选择该目标像素点上方相邻位置坐标为(i,j-1)的第一像素点,并得到该第一像素点相应的灰度值d1。设置灰度值的预设范围为D,并判断|d0-d1|是否在预设范围内,如果在,则标记该第一像素点为区域0。然后依次判断与第一像素相邻的像素点是否属于区域0,直到该方向上找到不属于区域0的像素点后停止在该方向的判断。按照上述过程依次选择目标像素点下、左、右三个方向的像素点是否属于区域0,直到区域0聚类完成后,重新选择未标记的任一个像素点作为目标像素点标记为区域1并按照区域0的聚类过程完成区域1的聚类。以此类推,直到对该立体视差图中的每一像素点均标记聚类区域后表示聚类完成,从而可以获得多个聚类区域。Optionally, the determining, from the adjacent pixel points of the target pixel point, sequentially determining whether a pixel difference between each pixel point and the target pixel point is within a preset range, may first select the target pixel point The adjacent position coordinate is the first pixel point of (i, j-1), and the corresponding gray value d 1 of the first pixel point is obtained. The preset range of setting the gray value is D, and it is judged whether |d 0 -d 1 | is within the preset range, and if so, the first pixel is marked as area 0. Then, it is sequentially determined whether the pixel adjacent to the first pixel belongs to the region 0 until the pixel in the direction that does not belong to the region 0 is found, and the judgment in the direction is stopped. According to the above process, the pixel points in the three directions of the target pixel point, the left and the right direction are sequentially selected to belong to the area 0, and after the clustering of the area 0 is completed, any pixel that is not marked is reselected as the target pixel point is marked as the area 1 and The clustering of the region 1 is completed according to the clustering process of the region 0. By analogy, until the clustering area is marked for each pixel in the stereo disparity map, the clustering is completed, so that a plurality of clustering regions can be obtained.
图4为本发明实施例的一种无人机的另一个实施例的结构示意图,该无人机可以包括:4 is a schematic structural diagram of another embodiment of a drone according to an embodiment of the present invention, and the drone may include:
获取模块401,用于通过双目相机获取当前环境图像的立体视差图;The obtaining module 401 is configured to acquire a stereo disparity map of the current environment image by using a binocular camera;
聚类模块402,用于将所述立体视差图中的像素点进行聚类,获得多个聚类区域;The clustering module 402 is configured to cluster the pixels in the stereo disparity map to obtain a plurality of cluster regions.
飞行距离计算模块403,用于基于每一个聚类区域,计算每一个聚类区域对应的环境区域到无人机的飞行距离。The flight distance calculation module 403 is configured to calculate, according to each cluster area, a flight distance of the environment area corresponding to each cluster area to the drone.
第一判断模块404,用于判断每一个环境区域与所述无人机的飞行距离是否大于距离阈值。The first determining module 404 is configured to determine whether a flight distance of each environmental area and the drone is greater than a distance threshold.
可选地,在判断每一个环境区域与所述无人机的飞行距离是否大于距离阈值时,可以根据每一个环境区域距离无人机的飞行距离的由大到小进行排序,得到一个按照环境区域的飞行距离进行排序的序列。Optionally, when determining whether the flight distance of each environment area and the drone is greater than a distance threshold, the flight distances of the drones may be sorted according to the distance from each environment area to obtain an environment according to the environment. The sequence in which the flight distances of the regions are sorted.
可选地,可以首先判断在无人机飞行方向对应的环境区域与所述无人机的飞行距离是否大于距离阈值。Optionally, it may be first determined whether the flight distance of the environment area corresponding to the flight direction of the drone and the drone is greater than a distance threshold.
可选地,作为又一个实施例,所述第一判断模块404之后,还可以包括:Optionally, as a further embodiment, after the first determining module 404, the method may further include:
第一控制模块,用于如果所述无人机的飞行方向对应的环境区域与所述无人机的飞行距离大于距离阈值,控制所述无人机按照原始路线飞行。The first control module is configured to control the drone to fly according to the original route if the flight area corresponding to the flight direction of the drone and the flight distance of the drone are greater than a distance threshold.
第一选择模块405,用于从飞行距离大于距离阈值的环境区域中选择任一个绕飞区域;a first selection module 405, configured to select any one of the surrounding areas from an environmental area whose flight distance is greater than a distance threshold;
所述第一选择模块405可以包括: The first selection module 405 can include:
第二选择单元411,用于如果所述无人机的飞行方向对应的环境区域与所述无人机的飞行距离小于距离阈值,从飞行距离大于距离阈值的环境区域中选择飞行距离最远的一个环境区域作为候选区域。The second selecting unit 411 is configured to select, if the flight distance of the environment area corresponding to the flight direction of the drone and the drone is less than the distance threshold, select the farthest flight distance from the environmental area where the flight distance is greater than the distance threshold An environmental area is used as a candidate area.
当所述无人机的飞行方向对应的环境区域与所述无人机的飞行距离小于距离阈值时,则该无人机无法按照初始的飞行方向继续飞行,需要选择周围大于飞行距离的环境区域进行绕飞。可以按照各个环境区域对应的飞行距离的排列顺序,优先选择飞行距离最大的环境区域作为候选区域进行绕飞。当然,也可以选择距离无人机飞行方向最近的大于飞行距离的环境区域作为候选区域绕飞。When the flight area corresponding to the flight direction of the drone and the flight distance of the drone is less than the distance threshold, the drone cannot continue to fly according to the initial flight direction, and the environment area larger than the flight distance needs to be selected. Take a fly around. According to the arrangement order of the flight distances corresponding to the respective environmental regions, the environment region with the largest flight distance may be preferentially selected as the candidate region for the flight. Of course, it is also possible to select an environmental area larger than the flight distance closest to the flight direction of the drone as a candidate area to fly around.
第三判断单元412,用于判断所述候选区域是否满足飞行条件;如果是,触发所述确定单元413;如果否,触发所述第三选择单元414。The third determining unit 412 is configured to determine whether the candidate area meets a flight condition; if yes, trigger the determining unit 413; if not, trigger the third selecting unit 414.
可选地,所述飞行条件可以根据候选区域的实际大小进行判断。例如,无人机的尺寸是宽1米,高0.5米,如果候选区域的实际大小大于无人机的尺寸,则可以保证无人机安全通过,该候选区域满足飞行条件,如果该候选区域的实际大小小于无人机的尺寸,则无人机无法飞行通过该候选区域,因此不满足飞行条件。Alternatively, the flight condition may be determined according to the actual size of the candidate area. For example, the size of the drone is 1 meter wide and 0.5 meters high. If the actual size of the candidate area is larger than the size of the drone, the drone can be safely passed, and the candidate area satisfies the flight condition if the candidate area If the actual size is smaller than the size of the drone, the drone cannot fly through the candidate area and therefore does not meet the flight conditions.
所述确定单元413,用于将所述候选区域作为绕飞区域。The determining unit 413 is configured to use the candidate area as a flying area.
所述第三选择单元414,用于从飞行距离大于距离阈值且不包括所述候选区域的环境区域中选择飞行距离最远的一个环境区域作为候选区域,并返回步骤206继续执行相应操作。The third selecting unit 414 is configured to select, as the candidate region, an environment region whose flight distance is the farthest from the environment region whose flight distance is greater than the distance threshold and does not include the candidate region, and returns to step 206 to continue performing the corresponding operation.
可选地,如果该候选区域不满足飞行条件,则重新按照环境区域对应的飞行距离的排列顺序,选择除该候选区域以外的环境区域中,飞行距离最大的候选区域作为新的候选区域,并重新判断该候选区域是否满足飞行条件。Optionally, if the candidate region does not satisfy the flight condition, the candidate region with the largest flight distance among the environmental regions other than the candidate region is selected as the new candidate region according to the order of the flight distance corresponding to the environment region. It is re-determined whether the candidate area satisfies the flight condition.
绕飞模块406,用于计算所述无人机转向所述任一个绕飞区域的旋转角度,并控制所述无人机按照所述旋转角度进行绕飞。The flying module 406 is configured to calculate a rotation angle of the drone to the any of the flying regions, and control the drone to fly around according to the rotation angle.
可选地,作为又一个实施例,所述第一判断模块404之后,还可以包括:Optionally, as a further embodiment, after the first determining module 404, the method may further include:
第二控制模块,用于如果任一个候选区域均不满足所述飞行条件,控制所述无人机悬停。And a second control module, configured to control the drone to hover if none of the candidate regions meet the flight condition.
可选地,如果任一个候选区域均不满足飞行条件,则表明无人机无法继续飞行,则控制该无人机悬停,并发送周围环境图像至地面站点重新计算飞行路线后,等待接收到地面站点发送的飞行路线后按照飞行指令避障绕飞。Optionally, if none of the candidate regions meet the flight condition, indicating that the drone cannot continue to fly, control the drone to hover and send the surrounding environment image to the ground station to recalculate the flight route, and then wait to receive After the flight route sent by the ground station, follow the flight instructions to avoid obstacles.
其中,本实施例中获取模块401与图3实施例中获取模块301相同,聚类模块402与图3实施例中聚类模块302相同,飞行距离计算模块403与图3实施例中飞行距离计算模块303相同,绕飞模块406与图3实施例中绕飞模块306相同,在此不再赘述。The acquisition module 401 is the same as the acquisition module 301 in the embodiment of FIG. 3, the clustering module 402 is the same as the clustering module 302 in the embodiment of FIG. 3, and the flight distance calculation module 403 and the flight distance calculation in the embodiment of FIG. The module 303 is the same, and the flying module 406 is the same as the flying module 306 in the embodiment of FIG. 3, and details are not described herein again.
本实施例中,通过优先判断无人机飞行方向对应的环境区域是否大于距离阈值,如果大 于则按照原始路线继续飞行;如果小于,则优先选择飞行距离大于距离阈值且飞行距离最远的一个环境区域作为候选区域,并判断该候选区域满足飞行条件的情况下进行绕飞,从而可使无人机根据周围环境,可以需找到最优的飞行路径进行绕飞,实现了无人机自主避障绕飞,大大提高了飞行效率。In this embodiment, by determining whether the environment area corresponding to the flight direction of the drone is greater than the distance threshold, if Then, the flight continues according to the original route; if it is smaller, an environment region whose flight distance is greater than the distance threshold and the farthest flight distance is preferentially selected as the candidate region, and it is determined that the candidate region satisfies the flight condition, so that the flight can be performed. According to the surrounding environment, the drone can find the optimal flight path to fly around, and realize the autonomous autonomous obstacle avoidance of the drone, which greatly improves the flight efficiency.
在一个典型的配置中,计算设备包括一个或多个处理器(CPU)、输入/输出接口、网络接口和内存。In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
内存可能包括计算机可读介质中的非永久性存储器,随机存取存储器(RAM)和/或非易失性内存等形式,如只读存储器(ROM)或闪存(flash RAM)。内存是计算机可读介质的示例。The memory may include non-persistent memory, random access memory (RAM), and/or non-volatile memory in a computer readable medium, such as read only memory (ROM) or flash memory. Memory is an example of a computer readable medium.
计算机可读介质包括永久性和非永久性、可移动和非可移动媒体可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括,但不限于相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁磁盘存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。按照本文中的界定,计算机可读介质不包括非暂存电脑可读媒体(transitory media),如调制的数据信号和载波。Computer readable media includes both permanent and non-persistent, removable and non-removable media. Information storage can be implemented by any method or technology. The information can be computer readable instructions, data structures, modules of programs, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read only memory. (ROM), electrically erasable programmable read only memory (EEPROM), flash memory or other memory technology, compact disk read only memory (CD-ROM), digital versatile disk (DVD) or other optical storage, Magnetic tape cartridges, magnetic tape storage or other magnetic storage devices or any other non-transportable media can be used to store information that can be accessed by a computing device. As defined herein, computer readable media does not include non-transitory computer readable media, such as modulated data signals and carrier waves.
如在说明书及权利要求当中使用了某些词汇来指称特定组件。本领域技术人员应可理解,硬件制造商可能会用不同名词来称呼同一个组件。本说明书及权利要求并不以名称的差异来作为区分组件的方式,而是以组件在功能上的差异来作为区分的准则。如在通篇说明书及权利要求当中所提及的“包含”为一开放式用语,故应解释成“包含但不限定于”。“大致”是指在可接收的误差范围内,本领域技术人员能够在一定误差范围内解决所述技术问题,基本达到所述技术效果。此外,“耦接”一词在此包含任何直接及间接的电性耦接手段。因此,若文中描述一第一装置耦接于一第二装置,则代表所述第一装置可直接电性耦接于所述第二装置,或通过其他装置或耦接手段间接地电性耦接至所述第二装置。说明书后续描述为实施本发明的较佳实施方式,然所述描述乃以说明本发明的一般原则为目的,并非用以限定本发明的范围。本发明的保护范围当视所附权利要求所界定者为准。Certain terms are used throughout the description and claims to refer to particular components. Those skilled in the art will appreciate that hardware manufacturers may refer to the same component by different nouns. The present specification and the claims do not use the difference in the name as the means for distinguishing the components, but the difference in function of the components as the criterion for distinguishing. The word "comprising" as used throughout the specification and claims is an open term and should be interpreted as "including but not limited to". "Substantially" means that within the range of acceptable errors, those skilled in the art will be able to solve the technical problems within a certain error range, substantially achieving the technical effects. In addition, the term "coupled" is used herein to include any direct and indirect electrical coupling means. Therefore, if a first device is coupled to a second device, the first device can be directly electrically coupled to the second device, or electrically coupled indirectly through other devices or coupling means. Connected to the second device. The description of the present invention is intended to be illustrative of the preferred embodiments of the invention. The scope of the invention is defined by the appended claims.
还需要说明的是,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的商品或者系统不仅包括那些要素,而且还包括没有明确列出的其 他要素,或者是还包括为这种商品或者系统所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的商品或者系统中还存在另外的相同要素It should also be noted that the terms "including", "comprising" or "comprising" or any other variations thereof are intended to encompass a non-exclusive inclusion, such that the item or system comprising a plurality of elements includes not only those elements but also Its His elements, or include elements inherent to such goods or systems. In the absence of more restrictions, elements defined by the phrase "including one..." do not exclude the existence of additional identical elements in the item or system that includes the element.
上述说明示出并描述了本发明的若干优选实施例,但如前所述,应当理解本发明并非局限于本文所披露的形式,不应看作是对其他实施例的排除,而可用于各种其他组合、修改和环境,并能够在本文所述申请构想范围内,通过上述教导或相关领域的技术或知识进行改动。而本领域人员所进行的改动和变化不脱离本发明的精神和范围,则都应在本发明所附权利要求的保护范围内。 The above description illustrates and describes several preferred embodiments of the present invention, but as described above, it should be understood that the invention is not limited to the forms disclosed herein, and should not be construed as Other combinations, modifications, and environments are possible and can be modified by the above teachings or related art or knowledge within the scope of the application concept described herein. All changes and modifications made by those skilled in the art are intended to be within the scope of the appended claims.

Claims (16)

  1. 一种无人机避障方法,其特征在于,包括:A method for obstacle avoidance of a drone, characterized in that it comprises:
    通过双目相机获取当前环境图像的立体视差图;Obtaining a stereo disparity map of the current environment image through the binocular camera;
    将所述立体视差图中的像素点进行聚类,获得多个聚类区域;Clustering pixel points in the stereo disparity map to obtain a plurality of cluster regions;
    基于每一个聚类区域,计算每一个聚类区域对应的环境区域到无人机的飞行距离;Calculating a flight distance of the environmental area corresponding to each cluster area to the drone based on each cluster area;
    判断每一个环境区域与所述无人机的飞行距离是否大于距离阈值;Determining whether the flight distance of each environmental area and the drone is greater than a distance threshold;
    从飞行距离大于距离阈值的环境区域中选择任一个绕飞区域;Selecting any one of the flying areas from an environmental area whose flight distance is greater than the distance threshold;
    计算所述无人机转向所述任一个绕飞区域的旋转角度,并控制所述无人机按照所述旋转角度进行绕飞。Calculating a rotation angle of the drone to the any of the flying regions, and controlling the drone to fly around according to the rotation angle.
  2. 根据权利要求1所述的方法,其特征在于,所述将所述立体视差图中的像素点进行聚类,获得多个聚类区域包括:The method according to claim 1, wherein the clustering the pixel points in the stereo disparity map to obtain a plurality of cluster regions comprises:
    将未标记的任一个像素点作为目标像素点,从所述目标像素点的相邻像素点开始,依次判断各个像素点与所述目标像素点的像素差值是否在预设范围内;Taking any unmarked pixel as the target pixel, starting from the adjacent pixel of the target pixel, sequentially determining whether the pixel difference between each pixel and the target pixel is within a preset range;
    将所述目标像素点的相邻区域中与所述目标像素点的像素差值在预设范围的像素点,标记为与所述目标像素点属于同一个聚类区域。Pixel points in the adjacent area of the target pixel that are different from the pixel of the target pixel in a preset range are marked as belonging to the same cluster area as the target pixel.
  3. 根据权利要求2所述的方法,其特征在于,所述将未标记的任一个像素点作为目标像素点,从所述目标像素点的相邻像素点开始,依次判断各个像素点与所述目标像素点的像素差值是否在预设范围内包括:The method according to claim 2, wherein the unmarked pixel point is used as a target pixel point, and each pixel point and the target are sequentially determined from adjacent pixel points of the target pixel point. Whether the pixel difference of the pixel is within a preset range includes:
    将未标记的任一个像素点作为目标像素点,从所述目标像素点的每一个相邻像素点开始,依次判断连续相邻的各个像素点与所述目标像素点的像素差值是否在预设范围内直至任一个像素点与所述目标像素点的像素差值未在所述预设范围内。Taking any unmarked pixel as the target pixel, starting from each adjacent pixel of the target pixel, sequentially determining whether the pixel difference between each successive pixel and the target pixel is in advance It is assumed that the pixel difference value up to any one of the pixel points and the target pixel point is not within the preset range.
  4. 根据权利要求1所述的方法,其特征在于,所述通过双目相机获取当前环境图像的立体视差图包括:The method according to claim 1, wherein the obtaining a stereo disparity map of the current environment image by the binocular camera comprises:
    对所述双目相机进行标定,获得由所述双目相机的焦距、基线以及原点偏移量构成的Q矩阵;Calibrating the binocular camera to obtain a Q matrix composed of a focal length, a baseline, and an origin offset of the binocular camera;
    基于所述Q矩阵对所述双目相机获取的所述当前环境图像进行图像矫正,获得矫正图像;Performing image correction on the current environment image acquired by the binocular camera based on the Q matrix to obtain a corrected image;
    针对所述矫正图像进行立体匹配计算获得所述立体视差图;Performing stereo matching calculation on the corrected image to obtain the stereo disparity map;
    所述基于每一个聚类区域,计算每一个聚类区域对应的环境区域到所述无人机的飞行距离包括:Calculating, according to each cluster area, a flight distance of the environment area corresponding to each cluster area to the drone includes:
    确定每一个聚类区域中的目标像素点的位置坐标; Determining position coordinates of target pixel points in each cluster area;
    基于每一个聚类区域的目标像素点,按照如下位置坐标公式计算获得所述每一个聚类区域对应的环境区域在立体坐标系中的位置坐标;Calculating, according to the target pixel point of each clustering region, the position coordinates of the environment region corresponding to each of the cluster regions in the three-dimensional coordinate system according to the following coordinate formula;
    所述位置坐标计算公式为:The position coordinate calculation formula is:
    Figure PCTCN2017108022-appb-100001
    Figure PCTCN2017108022-appb-100001
    其中,(X,Y,Z)为所述环境区域在立体坐标系中的位置坐标,Z表示所述每一个聚类区域对应的环境区域到所述无人机的飞行距离,Tx表示所述双目相机的基线,f表示所述双目相机的焦距,(x,y)表示任一聚类区域中所述目标像素点的位置坐标,d表示所述任一聚类区域中所述目标像素对应的视差,(cx,cy)为所述双目相机矫正后的成像原点。Where (X, Y, Z) is the position coordinate of the environmental region in the three-dimensional coordinate system, and Z represents the flight distance of the environmental region corresponding to each of the cluster regions to the drone, and T x represents a baseline of the binocular camera, f denotes a focal length of the binocular camera, (x, y) denotes a position coordinate of the target pixel point in any cluster region, and d denotes the one in the any cluster region The parallax corresponding to the target pixel, (c x , c y ) is the imaging origin after the binocular camera is corrected.
  5. 根据权利要求1所述的方法,其特征在于,所述从飞行距离大于距离阈值的环境区域中选择任一个绕飞区域包括:The method according to claim 1, wherein the selecting one of the environment regions from the flight distance greater than the distance threshold comprises:
    如果所述无人机的飞行方向对应的环境区域与所述无人机的飞行距离小于距离阈值,从飞行距离大于距离阈值的环境区域中选择飞行距离最远的一个环境区域作为候选区域;If the flight distance corresponding to the flight direction of the drone and the drone is less than the distance threshold, select an environmental region that is the farthest from the flight region with the flight distance greater than the distance threshold as the candidate region;
    判断所述候选区域是否满足飞行条件;Determining whether the candidate area satisfies flight conditions;
    如果是,将所述候选区域作为绕飞区域;If yes, the candidate area is used as a flying area;
    如果否,从飞行距离大于距离阈值且不包括所述候选区域的环境区域中选择飞行距离最远的一个环境区域作为候选区域,并返回判断所述候选区域是否满足飞行条件的步骤继续执行。If not, an environment region whose flight distance is the farthest from the flight distance greater than the distance threshold and not including the candidate region is selected as the candidate region, and the step of determining whether the candidate region satisfies the flight condition is continued.
  6. 根据权利要求5所述的方法,其特征在于,还包括:The method of claim 5, further comprising:
    如果所述无人机的飞行方向对应的环境区域与所述无人机的飞行距离大于距离阈值,控制所述无人机按照原始路线飞行。If the flight area corresponding to the flight direction of the drone and the flight distance of the drone are greater than the distance threshold, the drone is controlled to fly according to the original route.
  7. 根据权利要求5所述的方法,其特征在于,还包括:The method of claim 5, further comprising:
    如果任一个候选区域均不满足所述飞行条件,控制所述无人机悬停。If none of the candidate regions meet the flight conditions, the drone is controlled to hover.
  8. 根据权利要求4所述的方法,其特征在于,所述计算所述无人机转向所述任一个绕飞区域的旋转角度,并控制所述无人机按照所述旋转角度进行绕飞,包括:The method according to claim 4, wherein said calculating a rotation angle of said drone to said one of said flying regions, and controlling said drone to fly around said rotation angle, including :
    按照所述绕飞区域对应的位置坐标,计算所述无人机在所述绕飞区域的旋转的角度并控制所述无人机按照所述旋转角度进行绕飞。Calculating an angle of rotation of the drone in the flying area according to position coordinates corresponding to the flying area and controlling the drone to fly according to the rotation angle.
  9. 一种无人机,其特征在于,包括:A drone, characterized in that it comprises:
    获取模块,用于通过双目相机获取当前环境图像的立体视差图;An acquisition module, configured to acquire a stereo disparity map of the current environment image by using a binocular camera;
    聚类模块,用于将所述立体视差图中的像素点进行聚类,获得多个聚类区域; a clustering module, configured to cluster pixel points in the stereo disparity map to obtain a plurality of cluster regions;
    飞行距离计算模块,用于基于每一个聚类区域,计算每一个聚类区域对应的环境区域到无人机的飞行距离;a flight distance calculation module, configured to calculate a flight distance of the environmental region corresponding to each cluster region to the drone based on each cluster region;
    第一判断模块,用于判断每一个环境区域与所述无人机的飞行距离是否大于距离阈值;a first determining module, configured to determine whether a flight distance of each environmental area and the drone is greater than a distance threshold;
    第一选择模块,用于从飞行距离大于距离阈值的环境区域中选择任一个绕飞区域;a first selection module, configured to select any one of the surrounding areas from an environmental area whose flight distance is greater than a distance threshold;
    绕飞模块,用于计算所述无人机转向所述任一个绕飞区域的旋转角度,并控制所述无人机按照所述旋转角度进行绕飞。The flying module is configured to calculate a rotation angle of the drone to the any of the flying regions, and control the drone to fly around according to the rotation angle.
  10. 根据权利要求9所述的无人机,其特征在于,所述聚类模块包括:The drone according to claim 9, wherein the clustering module comprises:
    第二判断单元,用于将未标记的任一个像素点作为目标像素点,从所述目标像素点的相邻像素点开始,依次判断各个像素点与所述目标像素点的像素差值是否在预设范围内;a second determining unit, configured to use any pixel point that is not marked as a target pixel point, and sequentially determine, from the adjacent pixel points of the target pixel point, whether the pixel difference between each pixel point and the target pixel point is Within the preset range;
    标记单元,用于将所述目标像素点的相邻区域中与所述目标像素点的像素差值在预设范围的像素点,标记为与所述目标像素点属于同一个聚类区域。And a marking unit, configured to mark a pixel difference between the adjacent pixel of the target pixel and the pixel of the target pixel in a preset range, and mark the same cluster area as the target pixel.
  11. 根据权利要求10所述的无人机,其特征在于,所述第二判断单元具体用于:The UAV according to claim 10, wherein the second determining unit is specifically configured to:
    将未标记的任一个像素点作为目标像素点,从所述目标像素点的每一个相邻像素点开始,依次判断连续相邻的各个像素点与所述目标像素点的像素差值是否在预设范围内直至任一个像素点与所述目标像素点的像素差值未在所述预设范围内。Taking any unmarked pixel as the target pixel, starting from each adjacent pixel of the target pixel, sequentially determining whether the pixel difference between each successive pixel and the target pixel is in advance It is assumed that the pixel difference value up to any one of the pixel points and the target pixel point is not within the preset range.
  12. 根据权利要求9所述的无人机,其特征在于,所述第一获取模块具体用于:The UAV according to claim 9, wherein the first acquisition module is specifically configured to:
    对所述双目相机进行标定,获得由所述双目相机的焦距、基线以及原点偏移量构成的Q矩阵;Calibrating the binocular camera to obtain a Q matrix composed of a focal length, a baseline, and an origin offset of the binocular camera;
    基于所述Q矩阵对所述双目相机获取的所述当前环境图像进行图像矫正,获得矫正图像;Performing image correction on the current environment image acquired by the binocular camera based on the Q matrix to obtain a corrected image;
    针对所述矫正图像进行立体匹配计算获得所述立体视差图;Performing stereo matching calculation on the corrected image to obtain the stereo disparity map;
    所述飞行距离计算模块具体用于:The flight distance calculation module is specifically configured to:
    确定每一个聚类区域中的目标像素点的位置坐标;Determining position coordinates of target pixel points in each cluster area;
    基于每一个聚类区域的目标像素点,按照如下位置坐标公式计算获得所述每一个聚类区域对应的环境区域在立体坐标系中的位置坐标;Calculating, according to the target pixel point of each clustering region, the position coordinates of the environment region corresponding to each of the cluster regions in the three-dimensional coordinate system according to the following coordinate formula;
    所述位置坐标计算公式为:The position coordinate calculation formula is:
    Figure PCTCN2017108022-appb-100002
    Figure PCTCN2017108022-appb-100002
    其中,(X,Y,Z)为所述环境区域在立体坐标系中的位置坐标,Z表示所述每一个聚类区域对应的环境区域到所述无人机的飞行距离,Tx表示所述双目相机的基线,f表示所述 双目相机的焦距,(x,y)表示任一聚类区域中所述目标像素点的位置坐标,d表示所述任一聚类区域中所述目标像素对应的视差,(cx,cy)为所述双目相机矫正后的成像原点。Where (X, Y, Z) is the position coordinate of the environmental region in the three-dimensional coordinate system, and Z represents the flight distance of the environmental region corresponding to each of the cluster regions to the drone, and T x represents a baseline of the binocular camera, f denotes a focal length of the binocular camera, (x, y) denotes a position coordinate of the target pixel point in any cluster region, and d denotes the one in the any cluster region The parallax corresponding to the target pixel, (c x , c y ) is the imaging origin after the binocular camera is corrected.
  13. 根据权利要求9所述的无人机,其特征在于,所述第一选择模块包括:第二选择单元、第三判断单元、确定单元、第三选择单元;The UAV according to claim 9, wherein the first selection module comprises: a second selection unit, a third determination unit, a determination unit, and a third selection unit;
    所述第二选择单元,用于如果所述无人机的飞行方向对应的环境区域与所述无人机的飞行距离小于距离阈值,从飞行距离大于距离阈值的环境区域中选择飞行距离最远的一个环境区域作为候选区域;The second selecting unit is configured to select a farthest flight distance from an environment region where the flight distance is greater than the distance threshold if the flight distance corresponding to the flight direction of the drone is less than the distance threshold An environmental area as a candidate area;
    所述第三判断单元,用于判断所述候选区域是否满足飞行条件;如果是,触发所述确定单元;如果否,触发所述第三选择单元;The third determining unit is configured to determine whether the candidate area meets a flight condition; if yes, trigger the determining unit; if not, trigger the third selecting unit;
    所述确定单元,用于将所述候选区域作为绕飞区域;The determining unit is configured to use the candidate area as a flying area;
    所述第三选择单元,用于从飞行距离大于距离阈值且不包括所述候选区域的环境区域中选择飞行距离最远的一个环境区域作为候选区域,并返回判断所述候选区域是否满足飞行条件的步骤继续执行。The third selection unit is configured to select, as a candidate region, an environment region whose flight distance is the farthest from the environment region where the flight distance is greater than the distance threshold and does not include the candidate region, and return to determine whether the candidate region meets the flight condition. The steps continue.
  14. 根据权利要求13所述的无人机,其特征在于,还包括:The drone according to claim 13, further comprising:
    第一控制模块,用于如果所述无人机的飞行方向对应的环境区域与所述无人机的飞行距离大于距离阈值,控制所述无人机按照原始路线飞行。The first control module is configured to control the drone to fly according to the original route if the flight area corresponding to the flight direction of the drone and the flight distance of the drone are greater than a distance threshold.
  15. 根据权利要求13所述的无人机,其特征在于,还包括:The drone according to claim 13, further comprising:
    第二控制模块,用于如果任一个候选区域均不满足所述飞行条件,控制所述无人机悬停。And a second control module, configured to control the drone to hover if none of the candidate regions meet the flight condition.
  16. 根据权利要求12所述的无人机,其特征在于,所述绕飞模块具体用于:The drone according to claim 12, wherein the flying module is specifically configured to:
    按照所述绕飞区域对应的位置坐标,计算所述无人机在所述绕飞区域的旋转的角度并控制所述无人机按照所述旋转角度进行绕飞。 Calculating an angle of rotation of the drone in the flying area according to position coordinates corresponding to the flying area and controlling the drone to fly according to the rotation angle.
PCT/CN2017/108022 2017-07-21 2017-10-27 Obstacle avoidance method for unmanned aerial vehicle, and unmanned aerial vehicle WO2019015158A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710601150.1A CN107329490B (en) 2017-07-21 2017-07-21 Unmanned aerial vehicle obstacle avoidance method and unmanned aerial vehicle
CN201710601150.1 2017-07-21

Publications (1)

Publication Number Publication Date
WO2019015158A1 true WO2019015158A1 (en) 2019-01-24

Family

ID=60200465

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/108022 WO2019015158A1 (en) 2017-07-21 2017-10-27 Obstacle avoidance method for unmanned aerial vehicle, and unmanned aerial vehicle

Country Status (2)

Country Link
CN (1) CN107329490B (en)
WO (1) WO2019015158A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109634309A (en) * 2019-02-21 2019-04-16 南京晓庄学院 A kind of aircraft automatic obstacle avoiding system, method and aircraft
CN113376658A (en) * 2021-05-08 2021-09-10 广东电网有限责任公司广州供电局 Unmanned aerial vehicle autonomous obstacle avoidance method and system based on single line laser radar
CN113554666A (en) * 2021-07-22 2021-10-26 南京航空航天大学 Device and method for extracting aircraft target candidate region in airborne optical image
CN114879729A (en) * 2022-05-16 2022-08-09 西北工业大学 Unmanned aerial vehicle autonomous obstacle avoidance method based on obstacle contour detection algorithm
CN117170411A (en) * 2023-11-02 2023-12-05 山东环维游乐设备有限公司 Vision assistance-based auxiliary obstacle avoidance method for racing unmanned aerial vehicle
CN117437563A (en) * 2023-12-13 2024-01-23 黑龙江惠达科技股份有限公司 Plant protection unmanned aerial vehicle dotting method, device and equipment based on binocular vision

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107958461A (en) * 2017-11-14 2018-04-24 中国航空工业集团公司西安飞机设计研究所 A kind of carrier aircraft method for tracking target based on binocular vision
CN107977985B (en) * 2017-11-29 2021-02-09 上海拓攻机器人有限公司 Unmanned aerial vehicle hovering method and device, unmanned aerial vehicle and storage medium
WO2019126930A1 (en) * 2017-12-25 2019-07-04 深圳市道通智能航空技术有限公司 Method and apparatus for measuring distance, and unmanned aerial vehicle
WO2019134142A1 (en) * 2018-01-05 2019-07-11 深圳市大疆创新科技有限公司 Unmanned aerial vehicle control method, unmanned aerial vehicle system, and control device
WO2019144291A1 (en) * 2018-01-23 2019-08-01 深圳市大疆创新科技有限公司 Flight control method, apparatus, and machine-readable storage medium
CN110231832B (en) * 2018-03-05 2022-09-06 北京京东乾石科技有限公司 Obstacle avoidance method and obstacle avoidance device for unmanned aerial vehicle
CN108497988A (en) * 2018-04-11 2018-09-07 重庆第二师范学院 A kind of embedded high-altitude cleaning glass window machine people's control system
CN108844538B (en) * 2018-05-07 2021-01-19 中国人民解放军国防科技大学 Unmanned aerial vehicle obstacle avoidance waypoint generation method based on vision/inertial navigation
CN111326023B (en) * 2018-12-13 2022-03-29 丰翼科技(深圳)有限公司 Unmanned aerial vehicle route early warning method, device, equipment and storage medium
US20220153411A1 (en) * 2019-03-25 2022-05-19 Sony Group Corporation Moving body, control method thereof, and program
CN110187720B (en) * 2019-06-03 2022-09-27 深圳铂石空间科技有限公司 Unmanned aerial vehicle guiding method, device, system, medium and electronic equipment
CN112101374B (en) * 2020-08-01 2022-05-24 西南交通大学 Unmanned aerial vehicle obstacle detection method based on SURF feature detection and ISODATA clustering algorithm
CN112729312A (en) * 2020-12-25 2021-04-30 云南电网有限责任公司昆明供电局 Unmanned aerial vehicle inspection method for high-voltage chamber of transformer substation

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120296497A1 (en) * 2011-05-18 2012-11-22 Hon Hai Precision Industry Co., Ltd. Unmanned aerial vehicle and method for controlling the unmanned aerial vehicle
TW201534512A (en) * 2014-03-06 2015-09-16 Univ Nat Changhua Education Control method about obstacle avoidance and navigation by binocular images
CN105787447A (en) * 2016-02-26 2016-07-20 深圳市道通智能航空技术有限公司 Method and system of unmanned plane omnibearing obstacle avoidance based on binocular vision
CN106444837A (en) * 2016-10-17 2017-02-22 北京理工大学 Obstacle avoiding method and obstacle avoiding system for unmanned aerial vehicle
CN106708084A (en) * 2016-11-24 2017-05-24 中国科学院自动化研究所 Method for automatically detecting and avoiding obstacles for unmanned aerial vehicle under complicated environments

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101720047B (en) * 2009-11-03 2011-12-21 上海大学 Method for acquiring range image by stereo matching of multi-aperture photographing based on color segmentation
CN101989302B (en) * 2010-10-22 2012-11-28 西安交通大学 Multilayer bitmap color feature-based image retrieval method
EP2987001A4 (en) * 2013-04-16 2017-01-11 Bae Systems Australia Limited Landing system for an aircraft
CN104463183B (en) * 2013-09-13 2017-10-10 株式会社理光 Cluster centre choosing method and system
CN105225241B (en) * 2015-09-25 2017-09-15 广州极飞科技有限公司 The acquisition methods and unmanned plane of unmanned plane depth image
CN106933243A (en) * 2015-12-30 2017-07-07 湖南基石信息技术有限公司 A kind of unmanned plane Real Time Obstacle Avoiding system and method based on binocular vision
CN105718895A (en) * 2016-01-22 2016-06-29 张健敏 Unmanned aerial vehicle based on visual characteristics
CN106774386B (en) * 2016-12-06 2019-08-13 杭州灵目科技有限公司 Unmanned plane vision guided navigation landing system based on multiple dimensioned marker
CN106909877B (en) * 2016-12-13 2020-04-14 浙江大学 Visual simultaneous mapping and positioning method based on dotted line comprehensive characteristics
CN106774421B (en) * 2017-02-10 2020-03-10 郑州云海信息技术有限公司 Unmanned aerial vehicle trajectory planning system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120296497A1 (en) * 2011-05-18 2012-11-22 Hon Hai Precision Industry Co., Ltd. Unmanned aerial vehicle and method for controlling the unmanned aerial vehicle
TW201534512A (en) * 2014-03-06 2015-09-16 Univ Nat Changhua Education Control method about obstacle avoidance and navigation by binocular images
CN105787447A (en) * 2016-02-26 2016-07-20 深圳市道通智能航空技术有限公司 Method and system of unmanned plane omnibearing obstacle avoidance based on binocular vision
CN106444837A (en) * 2016-10-17 2017-02-22 北京理工大学 Obstacle avoiding method and obstacle avoiding system for unmanned aerial vehicle
CN106708084A (en) * 2016-11-24 2017-05-24 中国科学院自动化研究所 Method for automatically detecting and avoiding obstacles for unmanned aerial vehicle under complicated environments

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SU, DONG: "Navigation and Obstacle Avoidance for Miniature UAV Based on Binocular Stereo Vision", 2014 MASTER'S DISSERTATION OF XIDIAN UNIVERSITY, 31 December 2014 (2014-12-31) *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109634309A (en) * 2019-02-21 2019-04-16 南京晓庄学院 A kind of aircraft automatic obstacle avoiding system, method and aircraft
CN109634309B (en) * 2019-02-21 2024-03-26 南京晓庄学院 Autonomous obstacle avoidance system and method for aircraft and aircraft
CN113376658A (en) * 2021-05-08 2021-09-10 广东电网有限责任公司广州供电局 Unmanned aerial vehicle autonomous obstacle avoidance method and system based on single line laser radar
CN113554666A (en) * 2021-07-22 2021-10-26 南京航空航天大学 Device and method for extracting aircraft target candidate region in airborne optical image
CN114879729A (en) * 2022-05-16 2022-08-09 西北工业大学 Unmanned aerial vehicle autonomous obstacle avoidance method based on obstacle contour detection algorithm
CN117170411A (en) * 2023-11-02 2023-12-05 山东环维游乐设备有限公司 Vision assistance-based auxiliary obstacle avoidance method for racing unmanned aerial vehicle
CN117170411B (en) * 2023-11-02 2024-02-02 山东环维游乐设备有限公司 Vision assistance-based auxiliary obstacle avoidance method for racing unmanned aerial vehicle
CN117437563A (en) * 2023-12-13 2024-01-23 黑龙江惠达科技股份有限公司 Plant protection unmanned aerial vehicle dotting method, device and equipment based on binocular vision
CN117437563B (en) * 2023-12-13 2024-03-15 黑龙江惠达科技股份有限公司 Plant protection unmanned aerial vehicle dotting method, device and equipment based on binocular vision

Also Published As

Publication number Publication date
CN107329490B (en) 2020-10-09
CN107329490A (en) 2017-11-07

Similar Documents

Publication Publication Date Title
WO2019015158A1 (en) Obstacle avoidance method for unmanned aerial vehicle, and unmanned aerial vehicle
US10534967B2 (en) Fish measurement station keeping
US20170305546A1 (en) Autonomous navigation method and system, and map modeling method and system
WO2019161813A1 (en) Dynamic scene three-dimensional reconstruction method, apparatus and system, server, and medium
US11073389B2 (en) Hover control
CN106529495B (en) Obstacle detection method and device for aircraft
WO2019113966A1 (en) Obstacle avoidance method and device, and unmanned aerial vehicle
WO2021035731A1 (en) Control method and apparatus for unmanned aerial vehicle, and computer readable storage medium
WO2018210078A1 (en) Distance measurement method for unmanned aerial vehicle, and unmanned aerial vehicle
WO2019119328A1 (en) Vision-based positioning method and aerial vehicle
WO2019076304A1 (en) Binocular camera-based visual slam method for unmanned aerial vehicles, unmanned aerial vehicle, and storage medium
WO2018145291A1 (en) System and method for real-time location tracking of drone
KR101896654B1 (en) Image processing system using drone and method of the same
Eynard et al. Real time UAV altitude, attitude and motion estimation from hybrid stereovision
WO2019126930A1 (en) Method and apparatus for measuring distance, and unmanned aerial vehicle
WO2019061064A1 (en) Image processing method and device
WO2021056139A1 (en) Method and device for acquiring landing position, unmanned aerial vehicle, system, and storage medium
US11100667B2 (en) Systems and methods for generating annotations of structured, static objects in aerial imagery using geometric transfer learning and probabilistic localization
WO2020114433A1 (en) Depth perception method and apparatus, and depth perception device
WO2020237478A1 (en) Flight planning method and related device
WO2021056144A1 (en) Method and apparatus for controlling return of movable platform, and movable platform
CN114648639B (en) Target vehicle detection method, system and device
Dubey et al. Droan-disparity-space representation for obstacle avoidance: Enabling wire mapping & avoidance
Liu et al. Hybrid real-time stereo visual odometry for unmanned aerial vehicles
Gee et al. A dedicated lightweight binocular stereo system for real-time depth-map generation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17918274

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17918274

Country of ref document: EP

Kind code of ref document: A1