CN107329490B - Unmanned aerial vehicle obstacle avoidance method and unmanned aerial vehicle - Google Patents

Unmanned aerial vehicle obstacle avoidance method and unmanned aerial vehicle Download PDF

Info

Publication number
CN107329490B
CN107329490B CN201710601150.1A CN201710601150A CN107329490B CN 107329490 B CN107329490 B CN 107329490B CN 201710601150 A CN201710601150 A CN 201710601150A CN 107329490 B CN107329490 B CN 107329490B
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
area
pixel point
clustering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710601150.1A
Other languages
Chinese (zh)
Other versions
CN107329490A (en
Inventor
王晓曼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Techology Co Ltd
Original Assignee
Goertek Techology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Techology Co Ltd filed Critical Goertek Techology Co Ltd
Priority to CN201710601150.1A priority Critical patent/CN107329490B/en
Priority to PCT/CN2017/108022 priority patent/WO2019015158A1/en
Publication of CN107329490A publication Critical patent/CN107329490A/en
Application granted granted Critical
Publication of CN107329490B publication Critical patent/CN107329490B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an unmanned aerial vehicle obstacle avoidance method and an unmanned aerial vehicle, wherein the method comprises the following steps: the method comprises the steps of obtaining a stereo disparity map of a current environment image through a binocular camera and clustering pixel points in the stereo disparity map to obtain a plurality of clustering areas. And calculating the flight distance from the environment area corresponding to each clustering area to the unmanned aerial vehicle based on each clustering area. Through judging each environmental zone with whether unmanned aerial vehicle's flying distance is greater than the distance threshold value, make unmanned aerial vehicle select any around flying the region from the environmental zone that flying distance is greater than the distance threshold value, and calculate unmanned aerial vehicle turns to any around flying the rotation angle in region, and control unmanned aerial vehicle follows rotation angle winds. The unmanned aerial vehicle autonomous obstacle avoidance and fly-around system realizes autonomous obstacle avoidance and fly-around of the unmanned aerial vehicle and improves the flying efficiency.

Description

Unmanned aerial vehicle obstacle avoidance method and unmanned aerial vehicle
Technical Field
The invention belongs to the technical field of electronics, and particularly relates to an unmanned aerial vehicle obstacle avoidance method and an unmanned aerial vehicle.
Background
With the rapid development of the unmanned aerial vehicle technology, the unmanned aerial vehicle is widely applied in the fields of aerial photography, cargo carrying, resource exploration, route exploration and the like. Because unmanned aerial vehicle does not have special personnel to control unmanned aerial vehicle flight when flying, unmanned aerial vehicle need independently fly according to the planning route, owing to be difficult to avoid the barrier that meets in the actual flight when planning the route.
The unmanned aerial vehicle owner can avoid the obstacle in flight based on the video autonomous obstacle avoidance method. According to the method, the camera carried by the unmanned aerial vehicle is used for shooting the image of the surrounding environment, and the obstacle behavior aiming at the obstacle is executed when whether the unmanned aerial vehicle cannot fly through the obstacle or not is analyzed based on the shot image, so that the unmanned aerial vehicle can be ensured to fly safely.
In the prior art, the video-based autonomous obstacle avoidance method is that the unmanned aerial vehicle hovers when meeting an obstacle, cannot continuously fly, and can only transmit an image back to a ground station to recalculate a flight route and then carry out obstacle avoidance flight, so that the flight efficiency of the unmanned aerial vehicle is low.
Disclosure of Invention
In view of the above, the invention provides an unmanned aerial vehicle obstacle avoidance method and an unmanned aerial vehicle, which solve the technical problem of low flight efficiency of the unmanned aerial vehicle, realize autonomous obstacle avoidance and fly-around of the unmanned aerial vehicle, and greatly improve flight efficiency.
In order to solve the technical problem, the invention provides an unmanned aerial vehicle obstacle avoidance method, which comprises the following steps:
acquiring a stereo disparity map of a current environment image through a binocular camera;
clustering the pixel points in the stereo disparity map to obtain a plurality of clustering areas;
calculating the flight distance from the environment area corresponding to each clustering area to the unmanned aerial vehicle based on each clustering area;
judging whether the flying distance between each environment area and the unmanned aerial vehicle is greater than a distance threshold value;
selecting any one of the fly-around areas from the environmental areas having the flight distance greater than the distance threshold;
and calculating a rotation angle of the unmanned aerial vehicle steering to any one of the fly-around areas, and controlling the unmanned aerial vehicle to fly around according to the rotation angle.
Preferably, the clustering the pixel points in the stereo disparity map to obtain a plurality of clustering regions includes:
taking any unmarked pixel point as a target pixel point, and sequentially judging whether the pixel difference value of each pixel point and the target pixel point is within a preset range from the adjacent pixel point of the target pixel point;
and marking the pixel points of which the pixel difference value between the adjacent areas of the target pixel points and the target pixel points is in a preset range as the pixel points which belong to the same clustering area with the target pixel points.
Preferably, the step of taking any unmarked pixel point as a target pixel point, starting from an adjacent pixel point of the target pixel point, and sequentially judging whether the pixel difference value between each pixel point and the target pixel point is within a preset range includes:
taking any unmarked pixel point as a target pixel point, and sequentially judging whether the pixel difference value of each pixel point which is continuously adjacent to the target pixel point is within a preset range or not from each adjacent pixel point of the target pixel point until the pixel difference value of any pixel point and the target pixel point is not within the preset range.
Preferably, the acquiring the stereoscopic disparity map of the current environment image by the binocular camera includes:
calibrating the binocular camera to obtain a Q matrix formed by the focal length, the base line and the origin offset of the binocular camera;
performing image correction on the current environment image acquired by the binocular camera based on the Q matrix to obtain a corrected image;
performing stereo matching calculation on the corrected image to obtain the stereo disparity map;
the calculating, based on each clustering region, a flight distance from an environment region corresponding to each clustering region to the unmanned aerial vehicle includes:
determining the position coordinates of target pixel points in each clustering region;
calculating and obtaining the position coordinates of the environment area corresponding to each clustering area in a stereo coordinate system according to the following position coordinate formula based on the target pixel point of each clustering area;
the position coordinate calculation formula is as follows:
Figure BDA0001357169510000031
wherein (X, Y, Z) is the position coordinate of the environment area in a three-dimensional coordinate system, Z represents the flying distance from the environment area corresponding to each clustering area to the unmanned aerial vehicle, TxRepresenting a base line of the binocular camera, f representing a focal length of the binocular camera, (x, y) representing position coordinates of the target pixel point in any one clustering area, d representing a parallax corresponding to the target pixel in any one clustering area, and (c)x,cy) And the corrected imaging original point of the binocular camera is obtained.
Preferably, the selecting any one of the fly-around areas from the environmental areas having the flight distance greater than the distance threshold value includes:
if the flight distance between the environment area corresponding to the flight direction of the unmanned aerial vehicle and the unmanned aerial vehicle is smaller than a distance threshold, selecting an environment area with the longest flight distance from the environment areas with the flight distance larger than the distance threshold as a candidate area;
judging whether the candidate area meets the flight condition;
if so, taking the candidate area as a fly-around area;
if not, selecting one environmental area with the largest flight distance from the environmental areas with the flight distance larger than the distance threshold and not including the candidate area as the candidate area, and returning to the step of judging whether the candidate area meets the flight condition to continue the execution.
Preferably, the method further comprises the following steps:
and if the environmental area corresponding to the flight direction of the unmanned aerial vehicle and the flight distance of the unmanned aerial vehicle are greater than the distance threshold value, controlling the unmanned aerial vehicle to fly according to the original route.
Preferably, the method further comprises the following steps:
and if any candidate area does not meet the flight condition, controlling the unmanned aerial vehicle to hover.
Preferably, the calculating a rotation angle at which the unmanned aerial vehicle turns to any one of the fly-around areas, and controlling the unmanned aerial vehicle to fly around according to the rotation angle includes:
and calculating the rotating angle of the unmanned aerial vehicle in the fly-around area according to the position coordinate corresponding to the fly-around area, and controlling the unmanned aerial vehicle to fly around according to the rotating angle.
The invention also provides an unmanned obstacle avoidance system, which comprises:
the acquisition module is used for acquiring a stereoscopic parallax image of the current environment image through a binocular camera;
the clustering module is used for clustering the pixel points in the stereo disparity map to obtain a plurality of clustering areas;
the flight distance calculation module is used for calculating the flight distance from the environment area corresponding to each clustering area to the unmanned aerial vehicle based on each clustering area;
the first judgment module is used for judging whether the flying distance between each environment area and the unmanned aerial vehicle is greater than a distance threshold value;
the first selection module is used for selecting any one of the fly-around areas from the environment areas with the flight distance larger than the distance threshold;
and the fly-around module is used for calculating a rotation angle of the unmanned aerial vehicle steering to any one of the fly-around areas and controlling the unmanned aerial vehicle to fly around according to the rotation angle.
Preferably, the clustering module comprises:
the second judgment unit is used for taking any unmarked pixel point as a target pixel point and sequentially judging whether the pixel difference value of each pixel point and the target pixel point is within a preset range from the adjacent pixel point of the target pixel point;
and the marking unit is used for marking the pixel points, which are in the adjacent region of the target pixel point and have the pixel difference value with the target pixel point within the preset range, as the pixel points which belong to the same clustering region with the target pixel point.
Preferably, the second judging unit is specifically configured to:
taking any unmarked pixel point as a target pixel point, and sequentially judging whether the pixel difference value of each pixel point which is continuously adjacent to the target pixel point is within a preset range or not from each adjacent pixel point of the target pixel point until the pixel difference value of any pixel point and the target pixel point is not within the preset range.
Preferably, the first obtaining module is specifically configured to:
calibrating the binocular camera to obtain a Q matrix formed by the focal length, the base line and the origin offset of the binocular camera;
performing image correction on the current environment image acquired by the binocular camera based on the Q matrix to obtain a corrected image;
performing stereo matching calculation on the corrected image to obtain the stereo disparity map;
the flight distance calculation module is specifically configured to:
determining the position coordinates of target pixel points in each clustering region;
calculating and obtaining the position coordinates of the environment area corresponding to each clustering area in a stereo coordinate system according to the following position coordinate formula based on the target pixel point of each clustering area;
the position coordinate calculation formula is as follows:
Figure BDA0001357169510000051
wherein (X, Y, Z) is the position coordinate of the environment area in a three-dimensional coordinate system, Z represents the flying distance from the environment area corresponding to each clustering area to the unmanned aerial vehicle, TxRepresenting a base line of the binocular camera, f representing a focal length of the binocular camera, (x, y) representing position coordinates of the target pixel point in any one clustering area, d representing a parallax corresponding to the target pixel in any one clustering area, and (c)x,cy) And the corrected imaging original point of the binocular camera is obtained.
Preferably, the first selection module comprises: the device comprises a second selection unit, a third judgment unit, a determination unit and a third selection unit;
the second selection unit is configured to select, if the flight distance between the environment area corresponding to the flight direction of the unmanned aerial vehicle and the unmanned aerial vehicle is smaller than a distance threshold, an environment area with the longest flight distance from among the environment areas with the flight distance greater than the distance threshold as a candidate area;
the third judging unit is used for judging whether the candidate area meets the flight condition; if yes, triggering the determining unit; if not, triggering the third selection unit;
the determining unit is used for taking the candidate area as a fly-around area;
and the third selecting unit is used for selecting an environment area with the farthest flight distance from environment areas with the flight distance larger than a distance threshold value and without the candidate area as a candidate area, and returning to the step of judging whether the candidate area meets the flight condition to continue the execution.
Preferably, the method further comprises the following steps:
the first control module is used for controlling the unmanned aerial vehicle to fly according to an original route if an environment area corresponding to the flight direction of the unmanned aerial vehicle is larger than a distance threshold value with the flight distance of the unmanned aerial vehicle.
Preferably, the method further comprises the following steps:
and the second control module is used for controlling the unmanned aerial vehicle to hover if any one of the candidate areas does not meet the flight condition.
Preferably, the fly-around module is specifically configured to:
and calculating the rotating angle of the unmanned aerial vehicle in the fly-around area according to the position coordinate corresponding to the fly-around area, and controlling the unmanned aerial vehicle to fly around according to the rotating angle.
Compared with the prior art, the invention can obtain the following technical effects:
the invention provides an unmanned aerial vehicle obstacle avoidance method and an unmanned aerial vehicle, wherein a stereo disparity map of a current environment image is acquired through a binocular camera, and pixel points in the stereo disparity map are clustered to obtain a plurality of clustering areas. And calculating the flight distance from the environment area corresponding to each clustering area to the unmanned aerial vehicle based on each clustering area. Through judging each environmental zone with whether unmanned aerial vehicle's flying distance is greater than the distance threshold value, make unmanned aerial vehicle select any around flying the region from the environmental zone that flying distance is greater than the distance threshold value, and calculate unmanned aerial vehicle turns to any around flying the rotation angle in region, and control unmanned aerial vehicle follows rotation angle winds. According to the invention, the flight distance from the environment area corresponding to each clustering area in the stereo disparity map to the unmanned aerial vehicle is judged, the flying-around area larger than the distance threshold is selected, and flying-around is carried out according to the rotation angle of turning to the flying-around area, so that the unmanned aerial vehicle can autonomously avoid obstacles and fly around, and the flight efficiency is greatly improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention and not to limit the invention. In the drawings:
fig. 1 is a flowchart of an embodiment of an obstacle avoidance method for an unmanned aerial vehicle according to an embodiment of the present invention; (ii) a
Fig. 2 is a flowchart of another embodiment of an obstacle avoidance method for an unmanned aerial vehicle according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an embodiment of an unmanned aerial vehicle according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of another embodiment of the unmanned aerial vehicle according to the embodiment of the present invention.
Detailed Description
The following detailed description of the embodiments of the present invention will be provided with reference to the accompanying drawings and examples, so that how to implement the embodiments of the present invention by using technical means to solve the technical problems and achieve the technical effects can be fully understood and implemented.
With the rapid development of electronic technology, unmanned aerial vehicles are widely applied in the fields of aerial photography, cargo carrying, resource exploration, route exploration and the like. Unmanned aerial vehicle independently flies and needs the flight safety of automatic obstacle avoidance function guarantee unmanned aerial vehicle. The existing automatic obstacle avoidance means can comprise video-based obstacle avoidance, but the existing video-based autonomous obstacle avoidance method is that the unmanned aerial vehicle hovers when meeting an obstacle, cannot continuously fly, and only can transmit an image back to a ground station to recalculate a flight route and then carry out obstacle avoidance flight, so that a lot of limitations are placed on use, and the flight efficiency of the unmanned aerial vehicle is low.
In order to solve the technical problem that the flying efficiency of the unmanned aerial vehicle is low, the inventor provides the technical scheme of the invention through a series of researches. In the invention, a stereo disparity map of a current environment image is obtained through a binocular camera, and pixel points in the stereo disparity map are clustered to obtain a plurality of clustering areas. And calculating the flight distance from the environment area corresponding to each clustering area to the unmanned aerial vehicle based on each clustering area. And then, by judging whether the flying distance between each environment area and the unmanned aerial vehicle is greater than a distance threshold value, the unmanned aerial vehicle selects any one of the flying-around areas from the environment areas with the flying distance greater than the distance threshold value, and controls the unmanned aerial vehicle to fly around according to the rotating angle. According to the invention, the flight distance from the environment area corresponding to each clustering area in the stereo disparity map to the unmanned aerial vehicle is judged, the flying-around area larger than the distance threshold is selected, and flying-around is carried out according to the rotation angle of turning to the flying-around area, so that the unmanned aerial vehicle can autonomously avoid obstacles and fly around, and the flight efficiency is greatly improved.
The technical solution of the present invention will be described in detail below with reference to the accompanying drawings.
Fig. 1 is a flowchart of an embodiment of an unmanned aerial vehicle obstacle avoidance method according to an embodiment of the present invention, where the method may include:
101: and acquiring a stereoscopic disparity map of the current environment image through a binocular camera.
The binocular camera is formed by two cameras on the left and right sides and is located the front end of unmanned aerial vehicle for shoot the environment image around the unmanned aerial vehicle flight path. The binocular camera simulates the principle of human eye vision, two environment images are respectively obtained through the left camera and the right camera, the acquired environment images need to be corrected due to distortion of the environment images acquired by the binocular image, and a stereo parallax image is obtained through calculation by adopting a stereo matching algorithm. The stereo matching algorithm is mainly used for obtaining a disparity map according to a triangulation principle through the corresponding relation between two environment images obtained by a binocular camera; after obtaining the parallax information, the depth information and the three-dimensional information of the original environment image can be easily obtained according to the projection model, so that the stereo parallax image is obtained through calculation.
102: and clustering the pixel points in the stereo disparity map to obtain a plurality of clustering areas.
After the stereo disparity map of the surrounding environment image is obtained, the pixel points can be clustered according to the gray value of each pixel point in the stereo disparity map, and the pixel points with the approximate gray values are divided into a clustering area, so that the pixel points in the stereo disparity map can be divided into a plurality of clustering areas.
103: and calculating the flight distance from the environment area corresponding to each clustering area to the unmanned aerial vehicle based on each clustering area.
Optionally, after the flying distance from the environment image corresponding to each clustering region to the unmanned aerial vehicle is obtained through calculation, the environment regions can be sorted from large to small according to the distance from each environment region to the unmanned aerial vehicle, and the arrangement sequence of the flying distances from the environment regions to the unmanned aerial vehicle is obtained.
Optionally, as another embodiment, the acquiring the stereoscopic disparity map of the current environment image by using a binocular camera includes:
calibrating the binocular camera to obtain a Q matrix formed by the focal length, the base line and the origin offset of the binocular camera;
performing image correction on the current environment image acquired by the binocular camera based on the Q matrix to obtain a corrected image; the binocular camera acquires two current environment images, and accordingly two corrected images are acquired correspondingly.
Performing stereo matching calculation on the two corrected images to obtain the stereo disparity map; wherein, the formula of the Q matrix may be:
Figure BDA0001357169510000091
wherein, T isxRepresenting a baseline of the binocular camera, f representing a focal length of the binocular camera, (c)x,cy)、(c’x,c’y) Respectively are two corrected imaging original points of the binocular camera, and the two corrected imaging original points are the same, namely cx=c’x,cy=c’y
The imaging origin is the intersection point of the optical axis of the binocular camera and the acquired image plane, and is usually located at the center of the image. The imaging original points of the two environment images collected by the binocular camera can be obtained by calibrating the binocular camera, and because the collected environment images need to be corrected, the imaging original points after the two environment images are corrected are the same (c)x,cy) So that the ideal form of perfectly parallel alignment of the two ambient images can be obtained.
Optionally, after obtaining the internal and external parameters of the binocular camera through calibration of the binocular camera, the two environment images collected by the two cameras can be subjected to image correction through the internal and external parameters of the binocular camera, and planes of the two environment images after correction are aligned in a completely parallel manner, so that the two corrected images can be subjected to stereo matching by using a stereo matching algorithm, and a stereo disparity map of the current environment image can be obtained. Optionally, the stereo matching algorithm may calculate the parallax d of each pixel point in the corrected image acquired by the binocular camera according to a triangulation principle, so as to obtain a parallax image, and after obtaining parallax information, may obtain depth information and three-dimensional information of the original environment image according to a projection model, so as to calculate and obtain a stereo parallax image.
The calculating, based on each clustering region, a flight distance from an environment region corresponding to each clustering region to the unmanned aerial vehicle includes:
determining the position coordinates of target pixel points in each clustering region;
calculating and obtaining the position coordinates of the environment area corresponding to each clustering area in a stereo coordinate system according to the following position coordinate formula based on the target pixel point of each clustering area;
the position coordinate calculation formula may be:
Figure BDA0001357169510000101
wherein, (X, Y, Z) is a position coordinate of the environment region in a stereo coordinate system, Z may represent a flight distance from the environment region corresponding to each cluster region to the drone, (X, Y) represents a position coordinate of the target pixel point in any cluster region, and d represents a parallax corresponding to the target pixel in any cluster region.
104: and judging whether the flying distance between each environment area and the unmanned aerial vehicle is greater than a distance threshold value.
105: selecting any one of the fly-around areas from the environmental areas having the flight distance greater than the distance threshold;
optionally, the distance threshold may be set according to performance parameters of the unmanned aerial vehicle in actual flight, for example, the unmanned aerial vehicle may hover when being three meters away from the obstacle in front, and it is guaranteed that the unmanned aerial vehicle will not collide with the obstacle, and then the unmanned aerial vehicle may be set to three meters as a safety distance, and the safety distance is used as the distance threshold, so as to guarantee safe flight of the unmanned aerial vehicle.
106: and calculating a rotation angle of the unmanned aerial vehicle steering to any one of the fly-around areas, and controlling the unmanned aerial vehicle to fly around according to the rotation angle.
Optionally, in some embodiments, the calculating a rotation angle at which the drone turns to any one of the fly-around areas, and controlling the drone to fly around according to the rotation angle includes:
and calculating the rotating angle of the unmanned aerial vehicle in the fly-around area according to the position coordinate corresponding to the fly-around area, and controlling the unmanned aerial vehicle to fly around according to the rotating angle.
After the unmanned aerial vehicle selects the fly-around area, the unmanned aerial vehicle needs to turn to fly to the fly-around area, and therefore the rotation angle of the unmanned aerial vehicle turning to the selected fly-around area needs to be calculated. The rotation angle can be obtained by calculating position coordinates (X, Y, Z) of an environment area corresponding to the flying-around area in a three-dimensional coordinate system, and the unmanned aerial vehicle is controlled to fly around according to the rotation angle.
Optionally, in order to avoid that the flight path of the drone is not changed too much, in the case that any one of the fly-around areas can pass through the drone, the fly-around area with the smallest rotation angle may be preferentially selected for fly-around.
In the embodiment, the distances from the environment areas corresponding to the clustering areas in the stereo disparity map to the unmanned aerial vehicle are judged, the flying-around areas larger than the distance threshold are selected and flying-around is carried out according to the rotating angles of the flying-around areas, and the environment areas with the minimum rotating angles and larger than the distance threshold can be preferentially selected as the flying-around areas to fly around, so that the flying route is not greatly changed, the unmanned aerial vehicle can autonomously avoid obstacles and fly around, and the flying efficiency is greatly improved.
Optionally, as another embodiment, the clustering the pixel points in the stereo disparity map to obtain a plurality of clustering regions includes:
taking any unmarked pixel point as a target pixel point, and sequentially judging whether the pixel difference value of each pixel point and the target pixel point is within a preset range from the adjacent pixel point of the target pixel point;
and marking the pixel points of which the pixel difference value between the adjacent areas of the target pixel points and the target pixel points is in a preset range as the pixel points which belong to the same clustering area with the target pixel points.
Optionally, after any pixel point in the stereo disparity map is clustered, a clustering region where the any pixel point is located is marked.
For example, first, any unmarked pixel point in the stereo disparity map may be selected as a target pixel point, and the clustering region where the target pixel point marks the pixel point is region 0, and the position coordinate of the target pixel point is (i, j), and the gray value is d0
Optionally, as another embodiment, the taking any unmarked pixel point as a target pixel point, starting from an adjacent pixel point of the target pixel point, and sequentially determining whether a pixel difference value between each pixel point and the target pixel point is within a preset range includes:
taking any unmarked pixel point as a target pixel point, and sequentially judging whether the pixel difference value of each pixel point which is continuously adjacent to the target pixel point is within a preset range or not from each adjacent pixel point of the target pixel point until the pixel difference value of any pixel point and the target pixel point is not within the preset range.
Optionally, the step of sequentially determining whether the pixel difference value between each pixel point and the target pixel point is within a preset range from the adjacent pixel point of the target pixel point may be to first select a first pixel point with an adjacent position coordinate (i, j-1) above the target pixel point, and obtain a gray value d corresponding to the first pixel point1. Setting the preset range of the gray value as D, and judging | D0-d1If the | is in the preset range, marking the first pixel point as a region 0. Then, whether pixel points adjacent to the first pixel belong to the region 0 or not is sequentially judged until pixel points which do not belong to the region 0 are found in the directionThe determination in that direction is stopped. And sequentially selecting whether the pixel points in the lower direction, the left direction and the right direction of the target pixel point belong to the region 0 or not according to the process, and after the clustering of the region 0 is completed, reselecting any unmarked pixel point as the target pixel point to be marked as the region 1 and completing the clustering of the region 1 according to the clustering process of the region 0. And analogizing until the clustering region is marked on each pixel point in the stereo disparity map to represent that clustering is finished, thereby obtaining a plurality of clustering regions.
Fig. 2 is a flowchart of another embodiment of an obstacle avoidance method for an unmanned aerial vehicle according to an embodiment of the present invention, where the method may include:
201: acquiring a stereo disparity map of a current environment image through a binocular camera;
202: clustering the pixel points in the stereo disparity map to obtain a plurality of clustering areas;
203: and calculating the flight distance from the environment area corresponding to each clustering area to the unmanned aerial vehicle based on each clustering area.
204: and judging whether the flying distance between each environment area and the unmanned aerial vehicle is greater than a distance threshold value.
Optionally, when judging whether the flying distance between each environmental area and the unmanned aerial vehicle is greater than the distance threshold, the sequences may be sorted according to the flying distance between each environmental area and the unmanned aerial vehicle from large to small, so as to obtain a sequence sorted according to the flying distance of the environmental areas.
Optionally, it may be determined first whether a flying distance between an environmental area corresponding to a flying direction of the drone and the drone is greater than a distance threshold.
Optionally, as another embodiment, after determining whether the flight distance between each environmental area and the drone is greater than a distance threshold, the method may further include:
and if the environmental area corresponding to the flight direction of the unmanned aerial vehicle and the flight distance of the unmanned aerial vehicle are greater than the distance threshold value, controlling the unmanned aerial vehicle to fly according to the original route.
205: and if the environment area corresponding to the flight direction of the unmanned aerial vehicle and the flight distance of the unmanned aerial vehicle are smaller than the distance threshold, selecting the environment area with the farthest flight distance from the environment areas with the flight distance larger than the distance threshold as a candidate area.
When the environmental area corresponding to the flight direction of the unmanned aerial vehicle and the flight distance of the unmanned aerial vehicle are smaller than the distance threshold, the unmanned aerial vehicle cannot continuously fly according to the initial flight direction, and the environmental area around which the distance is greater than the flight distance needs to be selected to fly around. According to the arrangement sequence of the flying distances corresponding to the environmental areas, the environmental area with the largest flying distance is preferentially selected as the candidate area to perform the detouring. Of course, the environmental area closest to the flight direction of the unmanned aerial vehicle and greater than the flight distance may also be selected as the candidate area to fly around.
206: judging whether the candidate area meets the flight condition; if yes, go to step 207; if not, step 208 is performed.
Alternatively, the flight condition may be determined based on the actual size of the candidate region. For example, the size of unmanned aerial vehicle is 1 meter wide, and height 0.5 meter, if the actual size in candidate region is greater than the size of unmanned aerial vehicle, then can guarantee that unmanned aerial vehicle passes through safely, this candidate region satisfies the flight condition, if the actual size in this candidate region is less than the size of unmanned aerial vehicle, then unmanned aerial vehicle can't fly through this candidate region, consequently does not satisfy the flight condition.
207: and taking the candidate area as a fly-around area.
208: and selecting one environmental area with the largest flying distance from the environmental areas with the flying distance larger than the distance threshold and not including the candidate area as the candidate area, and returning to the step 206 to continue to execute the corresponding operation.
Optionally, if the candidate region does not satisfy the flight condition, the candidate region with the largest flight distance in the environment regions except the candidate region is selected as a new candidate region according to the ranking order of the flight distances corresponding to the environment regions, and whether the candidate region satisfies the flight condition is determined again.
209: and calculating a rotation angle of the unmanned aerial vehicle steering to any one of the fly-around areas, and controlling the unmanned aerial vehicle to fly around according to the rotation angle.
Optionally, as another embodiment, after determining whether the flight distance between each environmental area and the drone is greater than a distance threshold, the method may further include:
and if any candidate area does not meet the flight condition, controlling the unmanned aerial vehicle to hover.
Optionally, if any candidate area does not meet the flight condition, indicating that the unmanned aerial vehicle cannot continue flying, controlling the unmanned aerial vehicle to hover, sending the surrounding environment image to the ground station to recalculate the flight route, and waiting for receiving the flight route sent by the ground station and then avoiding obstacles and flying around according to the flight instruction.
In this embodiment, operations in steps 201 to 203 are the same as operations in steps 101 to 103 in the embodiment of fig. 1, and operations in step 209 are the same as operations in step 106 in the embodiment of fig. 1, and are not repeated herein.
In the embodiment, whether the environment area corresponding to the flight direction of the unmanned aerial vehicle is greater than the distance threshold is judged preferentially, and if so, the unmanned aerial vehicle continues to fly according to the original route; if the distance between the unmanned aerial vehicle and the target area is smaller than the distance threshold value, the environment area with the flight distance larger than the distance threshold value and the farthest flight distance is preferentially selected as the candidate area, and the candidate area is judged to fly around under the condition that the flight condition is met, so that the unmanned aerial vehicle can find the optimal flight path to fly around according to the surrounding environment, the unmanned aerial vehicle can autonomously avoid obstacles and fly around, and the flight efficiency is greatly improved.
Fig. 3 is a schematic structural diagram of an embodiment of an unmanned aerial vehicle according to an embodiment of the present invention, where the unmanned aerial vehicle may include:
the acquiring module 301 is configured to acquire a stereoscopic disparity map of a current environment image through a binocular camera.
The binocular camera is formed by two cameras on the left and right sides and is located the front end of unmanned aerial vehicle for shoot the environment image around the unmanned aerial vehicle flight path. The binocular camera simulates the human eye vision principle to obtain two environment images, and the environment images acquired by the binocular images are distorted, so that a stereo parallax image needs to be obtained by calculation through a stereo matching algorithm. The stereo matching algorithm is mainly used for obtaining a disparity map according to a triangulation principle through a corresponding relation between a pair of environment images acquired by a binocular camera; after obtaining the parallax information, the depth information and the three-dimensional information of the original environment image can be easily obtained according to the projection model, so that the stereo parallax image is obtained through calculation.
A clustering module 302, configured to cluster the pixel points in the stereo disparity map to obtain multiple clustering regions.
After the stereo disparity map of the surrounding environment image is obtained, the pixel points can be clustered according to the gray value of each pixel point in the stereo disparity map, and the pixel points with the approximate gray values are divided into a clustering area, so that the pixel points in the stereo disparity map can be divided into a plurality of clustering areas.
And the flying distance calculating module 303 is configured to calculate, based on each clustering region, a flying distance from the environment region corresponding to each clustering region to the unmanned aerial vehicle.
Optionally, after the flying distance from the environment image corresponding to each clustering region to the unmanned aerial vehicle is obtained through calculation, the environment regions can be sorted from large to small according to the distance from each environment region to the unmanned aerial vehicle, and the arrangement sequence of the flying distances from the environment regions to the unmanned aerial vehicle is obtained.
Optionally, as another embodiment, the obtaining module 301 may be specifically configured to:
calibrating the binocular camera to obtain a Q matrix formed by the focal length, the base line and the origin offset of the binocular camera;
performing image correction on the current environment image acquired by the binocular camera based on the Q matrix to obtain a corrected image; the binocular camera obtains two current environment images, and therefore two corrected images are obtained after correction.
Performing stereo matching calculation on the two corrected images to obtain the stereo disparity map;
wherein, the formula of the Q matrix may be:
Figure BDA0001357169510000151
wherein, T isxRepresenting a baseline of the binocular camera, f representing a focal length of the binocular camera, (c)x,cy)、(c’x,c’y) Respectively two corrected imaging original points of the left camera and the right camera of the binocular camera, wherein the two corrected imaging original points are the same, namely cx=c’x,cy=c’y
The imaging origin is the intersection point of the optical axis of the binocular camera and the acquired image plane, and is usually located at the center of the image. The imaging original points of the two environment images collected by the binocular camera can be obtained by calibrating the binocular camera, and because the collected environment images need to be corrected, the imaging original points after the two environment images are corrected are the same (c)x,cy) So that the ideal form of perfectly parallel alignment of the two ambient images can be obtained.
Optionally, after obtaining the internal and external parameters of the binocular camera through calibration of the binocular camera, the two environment images collected by the two cameras can be subjected to image correction through the internal and external parameters of the binocular camera, and planes of the two environment images after correction are aligned in a completely parallel manner, so that the two corrected images can be subjected to stereo matching by using a stereo matching algorithm, and a stereo disparity map of the current environment image can be obtained. Optionally, the stereo matching algorithm may calculate the disparity d of each pixel point in the environment image acquired by the binocular camera according to a triangulation principle, so as to obtain a disparity image. After the parallax information is obtained, the depth information and the three-dimensional information of the original environment image can be obtained according to the projection model, and therefore the stereoscopic parallax image is obtained through calculation.
The flying distance calculating module 303 may specifically be configured to:
determining the position coordinates of target pixel points in each clustering region;
calculating and obtaining the position coordinates of the environment area corresponding to each clustering area in a stereo coordinate system according to the following position coordinate formula based on the target pixel point of each clustering area;
the position coordinate calculation formula may be:
Figure BDA0001357169510000161
wherein, (X, Y, Z) is a position coordinate of the environment region in a stereo coordinate system, Z may represent a flight distance from the environment region corresponding to each cluster region to the drone, (X, Y) represents a position coordinate of the target pixel point in any cluster region, and d represents a parallax corresponding to the target pixel in any cluster region.
A first determining module 304, configured to determine whether a flight distance between each environmental area and the drone is greater than a distance threshold.
A first selection module 305 for selecting any one of the fly-around regions from the environmental regions having a flight distance greater than a distance threshold;
optionally, the distance threshold may be set according to performance parameters of the unmanned aerial vehicle in actual flight, for example, the unmanned aerial vehicle may hover when being three meters away from the obstacle in front, and it is guaranteed that the unmanned aerial vehicle will not collide with the obstacle, and then the unmanned aerial vehicle may be set to three meters as a safety distance, and the safety distance is used as the distance threshold, so as to guarantee safe flight of the unmanned aerial vehicle.
And the flying-around module 306 is configured to calculate a rotation angle at which the unmanned aerial vehicle turns to any one of the flying-around areas, and control the unmanned aerial vehicle to fly around according to the rotation angle.
Optionally, in some embodiments, the fly-around module 306 may be specifically configured to:
and calculating the rotating angle of the unmanned aerial vehicle in the fly-around area according to the position coordinate corresponding to the fly-around area, and controlling the unmanned aerial vehicle to fly around according to the rotating angle.
After the unmanned aerial vehicle selects the fly-around area, the unmanned aerial vehicle needs to turn to fly to the fly-around area, and therefore the rotation angle of the unmanned aerial vehicle turning to the selected fly-around area needs to be calculated. The rotation angle can be obtained by calculating position coordinates (X, Y, Z) of an environment area corresponding to the flying-around area in a three-dimensional coordinate system, and the unmanned aerial vehicle is controlled to fly around according to the rotation angle.
Optionally, in order to avoid that the flight path of the drone is not changed too much, in the case that any one of the fly-around areas can pass through the drone, the fly-around area with the smallest rotation angle may be preferentially selected for fly-around.
In the embodiment, by judging the flight distance from the environment area corresponding to each clustering area in the stereo disparity map to the unmanned aerial vehicle, the flight-around area larger than the distance threshold is selected and the flight-around area is flown according to the rotation angle of the flight-around area, and the environment area with the minimum rotation angle and larger than the distance threshold can be preferentially selected as the flight-around area to be flown, so that the flight route is not greatly changed, the unmanned aerial vehicle can autonomously avoid obstacles and fly around, and the flight efficiency is greatly improved.
Optionally, as another embodiment, the clustering module may include:
the second judgment unit is used for taking any unmarked pixel point as a target pixel point and sequentially judging whether the pixel difference value of each pixel point and the target pixel point is within a preset range from the adjacent pixel point of the target pixel point;
and the marking unit is used for marking the pixel points, which are in the adjacent region of the target pixel point and have the pixel difference value with the target pixel point within the preset range, as the pixel points which belong to the same clustering region with the target pixel point.
Optionally, after any pixel point in the stereo disparity map is clustered, a clustering region where the any pixel point is located is marked.
For example, first, any unmarked pixel point in the stereo disparity map may be selected as a target pixel point, and the clustering region where the target pixel point marks the pixel point is region 0, and the position coordinate of the target pixel point is (i, j), and the gray value is d0
Optionally, as another embodiment, the second determining unit may be specifically configured to:
taking any unmarked pixel point as a target pixel point, and sequentially judging whether the pixel difference value of each pixel point which is continuously adjacent to the target pixel point is within a preset range or not from each adjacent pixel point of the target pixel point until the pixel difference value of any pixel point and the target pixel point is not within the preset range.
Optionally, the step of sequentially determining whether the pixel difference value between each pixel point and the target pixel point is within a preset range from the adjacent pixel point of the target pixel point may be to first select a first pixel point with an adjacent position coordinate (i, j-1) above the target pixel point, and obtain a gray value d corresponding to the first pixel point1. Setting the preset range of the gray value as D, and judging | D0-d1If the | is in the preset range, marking the first pixel point as a region 0. And then sequentially judging whether pixel points adjacent to the first pixel belong to the region 0 or not, and stopping judging in the direction until pixel points which do not belong to the region 0 are found in the direction. And sequentially selecting whether the pixel points in the lower direction, the left direction and the right direction of the target pixel point belong to the region 0 or not according to the process, and after the clustering of the region 0 is completed, reselecting any unmarked pixel point as the target pixel point to be marked as the region 1 and completing the clustering of the region 1 according to the clustering process of the region 0. And analogizing until the clustering region is marked on each pixel point in the stereo disparity map to represent that clustering is finished, thereby obtaining a plurality of clustering regions.
Fig. 4 is a schematic structural diagram of another embodiment of the unmanned aerial vehicle according to the embodiment of the present invention, where the unmanned aerial vehicle may include:
an obtaining module 401, configured to obtain a stereo disparity map of a current environment image through a binocular camera;
a clustering module 402, configured to cluster pixel points in the stereo disparity map to obtain multiple clustering regions;
a flying distance calculating module 403, configured to calculate, based on each clustering region, a flying distance from the environment region corresponding to each clustering region to the unmanned aerial vehicle.
A first determining module 404, configured to determine whether a flight distance between each environmental area and the drone is greater than a distance threshold.
Optionally, when judging whether the flying distance between each environmental area and the unmanned aerial vehicle is greater than the distance threshold, the sequences may be sorted according to the flying distance between each environmental area and the unmanned aerial vehicle from large to small, so as to obtain a sequence sorted according to the flying distance of the environmental areas.
Optionally, it may be determined first whether a flying distance between an environmental area corresponding to a flying direction of the drone and the drone is greater than a distance threshold.
Optionally, as another embodiment, after the first determining module 404, the method may further include:
the first control module is used for controlling the unmanned aerial vehicle to fly according to an original route if an environment area corresponding to the flight direction of the unmanned aerial vehicle is larger than a distance threshold value with the flight distance of the unmanned aerial vehicle.
A first selection module 405, configured to select any one of the fly-around areas from the environment areas having the flight distance greater than the distance threshold;
the first selection module 405 may include:
a second selecting unit 411, configured to select, if the environmental area corresponding to the flight direction of the unmanned aerial vehicle and the flight distance of the unmanned aerial vehicle are smaller than a distance threshold, an environmental area with the farthest flight distance from among the environmental areas with the flight distance greater than the distance threshold as a candidate area.
When the environmental area corresponding to the flight direction of the unmanned aerial vehicle and the flight distance of the unmanned aerial vehicle are smaller than the distance threshold, the unmanned aerial vehicle cannot continuously fly according to the initial flight direction, and the environmental area around which the distance is greater than the flight distance needs to be selected to fly around. According to the arrangement sequence of the flying distances corresponding to the environmental areas, the environmental area with the largest flying distance is preferentially selected as the candidate area to perform the detouring. Of course, the environmental area closest to the flight direction of the unmanned aerial vehicle and greater than the flight distance may also be selected as the candidate area to fly around.
A third judging unit 412, configured to judge whether the candidate region satisfies a flight condition; if so, triggering the determination unit 413; if not, the third selection unit 414 is triggered.
Alternatively, the flight condition may be determined based on the actual size of the candidate region. For example, the size of unmanned aerial vehicle is 1 meter wide, and height 0.5 meter, if the actual size in candidate region is greater than the size of unmanned aerial vehicle, then can guarantee that unmanned aerial vehicle passes through safely, this candidate region satisfies the flight condition, if the actual size in this candidate region is less than the size of unmanned aerial vehicle, then unmanned aerial vehicle can't fly through this candidate region, consequently does not satisfy the flight condition.
The determining unit 413 is configured to determine the candidate region as a fly-around region.
The third selecting unit 414 is configured to select an environmental area with the largest flight distance from the environmental areas with the flight distances larger than the distance threshold and excluding the candidate area as the candidate area, and return to step 206 to continue to perform the corresponding operation.
Optionally, if the candidate region does not satisfy the flight condition, the candidate region with the largest flight distance in the environment regions except the candidate region is selected as a new candidate region according to the ranking order of the flight distances corresponding to the environment regions, and whether the candidate region satisfies the flight condition is determined again.
And a fly-around module 406, configured to calculate a rotation angle at which the unmanned aerial vehicle turns to any one of the fly-around areas, and control the unmanned aerial vehicle to fly around according to the rotation angle.
Optionally, as another embodiment, after the first determining module 404, the method may further include:
and the second control module is used for controlling the unmanned aerial vehicle to hover if any one of the candidate areas does not meet the flight condition.
Optionally, if any candidate area does not meet the flight condition, indicating that the unmanned aerial vehicle cannot continue flying, controlling the unmanned aerial vehicle to hover, sending the surrounding environment image to the ground station to recalculate the flight route, and waiting for receiving the flight route sent by the ground station and then avoiding obstacles and flying around according to the flight instruction.
In this embodiment, the obtaining module 401 is the same as the obtaining module 301 in the embodiment of fig. 3, the clustering module 402 is the same as the clustering module 302 in the embodiment of fig. 3, the flying distance calculating module 403 is the same as the flying distance calculating module 303 in the embodiment of fig. 3, and the flying-around module 406 is the same as the flying-around module 306 in the embodiment of fig. 3, which is not repeated herein.
In the embodiment, whether the environment area corresponding to the flight direction of the unmanned aerial vehicle is greater than the distance threshold is judged preferentially, and if so, the unmanned aerial vehicle continues to fly according to the original route; if the distance between the unmanned aerial vehicle and the target area is smaller than the distance threshold value, the environment area with the flight distance larger than the distance threshold value and the farthest flight distance is preferentially selected as the candidate area, and the candidate area is judged to fly around under the condition that the flight condition is met, so that the unmanned aerial vehicle can find the optimal flight path to fly around according to the surrounding environment, the unmanned aerial vehicle can autonomously avoid obstacles and fly around, and the flight efficiency is greatly improved.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, computer readable media does not include non-transitory computer readable media (transient media), such as modulated data signals and carrier waves.
As used in the specification and in the claims, certain terms are used to refer to particular components. As one skilled in the art will appreciate, manufacturers may refer to a component by different names. This specification and claims do not intend to distinguish between components that differ in name but not function. In the following description and in the claims, the terms "include" and "comprise" are used in an open-ended fashion, and thus should be interpreted to mean "include, but not limited to. "substantially" means within an acceptable error range, and a person skilled in the art can solve the technical problem within a certain error range to substantially achieve the technical effect. Furthermore, the term "coupled" is intended to encompass any direct or indirect electrical coupling. Thus, if a first device couples to a second device, that connection may be through a direct electrical coupling or through an indirect electrical coupling via other devices and couplings. The following description is of the preferred embodiment for carrying out the invention, and is made for the purpose of illustrating the general principles of the invention and not for the purpose of limiting the scope of the invention. The scope of the present invention is defined by the appended claims.
It is also noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a good or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such good or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a commodity or system that includes the element
The foregoing description shows and describes several preferred embodiments of the invention, but as aforementioned, it is to be understood that the invention is not limited to the forms disclosed herein, but is not to be construed as excluding other embodiments and is capable of use in various other combinations, modifications, and environments and is capable of changes within the scope of the inventive concept as expressed herein, commensurate with the above teachings, or the skill or knowledge of the relevant art. And that modifications and variations may be effected by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (14)

1. An unmanned aerial vehicle obstacle avoidance method is characterized by comprising the following steps:
acquiring a stereo disparity map of a current environment image through a binocular camera;
clustering the pixel points in the stereo disparity map to obtain a plurality of clustering areas;
calculating the flight distance from the environment area corresponding to each clustering area to the unmanned aerial vehicle based on each clustering area;
determining the position coordinates of target pixel points in each clustering region;
calculating and obtaining the position coordinates of the environment area corresponding to each clustering area in a stereo coordinate system according to the following position coordinate formula based on the target pixel point of each clustering area;
the position coordinate calculation formula is as follows:
Figure FDA0002562783230000011
wherein (X, Y, Z) is the position coordinate of the environment area in a three-dimensional coordinate system, Z represents the flying distance from the environment area corresponding to each clustering area to the unmanned aerial vehicle, TxRepresenting a base line of the binocular camera, f representing a focal length of the binocular camera, (x, y) representing position coordinates of the target pixel points in any one clustering area, and d representing a view corresponding to the target pixel in any one clustering areaPoor, (c)x,cy) The corrected imaging original point of the binocular camera is obtained;
judging whether the flying distance between each environment area and the unmanned aerial vehicle is greater than a distance threshold value;
if the flight distance between the environment area corresponding to the flight direction of the unmanned aerial vehicle and the unmanned aerial vehicle is greater than a distance threshold, selecting an environment area with the longest flight distance from the environment areas with the flight distance greater than the distance threshold as a candidate area;
judging whether the candidate area meets the flight condition;
if so, taking the candidate area as a fly-around area;
if not, selecting an environment area with the longest flight distance from the environment areas with the flight distance larger than the distance threshold and without the candidate area as the candidate area, and returning to the step of judging whether the candidate area meets the flight condition to continue the execution;
and calculating a rotation angle of the unmanned aerial vehicle steering to any one of the fly-around areas, and controlling the unmanned aerial vehicle to fly around according to the rotation angle.
2. The method according to claim 1, wherein the clustering the pixels in the stereo disparity map to obtain a plurality of clustering regions comprises:
taking any unmarked pixel point as a target pixel point, and sequentially judging whether the pixel difference value of each pixel point and the target pixel point is within a preset range from the adjacent pixel point of the target pixel point;
and marking the pixel points of which the pixel difference value between the adjacent areas of the target pixel points and the target pixel points is in a preset range as the pixel points which belong to the same clustering area with the target pixel points.
3. The method of claim 2, wherein the step of sequentially determining whether the pixel difference value between each pixel point and the target pixel point is within a preset range from an adjacent pixel point of the target pixel point by using any unmarked pixel point as the target pixel point comprises:
taking any unmarked pixel point as a target pixel point, and sequentially judging whether the pixel difference value of each pixel point which is continuously adjacent to the target pixel point is within a preset range or not from each adjacent pixel point of the target pixel point until the pixel difference value of any pixel point and the target pixel point is not within the preset range.
4. The method of claim 1, wherein the acquiring the stereoscopic disparity map of the current environment image through the binocular camera comprises:
calibrating the binocular camera to obtain a Q matrix formed by the focal length, the base line and the origin offset of the binocular camera;
and carrying out image correction on the current environment image acquired by the binocular camera based on the Q matrix to obtain a corrected image.
5. The method of claim 1, further comprising:
and if the environmental area corresponding to the flight direction of the unmanned aerial vehicle and the flight distance of the unmanned aerial vehicle are greater than the distance threshold value, controlling the unmanned aerial vehicle to fly according to the original route.
6. The method of claim 1, further comprising:
and if any candidate area does not meet the flight condition, controlling the unmanned aerial vehicle to hover.
7. The method according to claim 4, wherein the calculating a rotation angle for the drone to turn to any one of the fly-around areas and controlling the drone to fly around according to the rotation angle comprises:
and calculating the rotating angle of the unmanned aerial vehicle in the fly-around area according to the position coordinate corresponding to the fly-around area, and controlling the unmanned aerial vehicle to fly around according to the rotating angle.
8. An unmanned aerial vehicle, comprising:
the acquisition module is used for acquiring a stereoscopic parallax image of the current environment image through a binocular camera;
the clustering module is used for clustering the pixel points in the stereo disparity map to obtain a plurality of clustering areas;
the flight distance calculation module is used for calculating the flight distance from the environment area corresponding to each clustering area to the unmanned aerial vehicle based on each clustering area, and is specifically used for determining the position coordinates of target pixel points in each clustering area; calculating and obtaining the position coordinates of the environment area corresponding to each clustering area in a stereo coordinate system according to the following position coordinate formula based on the target pixel point of each clustering area;
the position coordinate calculation formula is as follows:
Figure FDA0002562783230000031
wherein (X, Y, Z) is the position coordinate of the environment area in a three-dimensional coordinate system, Z represents the flying distance from the environment area corresponding to each clustering area to the unmanned aerial vehicle, TxRepresenting a base line of the binocular camera, f representing a focal length of the binocular camera, (x, y) representing position coordinates of the target pixel point in any one clustering area, d representing a parallax corresponding to the target pixel in any one clustering area, and (c)x,cy) The corrected imaging original point of the binocular camera is obtained;
the first judgment module is used for judging whether the flying distance between each environment area and the unmanned aerial vehicle is greater than a distance threshold value;
the first selection module comprises: the device comprises a second selection unit, a third judgment unit, a determination unit and a third selection unit;
the second selection unit is configured to select, if the flight distance between the environment area corresponding to the flight direction of the unmanned aerial vehicle and the unmanned aerial vehicle is smaller than a distance threshold, an environment area with the longest flight distance from among the environment areas with the flight distance greater than the distance threshold as a candidate area;
the third judging unit is used for judging whether the candidate area meets the flight condition; if yes, triggering the determining unit; if not, triggering the third selection unit;
the determining unit is used for taking the candidate area as a fly-around area;
the third selecting unit is used for selecting an environment area with the farthest flight distance from environment areas with the flight distance larger than a distance threshold value and excluding the candidate area as a candidate area, and returning to the step of judging whether the candidate area meets the flight condition to continue the execution;
and the fly-around module is used for calculating a rotation angle of the unmanned aerial vehicle steering to any one of the fly-around areas and controlling the unmanned aerial vehicle to fly around according to the rotation angle.
9. The drone of claim 8, wherein the clustering module comprises:
the second judgment unit is used for taking any unmarked pixel point as a target pixel point and sequentially judging whether the pixel difference value of each pixel point and the target pixel point is within a preset range from the adjacent pixel point of the target pixel point;
and the marking unit is used for marking the pixel points, which are in the adjacent region of the target pixel point and have the pixel difference value with the target pixel point within the preset range, as the pixel points which belong to the same clustering region with the target pixel point.
10. The unmanned aerial vehicle of claim 9, wherein the second determination unit is specifically configured to:
taking any unmarked pixel point as a target pixel point, and sequentially judging whether the pixel difference value of each pixel point which is continuously adjacent to the target pixel point is within a preset range or not from each adjacent pixel point of the target pixel point until the pixel difference value of any pixel point and the target pixel point is not within the preset range.
11. An unmanned aerial vehicle as defined in claim 8, wherein the acquisition module is specifically configured to:
calibrating the binocular camera to obtain a Q matrix formed by the focal length, the base line and the origin offset of the binocular camera;
and carrying out image correction on the current environment image acquired by the binocular camera based on the Q matrix to obtain a corrected image.
12. The drone of claim 8, further comprising:
the first control module is used for controlling the unmanned aerial vehicle to fly according to an original route if an environment area corresponding to the flight direction of the unmanned aerial vehicle is larger than a distance threshold value with the flight distance of the unmanned aerial vehicle.
13. The drone of claim 8, further comprising:
and the second control module is used for controlling the unmanned aerial vehicle to hover if any one of the candidate areas does not meet the flight condition.
14. A drone according to claim 11, characterised in that the fly-around module is particularly adapted to:
and calculating the rotating angle of the unmanned aerial vehicle in the fly-around area according to the position coordinate corresponding to the fly-around area, and controlling the unmanned aerial vehicle to fly around according to the rotating angle.
CN201710601150.1A 2017-07-21 2017-07-21 Unmanned aerial vehicle obstacle avoidance method and unmanned aerial vehicle Active CN107329490B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201710601150.1A CN107329490B (en) 2017-07-21 2017-07-21 Unmanned aerial vehicle obstacle avoidance method and unmanned aerial vehicle
PCT/CN2017/108022 WO2019015158A1 (en) 2017-07-21 2017-10-27 Obstacle avoidance method for unmanned aerial vehicle, and unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710601150.1A CN107329490B (en) 2017-07-21 2017-07-21 Unmanned aerial vehicle obstacle avoidance method and unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN107329490A CN107329490A (en) 2017-11-07
CN107329490B true CN107329490B (en) 2020-10-09

Family

ID=60200465

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710601150.1A Active CN107329490B (en) 2017-07-21 2017-07-21 Unmanned aerial vehicle obstacle avoidance method and unmanned aerial vehicle

Country Status (2)

Country Link
CN (1) CN107329490B (en)
WO (1) WO2019015158A1 (en)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107958461A (en) * 2017-11-14 2018-04-24 中国航空工业集团公司西安飞机设计研究所 A kind of carrier aircraft method for tracking target based on binocular vision
CN107977985B (en) * 2017-11-29 2021-02-09 上海拓攻机器人有限公司 Unmanned aerial vehicle hovering method and device, unmanned aerial vehicle and storage medium
EP3531375B1 (en) * 2017-12-25 2021-08-18 Autel Robotics Co., Ltd. Method and apparatus for measuring distance, and unmanned aerial vehicle
CN110612497B (en) * 2018-01-05 2022-10-25 深圳市大疆创新科技有限公司 Control method of unmanned aerial vehicle, unmanned aerial vehicle system and control equipment
WO2019144291A1 (en) * 2018-01-23 2019-08-01 深圳市大疆创新科技有限公司 Flight control method, apparatus, and machine-readable storage medium
CN110231832B (en) * 2018-03-05 2022-09-06 北京京东乾石科技有限公司 Obstacle avoidance method and obstacle avoidance device for unmanned aerial vehicle
CN108497988A (en) * 2018-04-11 2018-09-07 重庆第二师范学院 A kind of embedded high-altitude cleaning glass window machine people's control system
CN108844538B (en) * 2018-05-07 2021-01-19 中国人民解放军国防科技大学 Unmanned aerial vehicle obstacle avoidance waypoint generation method based on vision/inertial navigation
CN111326023B (en) * 2018-12-13 2022-03-29 丰翼科技(深圳)有限公司 Unmanned aerial vehicle route early warning method, device, equipment and storage medium
CN109634309B (en) * 2019-02-21 2024-03-26 南京晓庄学院 Autonomous obstacle avoidance system and method for aircraft and aircraft
US20220153411A1 (en) * 2019-03-25 2022-05-19 Sony Group Corporation Moving body, control method thereof, and program
CN110187720B (en) * 2019-06-03 2022-09-27 深圳铂石空间科技有限公司 Unmanned aerial vehicle guiding method, device, system, medium and electronic equipment
CN112101374B (en) * 2020-08-01 2022-05-24 西南交通大学 Unmanned aerial vehicle obstacle detection method based on SURF feature detection and ISODATA clustering algorithm
CN112729312A (en) * 2020-12-25 2021-04-30 云南电网有限责任公司昆明供电局 Unmanned aerial vehicle inspection method for high-voltage chamber of transformer substation
CN113376658A (en) * 2021-05-08 2021-09-10 广东电网有限责任公司广州供电局 Unmanned aerial vehicle autonomous obstacle avoidance method and system based on single line laser radar
CN113554666A (en) * 2021-07-22 2021-10-26 南京航空航天大学 Device and method for extracting aircraft target candidate region in airborne optical image
CN114879729A (en) * 2022-05-16 2022-08-09 西北工业大学 Unmanned aerial vehicle autonomous obstacle avoidance method based on obstacle contour detection algorithm
CN117170411B (en) * 2023-11-02 2024-02-02 山东环维游乐设备有限公司 Vision assistance-based auxiliary obstacle avoidance method for racing unmanned aerial vehicle
CN117437563B (en) * 2023-12-13 2024-03-15 黑龙江惠达科技股份有限公司 Plant protection unmanned aerial vehicle dotting method, device and equipment based on binocular vision

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014169354A1 (en) * 2013-04-16 2014-10-23 Bae Systems Australia Limited Landing system for an aircraft
CN105225241A (en) * 2015-09-25 2016-01-06 广州极飞电子科技有限公司 The acquisition methods of unmanned plane depth image and unmanned plane
CN106774421A (en) * 2017-02-10 2017-05-31 郑州云海信息技术有限公司 A kind of unmanned plane Trajectory Planning System
CN106774386A (en) * 2016-12-06 2017-05-31 杭州灵目科技有限公司 Unmanned plane vision guided navigation landing system based on multiple dimensioned marker

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101720047B (en) * 2009-11-03 2011-12-21 上海大学 Method for acquiring range image by stereo matching of multi-aperture photographing based on color segmentation
CN101989302B (en) * 2010-10-22 2012-11-28 西安交通大学 Multilayer bitmap color feature-based image retrieval method
TW201248347A (en) * 2011-05-18 2012-12-01 Hon Hai Prec Ind Co Ltd System and method for controlling unmanned aerial vehicle
CN104463183B (en) * 2013-09-13 2017-10-10 株式会社理光 Cluster centre choosing method and system
TWI532619B (en) * 2014-03-06 2016-05-11 Univ Nat Changhua Education Dual Image Obstacle Avoidance Path Planning Navigation Control Method
CN106933243A (en) * 2015-12-30 2017-07-07 湖南基石信息技术有限公司 A kind of unmanned plane Real Time Obstacle Avoiding system and method based on binocular vision
CN105718895A (en) * 2016-01-22 2016-06-29 张健敏 Unmanned aerial vehicle based on visual characteristics
CN105787447A (en) * 2016-02-26 2016-07-20 深圳市道通智能航空技术有限公司 Method and system of unmanned plane omnibearing obstacle avoidance based on binocular vision
CN106444837A (en) * 2016-10-17 2017-02-22 北京理工大学 Obstacle avoiding method and obstacle avoiding system for unmanned aerial vehicle
CN106708084B (en) * 2016-11-24 2019-08-02 中国科学院自动化研究所 The automatic detection of obstacles of unmanned plane and barrier-avoiding method under complex environment
CN106909877B (en) * 2016-12-13 2020-04-14 浙江大学 Visual simultaneous mapping and positioning method based on dotted line comprehensive characteristics

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014169354A1 (en) * 2013-04-16 2014-10-23 Bae Systems Australia Limited Landing system for an aircraft
CN105225241A (en) * 2015-09-25 2016-01-06 广州极飞电子科技有限公司 The acquisition methods of unmanned plane depth image and unmanned plane
CN106774386A (en) * 2016-12-06 2017-05-31 杭州灵目科技有限公司 Unmanned plane vision guided navigation landing system based on multiple dimensioned marker
CN106774421A (en) * 2017-02-10 2017-05-31 郑州云海信息技术有限公司 A kind of unmanned plane Trajectory Planning System

Also Published As

Publication number Publication date
WO2019015158A1 (en) 2019-01-24
CN107329490A (en) 2017-11-07

Similar Documents

Publication Publication Date Title
CN107329490B (en) Unmanned aerial vehicle obstacle avoidance method and unmanned aerial vehicle
US20210279444A1 (en) Systems and methods for depth map sampling
US9981742B2 (en) Autonomous navigation method and system, and map modeling method and system
CN110988912B (en) Road target and distance detection method, system and device for automatic driving vehicle
CN106529495B (en) Obstacle detection method and device for aircraft
CN110148185B (en) Method and device for determining coordinate system conversion parameters of imaging equipment and electronic equipment
US10630962B2 (en) Systems and methods for object location
US11151741B2 (en) System and method for obstacle avoidance
WO2019113966A1 (en) Obstacle avoidance method and device, and unmanned aerial vehicle
WO2019076304A1 (en) Binocular camera-based visual slam method for unmanned aerial vehicles, unmanned aerial vehicle, and storage medium
WO2018120040A1 (en) Obstacle detection method and device
CN106444837A (en) Obstacle avoiding method and obstacle avoiding system for unmanned aerial vehicle
JP2019011971A (en) Estimation system and automobile
CN108235815B (en) Imaging control device, imaging system, moving object, imaging control method, and medium
JP2022531625A (en) Detection method, device, electronic device and storage medium
CN112232275B (en) Obstacle detection method, system, equipment and storage medium based on binocular recognition
CN111338382A (en) Unmanned aerial vehicle path planning method guided by safety situation
WO2021195939A1 (en) Calibrating method for external parameters of binocular photographing device, movable platform and system
CN105844692A (en) Binocular stereoscopic vision based 3D reconstruction device, method, system and UAV
WO2021056139A1 (en) Method and device for acquiring landing position, unmanned aerial vehicle, system, and storage medium
WO2020237478A1 (en) Flight planning method and related device
CN114529800A (en) Obstacle avoidance method, system, device and medium for rotor unmanned aerial vehicle
US20210156710A1 (en) Map processing method, device, and computer-readable storage medium
WO2021056144A1 (en) Method and apparatus for controlling return of movable platform, and movable platform
CN114648639B (en) Target vehicle detection method, system and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant