CN112947570B - Unmanned aerial vehicle obstacle avoidance method and device and storage medium - Google Patents

Unmanned aerial vehicle obstacle avoidance method and device and storage medium Download PDF

Info

Publication number
CN112947570B
CN112947570B CN202110262530.3A CN202110262530A CN112947570B CN 112947570 B CN112947570 B CN 112947570B CN 202110262530 A CN202110262530 A CN 202110262530A CN 112947570 B CN112947570 B CN 112947570B
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
looking
path
environment map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110262530.3A
Other languages
Chinese (zh)
Other versions
CN112947570A (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Zhendi Intelligent Technology Co Ltd
Original Assignee
Suzhou Zhendi Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Zhendi Intelligent Technology Co Ltd filed Critical Suzhou Zhendi Intelligent Technology Co Ltd
Priority to CN202110262530.3A priority Critical patent/CN112947570B/en
Publication of CN112947570A publication Critical patent/CN112947570A/en
Application granted granted Critical
Publication of CN112947570B publication Critical patent/CN112947570B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Abstract

The embodiment of the application provides a method, a device and a storage medium for avoiding obstacles by an unmanned aerial vehicle, wherein the method comprises the following steps: acquiring a forward-looking image shot by a forward-looking camera of the unmanned aerial vehicle, wherein the forward-looking camera is positioned below a body of the unmanned aerial vehicle; establishing an environment map from the forward-looking image, wherein the environment map represents the location of objects in the forward-looking image in a real environment; acquiring a safe path according to the environment map, wherein each position point on the safe path is in an area without obstacles; generating an expected course according to the safety path; and controlling the unmanned aerial vehicle to carry out obstacle avoidance flight according to the expected course, so that the burden of the unmanned aerial vehicle can be reduced, and the omnidirectional obstacle avoidance can be realized.

Description

Unmanned aerial vehicle obstacle avoidance method and device and storage medium
Technical Field
The embodiment of the application relates to the field of unmanned aerial vehicles, in particular to a method and a device for avoiding obstacles by an unmanned aerial vehicle and a storage medium.
Background
In the correlation technique, when following people, when targets such as car, will adjust unmanned aerial vehicle course and cloud platform yaw angle, make the target at the field of vision center, then let unmanned aerial vehicle movement track follow the movement track of target, still need avoid the obstacle at the in-process of following, use the quantity that increases unmanned aerial vehicle equidirectional binocular camera of difference at present to accomplish the qxcomm direction and keep away the obstacle, but this method has increased the burden of unmanned aerial vehicle treater, has also increased unmanned aerial vehicle's weight simultaneously.
Therefore, how to reduce the burden of the unmanned aerial vehicle and simultaneously realize the omnidirectional obstacle avoidance becomes a problem to be solved urgently.
Disclosure of Invention
The embodiment of the application provides a method and a device for avoiding obstacles by an unmanned aerial vehicle and a storage medium, and at least an expected course of the unmanned aerial vehicle can be generated by some embodiments of the application, so that the burden of the unmanned aerial vehicle is reduced, and the obstacle is avoided in an omnidirectional manner.
In a first aspect, a method for avoiding obstacles for an unmanned aerial vehicle includes: acquiring a forward-looking image shot by a forward-looking camera of the unmanned aerial vehicle, wherein the forward-looking camera is positioned below a body of the unmanned aerial vehicle; establishing an environment map from the forward-looking image, wherein the environment map represents the location of objects in the forward-looking image in a real environment; acquiring a safe path according to the environment map, wherein each position point on the safe path is in an area without obstacles; generating an expected course according to the safety path; and controlling the unmanned aerial vehicle to carry out obstacle avoidance flight according to the expected course.
Therefore, according to the method for avoiding the obstacle of the unmanned aerial vehicle, the expected course of the unmanned aerial vehicle can be obtained, so that the unmanned aerial vehicle is controlled to avoid the obstacle to fly according to the expected course, the unmanned aerial vehicle can be shot by only arranging the front-facing camera, the burden of the unmanned aerial vehicle can be reduced, and the cost is also saved on the basis of reducing the burden of the unmanned aerial vehicle due to the fact that the number of the cameras is reduced.
With reference to the first aspect, in an embodiment, before the acquiring a forward-looking image captured by a forward-looking camera of a drone, the method further includes: acquiring the speed direction of the unmanned aerial vehicle; and adjusting the direction of the unmanned aerial vehicle head to be consistent with the speed direction.
Consequently, this application embodiment is unanimous through the direction and the speed direction that make the unmanned aerial vehicle aircraft nose, can make unmanned aerial vehicle keep flight direction at the flight in-process.
With reference to the first aspect, in an embodiment, the building an environment map according to the forward-looking image includes: and performing coordinate conversion on the pixel position and the depth information of each pixel in the forward-looking image by taking the unmanned aerial vehicle as an origin to obtain the environment map.
Therefore, the embodiment of the application converts the forward-looking image into the environment map so as to generate the safe path and the expected heading on the environment map, thereby enabling the unmanned aerial vehicle to fly in an obstacle avoidance manner.
With reference to the first aspect, in an embodiment, the generating a desired heading according to the safe path includes: optimizing the track of the safety path to obtain a path curve function; performing derivation on the path curve function to obtain a speed curve function; and obtaining the expected heading according to the speed vector in the speed curve function.
Therefore, the unmanned aerial vehicle can fly according to the expected course by generating the expected course, so that the purpose of obstacle avoidance flying is achieved.
With reference to the first aspect, in an embodiment, the obtaining a safe path according to the environment map includes: predicting a plurality of position points of the unmanned aerial vehicle at a plurality of future moments according to a motion strategy and a motion formula corresponding to the motion strategy, wherein the motion strategy comprises: acceleration, uniform speed or deceleration; under the condition that at least part of the position points in the plurality of position points are confirmed to meet preset conditions, calculating loss functions of the at least part of the position points, obtaining loss function values corresponding to each point in the at least part of the position points, and obtaining a plurality of candidate loss function values; selecting a position point corresponding to the loss function value with the minimum value from the candidate loss function values as a track point on the safe path; the steps in the above embodiment are repeatedly performed until the termination condition is satisfied, resulting in a plurality of trace points.
With reference to the first aspect, in an embodiment, the obtaining the desired heading according to a velocity vector in the velocity curve function includes: calculating an included angle between a first component and a second component in the velocity vector; and taking the included angle as the expected heading.
With reference to the first aspect, in one embodiment, the forward-looking image is obtained by controlling the pan/tilt head to rotate as follows: and controlling the holder to rotate according to the position difference between the target frame and the center of the front-looking camera.
Therefore, the embodiment of the application can control the rotation of the holder through the position difference between the target frame and the center of the forward-looking camera, so that the target can be ensured to be at the visual field center of the unmanned aerial vehicle.
In a second aspect, an apparatus for unmanned aerial vehicle obstacle avoidance, the apparatus comprising: the acquisition module is configured to acquire a forward-looking image shot by a forward-looking camera of the unmanned aerial vehicle, wherein the forward-looking camera is positioned below a body of the unmanned aerial vehicle; an establishing module configured to establish an environment map from the forward-looking image, wherein the environment map represents a location of an object in the forward-looking image in a real environment; the obtaining module is configured to obtain a safe path according to the environment map, wherein each position point on the safe path is in an area without obstacles; a generating module configured to generate a desired heading from the secure path; and the control module is configured to control the unmanned aerial vehicle to carry out obstacle avoidance flight according to the expected heading.
With reference to the second aspect, in one embodiment, the obtaining means is configured to obtain a speed direction of the drone; and adjusting the direction of the unmanned aerial vehicle head to be consistent with the speed direction.
With reference to the second aspect, in an embodiment, the establishing module is configured to perform coordinate transformation on the pixel position and the depth information of each pixel in the forward-looking image with the unmanned aerial vehicle as an origin to obtain the environment map.
With reference to the second aspect, in an embodiment, the generating module is configured to perform trajectory optimization on the safety path to obtain a path curve function; carrying out derivation on the path curve function to obtain a speed curve function; and obtaining the expected course according to the speed vector in the speed curve function.
With reference to the second aspect, in an embodiment, the obtaining module is configured to predict a plurality of location points of the drone at a plurality of future times according to a motion policy and a motion formula corresponding to the motion policy, wherein the motion policy includes: acceleration, uniform speed or deceleration; under the condition that at least part of the position points in the plurality of position points are confirmed to meet preset conditions, calculating loss functions of the at least part of the position points, obtaining loss function values corresponding to each point in the at least part of the position points, and obtaining a plurality of candidate loss function values; selecting a position point corresponding to the loss function value with the minimum value from the plurality of candidate loss function values as a track point on the safe path; the steps in the above embodiment are repeatedly performed until the termination condition is satisfied, resulting in a plurality of trace points.
With reference to the second aspect, in one embodiment, the generating module is configured to calculate an angle between a first component and a second component in the velocity vector; and taking the included angle as the expected heading.
With reference to the second aspect, in one embodiment, the forward-looking image is obtained by controlling the rotation of the pan/tilt head as follows: and controlling the rotation of the holder according to the position difference between the target frame and the center of the front-looking camera.
In a third aspect, an electronic device comprises: a processor, a memory and a bus, the processor being connected to the memory via the bus, the memory storing computer readable instructions for implementing the method according to the first aspect and any of all embodiments thereof when the computer readable instructions are executed by the processor.
In a fourth aspect, a computer-readable storage medium has stored thereon a computer program which, when executed by a server, implements the method as described in any of the first aspect and all embodiments thereof.
Drawings
Fig. 1 is a flowchart of a method for obstacle avoidance of an unmanned aerial vehicle according to an embodiment of the present disclosure;
fig. 2 is a flow of a following method for an unmanned aerial vehicle according to an embodiment of the present application;
fig. 3 is a flow of an unmanned aerial vehicle obstacle avoidance method shown in the embodiment of the present application;
fig. 4 is an internal structural diagram of an unmanned aerial vehicle obstacle avoidance device shown in the embodiment of the present application;
fig. 5 is an internal structure diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as presented in the figures, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
The method steps in the embodiments of the present application will be described in detail below with reference to the accompanying drawings.
The application can be applied to the scene that the barrier was kept away in unmanned aerial vehicle flight, and under the more and more circumstances of the use scene to unmanned aerial vehicle, the barrier was kept away in flight all the time to be the focus of unmanned aerial vehicle field research. The inventor of this application discovers, because use the quantity that increases unmanned aerial vehicle equidirectional binocular camera at present to accomplish the barrier of keeping away of qxcomm technology, but this method has increased the burden of unmanned aerial vehicle treater, has also increased unmanned aerial vehicle's weight simultaneously. Therefore, according to the method for avoiding the obstacle of the unmanned aerial vehicle, the expected course of the unmanned aerial vehicle can be obtained, so that the unmanned aerial vehicle is controlled to avoid the obstacle to fly according to the expected course, shooting is carried out by only arranging one front-facing camera, the burden of the unmanned aerial vehicle can be reduced, and the cost is also saved on the basis of reducing the burden of the unmanned aerial vehicle due to the fact that the number of the cameras is reduced.
In view of the foregoing problems, an embodiment of the present application provides a method, an apparatus, and a storage medium for obstacle avoidance of an unmanned aerial vehicle, where the method includes: acquiring a forward-looking image shot by a forward-looking camera of the unmanned aerial vehicle, wherein the forward-looking camera is positioned below a body of the unmanned aerial vehicle; establishing an environment map from the forward-looking image, wherein the environment map represents the location of objects in the forward-looking image in a real environment; acquiring a safe path according to the environment map, wherein each position point on the safe path is in an area without obstacles; generating an expected course according to the safety path; and controlling the unmanned aerial vehicle to carry out obstacle avoidance flight according to the expected course, so that the burden of the unmanned aerial vehicle can be reduced, and the omnidirectional obstacle avoidance can be realized.
The following description describes fig. 1, which illustrates a method for avoiding obstacles by an unmanned aerial vehicle, as shown in fig. 1:
and S110, acquiring a forward-looking image shot by the forward-looking camera of the unmanned aerial vehicle.
In one embodiment, a forward-looking image captured by a forward-looking camera of a drone is acquired, wherein the forward-looking camera is located below a fuselage of the drone.
In one embodiment, before the acquiring the forward-looking image captured by the forward-looking camera of the drone, the method further comprises: acquiring the speed direction of the unmanned aerial vehicle; and adjusting the direction of the unmanned aerial vehicle head to be consistent with the speed direction.
Under the premise of not increasing a sensor, the holder and the forward-looking camera are adjusted to be below the body of the unmanned aerial vehicle, so that 360-degree rotation of the holder in the horizontal direction is achieved, the angle of the holder is adjusted to enable the holder to always lock a target, the course of the aircraft is adjusted to be consistent with the flight direction of the unmanned aerial vehicle in real time, the forward-looking camera shoots a forward-looking image of the unmanned aerial vehicle at the moment, and a processor of the unmanned aerial vehicle obtains the forward-looking image for a subsequent unmanned aerial vehicle obstacle avoidance method.
In one embodiment, said forward-looking image is obtained by controlling the rotation of the head in such a way that: and controlling the rotation of the holder according to the position difference between the target frame and the center of the front-looking camera.
The center of the forward-looking camera is the center of the forward-looking image, pixels of the forward-looking image are 640 multiplied by 480, the central point is (320,240), and information of the target frame has four values (x, y, w, h), wherein (x, y) represent pixel coordinates of the target frame in the image, and (w, h) represent width and height of the target frame respectively.
Therefore, the embodiment of the application can control the rotation of the holder through the position difference between the target frame and the center of the forward-looking camera, so that the target can be ensured to be at the visual field center of the unmanned aerial vehicle.
The step of acquiring the forward-looking image shot by the forward-looking camera of the unmanned aerial vehicle is described above, and the step of establishing the environment map according to the forward-looking image is described below.
And S120, establishing an environment map according to the forward-looking image.
In one embodiment, an environment map is created from the forward-looking image, wherein the environment map represents the position of objects in the forward-looking image in the real environment.
In one embodiment, said creating an environment map from said forward looking image comprises: and performing coordinate conversion on the pixel position and the depth information of each pixel in the forward-looking image by taking the unmanned aerial vehicle as an origin to obtain the environment map.
The foresight image is a forward binocular depth map, a coordinate system where the unmanned aerial vehicle is located takes the unmanned aerial vehicle as an origin, a geographical north is an X axis, a geographical east is a Y axis to establish the coordinate system, each pixel point on the foresight image comprises position information and depth information and is represented by (X, Y, depth), X represents an abscissa value of each pixel in the foresight image, Y represents an ordinate value of each pixel in the foresight image, (X, Y) represents a position of each pixel in the coordinate system, depth represents a distance from each pixel to the foresight camera, and the pixel coordinates are converted into positions corresponding to an object in the image and a real world through coordinate conversion of the position of each pixel and the distance from each pixel to the foresight camera to obtain the environment map. The environment map is represented by a 20-by-5 square, each cell in the square represents a 0.3m small square, and in actual operation, a cell with an obstacle part is marked as 1, and a cell without an obstacle part is marked as 0.
Therefore, the embodiment of the application converts the forward-looking image into the environment map so as to generate the safe path and the expected heading on the environment map, thereby enabling the unmanned aerial vehicle to fly in an obstacle avoidance manner.
The above describes an embodiment of creating an environment map from a forward-looking image, and the following describes an embodiment of obtaining a safe path from the environment map obtained as described above.
And S130, acquiring a safe path according to the environment map.
In one embodiment, obtaining a safe path from an environment map includes: predicting a plurality of position points of the unmanned aerial vehicle at a plurality of future moments according to a motion strategy and a motion formula corresponding to the motion strategy, wherein the motion strategy comprises: acceleration, uniform speed or deceleration; under the condition that at least part of the position points in the plurality of position points are confirmed to meet preset conditions, calculating loss functions of the at least part of the position points, obtaining loss function values corresponding to each point in the at least part of the position points, and obtaining a plurality of candidate loss function values; selecting a position point corresponding to the loss function value with the minimum value from the plurality of candidate loss function values as a track point on the safe path; and repeating the steps until the termination condition is met to obtain a plurality of track points.
In the embodiment of the application, the search path adopts hybrid A algorithm and is based on the motion formula p corresponding to the motion strategy 1 =p 0 +v t +0.5at 2 Predicting a plurality of location points of the drone at a plurality of times in the future, wherein p 1 Indicating the position of the next instant in time interval t, p 0 Indicating the position at the current moment, v t The speed at the current moment is shown, a shows the acceleration in different motion states, and t shows the time step. Under the condition that the motion strategy is acceleration, taking the maximum acceleration in the forward direction, the position of the unmanned aerial vehicle at the current moment, the speed of the unmanned aerial vehicle at the current moment and the time step t, and calculating a plurality of position points with the time interval of the step t under the condition that the motion strategy is acceleration, for example: the time step is 2 seconds, and the predicted multiple position points are the positions of the unmanned aerial vehicle at the moments of 4 th second, 6 th second and 8 th second; under the condition that the motion strategy is deceleration, taking the negative maximum acceleration, the position of the unmanned aerial vehicle at the current moment, the speed of the unmanned aerial vehicle at the current moment and the time step length t, and calculating a plurality of position points with the time interval of the step length t under the condition that the motion strategy is deceleration; and under the condition that the motion strategy is in a constant speed, taking the acceleration of 0, the position of the unmanned aerial vehicle at the current moment, the speed of the unmanned aerial vehicle at the current moment and the time step length t, and calculating a plurality of position points with the time interval of the step length t under the condition that the motion strategy is in the constant speed. Position points are thus obtained at time intervals of step t under different motion strategies.
After obtaining each position point of the next moment corresponding to the motion strategy of acceleration, uniform speed or deceleration respectively, judging the position point p 1 Whether preset conditions are met or not, wherein the preset conditions are that the preset conditions are not in obstacles, are less than or equal to the maximum speed of the unmanned aerial vehicle and are not in a searched area, and if the preset conditions are met simultaneously, all p meeting the preset conditions are calculated 1 The loss function is composed of two parts of energy loss from the point to the starting point and the predicted energy loss to the end point, and p with the minimum loss function is selected 1 Point, then circulate the above steps until meetObtaining a plurality of trace points under the termination condition p 1 The distance from the current position of the unmanned aerial vehicle to the set distance is reached, or the cycle iteration number is larger than 200.
The set distance is generally equal to or less than the map range size, and when the map range is 20m × 20m, the set distance is equal to or less than 20m; when the range of the map is 50m × 50m, the set distance is 50m or less, and the embodiment of the present application is not limited to this.
As another embodiment, the next position point of the drone at the future time is predicted according to a motion strategy and a motion formula corresponding to the motion strategy, where the motion strategy includes: acceleration, uniform speed or deceleration; under the condition that the next position point is confirmed to meet the preset condition, calculating a loss function of the next position point to obtain a loss function value, and selecting the position point corresponding to the minimum loss function value as a track point on the safety path; and repeating the steps until the termination condition is met to obtain a plurality of track points.
The search path adopts hybrid A algorithm and is based on the motion formula p corresponding to the motion strategy 1 =p 0 +v t +0.5at 2 Predicting the position point of the unmanned aerial vehicle at the next time step, wherein p 1 Indicating the position of the next instant in time interval t, p 0 Indicating the position of the current time, v t The speed at the current moment is shown, a shows the acceleration in different motion states, and t shows the time step. Under the condition that the motion strategy is acceleration, taking the maximum acceleration in the forward direction, the position of the unmanned aerial vehicle at the current moment, the speed of the unmanned aerial vehicle at the current moment and the time step t, and calculating to predict the next position point with the time interval of the step t under the condition that the motion strategy is acceleration, for example: the time step is 2 seconds, and the predicted next position point is the position of the unmanned aerial vehicle at the 4 th second moment; under the condition that the motion strategy is deceleration, taking the negative maximum acceleration, the position of the unmanned aerial vehicle at the current moment, the speed of the unmanned aerial vehicle at the current moment and the time step length t, and calculating to predict the next position point with the time interval of the step length t under the condition that the motion strategy is deceleration; on-the-move strategyAnd under the condition of constant speed, taking the acceleration as 0, the position of the unmanned aerial vehicle at the current moment, the speed of the unmanned aerial vehicle at the current moment and the time step length t, and calculating to predict the next position point with the time interval as the step length t under the condition that the motion strategy is constant speed. The next position point is thus obtained for a time interval of step t under different motion strategies.
After the position point of the next moment corresponding to the motion strategy of acceleration, uniform speed or deceleration is obtained, the position point p is judged 1 Whether preset conditions are met or not, wherein the preset conditions are that the preset conditions are not in obstacles, are less than or equal to the maximum speed of the unmanned aerial vehicle and are not in a searched area, and if the preset conditions are met simultaneously, calculating all p meeting the preset conditions 1 The loss function is composed of the energy loss from the point to the starting point and the predicted energy loss to the end point, and the minimum p of the loss function is selected 1 And (4) point counting, and then circulating the steps until a plurality of track points are obtained when a termination condition is met, wherein the termination condition is p 1 The distance from the current position of the unmanned aerial vehicle to the set distance is reached, or the loop iteration number is larger than 200.
The secure path is obtained from the environment map as described above, and the desired heading is generated from the obtained secure path as described below.
And S140, generating an expected heading according to the safe path.
In one embodiment, the generating the desired heading from the secure path includes: carrying out track optimization on the safety path to obtain a path curve function; carrying out derivation on the path curve function to obtain a speed curve function; and obtaining the expected heading according to the speed vector in the speed curve function.
Performing track optimization on the safety path obtained in the step S130 to obtain a function of a path curve with respect to time, wherein x (t) = p 5 t 5 +p 4 t 4 +p 3 t 3 +p 2 t 2 +p 1 t+p 0 Where X (t) denotes the path curve in the direction of the X-axis as a function of time, t denotes the time at this point in time, p 0 、p 1 、p 2 、p 3 、p 4 And p 5 The variables to be optimized are represented by high-order polynomial variables to be optimized, functions of the path curves with respect to time are obtained after optimization, the optimized constraint lines comprise speed and acceleration of the starting point position, the ending point position is speed and acceleration, and the position of the middle path point, the variable P is converted into the position of the path point through a matrix, the number of the path points is more than 5, and the whole path is formed by combining multi-section functions.
And obtaining a speed curve function according to the derivation of the optimized path curve, and then obtaining a speed vector in the speed curve function.
In one embodiment, the obtaining the desired heading from the velocity vector in the velocity profile function includes: calculating an included angle between a first component and a second component in the velocity vector; and taking the included angle as the expected heading.
And obtaining the expected heading according to the speed curve vector by taking the included angle between the first component and the second component in the speed vector as the expected heading. As an example, the velocity vector is (v) x ,v y ,v z ) Then the desired heading is yaw _ sp = atan (vy/vx).
The above describes a specific embodiment of generating a desired heading according to a safe path, and the following describes a specific embodiment of controlling an unmanned aerial vehicle to fly in an obstacle avoidance manner according to the desired heading.
And S150, controlling the unmanned aerial vehicle to carry out obstacle avoidance flight according to the expected course.
And the unmanned aerial vehicle carries out obstacle avoidance flight according to the expected course obtained by the method.
Therefore, according to the method for avoiding the obstacle of the unmanned aerial vehicle, the expected course of the unmanned aerial vehicle can be obtained, so that the unmanned aerial vehicle is controlled to avoid the obstacle and fly according to the expected course, the unmanned aerial vehicle can be shot by only arranging one front-facing camera, the burden of the unmanned aerial vehicle can be reduced, and the cost is saved on the basis of reducing the burden of the unmanned aerial vehicle due to the fact that the number of the cameras is reduced.
A specific embodiment of a method for avoiding an obstacle by an unmanned aerial vehicle is described above, and a specific embodiment of a method for avoiding an obstacle by an unmanned aerial vehicle is described below with reference to fig. 2 and 3.
As shown in fig. 2, fig. 2 is a following process of an unmanned aerial vehicle in the implementation of the present application, the unmanned aerial vehicle first acquires a target frame 210, and then calculates a deviation 220 from a central point, where the forward-looking camera center refers to the center of a forward-looking image, pixels of the forward-looking image are 640 × 480, the central point is (320,240), and information of the target frame has four values (x, y, w, h), where (x, y) represent pixel coordinates of the target frame in the image, and (w, h) represent width and height of the target frame, respectively, and a position difference between a pixel coordinate of the target frame and a center coordinate of the forward-looking camera is controlled to rotate a pan-tilt head to eliminate the position difference, so as to achieve a purpose of ensuring that the target is in the center of a field of view of the unmanned aerial vehicle, and achieve pan-tilt-head following 230.
As shown in fig. 3, fig. 3 is a process of obstacle avoidance by an unmanned aerial vehicle in the embodiment of the present application, a speed direction 310 of the unmanned aerial vehicle is first obtained, the speed direction of the unmanned aerial vehicle is obtained by calculation of a navigation module, a vector itself includes information of speed magnitude and speed direction, a sensor includes an accelerometer, a gyroscope, a GPS, a forward view, a downward view, and the like, and a heading 320 is adjusted according to the obtained speed direction of the unmanned aerial vehicle, so that a nose direction of the unmanned aerial vehicle is consistent with the speed direction of the unmanned aerial vehicle. After the unmanned aerial vehicle adjusts the navigation direction, a foresight image 330 is obtained, an environment map 340 is established according to the obtained foresight image, the foresight image is a front binocular depth map, a coordinate system where the unmanned aerial vehicle is located uses the unmanned aerial vehicle as an origin, geographical north is an X axis, geographical east is a Y axis, a coordinate system is established, each pixel point on the foresight image comprises position information and depth information and is represented by (X, Y, depth), X represents an abscissa value of each pixel in the foresight image, Y represents an ordinate value of each pixel in the foresight image, X, Y represents the position of each pixel in the coordinate system, depth represents the distance from each pixel to the foresight camera, and the pixel coordinates are converted into the positions of an object and the real world in a corresponding image through coordinate conversion of the position of each pixel and the distance from each pixel to the foresight camera, so that the environment map is obtained. The environment map is represented by a 20 × 5 square, each cell in the square represents a 0.3m small square, and in actual operation, the cell with an obstacle part is marked as 1, and the cell without the obstacle part is marked as 0.
After the environment map is established, the safe path 350 is planned, in the embodiment of the present application, the hybrid a algorithm is adopted to search the path, and the safe path is planned according to the motion formula p corresponding to the motion strategy 1 =p 0 +v t +0.5at 2 Predicting a plurality of location points of the drone at a plurality of times in the future, wherein p 1 Indicating the position of the next instant in time interval t, p 0 Indicating the position of the current time, v t The speed at the current moment is shown, a shows the acceleration in different motion states, and t shows the time step. The current motion strategy is acceleration, the maximum acceleration in the forward direction, the position of the unmanned aerial vehicle at the current moment, the speed of the unmanned aerial vehicle at the current moment and the time step length are taken as 2 seconds, and the positions of the unmanned aerial vehicle at the moments with the time intervals of 4 th second, 6 th second and 8 th second are predicted under the condition that the motion strategy is acceleration, so that the position points with the time intervals of 2 seconds under different motion strategies are obtained.
After obtaining each position point of the next moment corresponding to the acceleration of the motion strategy respectively, judging the position point p 1 Whether preset conditions are met or not, wherein the preset conditions are that the preset conditions are not in obstacles, are less than or equal to the maximum speed of the unmanned aerial vehicle and are not in a searched area, and if the preset conditions are met simultaneously, calculating all p meeting the preset conditions 1 The loss function is composed of the energy loss from the point to the starting point and the predicted energy loss to the end point, and the minimum p of the loss function is selected 1 And (4) point counting, and then circulating the steps until a plurality of track points are obtained when a termination condition is met, wherein the termination condition is p 1 The distance from the current position of the unmanned aerial vehicle reaches 20m, or the loop iteration number is more than 200.
After the safety path is planned, a speed curve 360 starts to be acquired, and the acquired safety path is subjected to track optimization to obtain a function of the path curve with respect to time:
x(t)=p 5 t 5 +p 4 t 4 +p 3 t 3 +p 2 t 2 +p 1 t+p 0
and obtaining a speed curve function according to the derivation of the optimized path curve, and then obtaining a speed vector in the speed curve function.
After obtaining the velocity profile, the drone generates a desired heading 370, and after obtaining the desired heading from the velocity profile vector, the drone uses the angle between the first component and the second component in the velocity vector as the desired heading, and the velocity vector is (v) x ,v y ,v z ) And then the expected heading is yaw _ sp = atan (vy/vx), and the unmanned aerial vehicle performs obstacle avoidance flight according to the expected heading obtained by the method.
The foregoing describes a specific embodiment of a method for avoiding an obstacle for an unmanned aerial vehicle, and the following describes an apparatus for avoiding an obstacle for an unmanned aerial vehicle.
As shown in fig. 4, an apparatus 400 for avoiding obstacles for an unmanned aerial vehicle includes: an acquisition module 410, a setup module 420, a generation module 430, and a control module 440.
In one embodiment, an apparatus for obstacle avoidance for a drone, the apparatus comprising: the acquisition module is configured to acquire a forward-looking image shot by a forward-looking camera of the unmanned aerial vehicle, wherein the forward-looking camera is positioned below a body of the unmanned aerial vehicle; an establishing module configured to establish an environment map from the forward-looking image, wherein the environment map represents a location of an object in the forward-looking image in a real environment; the obtaining module is configured to obtain a safe path according to the environment map, wherein each position point on the safe path is in an area without obstacles; a generating module configured to generate a desired heading from the secure path; and the control module is configured to control the unmanned aerial vehicle to carry out obstacle avoidance flight according to the expected heading.
In one embodiment, the obtaining means is configured to obtain a speed direction of the drone; and adjusting the direction of the unmanned aerial vehicle head to be consistent with the speed direction.
In one embodiment, the establishing module is configured to perform coordinate transformation on the pixel position and the depth information of each pixel in the forward-looking image with the unmanned aerial vehicle as an origin to obtain the environment map.
In one embodiment, the generating module is configured to perform trajectory optimization on the safety path to obtain a path curve function; carrying out derivation on the path curve function to obtain a speed curve function; and obtaining the expected heading according to the speed vector in the speed curve function.
In one embodiment, the obtaining module is configured to predict a plurality of location points of the drone at a plurality of time instants in the future according to a motion policy and a motion formula corresponding to the motion policy, wherein the motion policy includes: acceleration, uniform speed or deceleration; under the condition that at least part of the position points in the plurality of position points are confirmed to meet preset conditions, calculating loss functions of the at least part of the position points, obtaining loss function values corresponding to each point in the at least part of the position points, and obtaining a plurality of candidate loss function values; selecting a position point corresponding to the loss function value with the minimum value from the candidate loss function values as a track point on the safe path; the steps in the above embodiment are repeated until the termination condition is met, resulting in multiple trace points.
In one embodiment, the generating module is configured to calculate an angle between a first component and a second component in the velocity vector; and taking the included angle as the expected heading.
In one embodiment, the forward-looking image is obtained by controlling the rotation of the pan/tilt head as follows: and controlling the holder to rotate according to the position difference between the target frame and the center of the front-looking camera.
In this embodiment of the present application, the module shown in fig. 4 can implement each process in the method embodiments of fig. 1, fig. 2, and fig. 3. The operations and/or functions of the respective modules in fig. 4 are respectively for implementing the corresponding flows in the method embodiments in fig. 1, 2 and 3. Reference may be made specifically to the description of the above method embodiments, and a detailed description is appropriately omitted herein to avoid redundancy.
As shown in fig. 5, an embodiment of the present application provides an electronic device 500, including: a processor 510, a memory 520 and a bus 530, the processor being connected to the memory via the bus, the memory storing computer readable instructions for implementing the method according to any one of the above embodiments when the computer readable instructions are executed by the processor, and in particular, refer to the description of the above method embodiments, and the detailed description is omitted here as appropriate to avoid redundancy.
Wherein the bus is used for realizing direct connection communication of the components. The processor in the embodiment of the present application may be an integrated circuit chip having signal processing capability. The Processor may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The Memory may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an Erasable Read Only Memory (EPROM), an electrically Erasable Read Only Memory (EEPROM), and the like. The memory stores computer readable instructions that, when executed by the processor, perform the methods described in the embodiments above.
It will be appreciated that the configuration shown in fig. 5 is merely illustrative and may include more or fewer components than shown in fig. 5 or have a different configuration than shown in fig. 5. The components shown in fig. 5 may be implemented in hardware, software, or a combination thereof.
Embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a server, the method in any of the foregoing embodiments is implemented, which can be specifically referred to the description in the foregoing method embodiments, and in order to avoid repetition, detailed description is appropriately omitted here.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (9)

1. A method for obstacle avoidance of an unmanned aerial vehicle is characterized by comprising the following steps:
acquiring a forward-looking image shot by a forward-looking camera of the unmanned aerial vehicle, wherein the forward-looking camera is positioned below a body of the unmanned aerial vehicle;
establishing an environment map from the forward-looking image, wherein the environment map represents the location of objects in the forward-looking image in a real environment;
acquiring a safe path according to the environment map, wherein each position point on the safe path is in an area without obstacles;
generating an expected course according to the safety path;
controlling the unmanned aerial vehicle to carry out obstacle avoidance flight according to the expected course;
wherein, the obtaining of the safe path according to the environment map comprises:
repeatedly executing the following steps until the termination condition is met to obtain a plurality of track points;
predicting a plurality of location points of the drone at a plurality of times in the future according to a motion policy and a motion formula corresponding to the motion policy, wherein the motion policy comprises: acceleration, uniform speed or deceleration;
under the condition that at least part of the position points in the plurality of position points are confirmed to meet preset conditions, calculating loss functions of the at least part of the position points, obtaining loss function values corresponding to each point in the at least part of the position points, and obtaining a plurality of candidate loss function values; and is
And selecting a position point corresponding to the loss function value with the minimum value from the plurality of candidate loss function values as a track point on the safe path.
2. The method of claim 1, wherein prior to said obtaining a forward looking image taken by a forward looking camera of the drone, the method further comprises:
acquiring the speed direction of the unmanned aerial vehicle;
and adjusting the direction of the unmanned aerial vehicle head to be consistent with the speed direction.
3. The method of claim 1, wherein said building an environment map from said forward looking images comprises:
and performing coordinate conversion on the pixel position and the depth information of each pixel in the forward-looking image by taking the unmanned aerial vehicle as an origin to obtain the environment map.
4. The method of claim 1, wherein generating a desired heading from the secure path comprises:
carrying out track optimization on the safety path to obtain a path curve function;
carrying out derivation on the path curve function to obtain a speed curve function;
and obtaining the expected heading according to the speed vector in the speed curve function.
5. The method of claim 4, wherein obtaining the desired heading from the velocity vector in the velocity profile function comprises:
calculating an included angle between a first component and a second component in the velocity vector;
and taking the included angle as the expected heading.
6. The method according to claim 1, wherein the forward-looking image is obtained by controlling pan-tilt rotation as follows:
and controlling the rotation of the holder according to the position difference between the target frame and the center of the front-looking camera.
7. The utility model provides a device that unmanned aerial vehicle kept away barrier which characterized in that, the device includes:
the acquisition module is configured to acquire a forward-looking image shot by a forward-looking camera of the unmanned aerial vehicle, wherein the forward-looking camera is positioned below a body of the unmanned aerial vehicle;
an establishing module configured to establish an environment map from the forward-looking image, wherein the environment map represents a location of an object in the forward-looking image in a real environment;
the obtaining module is configured to obtain a safe path according to the environment map, wherein each position point on the safe path is in an area without obstacles;
a generating module configured to generate a desired heading from the secure path;
the control module is configured to control the unmanned aerial vehicle to carry out obstacle avoidance flight according to the expected heading;
wherein the acquisition module is further configured to:
repeating the following steps until the termination condition is met to obtain a plurality of track points;
predicting a plurality of position points of the unmanned aerial vehicle at a plurality of future moments according to a motion strategy and a motion formula corresponding to the motion strategy, wherein the motion strategy comprises: acceleration, uniform speed or deceleration;
under the condition that at least part of the position points in the plurality of position points are confirmed to meet preset conditions, calculating loss functions of the at least part of the position points, obtaining loss function values corresponding to each point in the at least part of the position points, and obtaining a plurality of candidate loss function values;
and selecting a position point corresponding to the loss function value with the minimum value from the plurality of candidate loss function values as a track point on the safe path.
8. An electronic device, comprising: a processor, a memory and a bus, the processor being connected to the memory through the bus, the memory storing computer readable instructions for implementing the method of any one of claims 1 to 6 when the computer readable instructions are executed by the processor.
9. A computer-readable storage medium, having stored thereon a computer program which, when executed by a server, implements the method of any one of claims 1 to 6.
CN202110262530.3A 2021-03-10 2021-03-10 Unmanned aerial vehicle obstacle avoidance method and device and storage medium Active CN112947570B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110262530.3A CN112947570B (en) 2021-03-10 2021-03-10 Unmanned aerial vehicle obstacle avoidance method and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110262530.3A CN112947570B (en) 2021-03-10 2021-03-10 Unmanned aerial vehicle obstacle avoidance method and device and storage medium

Publications (2)

Publication Number Publication Date
CN112947570A CN112947570A (en) 2021-06-11
CN112947570B true CN112947570B (en) 2022-11-11

Family

ID=76229182

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110262530.3A Active CN112947570B (en) 2021-03-10 2021-03-10 Unmanned aerial vehicle obstacle avoidance method and device and storage medium

Country Status (1)

Country Link
CN (1) CN112947570B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115861859A (en) * 2023-02-20 2023-03-28 中国科学院东北地理与农业生态研究所 Slope farmland environment monitoring method and system

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108227738B (en) * 2017-12-28 2019-07-19 湖北电鹰科技有限公司 A kind of unmanned plane barrier-avoiding method and system
CN109634304B (en) * 2018-12-13 2022-07-15 中国科学院自动化研究所南京人工智能芯片创新研究院 Unmanned aerial vehicle flight path planning method and device and storage medium
CN109407705A (en) * 2018-12-14 2019-03-01 厦门理工学院 A kind of method, apparatus, equipment and the storage medium of unmanned plane avoiding barrier
CN110262568B (en) * 2019-07-19 2021-10-22 深圳市道通智能航空技术股份有限公司 Unmanned aerial vehicle obstacle avoidance method and device based on target tracking and unmanned aerial vehicle
CN112286228A (en) * 2020-12-01 2021-01-29 深圳高度创新技术有限公司 Unmanned aerial vehicle three-dimensional visual obstacle avoidance method and system

Also Published As

Publication number Publication date
CN112947570A (en) 2021-06-11

Similar Documents

Publication Publication Date Title
US11863857B2 (en) Photographing control method, apparatus, and control device
CN111932588B (en) Tracking method of airborne unmanned aerial vehicle multi-target tracking system based on deep learning
CN114679540A (en) Shooting method and unmanned aerial vehicle
US20180022472A1 (en) Autonomous system for taking moving images from a drone, with target tracking and improved target location
US20230215024A1 (en) Position estimation method and apparatus for tracking target, and unmanned aerial vehicle
CN105717933A (en) Unmanned aerial vehicle and unmanned aerial vehicle anti-collision method
US11320269B2 (en) Information processing apparatus, information processing method, and information processing program
US20210325503A1 (en) Relay point generation method and apparatus, and unmanned aerial vehicle
CN106054924A (en) Unmanned aerial vehicle accompanying method, unmanned aerial vehicle accompanying device and unmanned aerial vehicle accompanying system
CN111813148B (en) Unmanned aerial vehicle landing method, system, equipment and storage medium
CN114723955A (en) Image processing method, device, equipment and computer readable storage medium
CN112947570B (en) Unmanned aerial vehicle obstacle avoidance method and device and storage medium
CN113875222B (en) Shooting control method and device, unmanned aerial vehicle and computer readable storage medium
Wang et al. A visual navigation framework for the aerial recovery of UAVs
CN113741495B (en) Unmanned aerial vehicle attitude adjustment method and device, computer equipment and storage medium
CN111665870A (en) Trajectory tracking method and unmanned aerial vehicle
CN110879607A (en) Offshore wind power blade detection method based on multi-unmanned aerial vehicle formation cooperative detection
CN117058209B (en) Method for calculating depth information of visual image of aerocar based on three-dimensional map
CN105930766A (en) Unmanned plane
Zhou et al. Vision-based navigation of uav with continuous action space using deep reinforcement learning
Qin et al. Visual-based tracking and control algorithm design for quadcopter UAV
CN108733076B (en) Method and device for grabbing target object by unmanned aerial vehicle and electronic equipment
CN106845394A (en) A kind of photoelectronic reconnaissance platform and its target tracking method
CN116339321A (en) Global information driven distributed multi-robot reinforcement learning formation surrounding method based on 5G communication
WO2021238743A1 (en) Flight control method and apparatus for unmanned aerial vehicle, and unmanned aerial vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant