CN113900443A - Unmanned aerial vehicle obstacle avoidance early warning method and device based on binocular vision - Google Patents

Unmanned aerial vehicle obstacle avoidance early warning method and device based on binocular vision Download PDF

Info

Publication number
CN113900443A
CN113900443A CN202111142544.8A CN202111142544A CN113900443A CN 113900443 A CN113900443 A CN 113900443A CN 202111142544 A CN202111142544 A CN 202111142544A CN 113900443 A CN113900443 A CN 113900443A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
early warning
probability
obstacle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111142544.8A
Other languages
Chinese (zh)
Other versions
CN113900443B (en
Inventor
周丽华
吴帆
方素平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei University of Technology
Original Assignee
Hefei University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei University of Technology filed Critical Hefei University of Technology
Priority to CN202111142544.8A priority Critical patent/CN113900443B/en
Publication of CN113900443A publication Critical patent/CN113900443A/en
Application granted granted Critical
Publication of CN113900443B publication Critical patent/CN113900443B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The invention discloses an unmanned aerial vehicle obstacle avoidance early warning method and device based on binocular vision. The early warning method comprises the following steps: firstly, segmenting an outer contour line of an obstacle, projecting and transforming the outer contour line to a normalized imaging plane, selecting four points with minimum and maximum horizontal and vertical coordinates of pixels in the normalized outer contour line as four characteristic points of the obstacle, and finally determining the relative position relationship between the unmanned aerial vehicle and the obstacle; calculating position coordinates of the four feature point protoimages in a fuselage coordinate system of the unmanned aerial vehicle; firstly, determining the value range of a true value of a course angle, then determining a safe distance, and finally calculating the approximate probability collision area of the unmanned aerial vehicle; judging whether the two characteristic line segments are in the approximate probability collision area or not, calculating the numerical value of the center distance, and finally calculating the early warning probability; and according to the early warning probability, early warning is carried out on the unmanned aerial vehicle. The method can reduce the probability of collision with the obstacle in the flight process of the unmanned aerial vehicle, can be applied to automatic cruise of the unmanned aerial vehicle for reference, and fills the blank of the prior art.

Description

Unmanned aerial vehicle obstacle avoidance early warning method and device based on binocular vision
Technical Field
The invention relates to a visual ranging method in the technical field of visual ranging, in particular to an unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision, and further relates to an early warning device.
Background
The common unmanned aerial vehicle obstacle avoidance system at present has two schemes of distance sensing and visual sensing. The distance sensing scheme selects to install the distance sensors in the front or four directions, the working visual angle is small, and the distance sensing scheme cannot well predict and warn special surfaces, linear rod type objects and the like. The non-simplified binocular vision scheme has high calculation cost and is not suitable for the situation with high real-time requirement. State information such as space flight tracks of the unmanned aerial vehicle can be acquired by combining with airborne sensors such as a GPS, a barometer, a gyroscope and an accelerator, but the positioning problem of surrounding objects cannot be solved. The anti-collision early warning system based on satellite image and track prediction cannot process obstacle height information and small ground obstacles.
Disclosure of Invention
In order to solve the technical problem that the existing unmanned aerial vehicle obstacle avoidance system cannot comprehensively and accurately position surrounding objects, the invention provides an unmanned aerial vehicle obstacle avoidance early warning method and device based on binocular vision.
The invention is realized by adopting the following technical scheme: an unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision comprises the following steps:
s1: firstly, segmenting an outer contour line of the same obstacle from a video image shot by an unmanned aerial vehicle, projecting and transforming the outer contour line to a normalized imaging plane to obtain a normalized outer contour line, selecting four points with minimum and maximum horizontal and vertical coordinates of pixels in the normalized outer contour line as four feature points of the obstacle, and finally determining the relative position relationship between the unmanned aerial vehicle and the obstacle, the probability of collision of linear braking and the priority of bypassing the obstacle leftwards, rightwards, upwards and downwards according to the four feature points;
s2: taking tangent points of the four feature point image points and a viewing cone in the left/right/up/down directions as the structured representation of the barrier, substituting one-dimensional information of the four feature point image points into a binocular vision model, and calculating the position coordinates of the four feature point original images in a body coordinate system of the unmanned aerial vehicle;
s3: firstly, determining the value range of a real value of a course angle of the unmanned aerial vehicle, then determining the navigation safety distance of the unmanned aerial vehicle, and finally calculating the probable collision area of the unmanned aerial vehicle according to the value range of the real value of the course angle and the navigation safety distance;
s4: judging whether two feature line segments formed by the four feature points have points falling in the high-probability collision region or not according to the position coordinates of the four feature point original images in the fuselage coordinate system, calculating a central distance value of a point which is in the high-probability collision region and is closest to the circle center of the high-probability collision region, and finally calculating an early warning probability according to the central distance value; and
s5: and according to the early warning probability, early warning is carried out on the unmanned aerial vehicle.
According to the invention, the position of the obstacle shot by the mobile camera relative to the fuselage is calculated through the simplified binocular vision model, the collision probability and the avoiding direction are predicted by combining the approximate collision area of the unmanned aerial vehicle, the obstacle avoidance prompting and accurate early warning functions are provided for the safe flight of the unmanned aerial vehicle, the technical problem that the unmanned aerial vehicle obstacle avoidance system cannot comprehensively and accurately position the surrounding objects is solved, the phenomena of collision of the obstacle and the like in the navigation process of the unmanned aerial vehicle can be prevented, and the technical effect of reducing the damage probability of the unmanned aerial vehicle is achieved.
As a further improvement of the above scheme, in step S5, a low-risk early warning probability and a high-risk early warning probability are preset according to actual flight data of the unmanned aerial vehicle, and then the early warning probabilities are compared with the low-risk early warning probability and the high-risk early warning probability, respectively, so as to obtain corresponding early warning levels according to a preset scheme.
As a further improvement of the above solution, in step S1, the outer contour line is an outer contour line obtained by segmenting the same obstacle in the t-1 frame from the t-th frame image of the video image through front and background separation, and the normalized imaging plane is an imaging plane having an optical axis parallel to the forward direction of the drone and a column direction parallel to the direction of the rotation axis of the rotor of the drone.
As a further improvement of the above scheme, the four feature points are defined as feature points l, r, u, and d, and the three-dimensional directions of the fuselage coordinate system are X, Y, Z; the characteristic points are imaged in the airborne video frame at the time t-1 and the time t respectively, and the pixel coordinates of the image points and the track estimated values of the unmanned aerial vehicle at the time t-1 and the time t are estimated; the relationship of the position coordinates of the feature point primary image in the machine body coordinate system is as follows:
Figure BDA0003284526200000031
Figure BDA0003284526200000032
wherein m and n are two items of x, y and z, and p represents the characteristic point; delta x't-1,Δyt-1′,Δz′t-1Is a coordinate representation of the fuselage coordinate system of the displacement from time t-1 to time t, up1,up2The displacement of the pixel point relative to the principal point of the imaging plane is shown, and f is the distance between the imaging planes of the cameras.
As a further improvement of the above, in step S3, Δ θt=ξ(θt-1 measurementt-1Meter) as an estimated limit error value for the yaw heading angle at time t,
Figure BDA0003284526200000033
and xi is an error coefficient as an estimated limit error value of the pitching heading angle.
As a further improvement of the above, in the step S3, a maximum braking deceleration a of the vehicle is definedmax,vtIs the unmanned aerial vehicle speed; a minimum safe running distance L of the vehicleminThe calculation formula of (2) is as follows:
Figure BDA0003284526200000034
defining the brake reaction time as T0Then reaction distance L0
L0=vtT0
The driving safety distance is as follows:
Ls=η(L0+Lmin)
wherein L issIs a stand forAnd eta is a safety factor.
As a further improvement of the above scheme, a value range of the course angle true value is determined by using a direct observation value given by a gyroscope of the unmanned aerial vehicle at the time t-1 and a difference value of a displacement calculation value given by a GPS and an unmanned aerial vehicle barometer; wherein, in the body coordinate system, the center is defined as (0,0) on the XZ plane and the center is defined as L on the YZ planesIs a radius, symmetrical to ZtAxis, angular extent 2 Delta thetatAnd
Figure BDA0003284526200000035
is a horizontal, vertical, approximate probability collision zone.
As a further improvement of the above scheme, the position L (x) of the L, R, U, D point primary image L, R, U, D in the machine body coordinate system-t is definedL,yL,zL),R(xR,yR,zR),U(xU,yU,zU),D(xD,yD,zD) (ii) a In step S4, the method for calculating the warning probability includes the following steps:
if the high-probability collision region does not intersect with a line segment LR, the collision probability of the unmanned aerial vehicle is 0;
if the high probability collision region intersects LR, then the center distance Ro of the intersection closest to the fan center point N is calculated:
Figure BDA0003284526200000041
the early warning probability is as follows:
Figure BDA0003284526200000042
and P is the early warning probability.
As a further improvement of the above solution, in step S5, a low risk pre-warning probability P is defined0And a high risk early warning probability P1
If P is less than or equal to P0If so, not prompting collision and prompting normal flight;
if P0<P≤P1And then, carrying out low-level risk collision prompt, prompting cautious flight, and suggesting an obstacle avoidance flight direction: the system simulates a flight detour track according to the current speed and the maximum steering acceleration, and carries out an advance obstacle avoidance suggestion according to the approaching degree of an obstacle characteristic line segment and the flight track;
if P > P1And performing high-level risk collision prompt, prompting emergency braking under the condition that the protection switch is closed, and controlling the aircraft to perform emergency braking under the condition that the protection switch is opened.
The invention also provides an early warning device, which applies any one of the above unmanned aerial vehicle obstacle avoidance early warning methods based on binocular vision, and the method comprises the following steps:
the camera shooting unit is used for continuously imaging a range right in front of the unmanned aerial vehicle to obtain observation data of the front running state of the unmanned aerial vehicle;
the logic synthesis unit is used for firstly segmenting an outer contour line of the same obstacle from a video image shot by an unmanned aerial vehicle, projecting and transforming the outer contour line to a normalized imaging plane to obtain a normalized outer contour line, then selecting four points with minimum and maximum horizontal and vertical coordinates of pixels in the normalized outer contour line as four feature points of the obstacle, and finally determining the relative position relationship between the unmanned aerial vehicle and the obstacle, the probability of collision of linear braking and the priority of bypassing the obstacle leftwards, rightwards, upwards and downwards according to the four feature points; the logic synthesis unit is also used for substituting one-dimensional information of the four feature point image points into a binocular vision model by taking tangent points of the view cones in four directions of left/right/up/down as the structural representation of the barrier, and calculating the position coordinates of the four feature point original images in the body coordinate system of the unmanned aerial vehicle; the logic synthesis unit is further used for determining a value range of a real heading angle value of the unmanned aerial vehicle, determining a safe navigation distance of the unmanned aerial vehicle, and calculating an approximate probability collision area of the unmanned aerial vehicle according to the value range of the real heading angle value and the safe navigation distance; the logic synthesis unit is further used for judging whether two feature line segments formed by the four feature points are in the high-probability collision region or not according to the position coordinates of the four feature point primary images in the fuselage coordinate system, calculating a circle center distance value of a point which is in the high-probability collision region and is closest to the circle center of the high-probability collision region, and finally calculating an early warning probability according to the circle center distance value; and
and the early warning prompting unit is used for early warning the unmanned aerial vehicle according to the early warning probability.
The unmanned aerial vehicle obstacle avoidance early warning method and device based on binocular vision have the following beneficial effects:
1. according to the unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision, flight state parameters of an unmanned aerial vehicle monitored by an airborne sensor are combined, the position of a small obstacle is determined through image feature extraction and projection transformation, the collision occurrence possibility is analyzed, and an accurate early warning function is achieved for safe driving of the unmanned aerial vehicle.
2. According to the unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision, the probability of collision between the unmanned aerial vehicle and an obstacle in the flying process can be reduced, the unmanned aerial vehicle can be used for automatic cruise by reference, and the blank of the prior art is made up.
3. According to the unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision, relative judgment of obstacles is facilitated, the judgment accuracy is improved, the unmanned aerial vehicle danger area is calculated through the vehicle safety distance and the course angle error predicted value, and the judgment accuracy is high.
4. The beneficial effects of the early warning device are the same as those of the unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision, and are not repeated here.
Drawings
Fig. 1 is a flowchart of an unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision in embodiment 1 of the present invention;
fig. 2 is a schematic diagram of a process of preprocessing barrier feature points in the unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision in embodiment 1 of the present invention; extracting the outline of an original image (a), normalizing and projecting (c), selecting characteristic points (d), and performing one-dimensional processing (e);
fig. 3 is a schematic diagram of a process of shooting an obstacle by a camera during fixed-height flight in the unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision in embodiment 1 of the present invention;
fig. 4 is a simplified diagram of an object point projection on an XZ plane in the unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision in embodiment 1 of the present invention (a camera at time t-1 has normalized with reference to a posture at time t);
fig. 5 is a schematic diagram illustrating a relationship between an object point P and an image point thereof and a motion trajectory of an unmanned aerial vehicle in the unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision in embodiment 1 of the present invention;
fig. 6 is a schematic diagram of approximate probability collision area estimation and collision probability calculation in the unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision in embodiment 1 of the present invention;
fig. 7 is a schematic view of the unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision according to the embodiment 1 of the present invention, along the left/right/up/down obstacle avoidance flight direction;
fig. 8 is a block diagram of the early warning apparatus in embodiment 2 of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Example 1
Referring to fig. 1 to 7, the present embodiment provides an unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision. The early warning method comprises the steps of firstly extracting historical values and current values of flight state data from a storage unit, and determining the relation between a fuselage coordinate system and a geodetic coordinate system. Meanwhile, the pixel coordinate values of the feature points of the obstacles are calculated from the corresponding frame (t) of the front video camera of the unmanned aerial vehicle. And then, combining the pixel coordinates of the characteristic points in the last frame (t-1), and calculating and solving the orientation of the obstacle relative to the fuselage through a binocular vision model. And calculating the safe distance of the unmanned aerial vehicle according to the current flight data of the unmanned aerial vehicle. And calculating the dangerous area and the collision probability of the unmanned aerial vehicle according to the safe distance, the direction angle measurement value, the direction angle error and the position of the obstacle, and issuing early warning information. In the embodiment, the unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision is mainly realized by the following steps, specifically steps S1-S5.
Step S1: the method comprises the steps of firstly segmenting outer contour lines of the same obstacle from a video image shot by an unmanned aerial vehicle, projecting and transforming the outer contour lines to a normalized imaging plane to obtain normalized outer contour lines, then selecting four points with minimum and maximum horizontal and vertical coordinates of pixels in the normalized outer contour lines as four feature points of the obstacle, and finally determining the relative position relation between the unmanned aerial vehicle and the obstacle, the collision probability of linear braking and the priority of bypassing the obstacle leftwards, rightwards, upwards and downwards according to the four feature points.
The steps are to define the characteristic points of the obstacle, in order to obtain the characteristic points representing the space geometric characteristics of the obstacle, firstly, the outer contour line of the same obstacle in a t-1 frame is segmented from a t-th frame image of a video through a front and background separation algorithm, and the outer contour line is projected and transformed to a normalized imaging plane with the optical axis parallel to the advancing direction of the unmanned aerial vehicle and the column direction parallel to the rotating shaft direction of the rotor wing of the unmanned aerial vehicle, so that the processed outer contour line is obtained. Then, in order to ensure the calculation real-time performance, the model is simplified, and only four points l (u) with the minimum horizontal coordinates and the maximum vertical coordinates of the pixels in the contour are selectedl,vl),r(ur,vr),u(uu,vu),d(ud,vd) The changes of the primary image points of t-delta t time and t time l, r, u and d in the depth direction and the interested direction are ignored as characteristic points of the obstacle, and the relative position relationship between the obstacle and the unmanned aerial vehicle, the probability of collision of linear braking and the priority of the obstacle bypassing the obstacle in left/right/up/down deflection are further discussed only in two-dimensional planes (planes in which the unmanned aerial vehicle performs pitching motion and yawing motion). FIG. 2 is a diagram illustrating the image preprocessing process, and FIG. 3 is a diagram illustrating a two-dimensional process and a diagram of a camera shooting the same obstacle twice before and after the camera is moving during a fixed-height flightLike the feasibility of normalized projection.
Step S2: and (3) taking tangent points of the view cones in the left/right/up/down directions as the structured representation of the obstacle, substituting the one-dimensional information of the image points of the four characteristic points into a binocular vision model, and calculating the position coordinates of the original images of the four characteristic points in a body coordinate system of the unmanned aerial vehicle. In this embodiment, up, down, left, right four-way translations are used to avoid obstacles, so the tangency points with the viewing cone in four directions are chosen as the structured representation of the obstacle, which simplification requires that the obstacle size is one order of magnitude smaller than the depth distance or that the obstacle satisfies the planar surface assumption. Four feature points are defined as feature points l, r, u and d respectively, and the three-dimensional directions of the fuselage coordinate system are X, Y, Z respectively. The characteristic points are imaged in the airborne video frame at the time t-1 and the time t respectively, and the pixel coordinates of the image points of the characteristic points and the track estimated values of the unmanned aerial vehicle at the time t-1 and the time t are estimated; the relationship of the position coordinates of the characteristic point primary image in the fuselage coordinate system is as follows:
Figure BDA0003284526200000081
Figure BDA0003284526200000082
wherein m and n are two items of x, y and z, and p represents a characteristic point; delta x't-1,yt-1′,Δz′t-1Is a coordinate representation of the fuselage coordinate system of the displacement from time t-1 to time t, up1,up2The displacement of the pixel point relative to the principal point of the imaging plane is shown, and f is the distance between the imaging planes of the cameras.
Fig. 4 shows a simplified process of three-dimensional space projection, that is, when only the XZ direction is considered, the central projection model is still satisfied between the object images, so that only one-dimensional information of the image points of the feature points is used to substitute for the binocular vision model for calculation. The static point P is imaged in the airborne video frame at the moment t-1 and the moment t respectively, and the displacement of the point P at the moment t relative to the optical center of the camera or the geometric center of the unmanned aerial vehicle and the coordinate value (the airframe coordinate system-t) of the point P at the moment t can be calculated according to the pixel coordinate of the image point and the track estimation value of the unmanned aerial vehicle at the moment t-1 and the moment t. The geometrical positional relationship in the XZ direction is shown in FIG. 5 (the same applies to the YZ direction).
Used in FIG. 5 is a fuselage coordinate system at time t, Δ x 'in the drawing't-1,Δyt-1′,Δz′t-1Is a coordinate representation of the fuselage coordinate system-t of the displacement from time t-1 to time t, up1,up2The displacement (unit: pixel length) of the pixel point relative to the principal point of the imaging plane is shown, and f is the camera image plane distance (unit: pixel length). Easily obtain the coordinate value of the fuselage coordinate system-t of the P point and the image point P1,p2The relationship between the coordinate values of the pixel systems is as follows:
Figure BDA0003284526200000083
Figure BDA0003284526200000084
the x and z coordinates of the point P in the machine body coordinate system-t can be obtained through the formula, and the y and z coordinates of the point P can be obtained by performing related calculation on a YZ plane in the same way.
By applying the above calculation method to the one-dimensional feature points L, R, U, D, the position L (x) of the point L, R, U, D (i.e. two planes abstracted by the obstacle) in the machine coordinate system-t can be obtainedL,yL,zL),R(xR,yR,zR),U(xU,yU,zU),D(xD,yD,zD)。
S3: firstly, determining the value range of a true value of a course angle of the unmanned aerial vehicle, then determining the safe navigation distance of the unmanned aerial vehicle, and finally calculating the probable collision area of the unmanned aerial vehicle according to the value range of the true value of the course angle and the safe navigation distance. In order to calculate the area where collision may still occur when the drone applies braking immediately, the present embodiment first needs to determine the range of the true value of the direction angle. Direct observation given by a gyroscope at the time t-1 and displacement by GPS and barometer can be consideredThe difference between the calculated values (note: to be converted into the view angle of the coordinate system of the fuselage) is estimated, in particular, by delta thetat=ξ(θt-1 measurementt-1 meter) As an estimated limit error value of the yaw heading angle at time t,
Figure BDA0003284526200000091
as an estimated limit error value of the pitch heading angle, where ξ is an error coefficient, a certain value may be taken empirically, such as ξ ═ 3.
Secondly, a safe distance needs to be determined. The estimated value of the flight speed of the unmanned aerial vehicle at the moment t is known as vtAccording to the performance parameters of the unmanned aerial vehicle, the maximum braking deceleration a can be obtainedmax,vtFor unmanned aerial vehicle speed. The minimum safe flying distance LminComprises the following steps:
Figure BDA0003284526200000092
assuming a human braking response time of T0Then reaction distance L0
L0=vtT0
Safety distance LsComprises the following steps:
Ls=η(L0+Lmin)
where η is a safety factor, an appropriate value is empirically selected, such as: η is 1.2.
Estimating a limit error delta theta according to a heading angletAnd
Figure BDA0003284526200000093
safety distance LsThe approximate probability collision zone can be calculated: in the fuselage coordinate system-t, the center is defined as (0,0) on the XZ plane and the center is defined as L on the YZ planesIs a radius, symmetrical to ZtAxis of angle 2 thetatAnd
Figure BDA0003284526200000094
the sector of (A) is a horizontal and vertical approximate probability collision area, namely a graph6.
S4: according to the position coordinates of the four feature point primary images in the fuselage coordinate system, whether two feature line segments formed by the four feature points have points falling in the approximate probability collision area or not is judged, the numerical value of the center distance between the two feature line segments falling in the large probability collision area and the point closest to the circle center of the approximate probability collision area is calculated, and finally the early warning probability is calculated according to the numerical value of the center distance. In this embodiment, the coordinate value L (x) of the frame system is determined based on the feature points of the obstacleL,zL),R(xR,zR),U(YU,zU),V(yV,zV) It can be determined whether there is a point on the feature line segment LR, UD within the approximate probability collision region and a point M (x) closest to the center of the circle of the sector regionM,zM) Or N (y)N,zD) And calculating the collision probability.
If the large probability collision area is not intersected with the line segment LR, the collision probability of the unmanned aerial vehicle is 0;
if the high probability collision region intersects LR, then the center distance Ro of the intersection closest to the fan center point N is calculated:
Figure BDA0003284526200000101
the early warning probability is as follows:
Figure BDA0003284526200000102
and P is the early warning probability.
S5: and according to the early warning probability, early warning is carried out on the unmanned aerial vehicle. In this embodiment, a low risk early warning probability P is preset according to the actual flight data of the unmanned aerial vehicle0And a high risk early warning probability P1And comparing the early warning probability with the low risk early warning probability and the high risk early warning probability respectively, and obtaining corresponding early warning levels according to a preset scheme. Namely, according to the calculated early warning probability, the corresponding early warning grade can be obtained by comparing with the preset probability.
If P is less than or equal to P0And if so, not prompting collision and prompting normal flight.
If P0<P≤P1Then, low-level risk collision prompt is carried out, and cautious flight is prompted, and the obstacle avoidance flight direction shown in fig. 7 is recommended: the system simulates a flight detour track according to the current speed and the maximum steering acceleration, and carries out advance obstacle avoidance suggestion according to the approaching degree of the obstacle characteristic line segment and the flight track.
If P > P1And performing high-level risk collision prompt, prompting emergency braking under the condition that the protection switch is closed, and controlling the aircraft to perform emergency braking under the condition that the protection switch is opened.
To sum up, compare in current unmanned aerial vehicle and keep away barrier early warning technology, the unmanned aerial vehicle that is based on binocular vision of this embodiment keeps away barrier early warning method has following advantage:
1. according to the unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision, flight state parameters of an unmanned aerial vehicle monitored by an airborne sensor are combined, the position of a small obstacle is determined through image feature extraction and projection transformation, the collision occurrence possibility is analyzed, and an accurate early warning function is achieved for safe driving of the unmanned aerial vehicle.
2. According to the unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision, the probability of collision between the unmanned aerial vehicle and an obstacle in the flying process can be reduced, the unmanned aerial vehicle can be used for automatic cruise by reference, and the blank of the prior art is made up.
3. According to the unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision, relative judgment of obstacles is facilitated, the judgment accuracy is improved, the unmanned aerial vehicle danger area is calculated through the vehicle safety distance and the course angle error predicted value, and the judgment accuracy is high.
Example 2
Referring to fig. 8, the present embodiment provides an early warning device, which is applied to the unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision in embodiment 1, and specifically includes a camera unit, a control unit, a wireless communication module, a storage unit, a logic synthesis unit, and an early warning prompt module. The camera shooting unit realizes continuous imaging of the range right ahead of the unmanned aerial vehicle, the control unit realizes a control algorithm and interacts data with each unit, the wireless communication module realizes the communication function between the airborne warning device and the ground receiving station, the storage unit stores historical data of target variables, the logic integration unit realizes flight state calculation, video ranging calculation and unmanned aerial vehicle flight warning probability calculation, and the warning prompt module realizes warning.
The camera shooting unit is used for continuously imaging the range right ahead of the unmanned aerial vehicle to obtain the observation data of the front driving state of the unmanned aerial vehicle. The logic synthesis unit is used for segmenting an outer contour line of the same obstacle from a video image shot by the unmanned aerial vehicle, projecting and converting the outer contour line to a normalized imaging plane to obtain a normalized outer contour line, selecting four points with minimum horizontal coordinates and maximum vertical coordinates of pixels in the normalized outer contour line as four feature points of the obstacle, and finally determining the relative position relation between the unmanned aerial vehicle and the obstacle, the probability of collision of linear braking and the priority of bypassing the obstacle in a left/right/up/down deviation mode according to the four feature points. And the logic synthesis unit is also used for substituting the one-dimensional information of the image points of the four characteristic points into a binocular vision model through taking tangent points of the view cones in the left/right/up/down directions as the structural representation of the obstacle, and calculating the position coordinates of the original images of the four characteristic points in the body coordinate system of the unmanned aerial vehicle. The logic synthesis unit is also used for firstly determining the value range of the real value of the course angle of the unmanned aerial vehicle, then determining the safe navigation distance of the unmanned aerial vehicle, and finally calculating the probable collision area of the unmanned aerial vehicle according to the value range of the real value of the course angle and the safe navigation distance.
The logic synthesis unit is also used for judging whether two feature line segments formed by the four feature points have points falling in the approximate probability collision area or not according to the position coordinates of the four feature point primary images in the fuselage coordinate system, calculating a circle center distance value of the closest point which falls in the approximate probability collision area and is far away from the circle center of the approximate probability collision area, and finally calculating the early warning probability according to the circle center distance value. The early warning prompting unit is used for early warning the unmanned aerial vehicle according to the early warning probability. The wireless communication module is mainly used for receiving and transmitting wireless signals, can perform a communication function with a traffic command center, and can also perform signal transmission with alarm equipment, a mobile phone and the like, which is not described herein in detail.
Example 3
The present embodiments provide a computer terminal comprising a memory, a processor, and a computer program stored on the memory and executable on the processor. And when the processor executes the program, the steps of the unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision in the embodiment 1 are realized.
When the unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision is applied, the unmanned aerial vehicle obstacle avoidance early warning method can be applied in a software mode, if the unmanned aerial vehicle obstacle avoidance early warning method is designed into an independently running program, the program is installed on a computer terminal, and the computer terminal can be a computer, a smart phone, a control system and other Internet of things equipment. The unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision can also be designed into an embedded running program and installed on a computer terminal, such as a single chip microcomputer.
Example 4
The present embodiment provides a computer-readable storage medium having a computer program stored thereon. When the program is executed by the processor, the steps of the unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision in embodiment 1 are realized. When the unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision is applied, the unmanned aerial vehicle obstacle avoidance early warning method can be applied in a software mode, for example, the unmanned aerial vehicle obstacle avoidance early warning method is designed into a program which can be independently operated by a computer readable storage medium, the computer readable storage medium can be a U disk and is designed into a U shield, and the U disk is designed into a program which starts the whole method through external triggering.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (10)

1. An unmanned aerial vehicle obstacle avoidance early warning method based on binocular vision is characterized by comprising the following steps:
s1: firstly, segmenting an outer contour line of the same obstacle from a video image shot by an unmanned aerial vehicle, projecting and transforming the outer contour line to a normalized imaging plane to obtain a normalized outer contour line, selecting four points with minimum and maximum horizontal and vertical coordinates of pixels in the normalized outer contour line as four feature points of the obstacle, and finally determining the relative position relationship between the unmanned aerial vehicle and the obstacle, the probability of collision of linear braking and the priority of bypassing the obstacle leftwards, rightwards, upwards and downwards according to the four feature points;
s2: taking tangent points of the four feature point image points and a viewing cone in the left/right/up/down directions as the structured representation of the barrier, substituting one-dimensional information of the four feature point image points into a binocular vision model, and calculating the position coordinates of the four feature point original images in a body coordinate system of the unmanned aerial vehicle;
s3: firstly, determining the value range of a real value of a course angle of the unmanned aerial vehicle, then determining the navigation safety distance of the unmanned aerial vehicle, and finally calculating the probable collision area of the unmanned aerial vehicle according to the value range of the real value of the course angle and the navigation safety distance;
s4: judging whether two feature line segments formed by the four feature points have points falling in the high-probability collision region or not according to the position coordinates of the four feature point original images in the fuselage coordinate system, calculating a central distance value of a point which is in the high-probability collision region and is closest to the circle center of the high-probability collision region, and finally calculating an early warning probability according to the central distance value; and
s5: and according to the early warning probability, early warning is carried out on the unmanned aerial vehicle.
2. The binocular vision-based unmanned aerial vehicle obstacle avoidance early warning method as claimed in claim 1, wherein in step S5, a low risk early warning probability and a high risk early warning probability are preset according to actual flight data of the unmanned aerial vehicle, and then the early warning probabilities are respectively compared with the low risk early warning probability and the high risk early warning probability to obtain corresponding early warning levels according to a preset scheme.
3. The binocular vision based unmanned aerial vehicle obstacle avoidance early warning method according to claim 1, wherein in the step S1, the outer contour line is an outer contour line of the same obstacle in a t-1 frame segmented from a t-th frame image of the video image by front and background separation, and the normalized imaging plane is an imaging plane having an optical axis parallel to a forward direction of the unmanned aerial vehicle and a column direction parallel to a rotation axis direction of a rotor of the unmanned aerial vehicle.
4. The binocular vision-based unmanned aerial vehicle obstacle avoidance early warning method as claimed in claim 3, wherein the four feature points are defined as feature points l, r, u and d respectively, and the three-dimensional directions of a fuselage coordinate system are X, Y, Z respectively; the characteristic points are imaged in the airborne video frame at the time t-1 and the time t respectively, and the pixel coordinates of the image points and the track estimated values of the unmanned aerial vehicle at the time t-1 and the time t are estimated; the relationship of the position coordinates of the feature point primary image in the machine body coordinate system is as follows:
Figure FDA0003284526190000021
Figure FDA0003284526190000022
wherein m and n are two items of x, y and z, and p represents the characteristic point; delta x't-1,Δyt-1′,Δz′t-1Is a coordinate representation of the fuselage coordinate system of the displacement from time t-1 to time t, up1,up2The displacement of the pixel point relative to the principal point of the imaging plane is shown, and f is the distance between the imaging planes of the cameras.
5. The binocular vision based unmanned aerial vehicle obstacle avoidance early warning method as claimed in claim 4, wherein in the step S3, Δ θt=ξ(θt-1 measurementt-1 meter) As an estimated limit error value of the yaw heading angle at time t,
Figure FDA0003284526190000023
and xi is an error coefficient as an estimated limit error value of the pitching heading angle.
6. The binocular vision based unmanned aerial vehicle obstacle avoidance early warning method as claimed in claim 5, wherein in the step S3, the maximum braking deceleration a of the vehicle is definedmax,vtIs the unmanned aerial vehicle speed; a minimum safe running distance L of the vehicleminThe calculation formula of (2) is as follows:
Figure FDA0003284526190000024
defining the brake reaction time as T0Then reaction distance L0
L0=vtT0
The driving safety distance is as follows:
Ls=η(L0+Lmin)
wherein L issEta is the safety factor for the safe driving distance.
7. The binocular vision-based unmanned aerial vehicle obstacle avoidance early warning method as claimed in claim 6, wherein a value range of the course angle true value is determined by using a direct observation value given by a gyroscope of the unmanned aerial vehicle at a time t-1 and a difference value of a displacement calculation value given by a GPS and an unmanned aerial vehicle barometer; wherein, in the body coordinate system, the center is defined as (0,0) on the XZ plane and the center is defined as L on the YZ planesIs a radius, symmetrical to ZtAxis, angular extent 2 Delta thetatAnd
Figure FDA0003284526190000031
is a horizontal, vertical, approximate probability collision zone.
8. The binocular vision based unmanned aerial vehicle obstacle avoidance early warning method as claimed in claim 1,the method is characterized in that the position L (x) of a primary image L, R, U and D of points L, R, U and D in a machine body coordinate system-t is definedL,yL,zL),R(xR,yR,zR),U(xU,yU,zU),D(xD,yD,zD) (ii) a In step S4, the method for calculating the warning probability includes the following steps:
if the high-probability collision region does not intersect with a line segment LR, the collision probability of the unmanned aerial vehicle is 0;
if the high probability collision region intersects LR, then the center distance Ro of the intersection closest to the fan center point N is calculated:
Figure FDA0003284526190000032
the early warning probability is as follows:
Figure FDA0003284526190000033
and P is the early warning probability.
9. The binocular vision based unmanned aerial vehicle obstacle avoidance early warning method as claimed in claim 1, wherein in the step S5, a low risk early warning probability P is defined0And a high risk early warning probability P1
If P is less than or equal to P0If so, not prompting collision and prompting normal flight;
if P0<P≤P1And then, carrying out low-level risk collision prompt, prompting cautious flight, and suggesting an obstacle avoidance flight direction: the system simulates a flight detour track according to the current speed and the maximum steering acceleration, and carries out an advance obstacle avoidance suggestion according to the approaching degree of an obstacle characteristic line segment and the flight track;
if P > P1Then, high-grade risk collision prompt is carried out, and emergency braking is prompted under the condition that the protection switch is closed,and controlling the emergency braking of the aircraft under the condition that the protection switch is turned on.
10. An early warning device applying the binocular vision based unmanned aerial vehicle obstacle avoidance early warning method according to any one of claims 1 to 9, the early warning device comprising:
the camera shooting unit is used for continuously imaging a range right in front of the unmanned aerial vehicle to obtain observation data of the front running state of the unmanned aerial vehicle;
the logic synthesis unit is used for firstly segmenting an outer contour line of the same obstacle from a video image shot by an unmanned aerial vehicle, projecting and transforming the outer contour line to a normalized imaging plane to obtain a normalized outer contour line, then selecting four points with minimum and maximum horizontal and vertical coordinates of pixels in the normalized outer contour line as four feature points of the obstacle, and finally determining the relative position relationship between the unmanned aerial vehicle and the obstacle, the probability of collision of linear braking and the priority of bypassing the obstacle leftwards, rightwards, upwards and downwards according to the four feature points; the logic synthesis unit is also used for substituting one-dimensional information of the four feature point image points into a binocular vision model by taking tangent points of the view cones in four directions of left/right/up/down as the structural representation of the barrier, and calculating the position coordinates of the four feature point original images in the body coordinate system of the unmanned aerial vehicle; the logic synthesis unit is further used for determining a value range of a real heading angle value of the unmanned aerial vehicle, determining a safe navigation distance of the unmanned aerial vehicle, and calculating an approximate probability collision area of the unmanned aerial vehicle according to the value range of the real heading angle value and the safe navigation distance; the logic synthesis unit is further used for judging whether two feature line segments formed by the four feature points are in the high-probability collision region or not according to the position coordinates of the four feature point primary images in the body coordinate system, calculating a circle center distance value of a point which is in the high-probability collision region and is closest to the circle center of the high-probability collision region, and finally calculating an early warning probability according to the circle center distance value; and
and the early warning prompting unit is used for early warning the unmanned aerial vehicle according to the early warning probability.
CN202111142544.8A 2021-09-28 2021-09-28 Unmanned aerial vehicle obstacle avoidance early warning method and device based on binocular vision Active CN113900443B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111142544.8A CN113900443B (en) 2021-09-28 2021-09-28 Unmanned aerial vehicle obstacle avoidance early warning method and device based on binocular vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111142544.8A CN113900443B (en) 2021-09-28 2021-09-28 Unmanned aerial vehicle obstacle avoidance early warning method and device based on binocular vision

Publications (2)

Publication Number Publication Date
CN113900443A true CN113900443A (en) 2022-01-07
CN113900443B CN113900443B (en) 2023-07-18

Family

ID=79029620

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111142544.8A Active CN113900443B (en) 2021-09-28 2021-09-28 Unmanned aerial vehicle obstacle avoidance early warning method and device based on binocular vision

Country Status (1)

Country Link
CN (1) CN113900443B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114047788A (en) * 2022-01-11 2022-02-15 南京南机智农农机科技研究院有限公司 Automatic keep away mooring unmanned aerial vehicle of barrier and follow car system
CN115576357A (en) * 2022-12-01 2023-01-06 浙江大有实业有限公司杭州科技发展分公司 Full-automatic unmanned aerial vehicle inspection intelligent path planning method under RTK signal-free scene

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018058356A1 (en) * 2016-09-28 2018-04-05 驭势科技(北京)有限公司 Method and system for vehicle anti-collision pre-warning based on binocular stereo vision
CN108844538A (en) * 2018-05-07 2018-11-20 中国人民解放军国防科技大学 Unmanned aerial vehicle obstacle avoidance waypoint generation method based on vision/inertial navigation
CN109857133A (en) * 2019-03-26 2019-06-07 台州学院 Multi-rotor unmanned aerial vehicle selectivity avoidance obstacle method based on binocular vision
CN112508865A (en) * 2020-11-23 2021-03-16 深圳供电局有限公司 Unmanned aerial vehicle inspection obstacle avoidance method and device, computer equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018058356A1 (en) * 2016-09-28 2018-04-05 驭势科技(北京)有限公司 Method and system for vehicle anti-collision pre-warning based on binocular stereo vision
CN108844538A (en) * 2018-05-07 2018-11-20 中国人民解放军国防科技大学 Unmanned aerial vehicle obstacle avoidance waypoint generation method based on vision/inertial navigation
CN109857133A (en) * 2019-03-26 2019-06-07 台州学院 Multi-rotor unmanned aerial vehicle selectivity avoidance obstacle method based on binocular vision
CN112508865A (en) * 2020-11-23 2021-03-16 深圳供电局有限公司 Unmanned aerial vehicle inspection obstacle avoidance method and device, computer equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
林涛;: "无人机视觉定位与避障子系统研究", 机械工程师, no. 03 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114047788A (en) * 2022-01-11 2022-02-15 南京南机智农农机科技研究院有限公司 Automatic keep away mooring unmanned aerial vehicle of barrier and follow car system
CN114047788B (en) * 2022-01-11 2022-04-15 南京南机智农农机科技研究院有限公司 Automatic keep away mooring unmanned aerial vehicle of barrier and follow car system
CN115576357A (en) * 2022-12-01 2023-01-06 浙江大有实业有限公司杭州科技发展分公司 Full-automatic unmanned aerial vehicle inspection intelligent path planning method under RTK signal-free scene
CN115576357B (en) * 2022-12-01 2023-07-07 浙江大有实业有限公司杭州科技发展分公司 Full-automatic unmanned aerial vehicle inspection intelligent path planning method under RTK signal-free scene

Also Published As

Publication number Publication date
CN113900443B (en) 2023-07-18

Similar Documents

Publication Publication Date Title
US20200108834A1 (en) Image-based velocity control for a turning vehicle
CN110745140B (en) Vehicle lane change early warning method based on continuous image constraint pose estimation
US10179588B2 (en) Autonomous vehicle control system
CN111976718B (en) Automatic parking control method and system
CN109677348B (en) Pre-collision control device and control method of pre-collision control device
CN113900443B (en) Unmanned aerial vehicle obstacle avoidance early warning method and device based on binocular vision
CN106164998A (en) Path prediction means
EP3043202B1 (en) Moving body system
CN112512887B (en) Driving decision selection method and device
US20220234580A1 (en) Vehicle control device
WO2021056499A1 (en) Data processing method and device, and movable platform
CN114442101B (en) Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar
EP3496040A1 (en) Gradient estimation device, gradient estimation method, computer-readable medium, and controlling system
US20230063845A1 (en) Systems and methods for monocular based object detection
US20230182774A1 (en) Autonomous driving lidar technology
CN115485723A (en) Information processing apparatus, information processing method, and program
CN115923839A (en) Vehicle path planning method
CN115649158A (en) Mine vehicle anti-collision method, equipment and storage medium
Li et al. Pitch angle estimation using a Vehicle-Mounted monocular camera for range measurement
US11607999B2 (en) Method and apparatus for invisible vehicle underbody view
JP7278740B2 (en) Mobile control device
EP4148704A1 (en) System and method for localization of safe zones in dense depth and landing quality heatmaps
WO2022049880A1 (en) Image processing device
US11380110B1 (en) Three dimensional traffic sign detection
CN115793676A (en) Rotor unmanned aerial vehicle vision guiding landing method facing mobile platform

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant