CN116007631A - Unmanned aerial vehicle autonomous line navigation method based on computer vision - Google Patents

Unmanned aerial vehicle autonomous line navigation method based on computer vision Download PDF

Info

Publication number
CN116007631A
CN116007631A CN202211662356.2A CN202211662356A CN116007631A CN 116007631 A CN116007631 A CN 116007631A CN 202211662356 A CN202211662356 A CN 202211662356A CN 116007631 A CN116007631 A CN 116007631A
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
track
line
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211662356.2A
Other languages
Chinese (zh)
Inventor
李晓峰
郭玉新
杨晗
张浩然
陶震林
贾利民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jiaotong University
Original Assignee
Beijing Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jiaotong University filed Critical Beijing Jiaotong University
Priority to CN202211662356.2A priority Critical patent/CN116007631A/en
Publication of CN116007631A publication Critical patent/CN116007631A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides an unmanned aerial vehicle autonomous line navigation method based on computer vision. The method comprises the following steps: continuously pitching video data of a track line through an onboard camera of the unmanned aerial vehicle, and dividing the video data into continuous multi-frame track line images based on a lightweight neural network; constructing a smooth track center line based on the multi-frame track line image; the expected track, the flying speed and the attitude angle of the airborne sensor of the unmanned aerial vehicle are calculated by utilizing the unmanned aerial vehicle body coordinate system based on the track center line, and the unmanned aerial vehicle can autonomously navigate, smoothly and stably fly along the track line through the flying speed control of the unmanned aerial vehicle and the attitude control of the airborne sensor, and can keep the safe distance from the track line.

Description

Unmanned aerial vehicle autonomous line navigation method based on computer vision
Technical Field
The invention relates to the technical field of unmanned aerial vehicle intelligent navigation, in particular to an unmanned aerial vehicle autonomous line navigation method based on computer vision.
Background
The security of the rail transit operation environment is a vital link in a rail transit system, and the autonomous track inspection based on the unmanned aerial vehicle becomes a development trend of the current track inspection scheme due to the great advantages in the aspects of efficiency and security.
Currently, unmanned aerial vehicle autonomous navigation methods in the prior art are mostly navigation and positioning through a GPS (Global Positioning System ). Disadvantages of this method include: the orbit route coordinate information is sparse, and the GPS position information has a certain error.
Disclosure of Invention
The embodiment of the invention provides an unmanned aerial vehicle autonomous line navigation method based on computer vision, which is used for effectively realizing autonomous navigation of an unmanned aerial vehicle along a track line.
In order to achieve the above purpose, the present invention adopts the following technical scheme.
An unmanned aerial vehicle autonomous line navigation method based on computer vision, comprising:
continuously pitching video data of a track line through an onboard camera of the unmanned aerial vehicle, and dividing the video data into continuous multi-frame track line images based on a lightweight neural network;
constructing a smooth track center line based on the multi-frame track line image;
and calculating an expected track, a flying speed and an attitude angle of an airborne sensor of the unmanned aerial vehicle by using an unmanned aerial vehicle coordinate system based on the track center line, so as to realize navigation control of the unmanned aerial vehicle.
Preferably, the video data of the track line is continuously nodded by the onboard camera of the unmanned aerial vehicle, the video data is divided into continuous multi-frame track line images based on the lightweight neural network, and the method comprises the following steps:
the method comprises the steps of continuously pitching video data of a track line through an onboard camera of an unmanned aerial vehicle, dividing the video data into continuous multi-frame track line images, training and adding context information through the continuous multi-frame track line images, respectively extracting track structure similarity characteristics of each frame of images through a hole space pyramid pooling network and a multi-to-many bidirectional recursion convolution neural network, forming an output characteristic diagram according to characteristics extracted from all frames of images, and obtaining a railway line division image result through an attention mechanism module and up-sampling based on the characteristic diagram.
Preferably, the constructing a smooth track center line based on the multi-frame track line image includes:
dividing the track line image obtained after segmentation into a plurality of discretized trapezoid block nodes, establishing a unidirectional directed graph of the trapezoid block nodes, calculating the distances between different trapezoid block nodes, and clustering each trapezoid block node according to the distances between the trapezoid block nodes;
dividing strong and weak nodes, determining trapezoidal blocks which comprise a pair of tracks and are surrounded by ballast as strong nodes, determining the rest trapezoidal blocks as weak nodes, and filtering a low-confidence dividing region, namely filtering candidate track lines with the length less than half of the height of an image or candidate track lines with the number of strong nodes less than half of the number of all nodes;
the filtered discretized trapezoidal block nodes are reconstructed into a continuous smooth curve representing the track centerline by fitting a quadratic polynomial curve representing the track centerline and fitting a linear function representing the track width.
Preferably, the calculating the expected track and the flying speed of the unmanned aerial vehicle by using the unmanned aerial vehicle coordinate system based on the track center line comprises:
based on the track center line, performing coordinate conversion from a camera coordinate system to an unmanned aerial vehicle coordinate system, and setting U= (U, v) as one point of the track center line L in the image, wherein the focal length of the camera is f, and one point in the camera coordinate system is expressed as X c = (f, u, v), unmanned aerial vehicle height h p The rotation matrix of the camera coordinate system relative to the unmanned aerial vehicle coordinate system is R, and the coordinates of the point U in the unmanned aerial vehicle coordinate system are represented by formula (1):
Figure BDA0004014537030000031
in the middle of
Figure BDA0004014537030000032
Is an indication vector of a Z-axis coordinate in a coordinate system of the unmanned aerial vehicle body;
ignoring the unmanned aerial vehicle coordinate system O body The position of the unmanned aerial vehicle is the origin of coordinates, the nearest control point between the central line of the track and the unmanned aerial vehicle is X= (X, y), and the tangential direction of the control point is n 1 =(x n1 ,y n1 ) The normal direction of the control point is n 2 =(x n2 ,y n2 ) According to the geometric principle, vectors
Figure BDA0004014537030000033
Perpendicular to vector->
Figure BDA0004014537030000034
Vector->
Figure BDA0004014537030000035
The unit vector of (2) is>
Figure BDA0004014537030000036
Vector->
Figure BDA0004014537030000037
Sum vector
Figure BDA0004014537030000038
Is of formula (2) (3):
Figure BDA0004014537030000039
Figure BDA00040145370300000310
in the method, in the process of the invention,
Figure BDA00040145370300000311
is the horizontal distance from the drone to the control point, k= { -1,1} is based on the drone position and the detecting partyTo fixed parameters of the setting, L 1 Is a tangent to the track centerline at the control point, L 1 Along vector +.>
Figure BDA00040145370300000312
Translate a desired safe distance d th Obtaining an expected track L of the unmanned aerial vehicle 2
Dividing the speed of the unmanned aerial vehicle into: along the tangential direction
Figure BDA00040145370300000313
Is set to the navigation speed V g And +.>
Figure BDA00040145370300000314
Is a correction speed V of a As formula (4):
V=V g +V a =k g n 1 +k a n 2 (4)
wherein V is g Represents a navigation speed vector along the tangential direction, k g Scalar of speed representing tangential direction, V a Representing the corrected velocity vector, k, along the normal direction a A velocity scalar representing a normal direction;
set D max For the radius of the selection range of the target point, in the target direction, with L 2 The intersection point V is a selected target point, and if no intersection point exists, the unmanned aerial vehicle needs to follow the normal direction
Figure BDA00040145370300000315
Full speed approach to the desired trajectory, k g And k a As in formulas (5) and (6):
k g =ωmax(D max +d th -d p ,0)(5)
Figure BDA00040145370300000316
where ω is the proportionality coefficient of distance and velocity.
Preferably, the calculating the attitude angle of the on-board sensor of the unmanned aerial vehicle based on the track center line by using the unmanned aerial vehicle coordinate system, to realize the navigation control of the unmanned aerial vehicle, includes:
shooting reference points are controlled to be in the center of an image by changing the pitching angle, the yaw angle and the rolling angle of an onboard camera of the unmanned aerial vehicle, the pitch angle of the onboard camera of the unmanned aerial vehicle is a fixed value beta= -pi/2, the normal line of a control point passes through an origin, and the flying height of the unmanned aerial vehicle is h p The control point is (x, y), the tangential direction of the control point is (n) x ,n y ) The calculation method of the target yaw angle alpha and the rolling angle gamma of the onboard sensor of the unmanned aerial vehicle is as shown in the formula (7) (8):
Figure BDA0004014537030000041
Figure BDA0004014537030000042
according to the technical scheme provided by the embodiment of the invention, the unmanned aerial vehicle can navigate along the track independently through real-time and high-precision track target detection.
Additional aspects and advantages of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of an implementation of an unmanned aerial vehicle autonomous flight navigation method based on visual recognition of a virtual center line of a track according to an embodiment of the present invention.
Fig. 2 is a schematic process flow diagram of an unmanned aerial vehicle autonomous flight navigation method based on visual recognition of a virtual center line of a track according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of a relative relationship between an image coordinate system and an unmanned aerial vehicle coordinate system according to an embodiment of the present invention.
Fig. 4 is a schematic diagram of an unmanned aerial vehicle coordinate system after eliminating a Z axis according to an embodiment of the present invention.
Detailed Description
Embodiments of the present invention are described in detail below, examples of which are illustrated in the accompanying drawings, wherein the same or similar reference numerals refer to the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the drawings are exemplary only for explaining the present invention and are not to be construed as limiting the present invention.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless expressly stated otherwise, as understood by those skilled in the art. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or coupled. The term "and/or" as used herein includes any and all combinations of one or more of the associated listed items.
It will be understood by those skilled in the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
For the purpose of facilitating an understanding of the embodiments of the invention, reference will now be made to the drawings of several specific embodiments illustrated in the drawings and in no way should be taken to limit the embodiments of the invention.
According to the embodiment of the invention, the autonomous along-track navigation technology of the unmanned aerial vehicle is realized through real-time and high-precision track target detection. When the unmanned aerial vehicle is navigated, a sensor on the unmanned aerial vehicle is locked on a target track line so as to continuously track the track line. And selecting a point which is positioned on the virtual central line of the track and is closest to the unmanned aerial vehicle as a control point, and designing a flight and sensor attitude control strategy of the unmanned aerial vehicle based on the position and the direction of the control point relative to the unmanned aerial vehicle.
The implementation schematic diagram of the unmanned aerial vehicle autonomous flight navigation method based on the virtual neutral line visual recognition of the orbit provided by the embodiment of the invention is shown in fig. 1, the specific processing flow is shown in fig. 2, and the method comprises the following processing steps:
step S1: dividing video data into continuous multi-frame track line images based on a lightweight neural network by continuously pitching video data of the track line by the unmanned aerial vehicle;
step S2: constructing a smooth track center line based on the multi-frame track line image;
step S3: and calculating the expected track, the flying speed and the attitude angle of the airborne sensor of the unmanned aerial vehicle by utilizing the track center line, so as to realize the navigation control of the unmanned aerial vehicle.
Preferably, the step S1 specifically includes: the method comprises the steps of dividing video data into continuous multi-frame track line images through continuously pitching video data of a track line of an unmanned aerial vehicle, training and increasing context information through the continuous multi-frame track line images, respectively extracting track structure similarity characteristics of each frame of image through a hole space pyramid pooling network and a many-to-many bidirectional recursion convolutional neural network, forming an output characteristic diagram according to characteristics extracted from all frame images, and obtaining a railway line division image result through an attention mechanism module and up-sampling based on the characteristic diagram.
Preferably, the step S2 specifically includes: dividing the track line image obtained after segmentation into a plurality of discretized trapezoid block nodes, establishing a unidirectional directed graph of the trapezoid block nodes, calculating the distances between different trapezoid block nodes, clustering each trapezoid block node according to the distances between the trapezoid block nodes,
dividing strong and weak nodes, determining trapezoidal blocks which comprise a pair of tracks and are surrounded by ballast as strong nodes, determining the rest trapezoidal blocks as weak nodes, and filtering a low-confidence dividing region, namely filtering candidate track lines with the length less than half of the height of an image or candidate track lines with the number of strong nodes less than half of the number of all nodes;
the filtered discretized trapezoidal block nodes are reconstructed into a continuous smooth curve representing the track centerline by fitting a quadratic polynomial curve representing the track centerline and fitting a linear function representing the track width.
Preferably, the step S3 specifically includes: performing coordinate conversion from an image coordinate system to an unmanned aerial vehicle coordinate system; selecting a point which is positioned on a track central line and is closest to the unmanned aerial vehicle as a control point, calculating an expected track and an expected flight speed, and realizing flight control; and the attitude angle of the airborne sensor is calculated, so that attitude control is realized, a target is kept in the field of view of the sensor, and a continuous track line is realized.
(1) Coordinate conversion from camera coordinate system to unmanned aerial vehicle coordinate system
Unmanned aerial vehicle navigation requires obtaining the position information of the track center line in an unmanned aerial vehicle coordinate system, and the relative relationship between the image coordinate system and the unmanned aerial vehicle coordinate system is shown in fig. 3.
Let u= (U, v) be a point on the track centerline L in the image, the camera focal length is f, and one point in the camera coordinate system is denoted X c = (f, u, v). The unmanned plane has the height h p The rotation matrix of a PTZ (Pan-Tilt-Zoom) camera coordinate system relative to an unmanned aerial vehicle coordinate system is R, and the coordinates of a point U in the unmanned aerial vehicle coordinate system are shown as formula (1):
Figure BDA0004014537030000071
in the middle of
Figure BDA0004014537030000073
Is an indication vector of the Z-axis coordinate.
(2) The expected track and the flying speed of the unmanned aerial vehicle are calculated, and the flying control is realized
Since all orbit targets are on the ground, the unmanned aerial vehicle coordinate system O can be ignored body Fig. 4 is a schematic diagram of an unmanned aerial vehicle coordinate system after eliminating the Z axis according to an embodiment of the present invention, as shown in fig. 4,
Figure BDA0004014537030000072
is an unmanned aerial vehicle body coordinate system O body There is no coordinate system behind the Z axis. The drone location is the origin of coordinates. The control point is X= (X, y), and the tangential direction is n 1 =(x n1 ,y n1 ) A normal direction of n 2 =(x n2 ,y n2 )。
Since the control point is the nearest control point between the orbit centerline and the unmanned aerial vehicle, according to the geometric principle, the vector
Figure BDA0004014537030000081
Perpendicular to vector->
Figure BDA0004014537030000082
Vector->
Figure BDA0004014537030000083
The unit vector of (2) is>
Figure BDA0004014537030000084
Vector->
Figure BDA0004014537030000085
Sum vector->
Figure BDA0004014537030000086
For (2)
(3):
Figure BDA0004014537030000087
Figure BDA0004014537030000088
In the method, in the process of the invention,
Figure BDA0004014537030000089
is the horizontal distance from the drone to the control point, k= { -1,1} is a fixed parameter set according to the drone position and detection direction.
In FIG. 4, L 1 Is the tangent to the track centerline at the control point, which can be approximated by the track centerline in view due to the large turning radius of the track line. L (L) 1 Along a vector
Figure BDA00040145370300000810
Translate a desired safe distance d th Obtain the expected safety track L 2 As shown by the lower right dashed line L in FIG. 4 2 As shown.
To control the drone to approach the desired trajectory, the drone speed can be divided into two parts: along the tangential direction
Figure BDA00040145370300000811
Is set to the navigation speed V g And +.>
Figure BDA00040145370300000812
Is a correction speed V of a As formula (4):
V=V g +V a =k g n 1 +k a n 2 (4)
wherein V is g Represents a navigation speed vector along the tangential direction, k g Scalar of speed representing tangential direction, V a Indicating the correction speed along the normal directionVectors, k a A velocity scalar representing the normal direction.
Positively correlated with the distance from the desired trajectory of the drone. When the distance is small, k a Smaller and k g The unmanned aerial vehicle is large, and the unmanned aerial vehicle mainly moves along the track line; when k is a Larger and k g Less, no one has the opportunity to quickly move to the desired trajectory.
The speed of the drone is controlled by the distance of the target point in each step. Radius D in FIG. 3 max Is the selection range of the target point, and is equal to L in the target direction 2 The intersection point V is the selected target point. If there is no intersection point, the unmanned plane needs to be along the normal direction
Figure BDA00040145370300000813
Full speed approaches the desired trajectory. k (k) g And k a As in formulas (5) and (6):
k g =ωmax(D max +d th -d p ,0)(5)
Figure BDA00040145370300000814
where ω is the proportionality coefficient of distance and velocity. Equations (5) and (6) ensure that the speed of the drone remains unchanged and that the drone can move smoothly along the desired trajectory.
(3) Calculating attitude angle of an airborne sensor of the unmanned aerial vehicle and realizing attitude control
To constrain the target track line to remain in the field of view of the on-board camera, the reference point is controlled to be centered in the image by varying the pitch, yaw and roll angles of the PTZ.
The pitch angle is a fixed value β= -pi/2, so the normal of the control point will pass through the origin. Flying height h for unmanned aerial vehicle p Control point (x, y) and tangential direction (n) x ,n y ) The target yaw angle α and roll angle γ of the sensor are of formula (7) (8):
Figure BDA0004014537030000091
/>
Figure BDA0004014537030000092
in view of the short time delay that cannot be ignored from the acquisition of the image to the calculation of the target pose, directly using the angle defined by the equation results in overshoot, and therefore a proportional-integral-derivative (PID) algorithm is used to update the actual control angle.
In summary, according to the real-time visual unmanned aerial vehicle navigation system for track detection through field test, the virtual central line of the track line is reconstructed by identifying the split track structure, the smooth curve is used as the expected track of unmanned aerial vehicle flight, and the unmanned aerial vehicle autonomous navigation and smooth and stable flight along the track line are realized by unmanned aerial vehicle flight speed control and attitude control of the on-board sensor, and the safe distance between the unmanned aerial vehicle and the track line can be maintained. The invention realizes autonomous line navigation of the unmanned aerial vehicle and is based on the premise and the basis of an autonomous orbit inspection scheme of the unmanned aerial vehicle.
Those of ordinary skill in the art will appreciate that: the drawing is a schematic diagram of one embodiment and the modules or flows in the drawing are not necessarily required to practice the invention.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for apparatus or system embodiments, since they are substantially similar to method embodiments, the description is relatively simple, with reference to the description of method embodiments in part. The apparatus and system embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
The present invention is not limited to the above-mentioned embodiments, and any changes or substitutions that can be easily understood by those skilled in the art within the technical scope of the present invention are intended to be included in the scope of the present invention. Therefore, the protection scope of the present invention should be subject to the protection scope of the claims.

Claims (5)

1. An unmanned aerial vehicle autonomous line navigation method based on computer vision, which is characterized by comprising the following steps:
continuously pitching video data of a track line through an onboard camera of the unmanned aerial vehicle, and dividing the video data into continuous multi-frame track line images based on a lightweight neural network;
constructing a smooth track center line based on the multi-frame track line image;
and calculating an expected track, a flying speed and an attitude angle of an airborne sensor of the unmanned aerial vehicle by using an unmanned aerial vehicle coordinate system based on the track center line, so as to realize navigation control of the unmanned aerial vehicle.
2. The method of claim 1, wherein continuously nodding the video data of the track line by the onboard camera of the unmanned aerial vehicle, dividing the video data into continuous multi-frame track line images based on the lightweight neural network, comprises:
the method comprises the steps of continuously pitching video data of a track line through an onboard camera of an unmanned aerial vehicle, dividing the video data into continuous multi-frame track line images, training and adding context information through the continuous multi-frame track line images, respectively extracting track structure similarity characteristics of each frame of images through a hole space pyramid pooling network and a multi-to-many bidirectional recursion convolution neural network, forming an output characteristic diagram according to characteristics extracted from all frames of images, and obtaining a railway line division image result through an attention mechanism module and up-sampling based on the characteristic diagram.
3. The method of claim 1, wherein said constructing a smoothed track centerline based on the multi-frame track line image comprises:
dividing the track line image obtained after segmentation into a plurality of discretized trapezoid block nodes, establishing a unidirectional directed graph of the trapezoid block nodes, calculating the distances between different trapezoid block nodes, and clustering each trapezoid block node according to the distances between the trapezoid block nodes;
dividing strong and weak nodes, determining trapezoidal blocks which comprise a pair of tracks and are surrounded by ballast as strong nodes, determining the rest trapezoidal blocks as weak nodes, and filtering a low-confidence dividing region, namely filtering candidate track lines with the length less than half of the height of an image or candidate track lines with the number of strong nodes less than half of the number of all nodes;
the filtered discretized trapezoidal block nodes are reconstructed into a continuous smooth curve representing the track centerline by fitting a quadratic polynomial curve representing the track centerline and fitting a linear function representing the track width.
4. The method of claim 1, wherein calculating the desired trajectory and speed of flight of the drone using the drone coordinate system based on the trajectory centerline comprises:
based on the track center line, performing coordinate conversion from a camera coordinate system to an unmanned aerial vehicle coordinate system, and setting U= (U, v) as one point of the track center line L in the image, wherein the focal length of the camera is f, and one point in the camera coordinate system is expressed as X c = (f, u, v), unmanned aerial vehicle height h p The rotation matrix of the camera coordinate system relative to the unmanned aerial vehicle coordinate system is R, and the coordinates of the point U in the unmanned aerial vehicle coordinate system are represented by formula (1):
Figure FDA0004014537020000021
in the middle of
Figure FDA00040145370200000214
Is an indication vector of a Z-axis coordinate in a coordinate system of the unmanned aerial vehicle body;
ignoring the unmanned aerial vehicle coordinate system O body The position of the unmanned aerial vehicle is the origin of coordinates, the nearest control point between the central line of the track and the unmanned aerial vehicle is X= (X, y), and the tangential direction of the control point is n 1 =(x n1 ,y n1 ) The normal direction of the control point is n 2 =(x n2 ,y n2 ) According to the geometric principle, vectors
Figure FDA0004014537020000022
Perpendicular to vector->
Figure FDA0004014537020000023
Vector->
Figure FDA0004014537020000024
The unit vector of (2) is>
Figure FDA0004014537020000025
Vector->
Figure FDA0004014537020000026
Sum vector->
Figure FDA0004014537020000027
Is of formula (2) (3):
Figure FDA0004014537020000028
/>
Figure FDA0004014537020000029
in the method, in the process of the invention,
Figure FDA00040145370200000210
is the horizontal distance from the unmanned aerial vehicle to the control point, k= { -1,1} is a fixed parameter set according to the unmanned aerial vehicle position and detection direction, L 1 Is a tangent to the track centerline at the control point, L 1 Along vector +.>
Figure FDA00040145370200000211
Translate a desired safe distance d th Obtaining an expected track L of the unmanned aerial vehicle 2
Dividing the speed of the unmanned aerial vehicle into: along the tangential direction
Figure FDA00040145370200000212
Is set to the navigation speed V g And +.>
Figure FDA00040145370200000213
Is a correction speed V of a As formula (4):
V=V g +V a =k g n 1 +k a n 2 (4)
wherein V is g Represents a navigation speed vector along the tangential direction, k g Scalar of speed representing tangential direction, V a Representing the corrected velocity vector, k, along the normal direction a A velocity scalar representing the normal direction.
Set D max For the radius of the selection range of the target point, in the target direction, with L 2 The intersection point V is a selected target point, and if no intersection point exists, the unmanned aerial vehicle needs to follow the normal direction
Figure FDA0004014537020000031
Full speed approach to the desired trajectory, k g And k a As in formulas (5) and (6):
k g =ωmax(D max +d th -d p ,0)(5)
Figure FDA0004014537020000032
where ω is the proportionality coefficient of distance and velocity.
5. The method of claim 1, wherein the calculating the attitude angle of the on-board sensor of the unmanned aerial vehicle based on the orbit centerline using the unmanned aerial vehicle coordinate system to realize the unmanned aerial vehicle navigation control comprises:
shooting reference points are controlled to be in the center of an image by changing the pitching angle, the yaw angle and the rolling angle of an onboard camera of the unmanned aerial vehicle, the pitch angle of the onboard camera of the unmanned aerial vehicle is a fixed value beta= -pi/2, the normal line of a control point passes through an origin, and the flying height of the unmanned aerial vehicle is h p The control point is (x, y), the tangential direction of the control point is (n) x ,n y ) The calculation method of the target yaw angle alpha and the rolling angle gamma of the onboard sensor of the unmanned aerial vehicle is as shown in the formula (7) (8):
Figure FDA0004014537020000033
Figure FDA0004014537020000034
/>
CN202211662356.2A 2022-12-23 2022-12-23 Unmanned aerial vehicle autonomous line navigation method based on computer vision Pending CN116007631A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211662356.2A CN116007631A (en) 2022-12-23 2022-12-23 Unmanned aerial vehicle autonomous line navigation method based on computer vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211662356.2A CN116007631A (en) 2022-12-23 2022-12-23 Unmanned aerial vehicle autonomous line navigation method based on computer vision

Publications (1)

Publication Number Publication Date
CN116007631A true CN116007631A (en) 2023-04-25

Family

ID=86034774

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211662356.2A Pending CN116007631A (en) 2022-12-23 2022-12-23 Unmanned aerial vehicle autonomous line navigation method based on computer vision

Country Status (1)

Country Link
CN (1) CN116007631A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114442665A (en) * 2022-01-20 2022-05-06 北京华能新锐控制技术有限公司 Wind power blade inspection line planning method based on unmanned aerial vehicle

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114442665A (en) * 2022-01-20 2022-05-06 北京华能新锐控制技术有限公司 Wind power blade inspection line planning method based on unmanned aerial vehicle
CN114442665B (en) * 2022-01-20 2023-12-08 北京华能新锐控制技术有限公司 Wind power blade inspection line planning method based on unmanned aerial vehicle

Similar Documents

Publication Publication Date Title
CN109945858A (en) It parks the multi-sensor fusion localization method of Driving Scene for low speed
US8666661B2 (en) Video navigation
Amer et al. Deep convolutional neural network based autonomous drone navigation
Borowczyk et al. Autonomous landing of a quadcopter on a high-speed ground vehicle
CN108594848B (en) Unmanned aerial vehicle staged autonomous landing method based on visual information fusion
Kong et al. Autonomous landing of an UAV with a ground-based actuated infrared stereo vision system
CN105644785A (en) Unmanned aerial vehicle landing method based on optical flow method and horizon line detection
Li et al. UAV autonomous landing technology based on AprilTags vision positioning algorithm
CN116007631A (en) Unmanned aerial vehicle autonomous line navigation method based on computer vision
Kassab et al. UAV target tracking by detection via deep neural networks
CN115861860B (en) Target tracking and positioning method and system for unmanned aerial vehicle
CN113485450A (en) Unmanned aerial vehicle keeps away barrier system based on computer vision
CN110223233B (en) Unmanned aerial vehicle aerial photography image building method based on image splicing
US11087158B2 (en) Error correction of airborne vehicles using natural patterns
CN112577481B (en) Ground target positioning method for rotor unmanned aerial vehicle
Amidi et al. Research on an autonomous vision-guided helicopter
CN113905190B (en) Panorama real-time splicing method for unmanned aerial vehicle video
CN114820768B (en) Method for aligning geodetic coordinate system and slam coordinate system
Gao et al. Adaptive Tracking and Perching for Quadrotor in Dynamic Scenarios
Kang et al. Performance enhancement of the attitude estimation using small quadrotor by vision-based marker tracking
CN212541104U (en) Automatic accurate landing control device of many rotor unmanned aerial vehicle
CN115525062A (en) Unmanned aerial vehicle real-time online flight path planning method for measurement tasks of recoverable spacecraft
Simlinger et al. Vision-based Gyroscope Fault Detection for UAVs
Qi et al. Detection and tracking of a moving target for UAV based on machine vision
Geiger et al. Flight testing a real-time direct collocation path planner

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination