CN116661470A - Unmanned aerial vehicle pose estimation method based on binocular vision guided landing - Google Patents

Unmanned aerial vehicle pose estimation method based on binocular vision guided landing Download PDF

Info

Publication number
CN116661470A
CN116661470A CN202310398944.8A CN202310398944A CN116661470A CN 116661470 A CN116661470 A CN 116661470A CN 202310398944 A CN202310398944 A CN 202310398944A CN 116661470 A CN116661470 A CN 116661470A
Authority
CN
China
Prior art keywords
landing
aerial vehicle
unmanned aerial
landmark
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310398944.8A
Other languages
Chinese (zh)
Inventor
肖励
唐瑞卿
任杰
李静洪
李娟�
杨晨
马越峤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Aircraft Industrial Group Co Ltd
Original Assignee
Chengdu Aircraft Industrial Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Aircraft Industrial Group Co Ltd filed Critical Chengdu Aircraft Industrial Group Co Ltd
Priority to CN202310398944.8A priority Critical patent/CN116661470A/en
Publication of CN116661470A publication Critical patent/CN116661470A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/15Correlation function computation including computation of convolution operations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/04Control of altitude or depth
    • G05D1/06Rate of change of altitude or depth
    • G05D1/0607Rate of change of altitude or depth specially adapted for aircraft
    • G05D1/0653Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing
    • G05D1/0676Rate of change of altitude or depth specially adapted for aircraft during a phase of take-off or landing specially adapted for landing
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Analysis (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computational Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Mathematical Optimization (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Algebra (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Navigation (AREA)

Abstract

The application belongs to the technical field of aviation navigation guidance, and particularly relates to an unmanned aerial vehicle pose estimation method based on binocular vision guidance landing, which comprises the following steps: firstly, calculating a distance to be flown, wherein an unmanned plane runway adopts a horizontally-placed landing landmark, the end of the runway adopts a vertically-placed landing landmark, and the QR code landing landmark is used as the runway landmark; the second step, the third step, the lifting speed, the fourth step, the side offset distance, the fifth step, the gesture estimation and calculation, the landmark structure is simplified through the QR code landmark, the visual processing speed is improved, the characteristics are different from the runway background, the detection robustness is enhanced, and the rapid identification and positioning of the unmanned aerial vehicle landing point are realized. In the landing and downslide process of the unmanned aerial vehicle, through two placement modes of horizontal placement and vertical placement of landmarks, the unmanned aerial vehicle autonomous landing navigation method based on machine vision is researched, landing parameters are estimated, dangers in the landing process of the unmanned aerial vehicle are effectively prevented, and safe landing of the unmanned aerial vehicle is realized.

Description

Unmanned aerial vehicle pose estimation method based on binocular vision guided landing
Technical Field
The application belongs to the technical field of aviation navigation guidance, and particularly relates to an unmanned aerial vehicle pose estimation method based on binocular vision guidance landing.
Background
In recent years, unmanned aerial vehicles have been the subject of hot spot research. GPS is used as a global positioning system, navigation precision is high, no time accumulation error exists, and the GPS is widely applied to unmanned aerial vehicle systems, but is easy to be subjected to electromagnetic interference, the risk of losing positioning capability due to interference or satellite signal loss exists, and the unmanned aerial vehicle is extremely easy to be subjected to various interferences in a landing stage, so that under the conditions of low speed and low altitude, accurate self-motion information such as altitude, speed and the like cannot be obtained, the accident rate of the unmanned aerial vehicle in a take-off and landing stage is higher than that of the unmanned aerial vehicle in other flight stages, and therefore, the unmanned aerial vehicle system is significant in carrying out autonomous safe landing under the GPS missing environment.
With the rapid development of machine vision technology, vision navigation technology has become a hot spot research object, and representative vision navigation is binocular vision navigation. Compared with the traditional navigation mode, the binocular navigation sensor has the advantages of low manufacturing cost, small volume, difficult discovery, strong anti-interference capability and rich captured information, and is widely applied to military aspects. Since the 21 st century, many theoretical exploration and physical research have been carried out at home and abroad in unmanned aerial vehicle visual landing, but the fixed-wing unmanned aerial vehicle has large autonomous landing difficulty and high accident rate, so the navigation technology in the landing stage has been the key research content in the unmanned aerial vehicle field.
Disclosure of Invention
In order to solve the problem of autonomous safe landing of an unmanned aerial vehicle in a GPS missing environment in the prior art, the application provides an unmanned aerial vehicle pose estimation method based on binocular vision guided landing.
In order to achieve the technical effects, the application is realized by the following technical scheme:
a binocular vision guided landing-based unmanned aerial vehicle pose estimation method comprises the following steps:
firstly, calculating a distance to be flown, wherein the distance from the unmanned aerial vehicle to a landing point in the sliding process is the distance to be flown, two landmark placement modes are adopted to assist the unmanned aerial vehicle in landing, a runway of the unmanned aerial vehicle adopts a horizontally-placed landing landmark, the end of the runway adopts a vertically-placed landing landmark, and a QR code landing landmark is used as a runway landmark;
further, the land landmarks laid flat in step one: when the landing landmark is horizontally placed on the runway, the relationship between the landmark and the camera in the visual field comprises two conditions, wherein one is that the landing landmark is on the same side of the unmanned aerial vehicle, and the other is that the landing landmark is on the different side of the unmanned aerial vehicle; the stand-by distance calculation of the landing landmark is same as that of the landing landmark.
Still further, when Liu Debiao is on the same side of the unmanned aerial vehicle, taking the example that the landing landmark is on the left side of the unmanned aerial vehicle, the waiting flying distance is calculated:
wherein D is a pitch of Liu Debiao A, B, M is a pitch of Liu Debiao A, M, D A Distance D of unmanned aerial vehicle from landing landmark A B The distance between the unmanned aerial vehicle and the landing landmark B is Liu Debiao A, and the included angle between the unmanned aerial vehicle and the sight line is beta; and the distance to be flown is MO.
Still further, when Liu Debiao is on the opposite side of the drone, take the drone as an example to calculate the distance to be flown between two adjacent landing landmarks:
wherein D is a pitch of Liu Debiao A, C, M is a pitch of Liu Debiao A, M, D A Distance D of unmanned aerial vehicle from landing landmark A c The distance between the unmanned aerial vehicle and the landing landmark C is Liu Debiao A, and the gamma is the included angle between the unmanned aerial vehicle and the sight line; and the distance to be flown is MO.
Calculating the relative height, wherein the vertical distance from the unmanned aerial vehicle to the horizontal plane of the landing point in the sliding process is the relative height;
further, landing landmarks are placed vertically:
wherein alpha is a roll-down pitch angle, MO is a distance to be flown, and PO is a relative height.
Further, land landmarks are laid flat: when the landing landmark is on the same side of the unmanned aerial vehicle, taking the landing landmark on the left side of the unmanned aerial vehicle as an example to calculate the relative height;
wherein D is the pitch of Liu Debiao A, B, D A Distance D of unmanned aerial vehicle from landing landmark A B The distance between the unmanned aerial vehicle and the landing landmark B is Liu Debiao A, B, the included angle between gamma and the sight of the unmanned aerial vehicle is defined, and PO is the relative height.
Further, land landmarks are laid flat: when the landing landmarks are on the opposite sides of the unmanned aerial vehicle, taking the unmanned aerial vehicle between two adjacent landing landmarks as an example to calculate the relative height;
wherein D is the pitch of Liu Debiao A, C, D A Distance D of unmanned aerial vehicle from landing landmark A c The distance between the unmanned aerial vehicle and the landing landmark C is Liu Debiao A, C, and the gamma is the included angle between the unmanned aerial vehicle and the sight line; PO is relatively high.
Thirdly, calculating the lifting speed, wherein the change amount of the height of the unmanned aerial vehicle in unit time in the sliding process is the lifting speed;
further, the third step of calculating the specific lifting speed comprises the following steps:
d diff =d pre -d cur
wherein d cur D is the distance from the current frame camera to the landmark pre V is the distance of the camera from the landmark of the previous frame curr I is the current frame rate, i is the current frame number,is pitch angle, v l Is the lifting speed.
Calculating a lateral offset distance, wherein the lateral offset distance is the distance between an actual route and a planned route of the aircraft, namely the distance of the center of the image deviated from a left runway and a right runway is the left lateral offset distance;
further, the fourth step is specifically calculated as follows:
where D is the actual distance between runways, D1 is the pixel distance, L1 is the pixel distance from the image center point to the straight line L, R1 is the pixel distance from the image center point to the straight line R, LL is the left offset, and RR is the right offset.
And fifthly, calculating the attitude estimation, namely realizing the attitude estimation of the unmanned aerial vehicle through the three-dimensional information of the unmanned aerial vehicle relative to the landing mark, and converting the target point of the world coordinate system into a camera coordinate system by combining the coordinate relation of binocular vision.
Further, the fifth step specifically includes:
wherein R is a rotation matrix from a world coordinate system to a camera coordinate system, and (theta, phi) is a pitch angle, a roll angle and a yaw angle of the unmanned aerial vehicle.
The application has the advantages that:
1. according to the unmanned aerial vehicle autonomous landing method based on the machine vision, unmanned aerial vehicle autonomous landing is used as an application background, and unmanned aerial vehicle standby flight distance/relative high speed/lifting speed/lateral offset distance/gesture estimation algorithm research is conducted. The QR code landmark is used for simplifying the landmark structure, improving the visual processing speed, distinguishing the characteristics from the runway background, enhancing the detection robustness and realizing the rapid identification and positioning of the unmanned aerial vehicle landing points. In the landing and downslide process of the unmanned aerial vehicle, through two placement modes of horizontal placement and vertical placement of landmarks, the unmanned aerial vehicle autonomous landing navigation method based on machine vision is researched, landing parameters are estimated, dangers in the landing process of the unmanned aerial vehicle are effectively prevented, and safe landing of the unmanned aerial vehicle is realized.
2. In the GPS signal missing environment, the binocular vision landing navigation technology is utilized, so that the robustness of the unmanned aerial vehicle system can be effectively improved;
3. developing unmanned aerial vehicle to-be-flown distance/relative height/lifting speed/lateral offset distance/attitude estimation algorithm research based on machine vision, solving unmanned aerial vehicle landing information, and providing guarantee for unmanned aerial vehicle autonomous safe landing;
4. the landing landmark design method using the two QR codes assists the unmanned aerial vehicle to land, and the QR codes are simple in structure, rich in information, easy to identify in color and convenient to detect.
Drawings
Fig. 1 is a QR code landing landmark of the present application.
Fig. 2 is a schematic view of a triangle of the landing landmark lay-down and roll-down process of the present application, 1.
Fig. 3 is a schematic view of a triangle of the landing landmark lay-down and roll-down process of the present application 2.
Fig. 4 is a triangular schematic view of the landing landmark vertical slip process of the present application.
Fig. 5 is a binary map of runway and QR code runway landmarks.
Fig. 6 is a schematic view of the elevating speed of the present application.
Fig. 7 is a velocity resolution schematic of the present application.
Detailed Description
In order to more clearly illustrate the technical solutions provided by the present application, the present application will be further described below with reference to the accompanying drawings and examples. It should be noted that the embodiments provided are only some embodiments of the present application, but not all embodiments, and therefore should not be construed as limiting the scope of protection. All other embodiments, which are obtained by a worker of ordinary skill in the art without creative efforts, are within the protection scope of the present application based on the embodiments of the present application.
Example 1
A binocular vision guided landing-based unmanned aerial vehicle pose estimation method comprises the following steps:
firstly, calculating a distance to be flown, wherein the distance from the unmanned aerial vehicle to a landing point in the sliding process is the distance to be flown, two landmark placement modes are adopted to assist the unmanned aerial vehicle in landing, a runway of the unmanned aerial vehicle adopts a horizontally-placed landing landmark, the end of the runway adopts a vertically-placed landing landmark, and a QR code landing landmark is used as a runway landmark;
further, the land landmarks laid flat in step one: when the landing landmark is horizontally placed on the runway, the relationship between the landmark and the camera in the visual field comprises two conditions, wherein one is that the landing landmark is on the same side of the unmanned aerial vehicle, and the other is that the landing landmark is on the different side of the unmanned aerial vehicle; the stand-by distance calculation of the landing landmark is same as that of the landing landmark.
Still further, when Liu Debiao is on the same side of the unmanned aerial vehicle, taking the example that the landing landmark is on the left side of the unmanned aerial vehicle, the waiting flying distance is calculated:
wherein D is a pitch of Liu Debiao A, B, M is a pitch of Liu Debiao A, M, D A Distance D of unmanned aerial vehicle from landing landmark A B The distance between the unmanned aerial vehicle and the landing landmark B is Liu Debiao A, and the included angle between the unmanned aerial vehicle and the sight line is beta; and the distance to be flown is MO.
Still further, when Liu Debiao is on the opposite side of the drone, take the drone as an example to calculate the distance to be flown between two adjacent landing landmarks:
wherein D is a pitch of Liu Debiao A, C, M is a pitch of Liu Debiao A, M, D A Distance D of unmanned aerial vehicle from landing landmark A c The distance between the unmanned aerial vehicle and the landing landmark C is Liu Debiao A, and the gamma is the included angle between the unmanned aerial vehicle and the sight line; and the distance to be flown is MO.
Calculating the relative height, wherein the vertical distance from the unmanned aerial vehicle to the horizontal plane of the landing point in the sliding process is the relative height;
further, landing landmarks are placed vertically:
wherein alpha is a roll-down pitch angle, MO is a distance to be flown, and PO is a relative height.
Further, land landmarks are laid flat: when the landing landmark is on the same side of the unmanned aerial vehicle, taking the landing landmark on the left side of the unmanned aerial vehicle as an example to calculate the relative height;
wherein D is the pitch of Liu Debiao A, B, D A Distance D of unmanned aerial vehicle from landing landmark A B The distance between the unmanned aerial vehicle and the landing landmark B is Liu Debiao A, B, the included angle between gamma and the sight of the unmanned aerial vehicle is defined, and PO is the relative height.
Further, land landmarks are laid flat: when the landing landmarks are on the opposite sides of the unmanned aerial vehicle, taking the unmanned aerial vehicle between two adjacent landing landmarks as an example to calculate the relative height;
wherein D is the pitch of Liu Debiao A, C, D A Distance D of unmanned aerial vehicle from landing landmark A c The distance between the unmanned aerial vehicle and the landing landmark C is Liu Debiao A, C, and the gamma is the included angle between the unmanned aerial vehicle and the sight line; PO is relatively high.
Thirdly, calculating the lifting speed, wherein the change amount of the height of the unmanned aerial vehicle in unit time in the sliding process is the lifting speed;
further, the third step of calculating the specific lifting speed comprises the following steps:
d diff =d pre -d cur
wherein d cur D is the distance from the current frame camera to the landmark pre V is the distance of the camera from the landmark of the previous frame curr I is the current frame rate, i is the current frame number,is pitch angle, v l Is the lifting speed.
Calculating a lateral offset distance, wherein the lateral offset distance is the distance between an actual route and a planned route of the aircraft, namely the distance of the center of the image deviated from a left runway and a right runway is the left lateral offset distance;
further, the fourth step is specifically calculated as follows:
where D is the actual distance between runways, D1 is the pixel distance, L1 is the pixel distance from the image center point to the straight line L, R1 is the pixel distance from the image center point to the straight line R, LL is the left offset, and RR is the right offset.
And fifthly, calculating the attitude estimation, namely realizing the attitude estimation of the unmanned aerial vehicle through the three-dimensional information of the unmanned aerial vehicle relative to the landing mark, and converting the target point of the world coordinate system into a camera coordinate system by combining the coordinate relation of binocular vision.
Further, the fifth step specifically includes:
wherein R is a rotation matrix from a world coordinate system to a camera coordinate system, and (theta, phi) is a pitch angle, a roll angle and a yaw angle of the unmanned aerial vehicle.
As shown in fig. 2, in the land marks placed horizontally, assuming that the straight line where the land mark M, A, B is located is the runway center line, after the unmanned aerial vehicle flies to the middle of two adjacent land marks near the side B, knowing the distances MA and AB between the adjacent land marks, the distances D between the unmanned aerial vehicle and the two land marks are respectively known A And D B According to the triangle cosine theorem, the lateral offset distance MO and the relative height PO are calculated.
As shown in fig. 3, in the land landmarks placed horizontally, assuming that the straight line where the land landmark M, A, C is located is the runway center line, after the unmanned aerial vehicle flies to the position between two adjacent land landmarks near the side C, the distance of the land landmark M, A is known as MA, and no land landmark existsThe distances between the man and the M, A are D respectively A And D C According to the triangle cosine theorem, the lateral offset distance MO and the relative height PO are calculated.
As shown in fig. 4, in the vertical landing landmark, the relative height PO is calculated by a triangle tangent formula according to the unmanned aerial vehicle glide pitch angle α and the lateral offset MO.
As shown in fig. 5, assuming that the actual distance between runways is D and the pixel distance is D1, the left offset LL and the right offset RR are calculated using the principle according to the pinhole imaging in combination with the distances L1 and R1 from the image center point to the pixel plane runway line L and runway line R.
As shown in fig. 6 and 7, the lifting speed is estimated as the calculation of the sliding distance between two frames and the image time. In order to eliminate singular points caused by distance and time, a multi-frame average mode is selected, the average speed of a downslide process is solved, errors among speeds are reduced, and a specific smoothing mode is smoothing filtering of the speeds of the first five frames.
The foregoing is a further detailed description of the application in connection with specific preferred embodiments, and it is not intended that the application be limited to these descriptions. Other embodiments of the application, which are apparent to those skilled in the art to which the application pertains without departing from its technical scope, shall be covered by the protection scope of the application.

Claims (10)

1. A binocular vision guided landing-based unmanned aerial vehicle pose estimation method is characterized by comprising the following steps of: the method comprises the following steps:
firstly, calculating a distance to be flown, wherein the distance from the unmanned aerial vehicle to a landing point in the sliding process is the distance to be flown, two landmark placement modes are adopted to assist the unmanned aerial vehicle in landing, a runway of the unmanned aerial vehicle adopts a horizontally-placed landing landmark, the end of the runway adopts a vertically-placed landing landmark, and a QR code landing landmark is used as a runway landmark;
calculating the relative height, wherein the vertical distance from the unmanned aerial vehicle to the horizontal plane of the landing point in the sliding process is the relative height;
thirdly, calculating the lifting speed, wherein the change amount of the height of the unmanned aerial vehicle in unit time in the sliding process is the lifting speed;
calculating a lateral offset distance, wherein the lateral offset distance is the distance between an actual route and a planned route of the aircraft, namely the distance of the center of the image deviated from a left runway and a right runway is the left lateral offset distance;
and fifthly, calculating the attitude estimation, namely realizing the attitude estimation of the unmanned aerial vehicle through the three-dimensional information of the unmanned aerial vehicle relative to the landing mark, and converting the target point of the world coordinate system into a camera coordinate system by combining the coordinate relation of binocular vision.
2. The unmanned aerial vehicle pose estimation method based on binocular vision guided landing of claim 1, wherein the method comprises the following steps:
the land landmark laid flat in the first step: when the landing landmark is horizontally placed on the runway, the relationship between the landmark and the camera in the visual field comprises two conditions, wherein one is that the landing landmark is on the same side of the unmanned aerial vehicle, and the other is that the landing landmark is on the different side of the unmanned aerial vehicle; the stand-by distance calculation of the landing landmark is same as that of the landing landmark.
3. The unmanned aerial vehicle pose estimation method based on binocular vision guided landing according to claim 2, wherein the method is characterized by comprising the following steps of:
when the landing landmark is on the same side of the unmanned plane, taking the landing landmark on the left side of the unmanned plane as an example, calculating the waiting flight distance:
wherein D is a pitch of Liu Debiao A, B, M is a pitch of Liu Debiao A, M, D A Distance D of unmanned aerial vehicle from landing landmark A B The distance between the unmanned aerial vehicle and the landing landmark B is Liu Debiao A, and the included angle between the unmanned aerial vehicle and the sight line is beta; and the distance to be flown is MO.
4. The unmanned aerial vehicle pose estimation method based on binocular vision guided landing according to claim 2, wherein the method is characterized by comprising the following steps of:
when the landing landmarks are on the opposite sides of the unmanned aerial vehicle, taking the unmanned aerial vehicle between two adjacent landing landmarks as an example, calculating a waiting flight distance:
wherein D is a pitch of Liu Debiao A, C, M is a pitch of Liu Debiao A, M, D A Distance D of unmanned aerial vehicle from landing landmark A c The distance between the unmanned aerial vehicle and the landing landmark C is Liu Debiao A, and the gamma is the included angle between the unmanned aerial vehicle and the sight line; and the distance to be flown is MO.
5. The unmanned aerial vehicle pose estimation method based on binocular vision guided landing according to claim 2, wherein the method is characterized by comprising the following steps of:
the calculation method of the second step is specifically that,
vertically placing landing landmarks:
wherein alpha is a roll-down pitch angle, MO is a distance to be flown, and PO is a relative height.
6. The unmanned aerial vehicle pose estimation method based on binocular vision guided landing of claim 5, wherein the method comprises the following steps of:
land landmarks lie flat: when the landing landmark is on the same side of the unmanned aerial vehicle, taking the landing landmark on the left side of the unmanned aerial vehicle as an example to calculate the relative height;
wherein D is the pitch of Liu Debiao A, B, D A Distance D of unmanned aerial vehicle from landing landmark A B The distance between the unmanned aerial vehicle and the landing landmark B is Liu Debiao A, B, the included angle between gamma and the sight of the unmanned aerial vehicle is defined, and PO is the relative height.
7. The unmanned aerial vehicle pose estimation method based on binocular vision guided landing of claim 5, wherein the method comprises the following steps of:
land landmarks lie flat: when the landing landmarks are on the opposite sides of the unmanned aerial vehicle, taking the unmanned aerial vehicle between two adjacent landing landmarks as an example to calculate the relative height;
wherein D is the pitch of Liu Debiao A, C, D A Distance D of unmanned aerial vehicle from landing landmark A c The distance between the unmanned aerial vehicle and the landing landmark C is Liu Debiao A, C, and the gamma is the included angle between the unmanned aerial vehicle and the sight line; PO is relatively high.
8. The unmanned aerial vehicle pose estimation method based on binocular vision guided landing of claim 5, wherein the method comprises the following steps of:
the third step of the concrete lifting speed calculating method comprises the following steps:
d diff =d pre -d cur
wherein d cur D is the distance from the current frame camera to the landmark pre V is the distance of the camera from the landmark of the previous frame curr I is the current frame rate, i is the current frame number,is pitch angle, v l Is the lifting speed.
9. The unmanned aerial vehicle pose estimation method based on binocular vision guided landing of claim 8, wherein the method comprises the following steps:
the fourth step is specifically calculated as follows:
where D is the actual distance between runways, D1 is the pixel distance, L1 is the pixel distance from the image center point to the straight line L, R1 is the pixel distance from the image center point to the straight line R, LL is the left offset, and RR is the right offset.
10. The unmanned aerial vehicle pose estimation method based on binocular vision guided landing of claim 9, wherein the method comprises the following steps: further, the fifth step specifically includes:
wherein R is a rotation matrix from a world coordinate system to a camera coordinate system, and (theta, phi) is a pitch angle, a roll angle and a yaw angle of the unmanned aerial vehicle.
CN202310398944.8A 2023-04-14 2023-04-14 Unmanned aerial vehicle pose estimation method based on binocular vision guided landing Pending CN116661470A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310398944.8A CN116661470A (en) 2023-04-14 2023-04-14 Unmanned aerial vehicle pose estimation method based on binocular vision guided landing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310398944.8A CN116661470A (en) 2023-04-14 2023-04-14 Unmanned aerial vehicle pose estimation method based on binocular vision guided landing

Publications (1)

Publication Number Publication Date
CN116661470A true CN116661470A (en) 2023-08-29

Family

ID=87708685

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310398944.8A Pending CN116661470A (en) 2023-04-14 2023-04-14 Unmanned aerial vehicle pose estimation method based on binocular vision guided landing

Country Status (1)

Country Link
CN (1) CN116661470A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101256412A (en) * 2008-03-31 2008-09-03 北京航空航天大学 Automatic homing control method for accident parking of unmanned vehicle engine
US20160086497A1 (en) * 2013-04-16 2016-03-24 Bae Systems Australia Limited Landing site tracker
CN108122255A (en) * 2017-12-20 2018-06-05 哈尔滨工业大学 It is a kind of based on trapezoidal with circular combination terrestrial reference UAV position and orientation method of estimation
CN110045741A (en) * 2019-04-13 2019-07-23 成都飞机工业(集团)有限责任公司 The integrated navigation system of safety guidance unmanned vehicle glide landing
CN112686149A (en) * 2020-12-29 2021-04-20 中国航天空气动力技术研究院 Vision-based autonomous landing method for near-field section of fixed-wing unmanned aerial vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101256412A (en) * 2008-03-31 2008-09-03 北京航空航天大学 Automatic homing control method for accident parking of unmanned vehicle engine
US20160086497A1 (en) * 2013-04-16 2016-03-24 Bae Systems Australia Limited Landing site tracker
CN108122255A (en) * 2017-12-20 2018-06-05 哈尔滨工业大学 It is a kind of based on trapezoidal with circular combination terrestrial reference UAV position and orientation method of estimation
CN110045741A (en) * 2019-04-13 2019-07-23 成都飞机工业(集团)有限责任公司 The integrated navigation system of safety guidance unmanned vehicle glide landing
CN112686149A (en) * 2020-12-29 2021-04-20 中国航天空气动力技术研究院 Vision-based autonomous landing method for near-field section of fixed-wing unmanned aerial vehicle

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
ZHANG QI 等: "The positioning system use for autonomous navigation of unmanned aerial vehicles", 2021 33RD CHINESE CONTROL AND DECISION CONFERENCE, 30 November 2021 (2021-11-30) *
ZHOU LI 等: "UAV Autonomous Landing Technology Based on AprilTags Vision Positioning Algorithm", 2019 CHINESE CONTROL CONFERENCE, 17 October 2019 (2019-10-17) *
姜宇: "基于视觉导航的四旋翼无人机自主跟踪及着陆控制技术研究", 中国优秀硕士学位论文全文数据库工程科技Ⅱ辑, 15 January 2023 (2023-01-15) *
王毅: "视觉辅助无人机自主着陆目标跟踪算法设计", 科技创新导报, 1 August 2017 (2017-08-01) *

Similar Documents

Publication Publication Date Title
CN110262546B (en) Tunnel intelligent unmanned aerial vehicle inspection method
CN110411462B (en) GNSS/inertial navigation/lane line constraint/milemeter multi-source fusion method
CN109911188B (en) Bridge detection unmanned aerial vehicle system in non-satellite navigation and positioning environment
CN103175524B (en) A kind of position of aircraft without view-based access control model under marking environment and attitude determination method
CN102967305B (en) Multi-rotor unmanned aerial vehicle pose acquisition method based on markers in shape of large and small square
CN107229063A (en) A kind of pilotless automobile navigation and positioning accuracy antidote merged based on GNSS and visual odometry
CN111426320B (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
CN107422730A (en) The AGV transportation systems of view-based access control model guiding and its driving control method
CN109946732A (en) A kind of unmanned vehicle localization method based on Fusion
CN107703951B (en) A kind of unmanned plane barrier-avoiding method and system based on binocular vision
CN102353377A (en) High altitude long endurance unmanned aerial vehicle integrated navigation system and navigating and positioning method thereof
CN106651953A (en) Vehicle position and gesture estimation method based on traffic sign
CN111044073B (en) High-precision AGV position sensing method based on binocular laser
CN105352495A (en) Unmanned-plane horizontal-speed control method based on fusion of data of acceleration sensor and optical-flow sensor
KR101878685B1 (en) System and method for positioning vehicle using local dynamic map
CN106586026B (en) A kind of measurement method of aircraft with respect to runway lateral deviation rate
CN104061899A (en) Kalman filtering based method for estimating roll angle and pitching angle of vehicle
CN105136153B (en) A kind of lane line exact position harvester and acquisition method
US20190197908A1 (en) Methods and systems for improving the precision of autonomous landings by drone aircraft on landing targets
CN106289250A (en) A kind of course information acquisition system
WO2022147924A1 (en) Method and apparatus for vehicle positioning, storage medium, and electronic device
CN111045455A (en) Visual correction method for flight course angle error of indoor corridor of micro unmanned aerial vehicle
CN108122255A (en) It is a kind of based on trapezoidal with circular combination terrestrial reference UAV position and orientation method of estimation
WO2023065342A1 (en) Vehicle, vehicle positioning method and apparatus, device, and computer-readable storage medium
CN104897159A (en) Aircraft full-course navigation method based on sequence image matching

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination