CN113741534A - Unmanned aerial vehicle vision and positioning double-guidance landing method - Google Patents
Unmanned aerial vehicle vision and positioning double-guidance landing method Download PDFInfo
- Publication number
- CN113741534A CN113741534A CN202111089256.0A CN202111089256A CN113741534A CN 113741534 A CN113741534 A CN 113741534A CN 202111089256 A CN202111089256 A CN 202111089256A CN 113741534 A CN113741534 A CN 113741534A
- Authority
- CN
- China
- Prior art keywords
- unmanned aerial
- aerial vehicle
- landing
- vehicle
- airborne
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 25
- 238000001514 detection method Methods 0.000 claims abstract description 46
- 230000000007 visual effect Effects 0.000 claims abstract description 24
- 238000004891 communication Methods 0.000 claims description 13
- 238000005286 illumination Methods 0.000 claims description 8
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 claims description 4
- 230000000243 photosynthetic effect Effects 0.000 abstract description 10
- 238000001179 sorption measurement Methods 0.000 abstract description 5
- 238000013459 approach Methods 0.000 abstract description 2
- 230000015572 biosynthetic process Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 3
- 238000011161 development Methods 0.000 description 2
- 238000003786 synthesis reaction Methods 0.000 description 2
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 230000006378 damage Effects 0.000 description 1
- DMBHHRLKUKUOEG-UHFFFAOYSA-N diphenylamine Chemical compound C=1C=CC=CC=1NC1=CC=CC=C1 DMBHHRLKUKUOEG-UHFFFAOYSA-N 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 231100001261 hazardous Toxicity 0.000 description 1
- 208000014674 injury Diseases 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention discloses a vision and positioning double-guidance landing method for an unmanned aerial vehicle, and belongs to the field of autonomous control of unmanned systems. It includes: based on an auxiliary positioning mode, directly guiding the unmanned aerial vehicle to approach the landing target until the landing target is positioned in the range of an airborne visual sensor of the unmanned aerial vehicle, guiding the unmanned aerial vehicle to land through vision until the unmanned aerial vehicle is 5-10cm away from the landing target, and finishing the landing of the unmanned aerial vehicle through strong magnetic adsorption; if there is not the assistance-localization real-time mode, at first judge whether the landing target is in unmanned aerial vehicle machine carries visual sensor detection range, if not, then through the three photosynthetic sensors of ground landing vehicle, search for unmanned aerial vehicle, assistance-unmanned aerial vehicle fixes a position, give unmanned aerial vehicle and landing target's relative distance and position, guide unmanned aerial vehicle and be close to the landing target, until the landing target is located unmanned aerial vehicle visual sensor detection range, through the descending of visual guidance unmanned aerial vehicle, until unmanned aerial vehicle apart from landing target 5-10cm department, through strong magnetic adsorption, accomplish the unmanned aerial vehicle and descend.
Description
Technical Field
The invention belongs to the field of unmanned system autonomous control, and particularly relates to a vision and positioning dual-guidance landing method for an unmanned aerial vehicle.
Background
Unmanned aerial vehicle can avoid personnel to receive the injury and most be used in each field because of it in hazardous environment, can replace traditional aircraft to carry out the patrol and the investigation work in danger area. The cost of the unmanned aerial vehicle is gradually reduced along with the continuous development of related technologies such as a micro-electricity technology, a digital communication technology and the like, and the unmanned aerial vehicle has the capability of adapting to a complex flying environment. Meanwhile, the unmanned aerial vehicle has more application environments than the unmanned aerial vehicle by virtue of the characteristics of simple structure, small size, flexibility and high safety.
Although the development of drones is quite rapid, autonomous landing has been one of the most dangerous challenges for drones to date. Traditional unmanned aerial vehicle is landing system independently relies on visual guidance more, when the target of landing is not when unmanned aerial vehicle machine carries visual sensor detection range, utilizes navigation to guide unmanned aerial vehicle, makes unmanned aerial vehicle and the relative distance of the car of landing satisfy visual guidance landing condition. The conventional navigation systems are an inertial navigation system and a global navigation satellite system. However, the conventional navigation system integrates the position and the speed of the drone for navigation and positioning, but the integration may generate an accumulated error to cause inaccurate navigation information, and the global navigation satellite system may fail because the satellite signal is blocked. Therefore, there is a need to add ground vision detection modules to a landing vehicle to increase the likelihood that the drone and the landing target will find each other. Moreover, in the existing autonomous landing scheme of the unmanned aerial vehicle, the pairing process of the unmanned aerial vehicle and the landing vehicle is not considered, so that the possibility that the unmanned aerial vehicle of the owner lands on the landing vehicle of the non-owner exists, and the safety of the system is influenced.
Disclosure of Invention
In view of the above, the invention provides a method for guiding and landing by an unmanned aerial vehicle, which aims at the problem of guiding and landing by the unmanned aerial vehicle and aims at complex scenes, and improves the reliability of guiding and landing by the unmanned aerial vehicle under the complex scenes.
In order to achieve the purpose, the invention adopts the technical scheme that:
a vision and positioning double-guidance landing method for an unmanned aerial vehicle comprises the following steps:
step 1, building an unmanned aerial vehicle vision and positioning double-guidance landing system; the landing system comprises an unmanned aerial vehicle and a landing vehicle, wherein the unmanned aerial vehicle is provided with an airborne information processor, an airborne three-light-combined sensor, an airborne cloud deck, a flight control computer, an airborne communication terminal and an airborne auxiliary positioning module, and the landing vehicle is provided with a landing target, an airborne cloud deck, an onboard three-light-combined sensor, an onboard information processor, a ground communication terminal and an onboard auxiliary positioning module; the airborne and vehicle-mounted three-optical combination sensor comprises a visible light module, an infrared module and a laser module; the airborne and vehicle-mounted auxiliary positioning modules comprise Beidou or GPS positioning modules; the landing target is a two-dimensional code;
step 2, judging whether system auxiliary positioning is available, if the system auxiliary positioning is available, turning to step 3, and if the system auxiliary positioning is unavailable, turning to step 4;
3, guiding the unmanned aerial vehicle by the airborne information processor through the flight control computer by using an auxiliary positioning means until the landing target is positioned in the detection range of the airborne three-optical combination sensor, and turning to the step 7;
step 4, the airborne information processor judges whether the landing target is located in the detection range of the airborne three-optical sensor, if so, the step 7 is carried out, and if not, the step 5 is carried out;
step 5, the vehicle-mounted information processor searches the unmanned aerial vehicle based on the vehicle-mounted three-light combination sensor until the unmanned aerial vehicle is searched;
step 6, the vehicle-mounted information processor guides the landing vehicle until the landing target is positioned in the detection range of the airborne three-light-combination sensor;
step 7, pairing the unmanned aerial vehicle and the landing vehicle through the airborne communication terminal and the ground communication terminal, finishing the task if the pairing fails, and guiding the unmanned aerial vehicle through vision until the unmanned aerial vehicle is 5-10cm above the landing target if the pairing succeeds;
and 8, strongly magnetically adsorbing the unmanned aerial vehicle to finish the landing of the unmanned aerial vehicle.
Further, the specific manner for determining whether system-assisted positioning is available in step 2 is as follows:
and the onboard information processor judges whether the onboard auxiliary positioning mode is available or not, the vehicle-mounted information processor judges whether the vehicle-mounted auxiliary positioning mode is available or not, if the onboard auxiliary positioning mode is available, the system auxiliary positioning mode is available, otherwise, the system auxiliary positioning mode is unavailable.
Further, in step 3, step 4 and step 6, the manner of judging that the landing target is located in the detection range of the airborne three-photosynthetic sensor is as follows:
the unmanned aerial vehicle is used for collecting images through the three-photosynthetic-sensor, the landing target is identified from the images through the visual identification system, if the landing target can be accurately identified, the landing target is positioned in the detection range of the airborne three-photosynthetic-sensor, otherwise, the landing target is positioned outside the detection range of the airborne three-photosynthetic-sensor.
Further, the specific manner of step 3 is as follows:
and guiding the unmanned aerial vehicle according to the relative positioning relation between the unmanned aerial vehicle and the landing target until the landing target is positioned in the detection range of the airborne three-optical sensor.
Further, the specific manner of step 5 is as follows:
5a) the method comprises the following steps of (1) acquiring images by using a vehicle-mounted three-photosynthetic sensor, identifying targets of an unmanned aerial vehicle, using visible light under the condition of good illumination conditions, and using infrared light under the condition of poor illumination conditions;
5b) after the target of the unmanned aerial vehicle is identified, the relative distance between the unmanned aerial vehicle and the landing vehicle is measured through laser ranging.
Further, the specific manner of step 6 is as follows:
according to the position of the target of the unmanned aerial vehicle in the image collected by the vehicle-mounted three-photosynthetic sensor, the relative positioning relation between the unmanned aerial vehicle and the landing vehicle is judged, the landing vehicle is guided, the unmanned aerial vehicle is positioned in the center of the image collected by the vehicle-mounted three-photosynthetic sensor, the duty ratio of the target of the unmanned aerial vehicle in the image collected by the vehicle-mounted three-photosynthetic sensor is increased by guiding, and the target is positioned in the detection range of the vehicle-mounted three-photosynthetic sensor until the landing target is positioned.
Further, the specific mode of pairing the unmanned aerial vehicle and the landing vehicle through the airborne communication terminal and the ground communication terminal in the step 7 is as follows:
7a) the airborne information processor uses an airborne three-optical sensor to identify the landing target, and after the landing target is identified, laser ranging is carried out to obtain the relative distance between the unmanned aerial vehicle and the landing vehicle;
7b) the vehicle-mounted information processor uses the vehicle-mounted three-photosynthetic sensor to identify the target of the unmanned aerial vehicle, and after the unmanned aerial vehicle is identified, laser ranging is carried out to obtain the relative distance between the landing vehicle and the unmanned aerial vehicle;
7c) and judging whether the two distances obtained by the 7a) and the 7b) are consistent, if so, successfully pairing the unmanned aerial vehicle and the landing vehicle, and if not, failing to pair the unmanned aerial vehicle and the landing vehicle.
Compared with the prior art, the invention has the following advantages:
1. the invention adopts a mode of unmanned aerial vehicle vision and positioning double guidance, can adapt to various scenes and completes unmanned aerial vehicle guidance landing.
2. The method adopts a mode of bidirectional search of the unmanned aerial vehicle and the landing vehicle, can increase the probability of finding each other in a complex scene, and completes the guided landing of the unmanned aerial vehicle.
3. The invention adopts the mode of pairing the unmanned aerial vehicle and the landing vehicle, and can guide landing only under the condition of successful pairing, thereby increasing the safety of the system.
Drawings
FIG. 1 is a general flow diagram of a method according to an embodiment of the invention;
FIG. 2 is a hardware architecture diagram of an embodiment of the present invention;
FIG. 3 is a software architecture diagram of an embodiment of the present invention;
FIG. 4 is a schematic diagram of a two-dimensional code used in an embodiment of the invention;
fig. 5 is a schematic view of an application scenario of the method according to the embodiment of the present invention.
Detailed Description
The following describes the steps and effects of the present invention in detail with reference to the accompanying drawings.
A vision and positioning double-guidance landing method for an unmanned aerial vehicle comprises the following steps:
step 1, building an unmanned aerial vehicle vision and positioning double-guidance landing system;
step 2, judging whether the vision and positioning double-guide system of the unmanned aerial vehicle has auxiliary positioning currently, if so, turning to step 3, and turning to step 4 if not;
3, guiding the unmanned aerial vehicle by using an auxiliary positioning means until the landing target is positioned in the detection range of the airborne three-optical combination sensor, and turning to the step 7;
step 4, judging whether the landing target is positioned in the detection range of the airborne three-photosynthetic sensor, if so, turning to step 7, and if not, continuing to execute step 5;
step 5, searching the unmanned aerial vehicle by the ground landing vehicle based on the vehicle-mounted three-photosynthetic sensor until the unmanned aerial vehicle is searched;
step 6, guiding the unmanned vehicle until the landing target is positioned in the detection range of the airborne three-optical combination sensor;
step 7, pairing the unmanned aerial vehicle and the landing vehicle, if pairing fails, finishing the task, and if pairing succeeds, guiding the unmanned aerial vehicle through vision until the unmanned aerial vehicle is located 5-10cm above the landing target;
and 8, strongly magnetically adsorbing the unmanned aerial vehicle to finish the landing of the unmanned aerial vehicle.
Wherein, step 1 is described in detail as follows:
1a) the unmanned aerial vehicle carries a Beidou, a GPS positioning module and a three-photosynthetic sensor, wherein the photosynthetic sensor comprises a visible light module, an infrared module and a laser module;
1b) the ground landing vehicle carries a landing target and a three-photosynthetic sensor, the mark of the landing target is a two-dimensional code, and the three-photosynthetic sensor comprises a visible light module, an infrared module and a laser module.
Wherein, the auxiliary positioning mode of step 2 includes: big dipper, GPS.
The determination method that the landing target is located in the detection range of the airborne three-photosynthetic sensor in the steps 3, 4 and 6 is as follows:
the unmanned aerial vehicle airborne visual intelligent recognition system is used, if the landing target can be accurately recognized, the landing target is located in the detection range of the airborne three-optical combination sensor, otherwise, the landing target is located outside the detection range of the airborne three-optical combination sensor.
Wherein, step 3 the use auxiliary positioning means, the mode of guide unmanned aerial vehicle does:
and guiding the unmanned aerial vehicle according to the positioning relation between the unmanned aerial vehicle and the landing target until the landing target is positioned in the detection range of the airborne three-light-combination sensor.
The specific operation mode of the step 5 is as follows:
5a) the unmanned aerial vehicle target identification is carried out by using the three-optical synthesis sensor vision module, visible light is used under the condition of good illumination conditions, and infrared light is used under the condition of poor illumination conditions;
5b) after identifying the target of the unmanned aerial vehicle, measuring the relative distance between the unmanned aerial vehicle and the landing vehicle by using laser ranging.
The specific mode of guiding the unmanned vehicle until the landing target is located in the detection range of the airborne three-optical combination sensor in step 6 is as follows:
according to the visible light (infrared) formation of image of the three photosynthetic sensors that unmanned aerial vehicle carried on at the landing vehicle, judge the relative positioning relation of unmanned aerial vehicle and the landing vehicle, guide unmanned vehicle for unmanned aerial vehicle is located the central authorities of the visible light (infrared) formation of image of three light unifications, and has bigger and bigger duty cycle, until the landing target is located the three photosynthetic sensors detection scope of machine carrier.
Wherein, step 7 the unmanned aerial vehicle and the land vehicle pair mode do:
7a) using an airborne three-optical-combination sensor to identify a landing target, and after the landing target is identified, performing laser ranging to obtain the relative distance between the unmanned aerial vehicle and a landing vehicle;
7b) carrying out target identification on the unmanned aerial vehicle by using the visual part of the three-photosynthetic-sensor landing vehicle, and carrying out laser ranging after the unmanned aerial vehicle is identified to obtain the relative distance between the landing vehicle and the unmanned aerial vehicle;
7c) and judging whether the two distances obtained by the 7a) and the 7b) are basically consistent, if so, successfully pairing the unmanned aerial vehicle and the landing vehicle, and if not, failing to pair the unmanned aerial vehicle and the landing vehicle.
The following is a more specific example:
referring to fig. 1, a method for unmanned aerial vehicle vision and positioning dual guidance landing includes the following steps:
step 1, building an unmanned aerial vehicle vision and positioning dual-guidance landing system, wherein a system hardware architecture is shown in fig. 2, a system software architecture is shown in fig. 3, and details are as follows:
1a) the unmanned aerial vehicle carries a Beidou, a GPS positioning module and a three-photosynthetic sensor, wherein the photosynthetic sensor comprises a visible light module, an infrared module and a laser module;
1b) ground landing car carries on landing target, three photosynthetic one sensor, as shown in fig. 4, the sign of landing target is the two-dimensional code, uses nested sign, when evening, can use the LED lamp, strengthens special symbol discernment degree, three photosynthetic one sensor contain visible light, infrared and laser module.
Step 2, judge whether unmanned aerial vehicle vision and two bootstrap systems in location have assistance-localization real-time, include: beidou and GPS, if auxiliary positioning exists, turning to the step 3, and turning to the step 4 if auxiliary positioning does not exist;
step 3, using an auxiliary positioning means to guide the unmanned aerial vehicle until the landing target is located in the detection range of the airborne three-photosynthetic sensor, and turning to step 7, wherein the determination mode that the landing target is located in the detection range of the airborne three-photosynthetic sensor is as follows: the unmanned aerial vehicle airborne visual intelligent recognition system is used, if the landing target can be accurately recognized, the landing target is located in the detection range of the airborne three-optical combination sensor, otherwise, the landing target is located outside the detection range of the airborne three-optical combination sensor.
Step 4, judging whether the landing target is positioned in the detection range of the airborne three-photosynthetic sensor, if so, turning to step 7, and if not, continuing to execute step 5, wherein the judgment mode that the landing target is positioned in the detection range of the airborne three-photosynthetic sensor is as follows: the unmanned aerial vehicle airborne visual intelligent recognition system is used, if the landing target can be accurately recognized, the landing target is located in the detection range of the airborne three-optical combination sensor, otherwise, the landing target is located outside the detection range of the airborne three-optical combination sensor.
Step 5, the ground landing vehicle searches the unmanned aerial vehicle based on the vehicle-mounted three-light-combined sensor until the unmanned aerial vehicle is searched, and the specific operation mode is as follows:
5a) the unmanned aerial vehicle target identification is carried out by using the three-optical synthesis sensor vision module, visible light is used under the condition of good illumination conditions, and infrared light is used under the condition of poor illumination conditions;
5b) after identifying the target of the unmanned aerial vehicle, measuring the relative distance between the unmanned aerial vehicle and the landing vehicle by using laser ranging.
And 6, guiding the unmanned vehicle until the landing target is positioned in the detection range of the airborne three-optical combination sensor, wherein the judgment mode that the landing target is positioned in the detection range of the airborne three-optical combination sensor is as follows: using an unmanned aerial vehicle airborne visual intelligent recognition system, if the landing target can be accurately recognized, locating the landing target in the detection range of the airborne three-optical combination sensor, otherwise, locating the landing target outside the detection range of the airborne three-optical combination sensor;
the specific operation mode for guiding the unmanned vehicle until the landing target is positioned in the detection range of the airborne three-light-combination sensor is as follows: according to the visible light (infrared) formation of image of the three photosynthetic sensors that unmanned aerial vehicle carried on at the landing vehicle, judge the relative positioning relation of unmanned aerial vehicle and the landing vehicle, guide unmanned vehicle for unmanned aerial vehicle is located the central authorities of the visible light (infrared) formation of image of three light unifications, and has bigger and bigger duty cycle, until the landing target is located the three photosynthetic sensors detection scope of machine carrier.
And 7, pairing the unmanned aerial vehicle and the landing vehicle, if pairing fails, finishing the task, and if pairing succeeds, guiding the unmanned aerial vehicle through vision until the unmanned aerial vehicle is positioned 5-10cm above the landing target, wherein the pairing mode of the unmanned aerial vehicle and the landing vehicle is as follows:
7a) using an airborne three-optical-combination sensor to identify a landing target, and after the landing target is identified, performing laser ranging to obtain the relative distance between the unmanned aerial vehicle and a landing vehicle;
7b) carrying out target identification on the unmanned aerial vehicle by using the visual part of the three-photosynthetic-sensor landing vehicle, and carrying out laser ranging after the unmanned aerial vehicle is identified to obtain the relative distance between the landing vehicle and the unmanned aerial vehicle;
7c) and judging whether the two distances obtained by the 7a) and the 7b) are basically consistent, if so, successfully pairing the unmanned aerial vehicle and the landing vehicle, and if not, failing to pair the unmanned aerial vehicle and the landing vehicle.
And 8, strongly magnetically adsorbing the unmanned aerial vehicle to finish the landing of the unmanned aerial vehicle.
The usage scenario diagram of the above embodiment is shown in fig. 5, in the case of assisted positioning, the unmanned aerial vehicle is directly guided by positioning information, in the case of no assisted positioning, the method of visual search is used, the unmanned aerial vehicle and the landing vehicle respectively search for the other side by using the method of visual search, after the search is successful, visual guidance is performed, after the unmanned aerial vehicle and the landing vehicle can both sense the other side by using respective visual sensors, the respective laser ranging modules are respectively used to measure the relative distances between the unmanned aerial vehicle and the landing vehicle, if the two distances are basically consistent, the pairing is successful, the visual guidance unmanned aerial vehicle lands, and finally, strong magnetic adsorption is used, and the unmanned aerial vehicle completes the landing.
By adopting the method, if an auxiliary positioning mode exists, the unmanned aerial vehicle is directly guided to approach the landing target based on the auxiliary positioning mode until the landing target is positioned in the range of an airborne visual sensor of the unmanned aerial vehicle, the unmanned aerial vehicle is guided to land through vision until the unmanned aerial vehicle is 5-10cm away from the landing target, and the unmanned aerial vehicle lands through strong magnetic adsorption; if there is not the assistance-localization real-time mode, at first judge whether the landing target is in unmanned aerial vehicle machine carries visual sensor detection range, if not, then through the three photosynthetic sensors of ground landing vehicle, search for unmanned aerial vehicle, assistance-unmanned aerial vehicle fixes a position, give unmanned aerial vehicle and landing target's relative distance and position, guide unmanned aerial vehicle and be close to the landing target, until the landing target is located unmanned aerial vehicle visual sensor detection range, through the descending of visual guidance unmanned aerial vehicle, until unmanned aerial vehicle apart from landing target 5-10cm department, through strong magnetic adsorption, accomplish the unmanned aerial vehicle and descend.
In a word, the unmanned aerial vehicle guiding and positioning dual-guiding mode is adopted, the unmanned aerial vehicle guiding and landing system can adapt to various scenes, the ground vision detection part is added, the probability of finding the unmanned aerial vehicle and the landing vehicle in a complex scene is increased, the link of pairing the unmanned aerial vehicle and the landing vehicle is added, and the safety of the system is improved.
Claims (7)
1. A vision and positioning double-guidance landing method for an unmanned aerial vehicle is characterized by comprising the following steps:
step 1, building an unmanned aerial vehicle vision and positioning double-guidance landing system; the landing system comprises an unmanned aerial vehicle and a landing vehicle, wherein the unmanned aerial vehicle is provided with an airborne information processor, an airborne three-light-combined sensor, an airborne cloud deck, a flight control computer, an airborne communication terminal and an airborne auxiliary positioning module, and the landing vehicle is provided with a landing target, an airborne cloud deck, an onboard three-light-combined sensor, an onboard information processor, a ground communication terminal and an onboard auxiliary positioning module; the airborne and vehicle-mounted three-optical combination sensor comprises a visible light module, an infrared module and a laser module; the airborne and vehicle-mounted auxiliary positioning modules comprise Beidou or GPS positioning modules; the landing target is a two-dimensional code;
step 2, judging whether system auxiliary positioning is available, if the system auxiliary positioning is available, turning to step 3, and if the system auxiliary positioning is unavailable, turning to step 4;
3, guiding the unmanned aerial vehicle by the airborne information processor through the flight control computer by using an auxiliary positioning means until the landing target is positioned in the detection range of the airborne three-optical combination sensor, and turning to the step 7;
step 4, the airborne information processor judges whether the landing target is located in the detection range of the airborne three-optical sensor, if so, the step 7 is carried out, and if not, the step 5 is carried out;
step 5, the vehicle-mounted information processor searches the unmanned aerial vehicle based on the vehicle-mounted three-light combination sensor until the unmanned aerial vehicle is searched;
step 6, the vehicle-mounted information processor guides the landing vehicle until the landing target is positioned in the detection range of the airborne three-light-combination sensor;
step 7, pairing the unmanned aerial vehicle and the landing vehicle through the airborne communication terminal and the ground communication terminal, finishing the task if the pairing fails, and guiding the unmanned aerial vehicle through vision until the unmanned aerial vehicle is 5-10cm above the landing target if the pairing succeeds;
and 8, strongly magnetically adsorbing the unmanned aerial vehicle to finish the landing of the unmanned aerial vehicle.
2. The method of claim 1, wherein the specific manner of determining whether system-assisted positioning is available in step 2 is as follows:
and the onboard information processor judges whether the onboard auxiliary positioning mode is available or not, the vehicle-mounted information processor judges whether the vehicle-mounted auxiliary positioning mode is available or not, if the onboard auxiliary positioning mode is available, the system auxiliary positioning mode is available, otherwise, the system auxiliary positioning mode is unavailable.
3. The unmanned aerial vehicle vision and positioning double-guidance landing method according to claim 1, wherein in the steps 3, 4 and 6, the manner of judging that the landing target is located in the detection range of the airborne three-optical combination sensor is as follows:
the unmanned aerial vehicle is used for collecting images through the three-photosynthetic-sensor, the landing target is identified from the images through the visual identification system, if the landing target can be accurately identified, the landing target is positioned in the detection range of the airborne three-photosynthetic-sensor, otherwise, the landing target is positioned outside the detection range of the airborne three-photosynthetic-sensor.
4. The unmanned aerial vehicle vision and positioning dual-guidance landing method according to claim 1, wherein the specific manner of step 3 is as follows:
and guiding the unmanned aerial vehicle according to the relative positioning relation between the unmanned aerial vehicle and the landing target until the landing target is positioned in the detection range of the airborne three-optical sensor.
5. The unmanned aerial vehicle vision and positioning dual-guidance landing method according to claim 1, wherein the specific manner of step 5 is as follows:
5a) the method comprises the following steps of (1) acquiring images by using a vehicle-mounted three-photosynthetic sensor, identifying targets of an unmanned aerial vehicle, using visible light under the condition of good illumination conditions, and using infrared light under the condition of poor illumination conditions;
5b) after the target of the unmanned aerial vehicle is identified, the relative distance between the unmanned aerial vehicle and the landing vehicle is measured through laser ranging.
6. The unmanned aerial vehicle vision and positioning dual-guidance landing method according to claim 6, wherein the specific manner of step 6 is as follows:
according to the position of the target of the unmanned aerial vehicle in the image collected by the vehicle-mounted three-photosynthetic sensor, the relative positioning relation between the unmanned aerial vehicle and the landing vehicle is judged, the landing vehicle is guided, the unmanned aerial vehicle is positioned in the center of the image collected by the vehicle-mounted three-photosynthetic sensor, the duty ratio of the target of the unmanned aerial vehicle in the image collected by the vehicle-mounted three-photosynthetic sensor is increased by guiding, and the target is positioned in the detection range of the vehicle-mounted three-photosynthetic sensor until the landing target is positioned.
7. The unmanned aerial vehicle vision and positioning dual-guidance landing method according to claim 1, wherein the specific manner of pairing the unmanned aerial vehicle and the landing vehicle through the airborne communication terminal and the ground communication terminal in step 7 is as follows:
7a) the airborne information processor uses an airborne three-optical sensor to identify the landing target, and after the landing target is identified, laser ranging is carried out to obtain the relative distance between the unmanned aerial vehicle and the landing vehicle;
7b) the vehicle-mounted information processor uses the vehicle-mounted three-photosynthetic sensor to identify the target of the unmanned aerial vehicle, and after the unmanned aerial vehicle is identified, laser ranging is carried out to obtain the relative distance between the landing vehicle and the unmanned aerial vehicle;
7c) and judging whether the two distances obtained by the 7a) and the 7b) are consistent, if so, successfully pairing the unmanned aerial vehicle and the landing vehicle, and if not, failing to pair the unmanned aerial vehicle and the landing vehicle.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111089256.0A CN113741534B (en) | 2021-09-16 | 2021-09-16 | Unmanned aerial vehicle vision and positioning double-guide landing method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111089256.0A CN113741534B (en) | 2021-09-16 | 2021-09-16 | Unmanned aerial vehicle vision and positioning double-guide landing method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113741534A true CN113741534A (en) | 2021-12-03 |
CN113741534B CN113741534B (en) | 2024-08-20 |
Family
ID=78739465
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111089256.0A Active CN113741534B (en) | 2021-09-16 | 2021-09-16 | Unmanned aerial vehicle vision and positioning double-guide landing method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113741534B (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105059533A (en) * | 2015-08-14 | 2015-11-18 | 深圳市多翼创新科技有限公司 | Aircraft and landing method thereof |
CN105501457A (en) * | 2015-12-16 | 2016-04-20 | 南京航空航天大学 | Infrared vision based automatic landing guidance method and system applied to fixed-wing UAV (unmanned aerial vehicle) |
CN106444792A (en) * | 2016-09-18 | 2017-02-22 | 中国空气动力研究与发展中心高速空气动力研究所 | Infrared visual recognition-based unmanned aerial vehicle landing positioning system and method |
US20170225800A1 (en) * | 2016-02-05 | 2017-08-10 | Jordan Holt | Visual landing aids for unmanned aerial systems |
CN109911231A (en) * | 2019-03-20 | 2019-06-21 | 武汉理工大学 | Unmanned plane autonomous landing on the ship method and system based on GPS and image recognition hybrid navigation |
CN110244749A (en) * | 2019-04-22 | 2019-09-17 | 西北农林科技大学 | A kind of agricultural unmanned plane mobile platform independently precisely lands control system and method |
CN112462791A (en) * | 2020-12-02 | 2021-03-09 | 成都时代星光科技有限公司 | Full-automatic high-precision flight landing system and method for airport of vehicle-mounted unmanned aerial vehicle |
CN112455705A (en) * | 2020-12-04 | 2021-03-09 | 南京晓飞智能科技有限公司 | Unmanned aerial vehicle autonomous accurate landing system and method |
CN112946766A (en) * | 2021-01-26 | 2021-06-11 | 汕头大学 | Group intelligence-based land mine detection method and system |
-
2021
- 2021-09-16 CN CN202111089256.0A patent/CN113741534B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105059533A (en) * | 2015-08-14 | 2015-11-18 | 深圳市多翼创新科技有限公司 | Aircraft and landing method thereof |
CN105501457A (en) * | 2015-12-16 | 2016-04-20 | 南京航空航天大学 | Infrared vision based automatic landing guidance method and system applied to fixed-wing UAV (unmanned aerial vehicle) |
US20170225800A1 (en) * | 2016-02-05 | 2017-08-10 | Jordan Holt | Visual landing aids for unmanned aerial systems |
CN106444792A (en) * | 2016-09-18 | 2017-02-22 | 中国空气动力研究与发展中心高速空气动力研究所 | Infrared visual recognition-based unmanned aerial vehicle landing positioning system and method |
CN109911231A (en) * | 2019-03-20 | 2019-06-21 | 武汉理工大学 | Unmanned plane autonomous landing on the ship method and system based on GPS and image recognition hybrid navigation |
CN110244749A (en) * | 2019-04-22 | 2019-09-17 | 西北农林科技大学 | A kind of agricultural unmanned plane mobile platform independently precisely lands control system and method |
CN112462791A (en) * | 2020-12-02 | 2021-03-09 | 成都时代星光科技有限公司 | Full-automatic high-precision flight landing system and method for airport of vehicle-mounted unmanned aerial vehicle |
CN112455705A (en) * | 2020-12-04 | 2021-03-09 | 南京晓飞智能科技有限公司 | Unmanned aerial vehicle autonomous accurate landing system and method |
CN112946766A (en) * | 2021-01-26 | 2021-06-11 | 汕头大学 | Group intelligence-based land mine detection method and system |
Non-Patent Citations (1)
Title |
---|
WEIWEI KONG等: "A Ground-Based Optical System for Autonomous Landing of a Fixed Wing UAV", 2014 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, pages 4797 - 4804 * |
Also Published As
Publication number | Publication date |
---|---|
CN113741534B (en) | 2024-08-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11983894B2 (en) | Determining road location of a target vehicle based on tracked trajectory | |
CN109767637B (en) | Method and device for identifying and processing countdown signal lamp | |
CN105759295B (en) | Apparatus and method for recognizing driving environment for unmanned vehicle | |
US9587948B2 (en) | Method for determining the absolute position of a mobile unit, and mobile unit | |
EP3650814B1 (en) | Vision augmented navigation | |
WO2021069967A1 (en) | Systems and methods for vehicle navigation | |
CN109597077B (en) | Detection system based on unmanned aerial vehicle | |
CN109643494A (en) | Automatic driving vehicle, the parking method of automatic driving vehicle and program | |
CN105501457A (en) | Infrared vision based automatic landing guidance method and system applied to fixed-wing UAV (unmanned aerial vehicle) | |
US20200005058A1 (en) | Method for localizing a more automated, e. g., highly automated vehicle (hav) in a digital localization map | |
CN113316706B (en) | Landmark position estimating apparatus and method, and computer-readable recording medium storing computer program programmed to perform the method | |
US20220355818A1 (en) | Method for a scene interpretation of an environment of a vehicle | |
JP7190699B2 (en) | Flight system and landing control method | |
CN112346103A (en) | V2X-based intelligent networking automobile dynamic co-location method and device | |
CN112083718A (en) | Control method and device of visual navigation robot and computer readable storage medium | |
US20200384983A1 (en) | Management device, vehicle management method, program, and vehicle management system | |
CN114840004A (en) | Unmanned aerial vehicle autonomous landing method based on two-dimensional code recognition | |
CN109073393A (en) | Method for determining the attitude of an at least partially autonomous vehicle by means of landmarks which are specially selected and transmitted by a back-end server | |
US20220373357A1 (en) | Method and device for assisting in landing an aircraft under poor visibility conditions | |
US11812342B2 (en) | Cellular-based navigation method | |
CN113741534A (en) | Unmanned aerial vehicle vision and positioning double-guidance landing method | |
US20200408886A1 (en) | Passive altimeter system for a platform and method thereof | |
CN110717007A (en) | Map data positioning system and method applying roadside feature identification | |
Suganuma et al. | Current status and issues of traffic light recognition technology in Autonomous Driving System | |
CN108877212A (en) | Method and apparatus for establishing the environmental model of vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |