CN111413708A - Unmanned aerial vehicle autonomous landing site selection method based on laser radar - Google Patents

Unmanned aerial vehicle autonomous landing site selection method based on laser radar Download PDF

Info

Publication number
CN111413708A
CN111413708A CN202010278168.4A CN202010278168A CN111413708A CN 111413708 A CN111413708 A CN 111413708A CN 202010278168 A CN202010278168 A CN 202010278168A CN 111413708 A CN111413708 A CN 111413708A
Authority
CN
China
Prior art keywords
laser radar
height
ground
site selection
distance error
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010278168.4A
Other languages
Chinese (zh)
Inventor
罗世彬
胡茂青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Airtops Intelligent Technology Co ltd
Original Assignee
Hunan Airtops Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Airtops Intelligent Technology Co ltd filed Critical Hunan Airtops Intelligent Technology Co ltd
Priority to CN202010278168.4A priority Critical patent/CN111413708A/en
Publication of CN111413708A publication Critical patent/CN111413708A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/933Lidar systems specially adapted for specific applications for anti-collision purposes of aircraft or spacecraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

The invention provides an unmanned aerial vehicle autonomous landing site selection method based on a laser radar, which comprises the following steps of 1: selecting a landing candidate area on a reference image in advance, and matching a real-time image shot by an airborne visible light camera with the reference image; step 2: the method comprises the steps that when an airborne laser radar is 200 meters in height, a site selection searching range is calculated according to the height and the field angle, a distance error is calculated according to the height and a laser radar distance error parameter, a ground slope identification result needs to be identified, site selection is carried out, and an accurate safe landing area is determined; and step 3: calculating the address selection search range according to the height and the field angle of the airborne laser radar at the height of 100 meters; according to the height and the laser radar distance error parameters, the distance error can be calculated, and the effective detection of the obstacle with the ground exceeding 0.1m is ensured. The invention has the characteristics of high precision, low cost, strong adaptability and the like.

Description

Unmanned aerial vehicle autonomous landing site selection method based on laser radar
Technical Field
The invention relates to the technical field of unmanned aerial vehicle autonomous identification, in particular to an unmanned aerial vehicle autonomous identification landing site selection method based on a laser radar.
Background
The landing of an airplane during approach is an important stage for executing a combat mission and is mainly characterized by low flying speed and altitude, great influence by meteorological conditions, geographic environment and other factors and high requirement on the driving technology of a pilot. Therefore, the number of flight accidents, especially serious flight accidents, occurring at this stage is relatively large. Therefore, drones play an increasingly important role in performing flight missions. The safety problem of the unmanned aerial vehicle is more prominent because no driver is provided in the autonomous landing process. Whether the unmanned aerial vehicle can land safely and accurately is directly related to whether the unmanned aerial vehicle can safely and smoothly recover and whether the flight task can be completed satisfactorily. Aiming at the safety problem of the unmanned aerial vehicle in the landing stage, the autonomous landing technology becomes an important content for research of researchers in various countries.
At present, researchers propose a visual guidance method, a microwave guidance method and the like aiming at the autonomous landing problem of an unmanned aerial vehicle.
① visual guidance method, the visual guidance technique is a new guidance technique for applying computer vision to the autonomous landing mode of unmanned aerial vehicle, it uses various imaging systems (such as camera) to replace visual organs as input sensitive means, obtains the position parameters of unmanned aerial vehicle through image processing, secondly, guides the accurate landing of unmanned aerial vehicle by recognizing the ground obstacle on the picture.
The visual sensor has the advantages of portability, low power consumption, small size and the like. In addition, the working wave band of the visual navigation system is far away from the frequency range of the current electromagnetic countermeasure, and the visual navigation system has the advantages of low cost, strong autonomy, large information quantity, passivity, rich information and the like.
② microwave guidance method, for microwave landing guidance technology, the airborne receiver receives the ground azimuth station and elevation station signals to get the azimuth angle of the plane relative to the center line of the runway and the elevation angle relative to the horizontal plane of the runway, and gets the distance of the plane relative to the phase center of the ranging antenna by the way of inquiry response.
The microwave landing guiding technology has high guiding precision and large proportional coverage area, and can provide various approach routes and all-weather guiding functions.
In the autonomous landing process of the unmanned aerial vehicle, the safety problem is particularly outstanding because no driver is available. However, the unmanned aerial vehicle may turn over during the landing stage due to the topographic relief of the ground surface and the raised obstacles.
The core of the vision-based guided landing technology is a vision sensor, but the vision sensor can cause large difference in imaging due to different application scenes. Especially under the bad weather condition, unmanned aerial vehicle machine carries the sensor and hardly obtains more clear image to influence unmanned aerial vehicle safety and the accuracy of independently landing.
The microwave-based guided landing technology needs to establish a ground base station, and has the defects of high manufacturing cost, high requirements on ground and airborne equipment, high replacement cost and the like, so the development of the technology is limited.
Disclosure of Invention
Aiming at the problems, the invention provides an unmanned aerial vehicle autonomous landing site selection method based on a laser radar.
The technical scheme adopted by the invention is as follows:
the invention provides an unmanned aerial vehicle autonomous landing site selection method based on a laser radar, which comprises the following steps:
step 1: selecting a landing candidate area on a reference image in advance, and matching a real-time image shot by an airborne visible light camera with the reference image;
step 2, the airborne laser radar images the ground by a 30-degree × 30-degree visual field at the height of 200 meters, a location searching range is calculated according to the height and the visual field angle, a distance error is calculated according to the height and the distance error parameter of the laser radar, a ground slope identification result needs to be identified, the location is selected, and an accurate safe landing area is determined;
and 3, imaging the ground by the airborne laser radar at the height of 100 meters and a 9-degree × 9-degree view field, calculating a site selection search range according to the height and the view field angle, and calculating a distance error according to the height and the distance error parameter of the laser radar so as to ensure that the obstacle of which the ground exceeds 0.1m is effectively detected.
Preferably, the step 1 is carried out at 1000 × 1000m in advance2Is manually selected to be 150 × 150m on the reference picture2Is a landing candidate area; at 500 m, pass through the machineThe real-time image shot by the visible light camera is matched with the reference image, and the PNP is used for resolving and positioning the position of the aircraft.
Preferably, in the step 1, a plurality of multi-point matching schemes are used during matching, so as to improve the positioning accuracy.
Preferably, the step 2 is implemented by imaging the ground by the airborne laser radar with a 30-degree × 30-degree field of view at the height of 200 meters;
an address selection search range of about 107 × 107m is calculated according to the height and the field angle2Calculating a Ground Sample Distance (GSD) to be 0.2m according to the laser radar resolution 512 × 512, and calculating a Distance error E to be 0.1m according to the height and the laser radar Distance error parameter;
assuming that the size of a detection target is S, GSD should be controlled below S/3, E (one time) is controlled below S/6, obstacles can be effectively detected, and obstacles with the ground surface exceeding 0.6m can be identified;
in addition, the area with the ground gradient exceeding the range of (-15 degrees and 15 degrees) needs to be identified, and then obstacle detection and gradient identification results are combined for addressing.
Preferably, the step 3 is to image the ground by the airborne laser radar at the height of 100 meters and with the field of view of 9 degrees × 9 degrees;
calculating an address searching range of 16 × 16m2 according to the height and the field angle, and calculating a Ground Sample Distance (GSD) of 0.3m according to the laser radar resolution of 512 × 512;
calculating the distance error E at the moment to be 0.05m according to the height and the laser radar distance error parameters;
assuming that the size of the detection target is S, the GSD is controlled below S/3, and the E is controlled below S/6 (one time), the obstacle can be effectively detected.
Preferably, in step 2, if the laser radar distance error parameter is one time, it is not possible to ensure that the obstacle whose ground exceeds 0.1m is effectively detected, and if the laser radar distance error parameter is three times, it is possible to ensure that the obstacle whose ground exceeds 0.1m is effectively detected.
The invention has the beneficial effects that:
(1) the invention provides an unmanned aerial vehicle autonomous landing site selection method based on a laser radar, which has the characteristics of high precision, low cost, strong adaptability and the like.
(2) In the embodiment of the invention, the laser radar has the characteristics of high precision and all-weather guiding capability, can be used for guiding the aircraft to land under the complex meteorological conditions of poor visibility and low cloud base, and has low requirement on airborne equipment.
(3) According to the embodiment of the invention, the real-time image shot by the airborne visible light camera is matched with the reference image, and the position of the aircraft is resolved and positioned by using the PNP, so that the aircraft is guided to fly to the designated area. The guiding method based on visual matching has the characteristics of high precision and low cost.
Drawings
Fig. 1 is a flowchart of an autonomous landing and site selection method for an unmanned aerial vehicle based on a laser radar according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of address selection according to an embodiment of the present invention;
Detailed Description
The conception, the specific structure, and the technical effects produced by the present invention will be clearly and completely described in conjunction with the embodiments below, so that the objects, the features, and the effects of the present invention can be fully understood. It is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all embodiments, and those skilled in the art can obtain other embodiments without inventive effort based on the embodiments of the present invention, and all embodiments are within the protection scope of the present invention.
The embodiment of the invention provides an unmanned aerial vehicle autonomous landing site selection method based on a laser radar, which roughly comprises the following steps of:
landing site selection is mainly performed in three stages.
The first stage, on a reference map in advance (range 1000 × 1000 m)2) Artificially selected 150 × 150m2Landing candidate area. At 500 meters, a real-time image shot by the airborne visible light camera is matched with the reference image, and the PNP is used for resolving and positioning the position of the aircraft, so that the aircraft is guided to fly to the designated area. Multiple sheets can be used in matchingAnd the multi-point matching scheme is adopted, so that the positioning precision is improved.
And in the second stage, the airborne laser radar images the ground at the height of 200 meters and with the field of view of 30 degrees × 30 degrees and 30 degrees.A site selection schematic diagram is shown in figure 2.A site selection search range of about 107 × 107m is calculated according to the height and the field angle2According to the laser radar resolution 512 × 512, the Ground Sample Distance (GSD) can be calculated to be about 0.2m, according to the height and laser radar Distance error parameters, the Distance error E at the moment can be calculated to be 0.1m, the detection target size is assumed to be S, the GSD is controlled to be below S/3, the E is controlled to be below S/6 (forward downward view angle imaging, Distance direction measurement Ground concave-convex, obstacle detection Distance precision is higher than GSD), obstacles can be effectively detected, so that the obstacles with the Ground surface exceeding 0.6m can be identified in the stage, in addition, the range area with the Ground gradient exceeding (-15 degrees and 15 degrees) is also identified, the area with the size being more than or equal to 20 × 20m and 20m is selected by integrating obstacle detection and gradient identification results, and address selection2The area meeting the landing condition. Then, in a third phase, a more precise safe landing area is searched again in the area.
And in the third stage, the airborne laser radar images the ground at the height of 100 meters and by using a 9-degree × 9-degree field of view to image the ground, and the address searching range is calculated to be about 16 × 16m according to the height and the field angle2According to the laser radar resolution 512 × 512, the Ground Sample Distance (GSD) can be calculated to be about 0.3m, according to the height and the laser radar Distance error parameter, the Distance error E at the moment can be calculated to be 0.05m, the GSD is supposed to be controlled below S/3, the E is supposed to be controlled below S/6 (one time), and the obstacle can be effectively detected, so according to the conclusion, if the laser radar Distance error parameter is one time, the obstacle exceeding 0.1m on the Ground can not be effectively detected, and if the laser radar Distance error parameter is three times, the obstacle exceeding 0.1m on the Ground can be effectively detected.
According to the scheme, the airborne Flash laser radar images the ground for 2 times at the stages of 200 meters and 100 meters, and imaging data is input to a landing obstacle detection and addressing algorithm to obtain the following output results: the system comprises a Terrain slope analysis Map (TSM), a Terrain Roughness analysis Map (TRM) and an addressing suggestion Map (Hazardcost Map, HCM), wherein the TSM describes the ground slope condition in a form of thermodynamic diagram, the TRM describes the ground Roughness in a form of thermodynamic diagram, and the HCM describes the addressing confidence in a form of thermodynamic diagram.
Elevation Map (EM) estimation
EM estimation is largely divided into three steps, first transferring the lidar point cloud from the lidar coordinate system (L idrarframe) to the ground coordinate system (ground frame) and then projecting the height of each point onto the EM.
TSM estimation
The gradient estimation algorithm is mainly divided into 3 steps
(1) Using 5 x 5m2The window slides over the search area.
(2) The plane P is estimated using a Median least squares estimate (L east Median Square, L MedSq).
(3) The normal vector of the plane P is estimated, resulting in the plane slope. And then returning to the step (1). By repeating the above steps, the gradient can be estimated for all 5 × 5 regions in the search region, so as to obtain a Smooth Surface Map (SSM) and a TSM. Obviously, the step (2) is key to the algorithm, and the basic principle is that a plane can be estimated by using three points:
three non-collinear points (X) on a given planea,Xb,Xc) The plane normal and intercept (n, d) are estimated as follows:
n=(Xb-Xa)×(Xc-Xa)(1)
d=-n·Xa(2)
if more than 3 points are available, the overdetermined solution can be solved by using the least square method. And the sum of the distances from the plane to each point obtained by least square estimation is minimum.
The plane can be estimated by using median least squares or RANSAC least squares, the plane can be correctly estimated under the condition that the outlier rate is less than 50 percent, the RANSAC least squares can correctly estimate the plane under the condition that the outlier rate is 90 percent, but the time consumption and the iteration number of the RANSAC least squares are obviously higher than L MedSq.
TRM estimation
TRM is the difference between EM and SSM, TRM | EM-SSM |.
HCM estimation
Obtaining TRM and TSM, HCM can be calculated according to the following formula
Figure BDA0002445533730000081
Wherein (x, y) is the coordinates of HCM, RtRepresenting a gradient threshold (tolerable maximum gradient), StA threshold value representing the ground protrusion distance (maximum protrusion tolerable).
Safe landing site selection
According to the safe landing requirements, sliding calculation and evaluation are performed in the search area by using 5-by-5 windows:
(1) the location within the window containing HCM (x, y) ═ 1 is the unsafe landing area.
(2) And the position with the minimum HCM mean value in the window is the recommended landing position.
It should be noted that, in the embodiment of the present invention,
and (3) autonomous landing: the landing flight process is that the airborne automatic flight system completely controls the aircraft to land and fly.
A visual sensor: instruments that acquire image information of the external environment using optical elements and imaging devices typically use image resolution to describe the performance of the vision sensor.
PNP: the PNP problem is how to calculate the pose of the camera based on the real coordinates of N spatial points in a known world coordinate system and the projection of the spatial points on the image.
RANSAC algorithm: it can iteratively estimate the parameters of the mathematical model from a set of observed data sets comprising "outliers".
L MedSq algorithm is also an algorithm for estimating parameters of a mathematical model, but its outlier point should be less than 50%.
The principles and embodiments of the present invention are explained herein using specific examples, which are presented only to assist in understanding the method and its core concepts of the present invention. It should be noted that there are no more than infinite trial-and-error modes objectively due to the limited character expressions, and it will be apparent to those skilled in the art that various modifications, decorations, or changes may be made without departing from the spirit of the invention or the technical features described above may be combined in a suitable manner; such modifications, variations, combinations, or adaptations of the invention using its spirit and scope, as defined by the claims, may be directed to other uses and embodiments.

Claims (6)

1. An unmanned aerial vehicle autonomous landing site selection method based on laser radar is characterized by comprising the following steps:
step 1: selecting a landing candidate area on a reference image in advance, and matching a real-time image shot by an airborne visible light camera with the reference image;
step 2, the airborne laser radar images the ground by a 30-degree × 30-degree visual field at the height of 200 meters, a location searching range is calculated according to the height and the visual field angle, a distance error is calculated according to the height and the distance error parameter of the laser radar, a ground slope identification result needs to be identified, the location is selected, and an accurate safe landing area is determined;
and 3, imaging the ground by the airborne laser radar at the height of 100 meters and a 9-degree × 9-degree view field, calculating a site selection search range according to the height and the view field angle, and calculating a distance error according to the height and the distance error parameter of the laser radar so as to ensure that the obstacle of which the ground exceeds 0.1m is effectively detected.
2. The unmanned aerial vehicle autonomous landing site selection method based on laser radar as claimed in claim 1,
the step 1 is specifically that the thickness is previously 1000 × 1000m2Is manually selected to be 150 × 150m on the reference picture2Is a landing candidate area; and at 500 m, matching a real-time image shot by the airborne visible camera with a reference image, and resolving and positioning the position of the aircraft by using PNP.
3. The unmanned aerial vehicle autonomous landing site selection method based on laser radar as claimed in claim 2, wherein in step 1, a plurality of multipoint matching schemes are used during matching, so as to improve positioning accuracy.
4. The unmanned aerial vehicle autonomous landing site selection method based on laser radar as claimed in claim 1,
the step 2 is specifically that the airborne laser radar images the ground in a 30-degree × 30-degree view field at the height of 200 meters;
an address selection search range of about 107 × 107m is calculated according to the height and the field angle2Calculating a Ground Sample Distance (GSD) to be 0.2m according to the laser radar resolution 512 × 512, and calculating a Distance error E to be 0.1m according to the height and the laser radar Distance error parameter;
assuming that the size of a detection target is S, GSD should be controlled below S/3, E (one time) is controlled below S/6, obstacles can be effectively detected, and obstacles with the ground surface exceeding 0.6m can be identified;
in addition, the area with the ground gradient exceeding the range of (-15 degrees and 15 degrees) needs to be identified, and then obstacle detection and gradient identification results are combined for addressing.
5. The unmanned aerial vehicle autonomous landing site selection method based on laser radar as claimed in claim 1,
the step 3 is specifically that the airborne laser radar images the ground in a field of view of 9 degrees × 9 degrees at the height of 100 meters;
according to height and field of viewAngle-deducing address-selecting search range 16 × 16m2Calculating the Ground Sample Distance (GSD) to be 0.3m according to the laser radar resolution 512 × 512;
calculating the distance error E at the moment to be 0.05m according to the height and the laser radar distance error parameters;
assuming that the size of the detection target is S, the GSD is controlled below S/3, and the E is controlled below S/6 (one time), the obstacle can be effectively detected.
6. The unmanned aerial vehicle autonomous landing site selection method based on lidar of claim 5, wherein in step 2, if the lidar range error parameter is doubled, it is not ensured that an obstacle above 0.1m on the ground is effectively detected, and if the lidar range error parameter is tripled, it is ensured that an obstacle above 0.1m on the ground is effectively detected.
CN202010278168.4A 2020-04-10 2020-04-10 Unmanned aerial vehicle autonomous landing site selection method based on laser radar Pending CN111413708A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010278168.4A CN111413708A (en) 2020-04-10 2020-04-10 Unmanned aerial vehicle autonomous landing site selection method based on laser radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010278168.4A CN111413708A (en) 2020-04-10 2020-04-10 Unmanned aerial vehicle autonomous landing site selection method based on laser radar

Publications (1)

Publication Number Publication Date
CN111413708A true CN111413708A (en) 2020-07-14

Family

ID=71491776

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010278168.4A Pending CN111413708A (en) 2020-04-10 2020-04-10 Unmanned aerial vehicle autonomous landing site selection method based on laser radar

Country Status (1)

Country Link
CN (1) CN111413708A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112185180A (en) * 2020-10-16 2021-01-05 西安应用光学研究所 Virtual three-dimensional landing landmark auxiliary landing method
CN112904332A (en) * 2021-01-21 2021-06-04 长沙莫之比智能科技有限公司 Gradient detection algorithm of millimeter wave radar altimeter
CN113359782A (en) * 2021-05-28 2021-09-07 福建工程学院 Unmanned aerial vehicle autonomous addressing landing method integrating LIDAR point cloud and image data
CN113917934A (en) * 2021-11-22 2022-01-11 江苏科技大学 Unmanned aerial vehicle accurate landing method based on laser radar
CN115167512A (en) * 2022-07-25 2022-10-11 亿航智能设备(广州)有限公司 Ground slope detection method and device and computer-readable storage medium
CN116482711A (en) * 2023-06-21 2023-07-25 之江实验室 Local static environment sensing method and device for autonomous selection of landing zone

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112185180A (en) * 2020-10-16 2021-01-05 西安应用光学研究所 Virtual three-dimensional landing landmark auxiliary landing method
CN112185180B (en) * 2020-10-16 2022-11-22 西安应用光学研究所 Virtual three-dimensional landing landmark auxiliary landing method
CN112904332A (en) * 2021-01-21 2021-06-04 长沙莫之比智能科技有限公司 Gradient detection algorithm of millimeter wave radar altimeter
CN113359782A (en) * 2021-05-28 2021-09-07 福建工程学院 Unmanned aerial vehicle autonomous addressing landing method integrating LIDAR point cloud and image data
CN113359782B (en) * 2021-05-28 2022-07-29 福建工程学院 Unmanned aerial vehicle autonomous addressing landing method integrating LIDAR point cloud and image data
CN113917934A (en) * 2021-11-22 2022-01-11 江苏科技大学 Unmanned aerial vehicle accurate landing method based on laser radar
CN113917934B (en) * 2021-11-22 2024-05-28 江苏科技大学 Unmanned aerial vehicle accurate landing method based on laser radar
CN115167512A (en) * 2022-07-25 2022-10-11 亿航智能设备(广州)有限公司 Ground slope detection method and device and computer-readable storage medium
CN116482711A (en) * 2023-06-21 2023-07-25 之江实验室 Local static environment sensing method and device for autonomous selection of landing zone

Similar Documents

Publication Publication Date Title
CN111413708A (en) Unmanned aerial vehicle autonomous landing site selection method based on laser radar
CN111326023B (en) Unmanned aerial vehicle route early warning method, device, equipment and storage medium
JP7263630B2 (en) Performing 3D reconstruction with unmanned aerial vehicles
Kong et al. Autonomous landing of an UAV with a ground-based actuated infrared stereo vision system
CN104197928B (en) Multi-camera collaboration-based method for detecting, positioning and tracking unmanned aerial vehicle
CN109709801A (en) A kind of indoor unmanned plane positioning system and method based on laser radar
WO2017177533A1 (en) Method and system for controlling laser radar based micro unmanned aerial vehicle
CN110426046B (en) Unmanned aerial vehicle autonomous landing runway area obstacle judging and tracking method
CN110609570A (en) Autonomous obstacle avoidance inspection method based on unmanned aerial vehicle
AU2014253606A1 (en) Landing system for an aircraft
CN111426320B (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
CN108089586A (en) A kind of robot autonomous guider, method and robot
CN111273679A (en) Visual-guided network-collision recovery longitudinal guidance method for small fixed-wing unmanned aerial vehicle
CN111649737A (en) Visual-inertial integrated navigation method for precise approach landing of airplane
CN102506867A (en) SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system) combined navigation method based on Harris comer matching and combined navigation system
Andert et al. Optical-aided aircraft navigation using decoupled visual SLAM with range sensor augmentation
Kawamura et al. Simulated vision-based approach and landing system for advanced air mobility
EP3989034B1 (en) Automatic safe-landing-site selection for unmanned aerial systems
Andert et al. Improving monocular SLAM with altimeter hints for fixed-wing aircraft navigation and emergency landing
CN114266821A (en) Online positioning method and device, terminal equipment and storage medium
CN112146627B (en) Aircraft imaging system using projection patterns on featureless surfaces
CN112904895B (en) Image-based airplane guiding method and device
RU2724908C1 (en) Aircraft-type unmanned aerial vehicle landing method to runway using optical devices of different range
Shin et al. 3D LiDAR-based point cloud map registration: Using spatial location of visual features
CN113138382B (en) Fully-automatic approach landing monitoring method for civil and military airport

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination