CN112308900B - Four-rotor unmanned aerial vehicle relative pose estimation method based on LED (light emitting diode) ring detection - Google Patents

Four-rotor unmanned aerial vehicle relative pose estimation method based on LED (light emitting diode) ring detection Download PDF

Info

Publication number
CN112308900B
CN112308900B CN202011135322.9A CN202011135322A CN112308900B CN 112308900 B CN112308900 B CN 112308900B CN 202011135322 A CN202011135322 A CN 202011135322A CN 112308900 B CN112308900 B CN 112308900B
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
led
ring
quad
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011135322.9A
Other languages
Chinese (zh)
Other versions
CN112308900A (en
Inventor
闫飞
初玉滨
庄严
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian University of Technology
Original Assignee
Dalian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian University of Technology filed Critical Dalian University of Technology
Priority to CN202011135322.9A priority Critical patent/CN112308900B/en
Publication of CN112308900A publication Critical patent/CN112308900A/en
Application granted granted Critical
Publication of CN112308900B publication Critical patent/CN112308900B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C27/00Rotorcraft; Rotors peculiar thereto
    • B64C27/04Helicopters
    • B64C27/08Helicopters with two or more rotors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/15Vehicle, aircraft or watercraft design
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Abstract

A relative pose estimation method of a quad-rotor unmanned aerial vehicle based on LED (light-emitting diode) ring detection belongs to the technical field of quad-rotor unmanned aerial vehicles. The LED lamps are arranged on the upper surfaces of the four-rotor unmanned aerial vehicle blades, the LED rings are constructed by utilizing the rotation characteristics of the four-rotor blades, and the unmanned aerial vehicle positioned above detects the LED rings through the downward-looking camera, so that the relative pose is estimated. The invention has simple required hardware, common hardware of the LED lamp and the camera, no need of arranging external equipment and strong adaptability.

Description

Four-rotor unmanned aerial vehicle relative pose estimation method based on LED (light emitting diode) ring detection
Technical Field
The invention belongs to the technical field of quad-rotor unmanned aerial vehicles, and particularly relates to a method for estimating relative pose of a quad-rotor unmanned aerial vehicle.
Background
In recent years, the application of unmanned aerial vehicles is increasingly wide, and tasks of cooperation and cooperation of multiple machines are more and more, such as unmanned aerial vehicle formation performance, multi-unmanned aerial vehicle environment exploration and the like. Many unmanned aerial vehicles cooperate and need acquire mutual positional information for the motion planning and avoid colliding with each other, thereby accomplish appointed task. Therefore, the relative pose estimation has important significance for the task of cooperating with multiple unmanned aerial vehicles.
The relative pose acquisition of the unmanned aerial vehicle can be divided into two types, one type depends on pose information provided by a GPS (Global Positioning System) or a GPS-RTK (GPS-Real-time kinematic), and the other type is a relative pose estimation method for actively sensing the surrounding environment by the unmanned aerial vehicle. The GPS can provide positioning information for the four rotors, but the positioning accuracy of the GPS can only reach a meter level generally, and the dynamic response capability of the GPS is poor, so that satellite signals are easily lost due to external factors such as building shielding, and the requirement of multi-machine cooperation cannot be met. The GPS-RTK has high positioning accuracy which can reach the centimeter level, but the cost is high, and a base station needs to be erected in advance, so that the GPS-RTK cannot be used in many scenes. The relative pose estimation method for actively sensing the surrounding environment by the unmanned aerial vehicle relies on the sensor, the degree of dependence on the outside is small, and the adaptability is strong, so that a plurality of researchers develop research on the method.
A semi-direct monocular visual range calculation method is proposed in the literature (C.Forster, M.Pizzoli and D.Scaramuzza.SVO: fast semi-direct monomeric visual range [ C ]. IEEE International Conference on Robotics and Automation (ICRA), hong Kong, 2014. The method is high in speed and suitable for real-time pose estimation of the unmanned aerial vehicles, but the global coordinate system of the pose estimation is determined when the algorithm is initialized, and when the unmanned aerial vehicles fly in a cooperative mode, the global coordinate systems of the aircrafts are not uniform, so that the relative poses between the unmanned aerial vehicles cannot be calculated directly.
The unmanned aerial vehicle relative pose estimation method based on cooperative target identification, which is proposed in the literature (Marjunjie, yellow Daqing, wengenhao et al. Unmanned aerial vehicle relative pose estimation [ J ] based on cooperative target identification electronic design engineering, 2020,028 (010): 1-6.), designs a cooperative target composed of capital letters H and circular edges, extracts the cooperative target in a camera image by using an improved YOLO network, and solves the position relation between the unmanned aerial vehicle and the cooperative target by combining a perspective projection principle. The method converts the positioning problem of the unmanned aerial vehicle into a detection and identification task of the cooperative target, so that the pose estimation of the unmanned aerial vehicle relative to the cooperative target can be completed. However, the method needs to arrange the cooperative target device in advance, and the moving range of the unmanned aerial vehicle is limited by the cooperative target device, so that the method cannot be extended to the estimation of the relative poses of multiple unmanned aerial vehicles without fixed cooperative targets.
Disclosure of Invention
In order to overcome the defects in the prior art, the invention provides a method for estimating the relative pose of a quad-rotor unmanned aerial vehicle based on LED (light-emitting diode) ring detection. The LED lamps are arranged on the upper surfaces of the four-rotor unmanned aerial vehicle blades, the LED rings are constructed by utilizing the rotation characteristics of the four-rotor blades, and the unmanned aerial vehicle positioned above detects the LED rings through the downward-looking camera, so that the relative pose is estimated. The invention has simple required hardware, common hardware of the LED lamp and the camera, no need of arranging external equipment and strong adaptability.
The technical scheme of the invention is as follows:
a method for estimating relative pose of a quad-rotor unmanned aerial vehicle based on LED ring detection comprises the following steps:
(1) Design of four-rotor unmanned aerial vehicle blade LED ring
Four rotor unmanned aerial vehicle's four motors link to each other and can constitute a rectangle, and the motor drives the paddle and rotates at a high speed and provide the power of unmanned aerial vehicle motion in the flight process. With this characteristic, a small LED lamp and a power supply circuit are disposed on the upper surfaces of the four paddles at a distance r from the center of rotation. The LED lamps are lightened before the quad-rotor unmanned aerial vehicle takes off, and when the blades move at a high speed, four LED rings which respectively use four motors as centers and use r as a radius are observed from the top in a overlooking mode. In order to distinguish the direction of aircraft nose, can adjust the colour of LED ring, for example set specific colour with the LED ring in four rotor unmanned aerial vehicle aircraft nose right place ahead to distinguish other three rings, thereby mark out the direction of aircraft nose.
(2) Detection of LED rings
First, need mark the machine to carry the camera, this camera will acquire other four rotor unmanned aerial vehicle's in the field of vision LED ring picture. Internal parameters, distortion parameters and the like of the camera are obtained through calibration, the camera and the quad-rotor unmanned aerial vehicle are calibrated in a combined mode, and a conversion matrix of a camera coordinate system and a quad-rotor unmanned aerial vehicle body coordinate system is obtained.
The downward-looking camera of the quad-rotor unmanned aerial vehicle positioned above shoots the LED rings on the upper surfaces of the quad-rotor unmanned aerial vehicle and the blades of the quad-rotor unmanned aerial vehicle positioned below, and the rings are extracted from the camera images by using an image processing method. Firstly, removing noise interference in an image, carrying out Gaussian filtering denoising on a picture obtained by a camera, and carrying out edge detection by using a Canny operator. Secondly, LED circular ring edges in the picture are preliminarily extracted by utilizing Hough transform detection, and the extracted circular rings need to be screened because other circular objects may exist in a real scene. Four rings of unmanned aerial vehicle have the same size to the centre of a circle position is more close to, consequently carries out the radius screening to the ring of preliminary extraction, and calculates the quantity of its ring around to every ring, sieves out four LED rings of unmanned aerial vehicle finally. The colors of the four circles are detected in the original image to determine the orientation of the handpiece.
(3) Relative pose estimation
And establishing a body coordinate system for the unmanned aerial vehicle, and acquiring a three-dimensional coordinate of the rotation center of the blade in the body coordinate system by a physical measurement method. The center of the LED circle extracted in the camera image is the projection of the blade center in the two-dimensional image. Substituting the three-dimensional coordinates of the center of the blade, the corresponding two-dimensional coordinates in the image, the internal parameters and the distortion parameters of the camera into a PnP (peer-n-point) model solution equation to solve the pose of the lower camera of the upper quadrotor unmanned aerial vehicle in the fuselage coordinate system of the unmanned aerial vehicle with the lower detected ring, and then combining a coordinate system conversion matrix obtained by the combined calibration of the camera and the fuselage to obtain the pose of the lower quadrotor unmanned aerial vehicle in the fuselage coordinate system of the upper quadrotor unmanned aerial vehicle. If the quad-rotor unmanned aerial vehicle A is located above the quad-rotor unmanned aerial vehicle B, the A estimates the relative pose of the B through the downward-looking camera C, and the coordinate of a certain point P in the space in the fuselage coordinate system of the A is P A The coordinate in the fuselage coordinate system of B is P B The coordinate in the coordinate system of C is P C The transformation matrix between the C coordinate system and the A coordinate system is R AC . The following conversion relation is obtained by jointly calibrating the camera and the quad-rotor unmanned aerial vehicle:
P A =R AC P C
by solving the PnP problem, a transformation relationship can be obtained, where R is BC Represents a transformation matrix between coordinate system C and coordinate system B:
P B =R BC P C
from these two equations, P can be obtained A And P B The transformation relationship between the A and B fuselage coordinate systems, namely the relative pose between A and B, is shown as the following formula, wherein R' BC Represents R BC The inverse matrix of (c):
P A =R AC R′ BC P B
the invention has the beneficial effects that: the paddle LED circular ring designed by the invention has obvious characteristics, is easy to extract, and enhances the robustness and speed of relative pose estimation. Simultaneously, hardware costs such as required sensors are lower, and weight is little, and unmanned aerial vehicle has better duration.
Drawings
FIG. 1 is a flow chart of the present invention.
Fig. 2 is a real shot picture of the LED ring.
Fig. 3 shows the result of image edge extraction.
Fig. 4 shows the LED ring extraction and screening results.
Fig. 5 shows the comparison of the ring extraction result with the original image.
Detailed Description
The following detailed description of the invention refers to the accompanying drawings.
The four-rotor unmanned aerial vehicle selected for use in the invention comprises a PixHawk4 flight control unit, a camera, an onboard computer (ARM or X86), a blade provided with an LED and the like, and the like. The camera is responsible for acquiring an LED circular ring image, the onboard computer is responsible for image processing, coordinate system conversion calculation and the like, and the LED lamp on the upper surface of the paddle displays a circular ring pattern when the paddle rotates.
(1) Design of four-rotor unmanned aerial vehicle blade LED (light-emitting diode) ring
In order to improve the detection precision, the radius of the LED circular ring on the paddle is as large as possible, the distance from the LED to the rotating center of the paddle is the radius of the LED circular ring, and therefore the LED lamp is arranged on the upper surface of the paddle far away from the center of the paddle. The radius of four paddle LED rings all sets up to r =7.7cm (the r value is drawn by measuring), sets up that the LED ring colour in fuselage right place ahead is different with other three ring colours in order to sign aircraft nose direction. The LED lamp switch is turned on before the quad-rotor unmanned aerial vehicle takes off, in the flying process, the circle formed by the rotation of the LED lamp is seen from the top down view, and the live-action picture is shown in figure 2.
(2) LED ring detection
The four rotor unmanned aerial vehicle that are located the top can observe the LED ring of below unmanned aerial vehicle paddle through the camera, and the ring in the picture contains the orientation information of the two-dimensional positional information of four motors of below unmanned aerial vehicle and fuselage. Due to noise interference in the picture, firstly, gaussian filtering denoising is carried out on the picture. And then carrying out edge detection and other processing on the denoised image by using a Canny operator. The results of the edge detection are shown in fig. 3. Extracting a ring by using Hough transform on a result graph of edge detection, wherein the ring needs to be screened because a circular object may exist in a scene and the ring is extracted by mistake when the environment is complex. Firstly, the radiuses of four LED circular rings on the upper surface of the paddle are the same, the circular rings are screened for the first time by utilizing the radius size information, and if one circular ring cannot find other three circular rings with the same radius (or the radius difference is within a certain threshold), the circular ring is deleted. Secondly, the LED rings are close in position, the radius of the ring is set to be n =6 times of the threshold value (the value of n is determined according to the hardware size of the unmanned aerial vehicle), and if the distance between the circle center of one ring and the circle center of other rings is smaller than the threshold value, the two rings are probably two of the four LED rings, and recording is carried out. If one ring can find three other rings meeting the condition, the ring is an LED ring. The center of the circle is the two-dimensional projection of the center of rotation of the blade in the picture. The detection result of the LED ring is shown in fig. 4, and the comparison of the detection result of the LED ring with the original image is shown in fig. 5.
(3) Relative pose estimation
Through jointly demarcating four rotor unmanned aerial vehicle and looking down the camera, can obtain the conversion relation of unmanned aerial vehicle organism coordinate system and camera coordinate system. The three-dimensional coordinates of the rotation centers of the four blades in a coordinate system of the machine body can be obtained through physical measurement. And calibrating the downward-looking camera to obtain the internal parameters, distortion parameters and the like of the downward-looking camera.
Therefore, the relative pose relationship between the downward-looking camera and the unmanned plane body coordinate system below can be solved by substituting the internal reference and distortion parameters of the downward-looking camera, the three-dimensional coordinates of the four motors and the two-dimensional projection coordinates (the centers of the LED rings) of the four motors in the image into the PnP problem model, and the conversion relationship between the downward-looking camera and the four-rotor unmanned plane body coordinate system is obtained when the downward-looking camera and the four-rotor unmanned plane body are calibrated in a combined mode, so that the pose relationship between the upper four-rotor unmanned plane body and the lower four-rotor unmanned plane body can be deduced, and the relative pose of the two unmanned planes is estimated. Taking fig. 5 as an example, the calculation result of the relative pose is (x =0.38m, y =0.39m, z = -4.43m, roll =0.02 °, pitch =0.32 °, yaw =5.3 °), the calculation time is 18ms, and the real-time calculation requirement is satisfied.

Claims (3)

1. A relative pose estimation method of a quad-rotor unmanned aerial vehicle based on LED ring detection is characterized by comprising the following steps:
(1) Design of four-rotor unmanned aerial vehicle blade LED (light-emitting diode) ring
Four motors of the quad-rotor unmanned aerial vehicle are connected to form a rectangle, and the motors drive the blades to rotate at high speed in the flying process to provide power for the unmanned aerial vehicle to move; by utilizing the characteristic, a small LED lamp and a power supply circuit are arranged on the upper surfaces of the four blades at a distance r from the rotating center; the LED lamps are lightened before the quad-rotor unmanned aerial vehicle takes off, and when the blades move at a high speed, four LED rings respectively taking four motors as centers and r as radius are observed from the top in a downward view; adjusting the color of the LED circular ring to distinguish the direction of the machine head;
(2) Detection of LED rings
Firstly, calibrating an onboard camera, and acquiring LED circular pictures of other quad-rotor unmanned aerial vehicles in a visual field by the camera; obtaining internal parameters and distortion parameters of a camera through calibration, and combining the camera with a quadrotor unmanned aerial vehicle to obtain a conversion matrix of a camera coordinate system and a quadrotor unmanned aerial vehicle body coordinate system;
a downward-looking camera of the upper quad-rotor unmanned aerial vehicle shoots LED (light-emitting diode) rings on the upper surfaces of the quad-rotor unmanned aerial vehicle and blades thereof, and the rings are extracted from camera images by using an image processing method; firstly, removing noise interference in an image, carrying out Gaussian filtering denoising on a picture obtained by a camera, and carrying out edge detection by using a Canny operator; secondly, preliminarily extracting the edges of the LED rings in the picture by utilizing Hough transform detection, screening the rings after the edges are extracted by utilizing the characteristics that the four rings of the quad-rotor unmanned aerial vehicle have the same size and the positions of circle centers are relatively close to each other, removing other circular objects in a real scene, screening the four LED rings of the unmanned aerial vehicle, and detecting the colors of the four rings in an original image to determine the orientation of a machine head;
(3) Relative pose estimation
Establishing a body coordinate system for the quad-rotor unmanned aerial vehicle, and acquiring a three-dimensional coordinate of a rotation center of a blade in the body coordinate system by a physical measurement method; the center of the LED circular ring extracted from the camera image is the projection of the center of the blade in the two-dimensional image; substituting the three-dimensional coordinates of the center of the blade, the corresponding two-dimensional coordinates in the image of the camera, the internal parameters and the distortion parameters of the camera into a PnP model solution equation to solve the pose of the downward-looking camera of the upper quadrotor unmanned aerial vehicle in the fuselage coordinate system of the unmanned aerial vehicle with the lower detected circular ring, and combining a coordinate system conversion matrix obtained by the joint calibration of the camera and the fuselage to obtain the pose of the lower quadrotor unmanned aerial vehicle in the fuselage coordinate system of the upper quadrotor unmanned aerial vehicle.
2. The method for estimating the relative pose of the quad-rotor unmanned aerial vehicle based on the LED ring detection according to claim 1, wherein in the step (2), the screening of the ring after the edge extraction includes two modes:
(1) Screening is carried out according to the characteristic that the four circular rings have the same size, and the screening method specifically comprises the following steps: calculating the ratio of the radius of each ring i to the radius of other rings, if the ratio is between 0.9 and 1.1, regarding that the radii of the two rings are approximately equal, and if the number of the rings with the radii approximately equal to the radius of the ring i is less than 3, indicating that the ring i is not an LED ring on the blade of the unmanned aerial vehicle, and screening out the ring i;
(2) Screening according to the characteristic that the circle center position is relatively close to specifically: and calculating the Euclidean distance between the circle center of each ring i and the circle centers of other rings, if the calculation result is less than 6 times of the radius of the ring i, determining that the positions of the two rings are relatively close to each other, and if the number of the rings relatively close to the position of the ring i is not equal to 3, indicating that the ring i is not an LED ring on the blade of the unmanned aerial vehicle, and deleting the ring i.
3. The method for estimating the relative pose of the quad-rotor unmanned aerial vehicle based on the LED ring detection according to claim 1 or 2, wherein in the step (3), when the pose is solved, a transformation matrix among coordinate systems of an upper quad-rotor unmanned aerial vehicle, an unmanned aerial vehicle below the detected ring, and a downward-looking camera of the upper quad-rotor unmanned aerial vehicle is obtained according to the following steps:
if the quad-rotor unmanned aerial vehicle A is located above the quad-rotor unmanned aerial vehicle B, the relative pose of the quad-rotor unmanned aerial vehicle B is estimated by the A through the downward-looking camera C, and the coordinate of a certain point P in the space in the fuselage coordinate system of the quad-rotor unmanned aerial vehicle A is P A The coordinate in the fuselage coordinate system of B is P B The coordinate in the coordinate system of C is P C The transformation matrix between the C coordinate system and the A coordinate system is R AC (ii) a The following conversion relation is obtained by jointly calibrating the camera and the quad-rotor unmanned aerial vehicle:
P A =R AC P C
by solving the PnP problem, a transformation relationship can be obtained, where R is BC Represents a transformation matrix between coordinate system C and coordinate system B:
P B =R BC P C
from these two equations, P can be obtained A And P B The transformation relationship between the two coordinate systems of the A and B bodies, namely the relative pose between the A and B bodies, is shown as the following formula, wherein R' BC Represents R BC The inverse matrix of (c):
P A =R AC R′ BC P B
CN202011135322.9A 2020-10-22 2020-10-22 Four-rotor unmanned aerial vehicle relative pose estimation method based on LED (light emitting diode) ring detection Active CN112308900B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011135322.9A CN112308900B (en) 2020-10-22 2020-10-22 Four-rotor unmanned aerial vehicle relative pose estimation method based on LED (light emitting diode) ring detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011135322.9A CN112308900B (en) 2020-10-22 2020-10-22 Four-rotor unmanned aerial vehicle relative pose estimation method based on LED (light emitting diode) ring detection

Publications (2)

Publication Number Publication Date
CN112308900A CN112308900A (en) 2021-02-02
CN112308900B true CN112308900B (en) 2022-10-21

Family

ID=74326926

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011135322.9A Active CN112308900B (en) 2020-10-22 2020-10-22 Four-rotor unmanned aerial vehicle relative pose estimation method based on LED (light emitting diode) ring detection

Country Status (1)

Country Link
CN (1) CN112308900B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113343355B (en) * 2021-06-08 2022-10-18 四川大学 Aircraft skin profile detection path planning method based on deep learning
CN113819889B (en) * 2021-09-09 2024-01-26 中国电子科技集团公司第五十四研究所 Relative ranging and attitude measuring method based on aircraft rotor wing light source detection

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106326892B (en) * 2016-08-01 2020-06-09 西南科技大学 Visual landing pose estimation method of rotary wing type unmanned aerial vehicle
CN108453738B (en) * 2018-03-30 2021-04-16 东南大学 Control method for four-rotor aircraft aerial autonomous grabbing operation based on Opencv image processing
CN108711166B (en) * 2018-04-12 2022-05-03 浙江工业大学 Monocular camera scale estimation method based on quad-rotor unmanned aerial vehicle
CN109764864B (en) * 2019-01-16 2022-10-21 南京工程学院 Color identification-based indoor unmanned aerial vehicle pose acquisition method and system

Also Published As

Publication number Publication date
CN112308900A (en) 2021-02-02

Similar Documents

Publication Publication Date Title
Martínez et al. On-board and ground visual pose estimation techniques for UAV control
CN109949361A (en) A kind of rotor wing unmanned aerial vehicle Attitude estimation method based on monocular vision positioning
CN106326892B (en) Visual landing pose estimation method of rotary wing type unmanned aerial vehicle
JP6496323B2 (en) System and method for detecting and tracking movable objects
CN109598765B (en) Monocular camera and millimeter wave radar external parameter combined calibration method based on spherical calibration object
US20200301015A1 (en) Systems and methods for localization
CN110058602A (en) Multi-rotor unmanned aerial vehicle autonomic positioning method based on deep vision
CN117369489A (en) Collision avoidance system, depth imaging system, vehicle, map generator, and method thereof
Kalinov et al. High-precision uav localization system for landing on a mobile collaborative robot based on an ir marker pattern recognition
CN112308900B (en) Four-rotor unmanned aerial vehicle relative pose estimation method based on LED (light emitting diode) ring detection
CN108759826B (en) Unmanned aerial vehicle motion tracking method based on multi-sensing parameter fusion of mobile phone and unmanned aerial vehicle
KR20180044279A (en) System and method for depth map sampling
Sanchez-Lopez et al. Toward visual autonomous ship board landing of a VTOL UAV
Demonceaux et al. Omnidirectional vision on UAV for attitude computation
CN110675453B (en) Self-positioning method for moving target in known scene
CN109460046B (en) Unmanned aerial vehicle natural landmark identification and autonomous landing method
CN109341686B (en) Aircraft landing pose estimation method based on visual-inertial tight coupling
CN111562791A (en) System and method for identifying visual auxiliary landing of unmanned aerial vehicle cooperative target
CN107063261A (en) The multicharacteristic information terrestrial reference detection method precisely landed for unmanned plane
CN116578035A (en) Rotor unmanned aerial vehicle autonomous landing control system based on digital twin technology
CN114815871A (en) Vision-based autonomous landing method for vertical take-off and landing unmanned mobile platform
CN117523461B (en) Moving target tracking and positioning method based on airborne monocular camera
CN113655803A (en) System and method for calibrating course of rotor unmanned aerial vehicle in tunnel environment based on vision
Lin et al. Real-time 6DoF deck pose estimation and target tracking for landing an UAV in a cluttered shipboard environment using on-board vision
CN109764864B (en) Color identification-based indoor unmanned aerial vehicle pose acquisition method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant