CN108594848B - Unmanned aerial vehicle staged autonomous landing method based on visual information fusion - Google Patents
Unmanned aerial vehicle staged autonomous landing method based on visual information fusion Download PDFInfo
- Publication number
- CN108594848B CN108594848B CN201810270640.2A CN201810270640A CN108594848B CN 108594848 B CN108594848 B CN 108594848B CN 201810270640 A CN201810270640 A CN 201810270640A CN 108594848 B CN108594848 B CN 108594848B
- Authority
- CN
- China
- Prior art keywords
- landmark
- aerial vehicle
- unmanned aerial
- camera
- unmanned
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 37
- 230000000007 visual effect Effects 0.000 title claims abstract description 24
- 230000004927 fusion Effects 0.000 title claims abstract description 22
- 238000012545 processing Methods 0.000 claims abstract description 7
- 238000005259 measurement Methods 0.000 claims abstract description 5
- 238000001914 filtration Methods 0.000 claims description 14
- 238000004364 calculation method Methods 0.000 claims description 8
- 239000011159 matrix material Substances 0.000 claims description 7
- 238000001514 detection method Methods 0.000 claims description 3
- 238000005070 sampling Methods 0.000 claims description 3
- 239000003550 marker Substances 0.000 claims description 2
- 230000009466 transformation Effects 0.000 description 3
- 230000003111 delayed effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 239000008186 active pharmaceutical agent Substances 0.000 description 1
- 230000003416 augmentation Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention relates to an unmanned aerial vehicle staged autonomous landing method based on visual information fusion, which comprises the following steps: 1) landmark making: taking a corresponding target landing point on the unmanned ship as a landmark, attaching an AprilTags label to the landmark, and adjusting the angle of a camera of the unmanned ship; 2) image processing: according to the parameter information of the camera and the image information captured by the camera, the relative pose X between the camera and the landmark is acquired when the landmark is foundct(ii) a 3) Information fusion: relative pose X between camera and landmarkctAfter information fusion is carried out on the measurement data of the IMU, the real-time relative pose X of the unmanned ship under the unmanned aerial vehicle reference system is obtainedvs(ii) a 4) And (3) motion control: according to the real-time relative pose XvsA nested control mode is adopted to ensure the stable flight and the path tracking, and a staged boat landing method is adopted to perform boat landing. Compared with the prior art, the method has the advantages of real-time effectiveness, lag avoidance, stability, safety and the like.
Description
Technical Field
The invention relates to the technical field of intelligent offshore robots, in particular to a staged autonomous landing method of an unmanned aerial vehicle based on visual information fusion.
Background
With the progress of science and technology, unmanned systems are more and more widely applied in the professional fields of agriculture, electric power, ocean and the like. Unmanned Aerial Vehicles (UAVs) have been used as "pets" in Unmanned systems, and their development speed and application fields have been promoted in recent years.
In the ocean field, unmanned aerial vehicles with limited cruising ability and wide search range are usually equipped on unmanned boats with strong cruising ability and small search range to form a coordinated formation of the unmanned boats with complementary advantages for completing tasks such as maritime rescue, environmental monitoring, battlefield reconnaissance and the like, and the core technology of the unmanned aerial vehicle autonomous navigation technology is a shipborne unmanned aerial vehicle autonomous navigation technology.
Taking off, hovering and landing are three basic problems of the autonomous navigation technology of the carrier-borne unmanned aerial vehicle. Among them, autonomous landing of the drone is the most challenging problem. In the process of landing on the boat, the unmanned aerial vehicle faces the moving and swinging landing platform to effectively identify landmarks and accurately control the pose. At present, research on unmanned aerial vehicle autonomous landing technology is still few internationally, and main challenges are reflected in two aspects:
first, unmanned aerial vehicle is independently the navigation precision of boat. The GPS of the small unmanned aerial vehicle often has meter-level precision and cannot meet the requirement of the boat task precision; the INS/GPS integrated navigation system can only position the self pose of the unmanned aerial vehicle, and the relative pose of the unmanned aerial vehicle and a boat landing platform is lacked. Although many documents propose to use computer vision to assist navigation, the image processing process of the documents has large computation amount, the general onboard computer has limited computation capability, and the pose given by the vision navigation is often behind the real-time pose. Therefore, how to ensure the real-time performance and effectiveness of the pose information while improving the navigation precision becomes a problem to be solved urgently in the autonomous landing process of the unmanned aerial vehicle.
Secondly, unmanned aerial vehicle is in the security of ship end. During the process of landing on the boat, some documents directly use the center point of the landmark as a reference position signal for control. However, this tends to cause the camera to lose sight of the landmark, especially on unmanned boats where the landmark position is not fixed. In addition, due to the wing-ground effect (when the moving object is close to the ground, the ground interferes with the aerodynamic force generated by the object), the unmanned aerial vehicle is difficult to keep stable when being close to the landing platform of the unmanned ship, so that the success rate of the method is not high when the method is applied, and the safety of the unmanned aerial vehicle at the tail end of the landing ship is to be improved.
Disclosure of Invention
The invention aims to overcome the defects in the prior art and provide a staged autonomous landing method of an unmanned aerial vehicle based on visual information fusion.
The purpose of the invention can be realized by the following technical scheme:
an unmanned aerial vehicle staged autonomous landing method based on visual information fusion comprises the following steps:
1) landmark making: taking a corresponding target landing point on the unmanned ship as a landmark, attaching an AprilTags label to the landmark, and adjusting the angle of a camera of the unmanned aerial vehicle to ensure that the unmanned aerial vehicle can detect the landmark when approaching the landmark;
2) image processing: calculating parameters of an AprilTags visual reference system configured in an on-board computer of the unmanned aerial vehicle according to the parameter information of the camera and the image information captured by the camera, and acquiring the relative pose X between the camera and the landmark when the landmark is foundct;
3) Information fusion: relative pose X between camera and landmarkctAfter information fusion is carried out on the measurement data of the IMU, the real-time relative pose X of the unmanned ship under the unmanned aerial vehicle reference system is obtainedvs;
4) And (3) motion control: according to the real-time relative pose XvsA nested control mode is adopted to ensure the stable flight and the path tracking, and a staged boat landing method is adopted to perform boat landing.
The parameters of AprilTags visual reference system in the step 2) comprise a focal length F in a pixel unit in the wide directionwFocal length F in the height direction in units of pixelshCenter position of image (C)w,Ch) The calculation formula is as follows:
wherein F and C are the focal length and position in different directions, respectively, and LfocusAnd LreceptorRespectively focal length and photosensitive element size, NpixelThe parameters in the width direction and the height direction are calculated by using the formula for the number of pixels.
The step 3) specifically comprises the following steps:
31) the system state estimation comprises two stages, wherein the first stage is a stage in which no landmark is detected, and the second stage is a stage in which the landmark is detected;
when the landmark is not detected, the system state provides pose information X for the IMUvI.e. X ═ Xv]If the system state is estimated, the following steps are performed:
wherein,is the estimated value of the system state at the k-th step,is the estimated value of the system state in the k-1 step, Fk-1For Kalman filtering parameters at landmark stages not detected, a uniform velocity model is used to predict the future state of the system, i.e. Satisfy the requirement of
Δ t is the sampling interval, wk-1Is white gaussian noise;
starting from the first detection of the landmark, adding the unmanned ship state X into the system statesI.e. X ═ Xv Xs]And updating Kalman filter parameter Fk-1Then, there are:
wherein,are kalman filtering parameters associated with the drone,for Kalman filtering parameters related to unmanned ship, only the motion of unmanned ship on horizontal plane is considered, and uniform velocity model is also adopted, including
32) Obtaining the observation results of the IMU and the camera, wherein the observation result h of the IMUIMUComprises the following steps:
wherein z islvIs a height, philv,θlv,ψlvRespectively, the angle of rotation, u, about the three directions of movement x, y, zlvAnd vlvForward and lateral velocities, respectively, corresponding to Jacobian matrix HIMUComprises the following steps:
the observation result h of the cameratag(XKF) Comprises the following steps:
wherein, XvcIs the pose, X, of the camera under the unmanned aerial vehicle reference systemstFor the pose, X, of the marker under the reference system of the unmanned shiplvFor the estimated pose of the drone, XlsFor the estimated pose of the unmanned vehicle,andconverting the coordinates;
its corresponding Jacobian matrix HtagComprises the following steps:
33) and performing extended Kalman filtering on the observation result to obtain a real-time system state.
The step 3) further comprises the following steps:
34) through historical poses of the unmanned aerial vehicle and the unmanned ship, the system state is modified under the condition of delay, and the real-time relative pose X of the unmanned ship under the unmanned aerial vehicle reference system is obtainedvs。
In the step 4), the nesting control mode is specifically as follows:
and 6 independent closed-loop PID controllers are adopted for control, wherein the inner ring attitude control is used for ensuring stable flight, and the outer ring position control is used for path tracking.
In the step 4), the staged boat landing method specifically comprises the following steps:
41) and (3) finding a landmark: when the landmark is found for the first time, initializing the system state and enabling the unmanned aerial vehicle to track the landmark;
42) and (3) a slow descending stage: guiding the unmanned aerial vehicle to the center of the landmark to keep the unmanned aerial vehicle at the center of the landmark and with the radius rloopHeight of hloopAnd the next course point is vertically searched downwards by taking the delta z as a step length, and the height of the delta z is increased when the visual field of the landmark is lost until the landmark is found again.
43) Entering a rapid descending stage: setting a first height threshold hlandWhen the height of the unmanned aerial vehicle is less than hlandIf the landmark is in the visual field of the camera in continuous N frames and meets the rapid descending condition, the landmark can enter a rapid descending stage and directly takes 0 as a reference signal of a z axis;
44) and (3) a landing stage: setting a second height threshold hminWhen the height of the unmanned aerial vehicle is lower than a second height threshold hminAnd in time, the propeller is closed, so that the unmanned aerial vehicle can freely fall and complete landing.
The fast descending conditions in the step 43) are as follows:
z-hland≤hloop
||(x,y)||2≤rloop
wherein z is the height of the unmanned aerial vehicle relative to the unmanned boat, | | (x, y) | luminance2Is the horizontal distance between the unmanned aerial vehicle and the unmanned boat.
In the step 44) described above, the operation,
if the unmanned aerial vehicle is higher than the second height threshold hminAnd when the camera loses the view of the landmark, the unmanned aerial vehicle is pulled up no matter in the slow descending stage or the fast descending stage until the landmark is found again.
Compared with the prior art, the invention has the following advantages:
firstly, real-time effective: in the processes of landmark identification and image processing, mature and open-source AprilTags are adopted, so that the technical threshold can be reduced; by using the monocular camera, the operation complexity is simplified and the equipment cost is reduced on the premise of ensuring the precision.
Secondly, hysteresis is avoided: performing information fusion on the relative pose obtained by the camera and the pose obtained by the IMU to improve the precision; the problem of hysteresis in image processing is avoided by increasing the history.
Thirdly, stable and safe: the unmanned aerial vehicle boat landing nesting control scheme and the staged safe boat landing method are combined, so that the situation that the camera of the unmanned aerial vehicle loses the visual field of a landmark in the boat landing process can be avoided, and meanwhile, the stability and the safety of the unmanned aerial vehicle when the unmanned aerial vehicle is close to the landing platform of the unmanned aerial vehicle can be guaranteed.
Drawings
Fig. 1 is a block diagram of a nesting control scheme for unmanned aerial vehicle landing.
Fig. 2 is a flow chart of a phased safe landing method of the unmanned aerial vehicle in the invention.
Fig. 3 is a schematic diagram of an autonomous landing method of the unmanned aerial vehicle in the invention.
Detailed Description
The invention is described in detail below with reference to the figures and specific embodiments.
Examples
The following describes the technical solution of the present invention with reference to fig. 3.
For convenience of explanation, the following symbol convention is made:
Xijrefers to the pose of the j reference system under the i reference system, wherein the pose is defined as a 6-dimensional column vector [ x y z phi theta phi ]]TWhere (x, y, z) is the position coordinate in the reference frame and (phi, theta, psi) are the angles of rotation about the x, y, z axes, respectively, called roll, pitch, and yaw. The reference systems used are: the unmanned aerial vehicle is characterized by comprising an unmanned aerial vehicle reference system { v }, a local reference system { l }, an unmanned ship reference system { s }, a camera reference system { c }, and a landmark reference system { t }. Some basic symbols of the reference frame transformation are defined at the same time: if i, j, k represents three reference systems, symbolsRepresents the accumulation of transformations, satisfiesSymbolTo representIs inverse operation ofIn the method provided by the invention, the calculation is realized in an onboard computer of the unmanned aerial vehicle through software, the unmanned aerial vehicle is provided with an inertia measurement unit to obtain real-time attitude information, and a camera is used for collecting environmental information. Taking the four-rotor unmanned aerial vehicle landing on an electrically-driven unmanned boat as an example, the implementation process of the method comprises the following 4 specific implementation steps.
Step 1: and (5) landmark making. The labels in AprilTags are printed and the corresponding target landing sites posted on the drones serve as landmarks. The angle adjustment of the unmanned aerial vehicle camera is forward and downward slightly, and the unmanned aerial vehicle can detect the landmark when approaching the landmark.
Step 2: and (5) processing the image. An aprilatags visual reference system is configured in the onboard computer of the drone. Utilizing the parameter information of the camera and the image information captured by the camera, 4 important parameters are calculated, namely: focal length F in pixel unit in width and height directionswAnd FhCenter position of image (C)w,Ch). The calculation formula is as follows:
wherein L isfocusAnd LreceptorRespectively, the focal length and the size of the photosensitive element, in millimeters. N is a radical ofpixelIs the number of pixels. The parameters in the width direction and the height direction are calculated by using the formula.
The calculated parameters are transmitted to an AprilTags visual reference system, so that the relative pose X between the camera and the mark can be returned when the landmark is foundct。
And step 3: letterAnd (4) fusing. Relative pose X between camera and landmarkctCarrying out information fusion with the measurement data of the IMU to obtain the real-time poses X of the unmanned aerial vehicle and the unmanned shiplvAnd XlsAnd relative pose Xvs. The process is divided into the following 4 sub-steps:
(1) and estimating the state of the system. The information fusion is divided into two stages, wherein the first stage is a stage in which the landmark is not detected, and the second stage is a stage in which the landmark is detected.
In the stage that no landmark is detected, the system state is taken as pose information X provided by the IMUv=[XlvVvWv]. Wherein, VvIs the speed in the x, y, z directions, and WvIs the angular velocity around the x, y, z direction. The system state is estimated as follows:
wherein, wk-1Is white Gaussian noise, where a uniform model is used to predict the system state, i.e. Has the following form:
Δ t is the sampling interval.
Starting from the first detection of a landmark, the state of the unmanned boat is added to the system state, i.e., X ═ Xv Xs],Xv=[XlvVv Wv],Xs=[Xls Vs Ws]Then, Fk-1Has the following form:
only the motion of the unmanned boat on the horizontal plane is considered, and a uniform speed model is also adopted, wherein
(2) And obtaining the observation result of the sensor. The observation models of the IMU and the camera can be expressed as
z[k]=h(X)+v[k]。
Where vk-N (0, Rk) is the observation noise. h (X) represents a function of state X.
IMU is unmanned aerial vehicle's height, gesture and speed information to unmanned aerial vehicle's observation result, promptly:
wherein z islvIs height, philv,θlv,ψlvThe rotation angles are respectively around the three movement directions of x, y and z. u. oflvAnd vlvForward and side speeds, respectively. Corresponding to a Jacobian matrix of
The observation result of the camera on the ith landmark is
Wherein h istag(XKF) As a result of observation by the camera, XvcIs the pose, X, of the camera under the unmanned aerial vehicle reference systemstIs the marked pose under the unmanned boat reference system,andis a coordinate transformation operation.
Correspond to
(3) And (5) expanding Kalman filtering. The above nonlinear filtering problem is solved using extended Kalman filtering, since the state quantities conform to a normal distribution XKFN (mu, sigma), where the covariance matrix sigma is expressed as
The extended kalman filter is updated according to the following formula:
through the extended Kalman filtering, the real-time state X of the system can be obtained as Xv Xs]。
(4) The problem of state hysteresis is solved. Because image calculation is complex, AprilTag can only provide a calculation result at the time k-n, and therefore state estimation of the extended Kalman filtering to the time k +1 is obtained according to the state at the time k-n and is inaccurate. We solve this problem by recording the historical poses of drones and drones. Modifying system state estimator under delayed condition
Where n is the number of delay states. The iterative calculation of the system state estimate is then formulated as
Accordingly, the Jacobian matrix of the IMU sensor model is modified to
Since the observation in the last step is performed for the delay state, for the jth delay state Xj, the observation model of the camera is modified to
After the correction, the state X after the augmentation can be updated by using the extended Kalman filteringDS. After each delayed observation is updated by the extended kalman filter, it is ignored and shifted out of the state vector. Therefore, the filtering algorithm only needs to store a small segment of history state additionally.
After filtered system state information is obtained by solving the state lag problem, the pose X of the unmanned aerial vehicle and the unmanned ship under the local reference system is utilizedlvAnd XlsThe relative pose X can be calculatedvs. The specific calculation method is
And 4, step 4: and (5) controlling the motion. And performing motion control on the unmanned aerial vehicle by using the fused relative pose. A nested control scheme is adopted, and 6 independent closed-loop PID controllers are used for control, wherein the inner ring attitude control is used for ensuring stable flight, and the outer ring position control is used for path tracking. Meanwhile, a staged safe boat landing method is adopted. The method is divided into the following 4 sub-steps:
(1) when the landmark is found for the first time, the system state is initialized, and the unmanned aerial vehicle is enabled to track the landmark.
(2) Directing the drone to the center of the landmark and then maintaining the drone centered at the landmark at a radius rloopHeight of hloopAnd the next course point is vertically searched downwards by taking the step length of delta z as a step length to descend. Once the field of view of the landmark is lost, Δ z is raised upward until the landmark is rediscovered, a phase known as the slow descent phase.
(3) A smaller height h is setlandIf the height is less than hlandIt is considered that the landing is directly prepared without lowering. If the landmark is in the camera view in N consecutive frames, and satisfies
z-hland≤hloop,
||(x,y)||2≤rloop,
It is considered that the fast-falling phase can be entered with 0 as the reference signal for the z-axis directly.
Wherein z is the height of the unmanned aerial vehicle relative to the unmanned boat, | | (x, y) | luminance2Is the horizontal distance between the unmanned aerial vehicle and the unmanned boat.
(4) When the camera falls to a certain height, the camera cannot identify the landmark at a short distance, so that the positioning cannot be carried out. Therefore, we set a very small height hminWhen the height is lower than the height, the propeller is closed, and the unmanned aerial vehicle is allowed to land in a free falling mode. But if above hminThe camera loses the view of the landmark, and the unmanned aerial vehicle should be pulled up no matter in the fast descending or slow descending stage until the landmark is found again.
Claims (7)
1. An unmanned aerial vehicle staged autonomous landing method based on visual information fusion is characterized by comprising the following steps:
1) landmark making: taking a corresponding target landing point on the unmanned ship as a landmark, attaching an AprilTags label to the landmark, and adjusting the angle of a camera of the unmanned aerial vehicle to ensure that the unmanned aerial vehicle can detect the landmark when approaching the landmark;
2) image processing: calculating parameters of an AprilTags visual reference system configured in an on-board computer of the unmanned aerial vehicle according to the parameter information of the camera and the image information captured by the camera, and acquiring the relative pose X between the camera and the landmark when the landmark is foundct;
3) Information fusion: relative pose X between camera and landmarkctAfter information fusion is carried out on the measurement data of the IMU, the real-time relative pose X of the unmanned ship under the unmanned aerial vehicle reference system is obtainedvsThe method specifically comprises the following steps:
31) the system state estimation is divided into two stages, wherein the first stage is a stage in which no landmark is detected, and the second stage is a stage in which the landmark is detected;
when the landmark is not detected, the system state provides pose information X for the IMUvI.e. X ═ Xv]If the system state is estimated, the following steps are performed:
wherein, among others,is the estimated value of the system state at the k-th step,is the estimated value of the system state in the k-1 step, Fk-1Kalman filter parameters, w, for stages in which no landmarks are detectedk-1Is Gaussian white noise, and delta t is a sampling interval;
starting from the first detection of the landmark, adding the unmanned ship state X into the system statesI.e. X ═ Xv Xs]And updating Kalman filter parameter Fk-1Then, there are:
wherein,are kalman filtering parameters associated with the drone,is a Kalman filtering parameter related to the unmanned ship;
32) obtaining the observation results of the IMU and the camera, wherein the observation result h of the IMUIMUComprises the following steps:
wherein z islvIs a height, philv,θlv,ψlvRespectively, the angle of rotation, u, about the three directions of movement x, y, zlvAnd vlvForward and lateral velocities, respectively, corresponding to Jacobian matrix HIMUComprises the following steps:
the observation result h of the cameratag(XKF) Comprises the following steps:
wherein, XvcIs the pose, X, of the camera under the unmanned aerial vehicle reference systemstFor the pose, X, of the marker under the reference system of the unmanned shiplvFor the estimated pose of the drone, XlsFor the estimated pose of the unmanned vehicle,andconverting the coordinates;
its corresponding Jacobian matrix HtagComprises the following steps:
33) performing extended Kalman filtering on the observation result to obtain a real-time system state;
4) and (3) motion control: according to the real-time relative pose XvsA nested control mode is adopted to ensure the stable flight and the path tracking, and a staged boat landing method is adopted to perform boat landing.
2. The unmanned aerial vehicle staged autonomous landing method based on visual information fusion as claimed in claim 1, wherein the AprilTags visual reference system parameters in step 2) include a focal length F in pixels in a wide directionwIn the high direction, the pixel isFocal length of unit FhCenter position of image (C)w,Ch) The calculation formula is as follows:
wherein F and C are the focal length and position in different directions, respectively, and LfocusAnd LreceptorRespectively focal length and photosensitive element size, NpixelIs the number of pixels.
3. The unmanned aerial vehicle staged autonomous landing method based on visual information fusion as claimed in claim 2, wherein said step 3) further comprises the steps of:
34) through historical poses of the unmanned aerial vehicle and the unmanned ship, the system state is modified under the condition of delay, and the real-time relative pose X of the unmanned ship under the unmanned aerial vehicle reference system is obtainedvs。
4. The unmanned aerial vehicle staged autonomous landing method based on visual information fusion as claimed in claim 1, wherein in step 4), the nesting control mode is specifically:
and 6 independent closed-loop PID controllers are adopted for control, wherein the inner ring attitude control is used for ensuring stable flight, and the outer ring position control is used for path tracking.
5. The unmanned aerial vehicle staged autonomous landing method based on visual information fusion according to claim 1, wherein in the step 4), the staged landing method specifically comprises the following steps:
41) and (3) finding a landmark: when the landmark is found for the first time, initializing the system state and enabling the unmanned aerial vehicle to track the landmark;
42) and (3) a slow descending stage: guiding the unmanned aerial vehicle to the center of the landmark to keep the unmanned aerial vehicle at the center of the landmark and with the radius rloopHeight of hloopIn the cylinder space, the next track point is vertically searched downwards by taking the delta z as a step length to descend, and when the visual field of the landmark is lost, the height of the delta z is increased until the landmark is found again;
43) entering a rapid descending stage: setting a first height threshold hlandWhen the height of the unmanned aerial vehicle is less than hlandIf the landmark is in the visual field of the camera in continuous N frames and meets the rapid descending condition, the landmark can enter a rapid descending stage and directly takes 0 as a reference signal of a z axis;
44) and (3) a landing stage: setting a second height threshold hminWhen the height of the unmanned aerial vehicle is lower than a second height threshold hminAnd in time, the propeller is closed, so that the unmanned aerial vehicle can freely fall and complete landing.
6. The unmanned aerial vehicle staged autonomous landing method based on visual information fusion as claimed in claim 5, wherein the fast descent condition in step 43) is:
z-hland≤hloop
||(x,y)||2≤rloop
wherein z is the height of the unmanned aerial vehicle relative to the unmanned boat, | | (x, y) | luminance2Is the horizontal distance between the unmanned aerial vehicle and the unmanned boat.
7. The unmanned aerial vehicle staged autonomous landing method based on visual information fusion as claimed in claim 5, wherein in step 44),
if the unmanned aerial vehicle is higher than the second height threshold hminAnd when the camera loses the view of the landmark, the unmanned aerial vehicle is pulled up no matter in the slow descending stage or the fast descending stage until the landmark is found again.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810270640.2A CN108594848B (en) | 2018-03-29 | 2018-03-29 | Unmanned aerial vehicle staged autonomous landing method based on visual information fusion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810270640.2A CN108594848B (en) | 2018-03-29 | 2018-03-29 | Unmanned aerial vehicle staged autonomous landing method based on visual information fusion |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108594848A CN108594848A (en) | 2018-09-28 |
CN108594848B true CN108594848B (en) | 2021-01-22 |
Family
ID=63623841
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810270640.2A Active CN108594848B (en) | 2018-03-29 | 2018-03-29 | Unmanned aerial vehicle staged autonomous landing method based on visual information fusion |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108594848B (en) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109341700B (en) * | 2018-12-04 | 2023-06-30 | 中国航空工业集团公司西安航空计算技术研究所 | Visual auxiliary landing navigation method for fixed-wing aircraft under low visibility |
CN109525220B (en) * | 2018-12-10 | 2019-08-30 | 中国人民解放军国防科技大学 | Gaussian mixture CPHD filtering method with track association and extraction capability |
CN111323005A (en) * | 2018-12-17 | 2020-06-23 | 北京华航无线电测量研究所 | Visual auxiliary cooperative landmark design method for omnidirectional autonomous precise landing of unmanned helicopter |
CN109823552B (en) * | 2019-02-14 | 2021-02-12 | 深圳市多翼创新科技有限公司 | Vision-based unmanned aerial vehicle accurate landing method, storage medium, device and system |
CN110058604A (en) * | 2019-05-24 | 2019-07-26 | 中国科学院地理科学与资源研究所 | A kind of accurate landing system of unmanned plane based on computer vision |
CN112099527B (en) * | 2020-09-17 | 2021-07-23 | 湖南大学 | Control method and system for autonomous landing of mobile platform of vertical take-off and landing unmanned aerial vehicle |
CN112286216A (en) * | 2020-11-11 | 2021-01-29 | 鹏城实验室 | Unmanned aerial vehicle autonomous landing unmanned ship method and system based on visual identification |
CN112419403B (en) * | 2020-11-30 | 2024-10-11 | 海南大学 | Indoor unmanned aerial vehicle positioning method based on two-dimensional code array |
CN112987765B (en) * | 2021-03-05 | 2022-03-15 | 北京航空航天大学 | Precise autonomous take-off and landing method of unmanned aerial vehicle/boat simulating attention distribution of prey birds |
CN114326765B (en) * | 2021-12-01 | 2024-02-09 | 爱笛无人机技术(南京)有限责任公司 | Landmark tracking control system and method for unmanned aerial vehicle visual landing |
CN117250995B (en) * | 2023-11-20 | 2024-02-02 | 西安天成益邦电子科技有限公司 | Bearing platform posture correction control method and system |
CN117806328B (en) * | 2023-12-28 | 2024-09-17 | 华中科技大学 | Unmanned ship berthing vision guiding control method and system based on reference marks |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103662091A (en) * | 2013-12-13 | 2014-03-26 | 北京控制工程研究所 | High-precision safe landing guiding method based on relative navigation |
EP2434256A3 (en) * | 2010-09-24 | 2014-04-30 | Honeywell International Inc. | Camera and inertial measurement unit integration with navigation data feedback for feature tracking |
CN104062977A (en) * | 2014-06-17 | 2014-09-24 | 天津大学 | Full-autonomous flight control method for quadrotor unmanned aerial vehicle based on vision SLAM |
CN104679013A (en) * | 2015-03-10 | 2015-06-03 | 无锡桑尼安科技有限公司 | Unmanned plane automatic landing system |
CN105021184A (en) * | 2015-07-08 | 2015-11-04 | 西安电子科技大学 | Pose estimation system and method for visual carrier landing navigation on mobile platform |
CN105335733A (en) * | 2015-11-23 | 2016-02-17 | 西安韦德沃德航空科技有限公司 | Autonomous landing visual positioning method and system for unmanned aerial vehicle |
CN106708066A (en) * | 2015-12-20 | 2017-05-24 | 中国电子科技集团公司第二十研究所 | Autonomous landing method of unmanned aerial vehicle based on vision/inertial navigation |
CN106774386A (en) * | 2016-12-06 | 2017-05-31 | 杭州灵目科技有限公司 | Unmanned plane vision guided navigation landing system based on multiple dimensioned marker |
CN107240063A (en) * | 2017-07-04 | 2017-10-10 | 武汉大学 | A kind of autonomous landing method of rotor wing unmanned aerial vehicle towards mobile platform |
CN107544550A (en) * | 2016-06-24 | 2018-01-05 | 西安电子科技大学 | A kind of Autonomous Landing of UAV method of view-based access control model guiding |
CN107687850A (en) * | 2017-07-26 | 2018-02-13 | 哈尔滨工业大学深圳研究生院 | A kind of unmanned vehicle position and orientation estimation method of view-based access control model and Inertial Measurement Unit |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090306840A1 (en) * | 2008-04-08 | 2009-12-10 | Blenkhorn Kevin P | Vision-based automated landing system for unmanned aerial vehicles |
-
2018
- 2018-03-29 CN CN201810270640.2A patent/CN108594848B/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2434256A3 (en) * | 2010-09-24 | 2014-04-30 | Honeywell International Inc. | Camera and inertial measurement unit integration with navigation data feedback for feature tracking |
CN103662091A (en) * | 2013-12-13 | 2014-03-26 | 北京控制工程研究所 | High-precision safe landing guiding method based on relative navigation |
CN104062977A (en) * | 2014-06-17 | 2014-09-24 | 天津大学 | Full-autonomous flight control method for quadrotor unmanned aerial vehicle based on vision SLAM |
CN104679013A (en) * | 2015-03-10 | 2015-06-03 | 无锡桑尼安科技有限公司 | Unmanned plane automatic landing system |
CN105021184A (en) * | 2015-07-08 | 2015-11-04 | 西安电子科技大学 | Pose estimation system and method for visual carrier landing navigation on mobile platform |
CN105335733A (en) * | 2015-11-23 | 2016-02-17 | 西安韦德沃德航空科技有限公司 | Autonomous landing visual positioning method and system for unmanned aerial vehicle |
CN106708066A (en) * | 2015-12-20 | 2017-05-24 | 中国电子科技集团公司第二十研究所 | Autonomous landing method of unmanned aerial vehicle based on vision/inertial navigation |
CN107544550A (en) * | 2016-06-24 | 2018-01-05 | 西安电子科技大学 | A kind of Autonomous Landing of UAV method of view-based access control model guiding |
CN106774386A (en) * | 2016-12-06 | 2017-05-31 | 杭州灵目科技有限公司 | Unmanned plane vision guided navigation landing system based on multiple dimensioned marker |
CN107240063A (en) * | 2017-07-04 | 2017-10-10 | 武汉大学 | A kind of autonomous landing method of rotor wing unmanned aerial vehicle towards mobile platform |
CN107687850A (en) * | 2017-07-26 | 2018-02-13 | 哈尔滨工业大学深圳研究生院 | A kind of unmanned vehicle position and orientation estimation method of view-based access control model and Inertial Measurement Unit |
Non-Patent Citations (4)
Title |
---|
Autonomous takeoff, tracking and landing of a UAV on a moving UGV using onboard monocular vision;Cheng Hui;《Proceedings of the 32nd Chinese Control Conference》;20131021;第5895-5901页 * |
Towards autonomous tracking and landing on moving target;Lingyun Xu;《2016 IEEE International Conference on Real-time Computing and Robotics (RCAR)》;20160610;第620-625页 * |
四旋翼无人机自主移动降落方法研究;贾配洋;《计算机科学》;20171231;第520-523页 * |
基于机载视觉的舰载无人机自主着舰引导技术研究;刘刚;《舰船科学技术》;20171031;第189-191页 * |
Also Published As
Publication number | Publication date |
---|---|
CN108594848A (en) | 2018-09-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108594848B (en) | Unmanned aerial vehicle staged autonomous landing method based on visual information fusion | |
CN105652891B (en) | A kind of rotor wing unmanned aerial vehicle movement Target self-determination tracks of device and its control method | |
CN111596693B (en) | Ground target tracking control method and system for unmanned aerial vehicle based on pan-tilt camera | |
CN107463181A (en) | A kind of quadrotor self-adoptive trace system based on AprilTag | |
Roelofsen et al. | Reciprocal collision avoidance for quadrotors using on-board visual detection | |
Li et al. | UAV autonomous landing technology based on AprilTags vision positioning algorithm | |
CN110658826A (en) | Autonomous berthing method of under-actuated unmanned surface vessel based on visual servo | |
Xu et al. | Vision-based autonomous landing of unmanned aerial vehicle on a motional unmanned surface vessel | |
Cho et al. | Autonomous ship deck landing of a quadrotor UAV using feed-forward image-based visual servoing | |
Mills et al. | Vision based control for fixed wing UAVs inspecting locally linear infrastructure using skid-to-turn maneuvers | |
Acuna et al. | Vision-based UAV landing on a moving platform in GPS denied environments using motion prediction | |
Eguíluz et al. | Why fly blind? Event-based visual guidance for ornithopter robot flight | |
Haddadi et al. | Visual-inertial fusion for indoor autonomous navigation of a quadrotor using ORB-SLAM | |
CN114200948A (en) | Unmanned aerial vehicle autonomous landing method based on visual assistance | |
Zhang et al. | Autonomous landing on ground target of UAV by using image-based visual servo control | |
Zhang et al. | Enhanced fiducial marker based precise landing for quadrotors | |
Flores et al. | Pid switching control for a highway estimation and tracking applied on a convertible mini-uav | |
Lee et al. | Landing Site Inspection and Autonomous Pose Correction for Unmanned Aerial Vehicles | |
CN114326765B (en) | Landmark tracking control system and method for unmanned aerial vehicle visual landing | |
CN116185049A (en) | Unmanned helicopter autonomous landing method based on visual guidance | |
CN115793676A (en) | Rotor unmanned aerial vehicle vision guiding landing method facing mobile platform | |
Ferreira et al. | LIDAR-based USV close approach to vessels for manipulation purposes | |
Abdessameud et al. | Dynamic image-based tracking control for VTOL UAVs | |
Yang et al. | A new image-based visual servo control algorithm for target tracking problem of fixed-wing unmanned aerial vehicle | |
Ajmera et al. | Autonomous visual tracking and landing of a quadrotor on a moving platform |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |