CN108594848A - A kind of unmanned plane of view-based access control model information fusion autonomous ship method stage by stage - Google Patents

A kind of unmanned plane of view-based access control model information fusion autonomous ship method stage by stage Download PDF

Info

Publication number
CN108594848A
CN108594848A CN201810270640.2A CN201810270640A CN108594848A CN 108594848 A CN108594848 A CN 108594848A CN 201810270640 A CN201810270640 A CN 201810270640A CN 108594848 A CN108594848 A CN 108594848A
Authority
CN
China
Prior art keywords
stage
unmanned plane
terrestrial reference
camera
unmanned
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810270640.2A
Other languages
Chinese (zh)
Other versions
CN108594848B (en
Inventor
袁野
陆宇
张卫东
姚瑞文
胡智焕
李茂峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiaotong University
Original Assignee
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University filed Critical Shanghai Jiaotong University
Priority to CN201810270640.2A priority Critical patent/CN108594848B/en
Publication of CN108594848A publication Critical patent/CN108594848A/en
Application granted granted Critical
Publication of CN108594848B publication Critical patent/CN108594848B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The present invention relates to a kind of unmanned plane of view-based access control model information fusion autonomous ship methods stage by stage, include the following steps:1) terrestrial reference makes:It using respective objects landing point on unmanned boat as terrestrial reference, is put on ground and posts AprilTags labels, and adjust the angle of unmanned plane camera;2) image procossing:According to the image information that the parameter information of camera and camera capture, the relative pose X between video camera and terrestrial reference is obtained when finding terrestrial referencect;3) information merges:By the relative pose X between video camera and terrestrial referencectAfter being merged into row information with the measurement data of IMU, real-time relative pose X of the unmanned boat under unmanned plane referential is obtainedvs;4) motion control:According to real-time relative pose XvsFlight stability and path trace are ensured using nested control mode, at the same using stage by stage ship method carry out ship.Compared with prior art, the present invention have many advantages, such as in real time effectively, avoid lagging, stablize it is safe.

Description

A kind of unmanned plane of view-based access control model information fusion autonomous ship method stage by stage
Technical field
The present invention relates to the marine robotic technology field of intelligence, more particularly, to a kind of fusion of view-based access control model information nobody Machine autonomous ship method stage by stage.
Background technology
With the development of science and technology unmanned systems are more and more extensive in the application of the professional domains such as agricultural, electric power, ocean.Nothing Man-machine (Unmanned Aerial Vehicle, UAV) in recent years, development speed and is answered as " favorite " in unmanned systems It is constantly promoted with field.
In marine field, it is stronger that cruising ability is limited but search range is wide unmanned plane is usually provided at cruising ability But on the smaller unmanned boat in search range, the machine ship collaboration for forming mutual supplement with each other's advantages is formed into columns, for completing rescue at sea, environment prison The tasks such as survey, battle reconnaissance, core technology are Shipborne UAV autonomous navigation technologies.
Take off, hover and ship be Shipborne UAV autonomous navigation technology three basic problems.Wherein, unmanned plane is autonomous The problem that ship is most challenging.During ship, unmanned plane will face landing platform that is mobile and swinging, carry out effective Terrestrial reference identification and accurate Pose Control.At present in the world unmanned plane it is autonomous the research of ship technical aspect it is also seldom, mainly Challenge be embodied in two aspect:
First, unmanned plane it is autonomous the navigation accuracy of ship.The GPS of small drone often only has meter accuracy, Wu Faman Foot ship task required precision;INS/GPS integrated navigation systems can only position unmanned plane itself pose, lack unmanned plane and ship The relative pose of platform.Although many documents are proposed at present carries out assisting navigation, their figure using computer vision Picture processing procedure operand is larger, and general airborne computer operational capability is limited, and the pose provided by vision guided navigation is often fallen Afterwards in real-time pose.Therefore, the real-time and validity that posture information how is ensured while improving navigation accuracy, become nothing A problem of urgent need to resolve during man-machine autonomous ship.
Second, unmanned plane the safety of ship latter end.During ship, some documents directly use the central point of terrestrial reference It is controlled as with reference to position signal.However be easy that camera is made to lose the target visual field over the ground in this way, especially on unmanned boat, Landmark locations are not fixed.Further, since the wing effect (moving object ground proximity run when, ground in face of object generation Air force interference), unmanned plane is difficult to keep stable when close to unmanned boat landing platform so that the above method application when Success rate is not high, unmanned plane ship latter end safety have it is to be hoisted.
Invention content
It is an object of the present invention to overcome the above-mentioned drawbacks of the prior art and provide a kind of view-based access control model information The unmanned plane of fusion autonomous ship method stage by stage.
The purpose of the present invention can be achieved through the following technical solutions:
A kind of unmanned plane of view-based access control model information fusion autonomous ship method stage by stage, includes the following steps:
1) terrestrial reference makes:Using respective objects landing point on unmanned boat as terrestrial reference, is put on ground and post AprilTags marks Label, and the angle of unmanned plane camera is adjusted, ensure that unmanned plane can detect terrestrial reference when close to terrestrial reference;
2) image procossing:The image information captured according to the parameter information of camera and camera is calculated in unmanned plane machine The parameter for carrying the AprilTags visional reference systems configured in computer is obtained when finding terrestrial reference between video camera and terrestrial reference Relative pose Xct
3) information merges:By the relative pose X between video camera and terrestrial referencectAfter being merged into row information with the measurement data of IMU, Obtain real-time relative pose X of the unmanned boat under unmanned plane referentialvs
4) motion control:According to real-time relative pose XvsUsing nested control mode ensure flight stability and path with Track, at the same using stage by stage ship method carry out ship.
The parameter of AprilTags visional references system includes the coke in wide direction as unit of pixel in the step 2) Away from Fw, focal length F on high direction as unit of pixelh, the center (C of imagew,Ch), calculation formula is:
Wherein, F and C is respectively the focal length on different directions and position, LfocusAnd LreceptorRespectively focal length and photosensitive member Part size, NpixelFor pixel quantity, the parameter of width direction and short transverse is calculated respectively using this formula.
The step 3) specifically includes following steps:
31) system state estimation, including two stages, first stage are the terrestrial reference stage that is not detected, and second stage is inspection After measuring terrestrial reference;
When terrestrial reference is not detected, system mode is the posture information X that IMU is providedv, i.e. X=[Xv], to the system mode Estimated then have:
Wherein,For kth step system state estimation value,For the system state estimation value of -1 step of kth, Fk-1 For the Kalman filtering parameter in terrestrial reference stage is not detected, forecasting system future state is gone using at the uniform velocity model here, i.e., Meet
Δ t is sampling interval, wk-1For white Gaussian noise;
Since detecting terrestrial reference for the first time, unmanned boat state X is added in system modes, i.e. X=[Xv Xs], and update Kalman filtering parameter Fk-1, then have:
Wherein,For with the relevant Kalman filtering parameter of unmanned plane,For with the relevant Kalman filtering of unmanned boat Parameter only considers the movement of unmanned boat in the horizontal plane here, same using at the uniform velocity model, has
32) observed result of IMU and camera are obtained, wherein the observed result h of IMUIMUFor:
Wherein, zlvFor height, φlvlvlvRespectively around x, the rotation angle of tri- direction of motion of y, z, ulvAnd vlvPoint Not Wei forward speed and side velocity, correspond to Jacobian matrix HsIMUFor:
The observed result h of the cameratag(XKF) be:
Wherein, XvcFor the pose of camera under unmanned plane referential, XstFor the pose marked under unmanned boat referential, Xlv Pose for the unmanned plane estimated, XlsPose for the unmanned boat estimated,WithFor coordinate conversion operation;
It corresponds to Jacobian matrix HstagFor:
33) Kalman filtering is extended to observed result, obtains real-time system mode.
The step 3) is further comprising the steps of:
34) by the history pose of unmanned plane and unmanned boat, system mode is changed in delay, obtains people's ship in nothing Real-time relative pose X under man-machine referentialvs
In the step 4), nested control mode is specially:
Controlled using 6 independent closed loop PID controllers, wherein inner ring gesture stability for ensureing flight stability, Outer ring position control is used for path trace.
In the step 4), stage by stage ship method specifically include following steps:
41) terrestrial reference is found:When finding terrestrial reference for the first time, system mode is initialized, and unmanned plane is made to track the terrestrial reference;
42) the descending at slow speed stage:It guides at unmanned plane to the center of terrestrial reference, keeps it in centered on terrestrial reference, radius For rloop, it is highly hloopCylindrical space in, and find using Δ z as step-length next track points vertically downward and decline, when losing When the visual field is marked in lost territory, then Δ z-height is promoted, until rediscovering the terrestrial reference.
43) enter the rapid decrease stage:First height threshold h is setland, when the height of unmanned plane is less than hlandWhen, then recognize To be not necessarily to decline, directly prepare to land, if in continuous N frames, which meets rapid decrease all in the visual field of camera When condition, then it is assumed that can enter the rapid decrease stage, directly using 0 as z-axis reference signal;
44) landing period:Second height threshold h is setmin, when unmanned plane height is less than the second height threshold hminWhen, then Propeller is closed, makes unmanned plane freely falling body and completes to land.
Rapid decrease condition in the step 43) is:
z-hland≤hloop
||(x,y)||2≤rloop
Wherein, z is height of the unmanned plane relative to unmanned boat, | | (x, y) | |2For unmanned plane and unmanned boat it is horizontal away from From.
In the step 44),
If unmanned plane is higher than the second height threshold hminWhen, camera loses the target visual field over the ground, no matter at a slow speed Decline stage or rapid decrease stage, then unmanned plane is drawn high, until rediscovering terrestrial reference.
Compared with prior art, the present invention has the following advantages:
One, in real time effectively:In terrestrial reference identification and image processing process, the ripe AprilTags to increase income is used, can be dropped Low technical threshold;Computational complexity is simplified, equipment cost is reduced under the premise of ensureing precision using monocular cam.
Two, it avoids lagging:The relative pose that video camera obtains is merged into row information to improve essence with the pose that IMU is obtained Degree;Lag issues when avoiding image procossing by increasing historic state.
Three, stablize safety:By unmanned plane ship nesting control program and stage by stage safety ship method be combined, can keep away Exempt from unmanned plane camera during ship and lose the target visual field over the ground, while can guarantee unmanned plane flat close to unmanned boat landing Stability when platform and safety.
Description of the drawings
Fig. 1 is that unmanned plane ship nesting control program block diagram in the present invention.
Fig. 2 be the present invention in unmanned plane stage by stage safety ship method flow diagram.
Fig. 3 be in the present invention unmanned plane it is autonomous ship method schematic.
Specific implementation mode
The present invention is described in detail with specific embodiment below in conjunction with the accompanying drawings.
Embodiment
Specific implementation description is made to technical scheme of the present invention below in conjunction with attached drawing 3.
For convenience of description, following notation convention is first done:
XijRefer to pose of the j referentials under i referentials, pose here is defined as 6 dimensional vectors [x y z φ θ ψ]T, wherein (x, y, z) is the position coordinates in referential, (φ, θ, ψ) is around the angle that x-axis, y-axis, z-axis rotate respectively Degree, referred to as roll angle, pitch angle and yaw angle.The referential used has:Unmanned plane referential { v }, local referential { l }, nothing People's ship referential { s }, video camera referential { c }, terrestrial reference referential { t }.The basic symbol of some referentials transformation is defined simultaneously: If i, j, k represent three referentials, symbolIt indicates the cumulative of transformation, meetsSymbolIt indicates's Inverse operation meetsIn the method as proposed in the present invention, it calculates and passes through software in the airborne computer of unmanned plane It realizes, unmanned plane configures an Inertial Measurement Unit to obtain real-time posture information, and a camera is to acquire environmental information. By quadrotor drone land for electricity drives on unmanned boat, the implementation procedure of this method includes following 4 specific implementation steps.
Step 1:Terrestrial reference makes.The label in AprilTags is printed, the respective objects landing point being posted on unmanned boat is made For terrestrial reference.The angle adjustment of unmanned plane camera is forward and slightly downward, ensure that unmanned plane can detect when close to terrestrial reference Terrestrial reference.
Step 2:Image procossing.AprilTags visional reference systems are configured in the airborne computer of unmanned plane.Using taking the photograph As the image information that the parameter information and camera of head capture, 4 important parameters are calculated, i.e.,:It is with pixel on wide, high direction The focal length F of unitwAnd Fh, the center (C of imagew,Ch).Calculation formula is as follows:
Wherein, LfocusAnd LreceptorIt is focal length and photosensitive element size respectively, in millimeters.NpixelIt is pixel number Amount.The parameter of width direction and short transverse is calculated respectively using this formula.
The parameter calculated is transmitted to AprilTags visional reference systems, just can be returned when finding terrestrial reference video camera and Relative pose X between labelct
Step 3:Information merges.By the relative pose X between camera and terrestrial referencectMelt into row information with the measurement data of IMU It closes, obtains the real-time pose X of unmanned plane and unmanned boatlvAnd XlsAnd relative pose Xvs.The process is divided into following 4 sub-steps Suddenly:
(1) system state estimation.Here information fusion is divided into two stages, and the first stage is that terrestrial reference rank is not detected Section, second stage are after detecting terrestrial reference.
The terrestrial reference stage is not detected, system mode is taken as the posture information X of IMU offersv=[XlvVvWv].Wherein, VvIt is x, The speed in the direction y, z, and WvIt is around x, y, the angular speed in the directions z.The system mode is estimated as follows:
Wherein, wk-1It is white Gaussian noise, here using at the uniform velocity model come forecasting system state, i.e., With following form:
Δ t is the sampling interval.
Since detecting terrestrial reference for the first time, unmanned boat state, i.e. X=[X are added in system modev Xs], Xv=[Xlv Vv Wv], Xs=[Xls Vs Ws], then, Fk-1With following form:
Here only consider the movement of unmanned boat in the horizontal plane, it is same using at the uniform velocity model, have
(2) sensor observed result is obtained.The observation model of IMU and camera is all represented by
Z [k]=h (X)+v [k].
Wherein, v [k]~N (0, R [k]) is observation noise.H (X) indicates the function of state X.
IMU is to height, posture and velocity information that the observed result of unmanned plane is unmanned plane, i.e.,:
Wherein, zlvIt is height, φlvlvlvIt is around x, y, the rotation angle of tri- directions of motion of z respectively.ulvAnd vlvPoint It is not forward speed and side velocity.It corresponds to Jacobian matrixes
The observed result of i-th of terrestrial reference of camera pair is
Wherein, htag(XKF) be camera observed result, XvcFor the pose of camera under unmanned plane referential, XstFor nothing The pose marked under people's ship referential,WithFor coordinate conversion operation.
It is corresponding with
(3) Extended Kalman filter.Above-mentioned Nonlinear Filtering Problem is solved using Extended Kalman filter, due to quantity of state Meet normal distribution XKF~N (μ, ∑), wherein covariance matrix ∑ is expressed as
Extended Kalman filter is updated according to following formula:
By Extended Kalman filter, we can obtain the real-time status X=[X of systemv Xs]。
(4) state hysteresis is solved the problems, such as.Since image calculating is more complex, at the k moment, AprilTag can only often provide k- The result of calculation at n moment, then Extended Kalman filter to the state estimation at k+1 moment is obtained according to the state at k-n moment , it is inaccurate.We solve the problems, such as this by recording the history pose of unmanned plane and unmanned boat.To be in the case of delay System state estimator is revised as
Wherein, n is the number of delaying state.Then the iterative calculation formula of system state estimation is
Correspondingly, the Jacobian matrixes of IMU sensor models are modified to
Since the observed result in previous step is carried out to delaying state, for j-th of delaying state X [j], will take the photograph As the observation model of head is modified to
Just state X after augmentation can be updated after above-mentioned amendment with Extended Kalman filterDS.The observation each postponed is extended After Kalman filtering update, is just ignored and remove state vector.Therefore, as long as filtering algorithm extra storage is a bit of goes through for this History state.
After solving the problems, such as that state hysteresis obtains filtered system status information, using unmanned plane and unmanned boat in local ginseng Examine the pose X under beinglvAnd XlsRelative pose X can be found outvs.Circular is
Step 4:Motion control.Motion control is carried out to unmanned plane using the relative pose after fusion.It is controlled using nesting Scheme is controlled using 6 independent closed loop PID controllers, and wherein inner ring gesture stability is for ensureing flight stability, outer shroud Position control is used for path trace.Simultaneously using the ship method of safety stage by stage.This method is divided into following 4 sub-steps:
(1) when finding terrestrial reference for the first time, system mode is initialized, and unmanned plane is allowed to track this terrestrial reference.
(2) unmanned plane is routed to ground target center, be then maintained at centered on terrestrial reference, radius rloop, highly it is hloopCylinder in, and find using Δ z as step-length next track points vertically downward and decline.Once being lost regarding for terrestrial reference Open country just up promotes Δ z, and until rediscovering terrestrial reference, which is known as the descending at slow speed stage.
(3) a smaller height h is setlandIf height is less than hland, it is considered as, without declining, being directly preparing to Land.If terrestrial reference is all in camera view in continuous N frames, and meets
z-hland≤hloop,
||(x,y)||2≤rloop,
Then think can enter the rapid decrease stage, directly using 0 as z-axis reference signal.
Wherein, z is height of the unmanned plane relative to unmanned boat, | | (x, y) | |2For unmanned plane and unmanned boat it is horizontal away from From.
(4) when dropping to certain altitude, camera None- identified terrestrial reference when due to short distance also can not just position.Therefore, A very small height h is arranged in wemin, when less than this height, it is shut off propeller, unmanned plane freely falling body is allowed to land. But if higher than hminWhen camera lose the target visual field over the ground, either be in rapid decrease or descending at slow speed rank Section, should all draw high unmanned plane, until rediscovering terrestrial reference.

Claims (8)

1. a kind of unmanned plane of view-based access control model information fusion autonomous ship method stage by stage, which is characterized in that include the following steps:
1) terrestrial reference makes:Using respective objects landing point on unmanned boat as terrestrial reference, is put on ground and post AprilTags labels, and The angle of unmanned plane camera is adjusted, ensures that unmanned plane can detect terrestrial reference when close to terrestrial reference;
2) image procossing:The image information captured according to the parameter information of camera and camera is calculated in unmanned aerial vehicle onboard meter The parameter of the AprilTags visional reference systems configured in calculation machine obtains opposite between video camera and terrestrial reference when finding terrestrial reference Pose Xct
3) information merges:By the relative pose X between video camera and terrestrial referencectAfter being merged into row information with the measurement data of IMU, obtain Real-time relative pose X of the unmanned boat under unmanned plane referentialvs
4) motion control:According to real-time relative pose XvsFlight stability and path trace are ensured using nested control mode, together Shi Caiyong stage by stage ship method carry out ship.
2. a kind of unmanned plane of view-based access control model information fusion according to claim 1 autonomous ship method stage by stage, special Sign is that the parameter of AprilTags visional references system includes the coke in wide direction as unit of pixel in the step 2) Away from Fw, focal length F on high direction as unit of pixelh, the center (C of imagew,Ch), calculation formula is:
Wherein, F and C is respectively the focal length on different directions and position,focusWithreceptorRespectively focal length and photosensitive element size, NpixelFor pixel quantity.
3. a kind of unmanned plane of view-based access control model information fusion according to claim 1 autonomous ship method stage by stage, special Sign is that the step 3) specifically includes following steps:
31) system state estimation is divided into two stages, and the first stage is the terrestrial reference stage that is not detected, and second stage is to detect After terrestrial reference;
When terrestrial reference is not detected, system mode is the posture information X that IMU is providedv, i.e. X=[Xv], which is carried out Estimation, then have:
Wherein, whereinFor kth step system state estimation value,For the system state estimation value of -1 step of kth, Fk-1 For the Kalman filtering parameter in terrestrial reference stage, w is not detectedk-1For white Gaussian noise, Δ t is the sampling interval;
Since detecting terrestrial reference for the first time, unmanned boat state X is added in system modes, i.e. X=[Xv Xs], and update karr Graceful filtering parameter Fk-1, then have:
Wherein,For with the relevant Kalman filtering parameter of unmanned plane,For with the relevant Kalman filtering parameter of unmanned boat;
32) observed result of IMU and camera are obtained, wherein the observed result h of IMU1MUFor:
Wherein, zlvFor height, φlvlvlvRespectively around x, the rotation angle of tri- direction of motion of y, z, ulvAnd vlvRespectively Forward speed and side velocity correspond to Jacobian matrix HsIMUFor:
The observed result h of the cameratag(XKF) be:
Wherein, XvcFor the pose of camera under unmanned plane referential, XstFor the pose marked under unmanned boat referential, XlvTo estimate The pose for the unmanned plane counted out, XlsPose for the unmanned boat estimated,WithFor coordinate conversion operation;
It corresponds to Jacobian matrix HstagFor:
33) Kalman filtering is extended to observed result, obtains real-time system mode.
4. a kind of unmanned plane of view-based access control model information fusion according to claim 3 autonomous ship method stage by stage, special Sign is that the step 3) is further comprising the steps of:
34) by the history pose of unmanned plane and unmanned boat, system mode is changed in delay, obtains people's ship in unmanned plane Real-time relative pose X under referentialvs
5. a kind of unmanned plane of view-based access control model information fusion according to claim 1 autonomous ship method stage by stage, special Sign is, in the step 4), nested control mode is specially:
It is controlled using 6 independent closed loop PID controllers, wherein inner ring gesture stability is for ensureing flight stability, outer shroud Position control is used for path trace.
6. a kind of unmanned plane of view-based access control model information fusion according to claim 1 autonomous ship method stage by stage, special Sign is, in the step 4), stage by stage ship method specifically include following steps:
41) terrestrial reference is found:When finding terrestrial reference for the first time, system mode is initialized, and unmanned plane is made to track the terrestrial reference;
42) the descending at slow speed stage:It guides at unmanned plane to the center of terrestrial reference, keeps it in centered on terrestrial reference, radius is rloop, it is highly hloopCylindrical space in, and find using Δ z as step-length next track points vertically downward and decline, work as loss When the terrestrial reference visual field, then Δ z-height is promoted, until rediscovering the terrestrial reference.
43) enter the rapid decrease stage:First height threshold h is setland, when the height of unmanned plane is less than hlandWhen, then it is assumed that nothing It needs to decline, directly prepares to land, if in continuous N frames, which meets rapid decrease condition all in the visual field of camera When, then it is assumed that can enter the rapid decrease stage, directly using 0 as z-axis reference signal;
44) landing period:Second height threshold h is setmin, when unmanned plane height is less than the second height threshold hminWhen, then it closes Propeller makes unmanned plane freely falling body and completes to land.
7. a kind of unmanned plane of view-based access control model information fusion according to claim 6 autonomous ship method stage by stage, special Sign is that the rapid decrease condition in the step 43) is:
z-hland≤hloop
||(x,y)||2≤rloop
Wherein, z is height of the unmanned plane relative to unmanned boat, | | (x, y) | |2For the horizontal distance of unmanned plane and unmanned boat.
8. a kind of unmanned plane of view-based access control model information fusion according to claim 6 autonomous ship method stage by stage, special Sign is, in the step 44),
If unmanned plane is higher than the second height threshold hminWhen, camera loses the target visual field over the ground, no matter is in descending at slow speed Stage or rapid decrease stage, then unmanned plane is drawn high, until rediscovering terrestrial reference.
CN201810270640.2A 2018-03-29 2018-03-29 Unmanned aerial vehicle staged autonomous landing method based on visual information fusion Active CN108594848B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810270640.2A CN108594848B (en) 2018-03-29 2018-03-29 Unmanned aerial vehicle staged autonomous landing method based on visual information fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810270640.2A CN108594848B (en) 2018-03-29 2018-03-29 Unmanned aerial vehicle staged autonomous landing method based on visual information fusion

Publications (2)

Publication Number Publication Date
CN108594848A true CN108594848A (en) 2018-09-28
CN108594848B CN108594848B (en) 2021-01-22

Family

ID=63623841

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810270640.2A Active CN108594848B (en) 2018-03-29 2018-03-29 Unmanned aerial vehicle staged autonomous landing method based on visual information fusion

Country Status (1)

Country Link
CN (1) CN108594848B (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109341700A (en) * 2018-12-04 2019-02-15 中国航空工业集团公司西安航空计算技术研究所 Fixed wing aircraft vision assists landing navigation method under a kind of low visibility
CN109525220A (en) * 2018-12-10 2019-03-26 中国人民解放军国防科技大学 Gaussian mixture CPHD filtering method with track association and extraction capability
CN109823552A (en) * 2019-02-14 2019-05-31 深圳市多翼创新科技有限公司 The unmanned plane precision approach method of view-based access control model, storage medium, apparatus and system
CN110058604A (en) * 2019-05-24 2019-07-26 中国科学院地理科学与资源研究所 A kind of accurate landing system of unmanned plane based on computer vision
CN111323005A (en) * 2018-12-17 2020-06-23 北京华航无线电测量研究所 Visual auxiliary cooperative landmark design method for omnidirectional autonomous precise landing of unmanned helicopter
CN112099527A (en) * 2020-09-17 2020-12-18 湖南大学 Control method and system for autonomous landing of mobile platform of vertical take-off and landing unmanned aerial vehicle
CN112286216A (en) * 2020-11-11 2021-01-29 鹏城实验室 Unmanned aerial vehicle autonomous landing unmanned ship method and system based on visual identification
CN112419403A (en) * 2020-11-30 2021-02-26 海南大学 Indoor unmanned aerial vehicle positioning method based on two-dimensional code array
CN112987765A (en) * 2021-03-05 2021-06-18 北京航空航天大学 Precise autonomous take-off and landing method of unmanned aerial vehicle/boat simulating attention distribution of prey birds
CN114326765A (en) * 2021-12-01 2022-04-12 爱笛无人机技术(南京)有限责任公司 Landmark tracking control system and method for visual landing of unmanned aerial vehicle
CN117250995A (en) * 2023-11-20 2023-12-19 西安天成益邦电子科技有限公司 Bearing platform posture correction control method and system

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090306840A1 (en) * 2008-04-08 2009-12-10 Blenkhorn Kevin P Vision-based automated landing system for unmanned aerial vehicles
CN103662091A (en) * 2013-12-13 2014-03-26 北京控制工程研究所 High-precision safe landing guiding method based on relative navigation
EP2434256A3 (en) * 2010-09-24 2014-04-30 Honeywell International Inc. Camera and inertial measurement unit integration with navigation data feedback for feature tracking
CN104062977A (en) * 2014-06-17 2014-09-24 天津大学 Full-autonomous flight control method for quadrotor unmanned aerial vehicle based on vision SLAM
CN104679013A (en) * 2015-03-10 2015-06-03 无锡桑尼安科技有限公司 Unmanned plane automatic landing system
CN105021184A (en) * 2015-07-08 2015-11-04 西安电子科技大学 Pose estimation system and method for visual carrier landing navigation on mobile platform
CN105335733A (en) * 2015-11-23 2016-02-17 西安韦德沃德航空科技有限公司 Autonomous landing visual positioning method and system for unmanned aerial vehicle
CN106708066A (en) * 2015-12-20 2017-05-24 中国电子科技集团公司第二十研究所 Autonomous landing method of unmanned aerial vehicle based on vision/inertial navigation
CN106774386A (en) * 2016-12-06 2017-05-31 杭州灵目科技有限公司 Unmanned plane vision guided navigation landing system based on multiple dimensioned marker
CN107240063A (en) * 2017-07-04 2017-10-10 武汉大学 A kind of autonomous landing method of rotor wing unmanned aerial vehicle towards mobile platform
CN107544550A (en) * 2016-06-24 2018-01-05 西安电子科技大学 A kind of Autonomous Landing of UAV method of view-based access control model guiding
CN107687850A (en) * 2017-07-26 2018-02-13 哈尔滨工业大学深圳研究生院 A kind of unmanned vehicle position and orientation estimation method of view-based access control model and Inertial Measurement Unit

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090306840A1 (en) * 2008-04-08 2009-12-10 Blenkhorn Kevin P Vision-based automated landing system for unmanned aerial vehicles
EP2434256A3 (en) * 2010-09-24 2014-04-30 Honeywell International Inc. Camera and inertial measurement unit integration with navigation data feedback for feature tracking
CN103662091A (en) * 2013-12-13 2014-03-26 北京控制工程研究所 High-precision safe landing guiding method based on relative navigation
CN104062977A (en) * 2014-06-17 2014-09-24 天津大学 Full-autonomous flight control method for quadrotor unmanned aerial vehicle based on vision SLAM
CN104679013A (en) * 2015-03-10 2015-06-03 无锡桑尼安科技有限公司 Unmanned plane automatic landing system
CN105021184A (en) * 2015-07-08 2015-11-04 西安电子科技大学 Pose estimation system and method for visual carrier landing navigation on mobile platform
CN105335733A (en) * 2015-11-23 2016-02-17 西安韦德沃德航空科技有限公司 Autonomous landing visual positioning method and system for unmanned aerial vehicle
CN106708066A (en) * 2015-12-20 2017-05-24 中国电子科技集团公司第二十研究所 Autonomous landing method of unmanned aerial vehicle based on vision/inertial navigation
CN107544550A (en) * 2016-06-24 2018-01-05 西安电子科技大学 A kind of Autonomous Landing of UAV method of view-based access control model guiding
CN106774386A (en) * 2016-12-06 2017-05-31 杭州灵目科技有限公司 Unmanned plane vision guided navigation landing system based on multiple dimensioned marker
CN107240063A (en) * 2017-07-04 2017-10-10 武汉大学 A kind of autonomous landing method of rotor wing unmanned aerial vehicle towards mobile platform
CN107687850A (en) * 2017-07-26 2018-02-13 哈尔滨工业大学深圳研究生院 A kind of unmanned vehicle position and orientation estimation method of view-based access control model and Inertial Measurement Unit

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
CHENG HUI: "Autonomous takeoff, tracking and landing of a UAV on a moving UGV using onboard monocular vision", 《PROCEEDINGS OF THE 32ND CHINESE CONTROL CONFERENCE》 *
LINGYUN XU: "Towards autonomous tracking and landing on moving target", 《2016 IEEE INTERNATIONAL CONFERENCE ON REAL-TIME COMPUTING AND ROBOTICS (RCAR)》 *
刘刚: "基于机载视觉的舰载无人机自主着舰引导技术研究", 《舰船科学技术》 *
贾配洋: "四旋翼无人机自主移动降落方法研究", 《计算机科学》 *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109341700A (en) * 2018-12-04 2019-02-15 中国航空工业集团公司西安航空计算技术研究所 Fixed wing aircraft vision assists landing navigation method under a kind of low visibility
CN109525220A (en) * 2018-12-10 2019-03-26 中国人民解放军国防科技大学 Gaussian mixture CPHD filtering method with track association and extraction capability
CN111323005A (en) * 2018-12-17 2020-06-23 北京华航无线电测量研究所 Visual auxiliary cooperative landmark design method for omnidirectional autonomous precise landing of unmanned helicopter
CN109823552A (en) * 2019-02-14 2019-05-31 深圳市多翼创新科技有限公司 The unmanned plane precision approach method of view-based access control model, storage medium, apparatus and system
CN110058604A (en) * 2019-05-24 2019-07-26 中国科学院地理科学与资源研究所 A kind of accurate landing system of unmanned plane based on computer vision
CN112099527A (en) * 2020-09-17 2020-12-18 湖南大学 Control method and system for autonomous landing of mobile platform of vertical take-off and landing unmanned aerial vehicle
CN112286216A (en) * 2020-11-11 2021-01-29 鹏城实验室 Unmanned aerial vehicle autonomous landing unmanned ship method and system based on visual identification
CN112419403A (en) * 2020-11-30 2021-02-26 海南大学 Indoor unmanned aerial vehicle positioning method based on two-dimensional code array
CN112987765A (en) * 2021-03-05 2021-06-18 北京航空航天大学 Precise autonomous take-off and landing method of unmanned aerial vehicle/boat simulating attention distribution of prey birds
CN112987765B (en) * 2021-03-05 2022-03-15 北京航空航天大学 Precise autonomous take-off and landing method of unmanned aerial vehicle/boat simulating attention distribution of prey birds
CN114326765A (en) * 2021-12-01 2022-04-12 爱笛无人机技术(南京)有限责任公司 Landmark tracking control system and method for visual landing of unmanned aerial vehicle
CN114326765B (en) * 2021-12-01 2024-02-09 爱笛无人机技术(南京)有限责任公司 Landmark tracking control system and method for unmanned aerial vehicle visual landing
CN117250995A (en) * 2023-11-20 2023-12-19 西安天成益邦电子科技有限公司 Bearing platform posture correction control method and system
CN117250995B (en) * 2023-11-20 2024-02-02 西安天成益邦电子科技有限公司 Bearing platform posture correction control method and system

Also Published As

Publication number Publication date
CN108594848B (en) 2021-01-22

Similar Documents

Publication Publication Date Title
CN108594848A (en) A kind of unmanned plane of view-based access control model information fusion autonomous ship method stage by stage
CN105652891B (en) A kind of rotor wing unmanned aerial vehicle movement Target self-determination tracks of device and its control method
CN111596693B (en) Ground target tracking control method and system for unmanned aerial vehicle based on pan-tilt camera
CN107463181A (en) A kind of quadrotor self-adoptive trace system based on AprilTag
Wang et al. Cooperative USV–UAV marine search and rescue with visual navigation and reinforcement learning-based control
Spica et al. Active structure from motion: Application to point, sphere, and cylinder
CN108453738A (en) A kind of quadrotor based on Opencv image procossings independently captures the control method of operation in the air
CN113627473B (en) Multi-mode sensor-based water surface unmanned ship environment information fusion sensing method
Li et al. UAV autonomous landing technology based on AprilTags vision positioning algorithm
CN108759826A (en) A kind of unmanned plane motion tracking method based on mobile phone and the more parameter sensing fusions of unmanned plane
Mills et al. Vision based control for fixed wing UAVs inspecting locally linear infrastructure using skid-to-turn maneuvers
Zhang et al. A object detection and tracking method for security in intelligence of unmanned surface vehicles
Battiato et al. A system for autonomous landing of a UAV on a moving vehicle
CN114200948A (en) Unmanned aerial vehicle autonomous landing method based on visual assistance
Shi et al. Real-Time Multi-Modal Active Vision for Object Detection on UAVs Equipped With Limited Field of View LiDAR and Camera
CN108227749A (en) Unmanned plane and its tracing system
CN108170160A (en) It is a kind of to utilize monocular vision and the autonomous grasping means of airborne sensor rotor wing unmanned aerial vehicle
Zhang et al. Enhanced fiducial marker based precise landing for quadrotors
CN114326765B (en) Landmark tracking control system and method for unmanned aerial vehicle visual landing
Yubo et al. Survey of UAV autonomous landing based on vision processing
Gaspar et al. Model-based filters for 3-D positioning of marine mammals using AHRS-and GPS-equipped UAVs
Duan et al. Image digital zoom based single target apriltag recognition algorithm in large scale changes on the distance
CN113075937B (en) Control method for capturing target by unmanned aerial vehicle based on target acceleration estimation
Yuan et al. Eagle vision-based coordinate landing control framework of unmanned aerial vehicles on an unmanned surface vehicle
Baldini et al. Learning pose estimation for uav autonomous navigation andlanding using visual-inertial sensor data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant