CN108645408A - Unmanned aerial vehicle autonomous recovery target prediction method based on navigation information - Google Patents

Unmanned aerial vehicle autonomous recovery target prediction method based on navigation information Download PDF

Info

Publication number
CN108645408A
CN108645408A CN201810424143.3A CN201810424143A CN108645408A CN 108645408 A CN108645408 A CN 108645408A CN 201810424143 A CN201810424143 A CN 201810424143A CN 108645408 A CN108645408 A CN 108645408A
Authority
CN
China
Prior art keywords
target
coordinate system
cooperative target
unmanned plane
navigation information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810424143.3A
Other languages
Chinese (zh)
Other versions
CN108645408B (en
Inventor
方强
唐邓清
赵框
周正元
周晗
周勇
曹正江
王树源
高平海
胡天江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National University of Defense Technology
Original Assignee
National University of Defense Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University of Defense Technology filed Critical National University of Defense Technology
Priority to CN201810424143.3A priority Critical patent/CN108645408B/en
Publication of CN108645408A publication Critical patent/CN108645408A/en
Application granted granted Critical
Publication of CN108645408B publication Critical patent/CN108645408B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Image Analysis (AREA)

Abstract

Aiming at the situation that the cooperative target moves, the invention provides an unmanned aerial vehicle autonomous recovery target prediction method based on navigation information. The movement of the cooperative target means that the displacement component of the cooperative target in the world coordinate system changes, i.e. the cooperative target moves in the world coordinate systemA change occurs. In the process of autonomous recovery of the unmanned aerial vehicle, the unmanned aerial vehicle can provide navigation information such as pose and the like in real time, based on the pose information of the unmanned aerial vehicle at the previous period, a motion curve of a cooperative target is fitted by using a relevant numerical method, the spatial motion position of the cooperative target at the current time is estimated according to the curve, and then an imaging point of the cooperative target is predicted by combining the navigation information. The method can predict the imaging area of the cooperative target by combining the navigation information, and only carries out target detection tracking and relative pose calculation in the area, thereby improving the real-time performance and the accuracy of the cooperative target pose estimation calculation method.

Description

A kind of unmanned plane voluntary recall target prediction method based on navigation information
Technical field
The present invention relates to unmanned plane voluntary recall technical fields, certainly more particularly to a kind of unmanned plane based on navigation information Main recycling target prediction method.
Background technology
It is a research currently to carry out voluntary recall (ground recycling or vehicle-mounted recycling etc.) in civil field to unmanned plane Hot and difficult issue, and to realize safe voluntary recall, accurate Relative attitude and displacement estimation in real time is basis, especially for small It is even more a challenge for type unmanned plane (the especially unmanned plane of high-speed motion).
Currently used voluntary recall method is to be based on cooperative target calibration method, by detect and track ground or vehicle Cooperative target estimate that the relative pose of unmanned plane, however small drone load is limited, airborne processing capacity by It is conventional to be likely to that requirement of real-time is met based on cooperative target calibration method to limitation, or since cooperative target is being schemed Change in location as in too fast causes pose estimation effect poor.
Therefore, it is used for solving cooperative target in unmanned plane voluntary recall to a certain extent there is an urgent need for a kind of method at research to predict Real-time and accuracy the problem of.
Invention content
To overcome defect of the existing technology, in the case of being movement for cooperative target, the present invention provides a kind of base In the unmanned plane voluntary recall target prediction method of navigation information.Cooperative target movement means cooperative target in world coordinate system In displacement component can change, i.e.,It changes, therefore first has to obtainValue.
During unmanned plane voluntary recall, unmanned plane can provide the navigation informations such as pose in real time, i.e.,WithIt is known.The present invention is based on the posture informations of the unmanned plane at moment the last period, are fitted and are closed using relevant numerical method The curve movement for making target estimates the space motion location at cooperative target current time according to the curve, in conjunction with navigation information The imaging point of forecast collaboration target.Specific implementation step is as follows:
(1) coordinate system is defined
Camera coordinate system ocxcyczc:Origin is the optical center of video camera, ocxcAnd ocycThe u of axis and image, v axis are parallel, oczcAxis is that camera shooting is optical axis, focal length f, ocxcAnd ocycThe effective focal length in direction is respectively fxAnd fy
Unmanned plane body coordinate system obxbybzb:ocycAxis is directed toward head, o along fuselage axis of symmetry linecxcPerpendicular to unmanned plane pair It is directed toward fuselage right, o in title faceczcAxis meets right-hand rule;
Imaging plane coordinate system:Origin is the central point of image, and abscissa x and ordinate y are respectively parallel to where image Row and column;
Image coordinate system:Origin is the upper left corner of image, and abscissa u and ordinate v are the row and column where image respectively, Central point (u0,v0) it is main point coordinates;
(2) cooperation marker movements mean that displacement component of the cooperation mark in world coordinate system can change, i.e.,It changes.If target point P is the central point of cooperative target, as shown in Fig. 2, considering pinhole camera Model, wherein OcPoint is optical center, OczcAxis is optical axis, and f is focal length.Target point P is relative to camera optical center OcDistance imaging Projection (the x of machine coordinate systemc,yc,zc) can be expressed as:
WhereinFor projections of the target point P in world coordinate system,It is sat in the world for unmanned plane Projection in mark system,For the spin matrix of unmanned plane body coordinate system to camera coordinate system,It is revolved for the posture of unmanned plane Torque battle array.
Assuming that target point P is respectively (x, y) and (u, v), image in the coordinate of imaging plane coordinate system and image coordinate system Principal point coordinate in coordinate system is (u0,v0), then projections of the target point P under the two coordinate systems has following relationship
(3) estimate cooperative target in the spatial position of subsequent time based on fitting of a polynomial.
Assuming that relative position of the unmanned plane at the preceding n moment is (xc(tk-i),yc(tk-i),zc(tk-i)), wherein i=1 ... n;In conjunction with cooperative target relative pose information, the spatial position of the cooperative target at n moment before can obtainingWherein i=1 ... n;
Consider in time interval [tk-i,tk-1] in, the curve movement of cooperative target can be fitted with lower order polynomial expressions, i.e.,
Polynomial coefficient (c can be obtained using formula (3)x0,cx1,cx2,…cxn)(cy0,cy1,cy2,…cyn)(cz0,cz1, cz2,…czn);Therefore work as tkWhen the moment, cooperative target can be estimated at this according to obtained multinomial coefficient, convolution (3) The spatial position at quarter
(4) image space of forecast collaboration target.
According to tkThe navigation information at momentWithAnd tkMoment estimation(x can be obtained using formula (1)c(tk),yc(tk),zc(tk)), it, can be pre- further according to formula (2) Survey the imaging pixel point (u (t of cooperative targetk),v(tk))。
Compared with prior art, the present invention can generate following technique effect:
Consider in pose estimation procedure, do not need to handle entire image, and only need to target area into Row processing, so as to reduce detection time.It is integrated with navigation system on unmanned plane simultaneously, navigation can be provided in real time Information, and target imaging geometrical principle is related with navigation information.Therefore navigation information of the present invention can be with forecast collaboration The imaging region of target only carries out target detection tracking and relative pose resolves, in the area to improve cooperative target The real-time and accuracy of pose algorithm for estimating.
Description of the drawings
Fig. 1 is a kind of schematic diagram of cooperative target.
Fig. 2 is video camera imaging principle schematic.
Specific implementation mode
The present invention is described in detail below, so that advantages and features of the invention can be easier to by art technology Personnel's understanding, so as to make a clearer definition of the protection scope of the present invention.
In the case of being movement for cooperative target, the present invention provides a kind of unmanned plane voluntary recall based on navigation information Target prediction method.Cooperative target movement means that displacement component of the cooperative target in world coordinate system can change, i.e.,It changes, therefore first has to obtainValue.It is independently returned in unmanned plane During receipts, unmanned plane can provide the navigation informations such as pose in real time, and the present invention is based on the positions of the unmanned plane at moment the last period Appearance information, using relevant numerical method be fitted cooperative target curve movement, according to the curve estimate cooperative target it is current when The space motion location at quarter, in conjunction with the imaging point of navigation information forecast collaboration target.
A specific embodiment is given below:
Case:Assuming that the focal length of video camera is fx=fy=1000, resolution ratio is 1280 × 720;Current tkMoment unmanned plane Projection in world coordinate systemThe posture spin matrix of unmanned planeFor unit matrix, camera shooting The spin matrix of machine and unmanned plane bodyCooperative target on ground is at the uniform velocity transported with the speed of 2m/s It is dynamic, and the spatial position sequence of the cooperative target of estimation early periodWherein i=1 ... 5 (here with 5 For a value, it is spaced dt=0.2s), respectively (9.2,8.9,0.1), (9.3,9.3,0.15), (9.15,9.5,0.2), (9.25,9.8,0.05), (9.5,10.2,0.12).
It needs to predict current tkThe image space of moment cooperative target.
Using the method for the present invention, steps are as follows:
1) according to known cooperative target spatial position sequence, since cooperative target moves, first according to formula (3) It is fitted cooperative target movement locus with multinomial based on the spatial position sequence of the cooperative target of estimation early period, that is, is sought multinomial Formula coefficient, due to using preceding 5 groups of data here, can obtain 3 rank multinomial coefficients according to formula (3) is:
cx0=-0.6, cx1=6.9583, cx2=-15.625, cx3=10.4167
cy0=-0.3, cy1=5, cy2=-8.75, cy3=6.257
cz0=-0.24, cz1=3.35, cz2=-10.625, cz3=8.125
2) according to 1) result and formula (3) predict current tkThe spatial position of moment target
3) according to formula (1) obtain target camera coordinate system projection (xc(tk),yc(tk),zc(tk))。
4) t is predicted according to formula (2)kImage space (u (the t of moment cooperative target in the picturek),v(tk))。
The foregoing is merely a preferred embodiment of the present invention, are not intended to restrict the invention, for this field For technical staff, the invention may be variously modified and varied.All within the spirits and principles of the present invention, any made by Modification, equivalent replacement, improvement etc., should all be included in the protection scope of the present invention.

Claims (3)

1. a kind of unmanned plane voluntary recall target prediction method based on navigation information, it is characterised in that:Include the following steps:
(1) coordinate system is defined
Camera coordinate system ocxcyczc:Origin is the optical center of video camera, ocxcAnd ocycThe u of axis and image, v axis are parallel, oczcAxis It is optical axis, focal length f, o for camera shootingcxcAnd ocycThe effective focal length in direction is respectively fxAnd fy
Unmanned plane body coordinate system obxbybzb:ocycAxis is directed toward head, o along fuselage axis of symmetry linecxcPerpendicular to the unmanned plane plane of symmetry It is directed toward fuselage right, oczcAxis meets right-hand rule;
Imaging plane coordinate system:Origin is the central point of image, and abscissa x and ordinate y are respectively parallel to the row where image And row;
Image coordinate system:Origin is the upper left corner of image, and abscissa u and ordinate v are the row and column where image, center respectively Point (u0,v0) it is main point coordinates;
(2) cooperation marker movements mean that displacement component of the cooperation mark in world coordinate system can change, i.e.,It changes;If target point P is the central point of cooperative target, pinhole camera modeling is considered, wherein OcPoint is optical center, OczcAxis is optical axis, and f is focal length;Target point P is relative to camera optical center OcDistance in camera coordinate system Projection (xc,yc,zc) can be expressed as:
WhereinFor projections of the target point P in world coordinate system,It is unmanned plane in world coordinate system In projection,For the spin matrix of unmanned plane body coordinate system to camera coordinate system,For the posture spin moment of unmanned plane Battle array;
Assuming that target point P is respectively (x, y) and (u, v), image coordinate in the coordinate of imaging plane coordinate system and image coordinate system Principal point coordinate in system is (u0,v0), then projections of the target point P under the two coordinate systems has following relationship
(3) estimate cooperative target in the spatial position of subsequent time based on fitting of a polynomial;
(4) image space of forecast collaboration target.
2. the unmanned plane voluntary recall target prediction method according to claim 1 based on navigation information, it is characterised in that: In step (3):Assuming that relative position of the unmanned plane at the preceding n moment is (xc(tk-i),yc(tk-i),zc(tk-i)), wherein i= 1,…n;In conjunction with cooperative target relative pose information, the spatial position of the cooperative target at n moment before can obtainingWherein i=1 ... n;
Consider in time interval [tk-i,tk-1] in, the curve movement of cooperative target can be fitted with lower order polynomial expressions, i.e.,
Polynomial coefficient (c can be obtained using formula (3)x0,cx1,cx2,…cxn)(cy0,cy1,cy2,…cyn)(cz0,cz1, cz2,…czn);Therefore work as tkWhen the moment, cooperative target can be estimated at this according to obtained multinomial coefficient, convolution (3) The spatial position at quarter
3. the unmanned plane voluntary recall target prediction method according to claim 2 based on navigation information, it is characterised in that: In step (4):According to tkThe navigation information at momentWithAnd tkMoment estimation(x can be obtained using formula (1)c(tk),yc(tk),zc(tk)), it, can be pre- further according to formula (2) Survey the imaging pixel point (u (t of cooperative targetk),v(tk))。
CN201810424143.3A 2018-05-07 2018-05-07 Unmanned aerial vehicle autonomous recovery target prediction method based on navigation information Active CN108645408B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810424143.3A CN108645408B (en) 2018-05-07 2018-05-07 Unmanned aerial vehicle autonomous recovery target prediction method based on navigation information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810424143.3A CN108645408B (en) 2018-05-07 2018-05-07 Unmanned aerial vehicle autonomous recovery target prediction method based on navigation information

Publications (2)

Publication Number Publication Date
CN108645408A true CN108645408A (en) 2018-10-12
CN108645408B CN108645408B (en) 2020-07-17

Family

ID=63749031

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810424143.3A Active CN108645408B (en) 2018-05-07 2018-05-07 Unmanned aerial vehicle autonomous recovery target prediction method based on navigation information

Country Status (1)

Country Link
CN (1) CN108645408B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110044212A (en) * 2019-03-12 2019-07-23 西安电子科技大学 The rotor wing unmanned aerial vehicle of view-based access control model metrical information arrests recovery method
CN111399542A (en) * 2020-04-02 2020-07-10 重庆市亿飞智联科技有限公司 Unmanned aerial vehicle landing method and device, storage medium, automatic pilot and unmanned aerial vehicle
CN111951331A (en) * 2020-07-07 2020-11-17 中国人民解放军93114部队 Precise positioning method and device for flight device based on video image and electronic equipment
CN112419417A (en) * 2021-01-25 2021-02-26 成都翼比特自动化设备有限公司 Unmanned aerial vehicle-based photographing point positioning method and related device
CN117516485A (en) * 2024-01-04 2024-02-06 东北大学 Pose vision measurement method for automatic guiding and mounting of aircraft engine

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103604427A (en) * 2013-12-10 2014-02-26 中国航天空气动力技术研究院 Unmanned aerial vehicle system and method for dynamically positioning ground moving target
CN105206109A (en) * 2015-08-13 2015-12-30 长安大学 Infrared CCD based foggy day identifying early-warning system and method for vehicle
CN106373159A (en) * 2016-08-30 2017-02-01 中国科学院长春光学精密机械与物理研究所 Simplified unmanned aerial vehicle multi-target location method
CN107314771A (en) * 2017-07-04 2017-11-03 合肥工业大学 Unmanned plane positioning and attitude angle measuring method based on coded target
US20180092484A1 (en) * 2016-10-04 2018-04-05 Wal-Mart Stores, Inc. Landing pad receptacle for package delivery and receipt

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103604427A (en) * 2013-12-10 2014-02-26 中国航天空气动力技术研究院 Unmanned aerial vehicle system and method for dynamically positioning ground moving target
CN105206109A (en) * 2015-08-13 2015-12-30 长安大学 Infrared CCD based foggy day identifying early-warning system and method for vehicle
CN106373159A (en) * 2016-08-30 2017-02-01 中国科学院长春光学精密机械与物理研究所 Simplified unmanned aerial vehicle multi-target location method
US20180092484A1 (en) * 2016-10-04 2018-04-05 Wal-Mart Stores, Inc. Landing pad receptacle for package delivery and receipt
CN107314771A (en) * 2017-07-04 2017-11-03 合肥工业大学 Unmanned plane positioning and attitude angle measuring method based on coded target

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110044212A (en) * 2019-03-12 2019-07-23 西安电子科技大学 The rotor wing unmanned aerial vehicle of view-based access control model metrical information arrests recovery method
CN111399542A (en) * 2020-04-02 2020-07-10 重庆市亿飞智联科技有限公司 Unmanned aerial vehicle landing method and device, storage medium, automatic pilot and unmanned aerial vehicle
CN111399542B (en) * 2020-04-02 2024-01-30 重庆市亿飞智联科技有限公司 Unmanned aerial vehicle landing method and device, storage medium, autopilot and unmanned aerial vehicle
CN111951331A (en) * 2020-07-07 2020-11-17 中国人民解放军93114部队 Precise positioning method and device for flight device based on video image and electronic equipment
CN111951331B (en) * 2020-07-07 2024-02-27 中国人民解放军93114部队 Flight device accurate positioning method and device based on video image and electronic equipment
CN112419417A (en) * 2021-01-25 2021-02-26 成都翼比特自动化设备有限公司 Unmanned aerial vehicle-based photographing point positioning method and related device
CN117516485A (en) * 2024-01-04 2024-02-06 东北大学 Pose vision measurement method for automatic guiding and mounting of aircraft engine
CN117516485B (en) * 2024-01-04 2024-03-22 东北大学 Pose vision measurement method for automatic guiding and mounting of aircraft engine

Also Published As

Publication number Publication date
CN108645408B (en) 2020-07-17

Similar Documents

Publication Publication Date Title
CN108645408A (en) Unmanned aerial vehicle autonomous recovery target prediction method based on navigation information
Bonatti et al. Towards a robust aerial cinematography platform: Localizing and tracking moving targets in unstructured environments
CN111024066A (en) Unmanned aerial vehicle vision-inertia fusion indoor positioning method
CN110570453B (en) Binocular vision-based visual odometer method based on closed-loop tracking characteristics
CN108519102B (en) Binocular vision mileage calculation method based on secondary projection
CN105096341B (en) Mobile robot position and orientation estimation method based on trifocal tensor and key frame strategy
CN109669474B (en) Priori knowledge-based multi-rotor unmanned aerial vehicle self-adaptive hovering position optimization algorithm
McManus et al. Distraction suppression for vision-based pose estimation at city scales
Lippiello et al. 3D monocular robotic ball catching
Wu et al. Vison-based auxiliary navigation method using augmented reality for unmanned aerial vehicles
Zhang et al. The use of optical flow for UAV motion estimation in indoor environment
CN108731683B (en) Unmanned aerial vehicle autonomous recovery target prediction method based on navigation information
Yang et al. Simultaneous estimation of ego-motion and vehicle distance by using a monocular camera
CN112945233A (en) Global drift-free autonomous robot simultaneous positioning and map building method
Zhu et al. Stereo visual tracking within structured environments for measuring vehicle speed
CN116151320A (en) Visual odometer method and device for resisting dynamic target interference
Zheng et al. Integrated navigation system with monocular vision and LIDAR for indoor UAVs
Sizintsev et al. Long-Range Augmented Reality with Dynamic Occlusion Rendering
Wu et al. Research progress of obstacle detection based on monocular vision
Deng et al. Robust 3D-SLAM with tight RGB-D-inertial fusion
Huang et al. Real-Time 6-DOF Monocular Visual SLAM based on ORB-SLAM2
Schaub et al. Reactive avoidance of dynamic obstacles through optimization of their epipoles
Gu et al. Application of Dynamic Deformable Attention in Bird’s-Eye-View Detection
Lee et al. A Gyro-based Tracking Assistant for Drones with Uncooled Infrared Camera
Lu et al. A new real time environment perception method based on visual image for micro UAS flight control

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant