CN105424059A - Wide baseline infrared camera pose estimation method - Google Patents

Wide baseline infrared camera pose estimation method Download PDF

Info

Publication number
CN105424059A
CN105424059A CN201510750446.0A CN201510750446A CN105424059A CN 105424059 A CN105424059 A CN 105424059A CN 201510750446 A CN201510750446 A CN 201510750446A CN 105424059 A CN105424059 A CN 105424059A
Authority
CN
China
Prior art keywords
reference point
camera
line
point
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510750446.0A
Other languages
Chinese (zh)
Other versions
CN105424059B (en
Inventor
杨涛
张艳宁
张卓越
肖彬
李广坡
王熙文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN201510750446.0A priority Critical patent/CN105424059B/en
Publication of CN105424059A publication Critical patent/CN105424059A/en
Application granted granted Critical
Publication of CN105424059B publication Critical patent/CN105424059B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass

Landscapes

  • Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a wide baseline infrared camera pose estimation method used for solving the technical problem that an existing method is poor in practicability. According to the technical scheme, six pairs of datum points of bilateral symmetry are selected in common view large-scale scene regions of all cameras on a carrier landing runway, and world coordinates of the datum points are accurately measured through a total station; when calibration is carried out, mark lamps are placed on the datum points, and poses of all the cameras are accurately calculated by detecting the mark lamps. The fact that the infrared cameras are incapable of taking effective calibration board photos is considered, the influence on calibration results by visible light in complex natural scenes in the landing scene of an unmanned aerial vehicle and the own characteristics of the infrared cameras are taken into consideration, infrared laser is used as cooperative mark lamps, and optical filters are additionally installed on the infrared cameras. The six pairs of datum points are arranged at the two sides of the landing runway, and the total station is used for carrying out level space accuracy measurement on the space coordinates of the datum points. By means of testing, the calibration results are accurate, and the reprojection error on the images reaches 0.05 pixel or below.

Description

Wide baseline near infrared camera position and orientation estimation method
Technical field
The present invention relates to a kind of wide baseline near infrared camera position and orientation estimation method, particularly relate to a kind of unmanned plane autonomous landing on the ship guidance system wide baseline near infrared camera position and orientation estimation method.
Background technology
The unmanned plane autonomous landing on the ship navigation of view-based access control model is the airmanship of current study hotspot, has bionics meaning and source, compared with traditional inertial and satellite navigation technology, has system independence, strong interference immunity, characteristic that precision is high.The navigation of unmanned plane autonomous landing on the ship refers to that measurement camera is arranged on the navigational system on landing runway, fixed-focus camera in navigational system is often subject to the impact of surround lighting when track and localization, in order to avoid this impact, in existing vision navigation system, adopt near infrared camera to the light of certain specific wavelength obtaining luminous point and send.In addition, in order to obtain the corresponding relation in space between three-dimensional coordinate point and camera review coordinate, punctuate must be carried out to video camera, and the precision of camera calibration directly affects the precision of optical alignment.
Conventional scaling method needs to place calibrated reference before camera to be calibrated, the scaling board of usual known form and size is demarcated, as gridiron pattern, cross etc., but because near infrared camera can not obtain the texture information of ordinary optical camera calibration plate, so near infrared video camera cannot be demarcated with the scaling board of general visible video camera.Vision navigation system needs the landing runway covering enough scopes, so need to install telephoto lens additional to video camera, because the telephoto lens depth of field is short, is difficult to accurate focusing, brings very large difficulty to the method for tradition shooting precision calibration thing.
Document " Acameracalibrationmethodforlargefieldopticalmeasurement; Optik123 (2013) 6553-6558 " proposes a kind of scaling method for the wide baseline multicamera system of large scene, first the method utilizes gridiron pattern scaling board to calculate the internal reference matrix of camera, then local mode of placing monumented point is utilized to calculate the outer ginseng matrix of camera, wherein the projection be placed on camera image of monumented point is similar to the transverse matrix being parallel to image, can simplify the calculating of longitudinal distortion coefficients of camera lens thus.With regard to unmanned plane autonomous landing on the ship system, in scene, the meeting of visible ray produces interference greatly.And for infrared camera, conventional scaling method needs to place calibrated reference before camera to be calibrated, the scaling board of usual known form and size is demarcated, as gridiron pattern, cross etc., but because near infrared camera can not obtain the texture information of ordinary optical camera calibration plate, so near infrared video camera cannot be demarcated with the scaling board of general visible video camera.And vision navigation system needs the landing runway covering enough scopes, so need to install telephoto lens additional to video camera, because the telephoto lens depth of field is short, be difficult to accurate focusing, bring very large difficulty to the method for tradition shooting precision calibration thing.
Summary of the invention
In order to overcome the deficiency of existing method poor practicability, the invention provides a kind of wide baseline near infrared camera position and orientation estimation method.Choose symmetrical six pairs of reference points in the large scene region, the public visual field of the method each camera on warship runway, utilize the world coordinates of all-station instrument accurate measurement reference point; Carry out timing signal and place identification light at reference point location, by detecting the pose of each camera of identification light accurate Calculation.The method take into account, and near infrared camera cannot take effective scaling board photo, and traditional scaling method cannot be utilized to carry out pose estimation.And visible ray is on the characteristic of the impact of calibration result and infrared camera itself in the complicated natural scene in the scene of UAV Landing.Design employs infrared laser etc. as cooperation identification light, and installs optical filter additional on infrared camera.Six pairs of reference points are set in landing runway both sides, and utilize total powerstation to measure the spatial accuracy that the volume coordinate of reference point carries out level.After tested, calibration result is accurate, and the re-projection error on image reaches below 0.05 pixel.
The technical solution adopted for the present invention to solve the technical problems is: a kind of wide baseline near infrared camera position and orientation estimation method, is characterized in comprising the following steps:
Step one, a world coordinate system is set up to unmanned plane Autonomous landing runway, near infrared camera is placed in the both sides of unmanned plane decline runway and symmetrical, choosing camera baseline mid point is world coordinate system initial point, be X-axis along runway center line marking direction, be Y direction perpendicular to runway heading, be Z axis straight up, coordinate system meets right hand theorem.
For estimating the transformation matrix between world three dimensional coordinate and camera image two-dimensional coordinate, needing the corresponding point obtaining several three-dimensional point and image coordinate, therefore selecting 12 reference points in runway both sides:
Reference point 1,2: the direction that reference point 2 points to reference point 1 is Y-axis positive dirction, reference point 1,2 two line I_12 mid points are world coordinate system initial point.I_12 length is 15m.
Reference point 3, the line I_34 of 4: reference point 3, the 4 and line I_12 of reference point 1,2 is parallel, and the vertical range between I_12 and I_34 is 50m, I_34 length is 15m.
Reference point 5,6: reference point 5, the line I_5 of 6,6 is parallel with the line I_12 of reference point 1,2, and I_12 and I_5, the vertical range between 6 is 100m, I_5, and 6 length are 15m.
Reference point 7,8: reference point 5, the line I_5 of 6,6 is parallel with the line I_12 of reference point 1,2, and I_12 and I_5, the vertical range between 6 is 150m, I_7, and 8 length are 15m.
Reference point 9,10: reference point 9, the line I_9 of 10,10 is parallel with the line I_12 of reference point 1,2, and I_12 and I_5, the vertical range between 6 is 200m, I_9, and 10 length are 15m.
Reference point 11, the line I_11.12 of 12: reference point 11, the 12 and line I_12 of reference point 1,2 is parallel, I_12 and I_5, and the vertical range between 6 is 300m, I_11, and 12 length are 15m.
A part is chosen for overhead reference point in 12 reference points.Wherein, reference point 7,8 is high point, is highly 2.8m; Reference point 11,12 is high point, is highly 2.6m.
After reconnaissance completes, at reference point placing total station prism, total powerstation is placed on world coordinate system initial point place, by total powerstation positive dirction positioned edge runway center line marking direction, uses the coordinate of all-station instrument accurate measurement 12 reference points under total station instrument coordinate system.
Step 2, choose the cooperation identification light of near-infrared laser lamp as the near infrared camera in landing navigation system, and install optical filter additional on camera lens.After opening each cooperation identification light, use the camera pictures taken in landing navigation system, the mode adopting manual reconnaissance to combine with field detection mode obtains the image coordinate of reference point.
Have corresponding relation between coordinate points X in step 3, supposition three-dimensional world coordinate system and camera image coordinate points x, camera matrix P just can be determined.Corresponding X and x is organized for each, derived relation formula:
0 T - w i X i T y i X i T w i X i T 0 T - x i X i T - y i X i T x i X i T 0 T P 1 P 2 P 3 = 0
Wherein P iTbeing i-th row of matrix P, is a four-vector.
Step 4, the camera transformation matrix P utilizing first three step to calculate, calculate the coordinate of two-dimensional image coordinate system in three-dimensional world coordinate system by formula x=PX.
The invention has the beneficial effects as follows: choose symmetrical six pairs of reference points in the large scene region, the public visual field of the method each camera on warship runway, utilize the world coordinates of all-station instrument accurate measurement reference point; Carry out timing signal and place identification light at reference point location, by detecting the pose of each camera of identification light accurate Calculation.The method take into account, and near infrared camera cannot take effective scaling board photo, and traditional scaling method cannot be utilized to carry out pose estimation.And visible ray is on the characteristic of the impact of calibration result and infrared camera itself in the complicated natural scene in the scene of UAV Landing.Design employs infrared laser etc. as cooperation identification light, and installs optical filter additional on infrared camera.Six pairs of reference points are set in landing runway both sides, and utilize total powerstation to measure the spatial accuracy that the volume coordinate of reference point carries out level.After tested, calibration result is accurate, and the re-projection error on image reaches below 0.05 pixel.
Below in conjunction with embodiment, the present invention is elaborated.
Embodiment
The present invention's wide baseline near infrared camera position and orientation estimation method concrete steps are as follows:
1, the choosing and measurement of coordinates of reference point.
The pose estimation procedure of camera is actual is to the calculating of world coordinate system to transformation matrix between camera coordinates system, therefore need first to set up a world coordinate system to unmanned plane Autonomous landing runway, in the present invention world coordinate system choose as follows: camera be positioned at decline runway symmetria bilateralis distribution, choosing camera baseline mid point is world coordinate system initial point, be X-axis along runway center line marking direction, be Y direction perpendicular to runway heading, be Z axis straight up, coordinate system meets right hand theorem.
For estimating the transformation matrix between world three dimensional coordinate and camera image two-dimensional coordinate, need the corresponding point obtaining several three-dimensional point and image coordinate, in the present invention, have selected 12 (6 to) reference points in runway both sides, wherein reference point to choose mode as follows:
Reference point 1,2: the direction that reference point 2 points to reference point 1 is Y-axis positive dirction, reference point 1,2 two line I_12 mid points are world coordinate system initial point.I_12 length is 15m.
Reference point 3, the line I_34 of 4: reference point 3, the 4 and line I_12 of reference point 1,2 is parallel, and the vertical range between I_12 and I_34 is 50m, I_34 length is 15m.
Reference point 5,6: reference point 5, the line I_5 of 6,6 is parallel with the line I_12 of reference point 1,2, and I_12 and I_5, the vertical range between 6 is 100m, I_5, and 6 length are 15m.
Reference point 7,8: reference point 5, the line I_5 of 6,6 is parallel with the line I_12 of reference point 1,2, and I_12 and I_5, the vertical range between 6 is 150m, I_7, and 8 length are 15m.
Reference point 9,10: reference point 9, the line I_9 of 10,10 is parallel with the line I_12 of reference point 1,2, and I_12 and I_5, the vertical range between 6 is 200m, I_9, and 10 length are 15m.
Reference point 11, the line I_11.12 of 12: reference point 11, the 12 and line I_12 of reference point 1,2 is parallel, I_12 and I_5, and the vertical range between 6 is 300m, I_11, and 12 length are 15m.
Demarcate under environment at large scene, in order to make simultaneously ground and aerial accuracy of detection guaranteed, in reference point, choose a part for overhead reference point.Wherein: reference point 7,8 is high point, is highly 2.8m; Reference point 11,12 is high point, is highly 2.6m.
The dimensional measuring instrument being is selected to be tape measure at reference point; After reconnaissance completes, at reference point placing total station prism, total powerstation is placed on world coordinate system initial point place, by total powerstation positive dirction positioned edge runway center line marking direction, uses the coordinate of all-station instrument accurate measurement 12 reference points under total station instrument coordinate system.Make total station instrument coordinate system in this way identical with world coordinate system, the conversion of total station instrument coordinate system and world coordinate system can be simplified.
2, take the cooperation identification light of datum and obtain the image coordinate of reference point.
The object of this step determines the image coordinate of reference point on image captured by camera.First cooperation identification light is placed in each datum, for the various weather conditions that landing runway may occur, and the background interference of large scene and distance problem, the present invention chooses the cooperation identification light of near-infrared laser lamp as the near infrared camera in landing navigation system, and installs optical filter additional on camera lens.After opening each cooperation identification light, use the camera pictures taken in landing navigation system, because reference point is all selected in landing runway both sides, therefore each cooperation identification light is all in the image captured by camera.The present invention utilizes near infrared camera imaging characteristics, the mode adopting manual reconnaissance to combine with field detection mode obtains the image coordinate of reference point, namely in developed interactive software, the position of each identification light is clicked by mouse, then the method using intensity-weighted average in the 5*5 territory of mouse reconnaissance, calculate the center of speck in picture, using the image coordinate of centre coordinate as reference point.
3, camera parameter is demarcated.
The simplest corresponding relation is, assuming that corresponding between 3D coordinate points X with camera image coordinate points x, if know the corresponding relation of abundant X and x, uses Method of Direct Liner Transformation (DTL), and camera matrix P becomes and can be determined.
Corresponding X and x is organized for each, can derived relation formula:
0 T - w i X i T y i X i T w i X i T 0 T - x i X i T - y i X i T x i X i T 0 T P 1 P 2 P 3 = 0
Wherein P iTbe i-th row of matrix P, it is 4 n dimensional vector ns.
4, camera pose is estimated.
1 of implementation step of the present invention, 2,3 steps have completed the estimation for camera transformation matrix P, namely formula can be passed through: the re-projection error of x=PX compute matrix P, re-projection error the estimated camera transformation of obtained reference point world coordinates is put to the proof P project on camera image, calculate the pixel distance between re-projection pixel and preimage element, for quantitative evaluation camera pose estimated accuracy.And can directly by formula: x=PX calculates image 2 dimension coordinate and ties up to coordinate in 3 dimension world coordinate systems.

Claims (1)

1. a wide baseline near infrared camera position and orientation estimation method, is characterized in that comprising the following steps:
Step one, a world coordinate system is set up to unmanned plane Autonomous landing runway, near infrared camera is placed in the both sides of unmanned plane decline runway and symmetrical, choosing camera baseline mid point is world coordinate system initial point, be X-axis along runway center line marking direction, be Y direction perpendicular to runway heading, be Z axis straight up, coordinate system meets right hand theorem;
For estimating the transformation matrix between world three dimensional coordinate and camera image two-dimensional coordinate, needing the corresponding point obtaining several three-dimensional point and image coordinate, therefore selecting 12 reference points in runway both sides:
Reference point 1,2: the direction that reference point 2 points to reference point 1 is Y-axis positive dirction, reference point 1,2 two line I_12 mid points are world coordinate system initial point; I_12 length is 15m;
Reference point 3, the line I_34 of 4: reference point 3, the 4 and line I_12 of reference point 1,2 is parallel, and the vertical range between I_12 and I_34 is 50m, I_34 length is 15m;
Reference point 5,6: reference point 5, the line I_5 of 6,6 is parallel with the line I_12 of reference point 1,2, I_12 and I_5, and the vertical range between 6 is 100m, I_5, and 6 length are 15m;
Reference point 7,8: reference point 5, the line I_5 of 6,6 is parallel with the line I_12 of reference point 1,2, I_12 and I_5, and the vertical range between 6 is 150m, I_7, and 8 length are 15m;
Reference point 9,10: reference point 9, the line I_9 of 10,10 is parallel with the line I_12 of reference point 1,2, I_12 and I_5, and the vertical range between 6 is 200m, I_9, and 10 length are 15m;
Reference point 11, the line I_11.12 of 12: reference point 11, the 12 and line I_12 of reference point 1,2 is parallel, I_12 and I_5, and the vertical range between 6 is 300m, I_11, and 12 length are 15m;
A part is chosen for overhead reference point in 12 reference points; Wherein, reference point 7,8 is high point, is highly 2.8m; Reference point 11,12 is high point, is highly 2.6m;
After reconnaissance completes, at reference point placing total station prism, total powerstation is placed on world coordinate system initial point place, by total powerstation positive dirction positioned edge runway center line marking direction, uses the coordinate of all-station instrument accurate measurement 12 reference points under total station instrument coordinate system;
Step 2, choose the cooperation identification light of near-infrared laser lamp as the near infrared camera in landing navigation system, and install optical filter additional on camera lens; After opening each cooperation identification light, use the camera pictures taken in landing navigation system, the mode adopting manual reconnaissance to combine with field detection mode obtains the image coordinate of reference point;
Have corresponding relation between coordinate points X in step 3, supposition three-dimensional world coordinate system and camera image coordinate points x, camera matrix P just can be determined; Corresponding X and x is organized for each, derived relation formula:
0 T - w i X i T y i X i T w i X i T 0 T - x i X i T - y i X i T x i X i T 0 T P 1 P 2 P 3 = 0
Wherein P iTbeing i-th row of matrix P, is a four-vector;
Step 4, the camera transformation matrix P utilizing first three step to calculate, calculate the coordinate of two-dimensional image coordinate system in three-dimensional world coordinate system by formula x=PX.
CN201510750446.0A 2015-11-06 2015-11-06 Wide baseline near infrared camera position and orientation estimation method Expired - Fee Related CN105424059B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510750446.0A CN105424059B (en) 2015-11-06 2015-11-06 Wide baseline near infrared camera position and orientation estimation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510750446.0A CN105424059B (en) 2015-11-06 2015-11-06 Wide baseline near infrared camera position and orientation estimation method

Publications (2)

Publication Number Publication Date
CN105424059A true CN105424059A (en) 2016-03-23
CN105424059B CN105424059B (en) 2018-10-16

Family

ID=55502430

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510750446.0A Expired - Fee Related CN105424059B (en) 2015-11-06 2015-11-06 Wide baseline near infrared camera position and orientation estimation method

Country Status (1)

Country Link
CN (1) CN105424059B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105890590A (en) * 2016-04-12 2016-08-24 西北工业大学 UAV (unmanned aerial vehicle) remote optical landing guidance system based on infrared laser lamps and multi-camera array
CN106228534A (en) * 2016-07-08 2016-12-14 众趣(北京)科技有限公司 Relation scaling method between a kind of rotating shaft based on constrained global optimization and camera
CN106444792A (en) * 2016-09-18 2017-02-22 中国空气动力研究与发展中心高速空气动力研究所 Infrared visual recognition-based unmanned aerial vehicle landing positioning system and method
CN110764117A (en) * 2019-10-31 2020-02-07 成都圭目机器人有限公司 Method for calibrating relative position of detection robot antenna and sensor based on total station

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103940364A (en) * 2014-05-04 2014-07-23 赵鸣 Subway tunnel relative deformation photogrammetry method
CN104200086A (en) * 2014-08-25 2014-12-10 西北工业大学 Wide-baseline visible light camera pose estimation method
CN104215239A (en) * 2014-08-29 2014-12-17 西北工业大学 Vision-based autonomous unmanned plane landing guidance device and method
CN104637053A (en) * 2015-01-29 2015-05-20 西北工业大学 Method for calibrating wide baseline multi-array camera system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103940364A (en) * 2014-05-04 2014-07-23 赵鸣 Subway tunnel relative deformation photogrammetry method
CN104200086A (en) * 2014-08-25 2014-12-10 西北工业大学 Wide-baseline visible light camera pose estimation method
CN104215239A (en) * 2014-08-29 2014-12-17 西北工业大学 Vision-based autonomous unmanned plane landing guidance device and method
CN104637053A (en) * 2015-01-29 2015-05-20 西北工业大学 Method for calibrating wide baseline multi-array camera system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
FENG GUANMING: ""A calibration methods for Vision measuring system with large view field"", 《2011 4TH INTERNATIONAL CONGRESS ON IMAGE AND SIGNAL PROCESSING》 *
MEILIAN LIU: ""Accurate Installation Method and Precision Analysis for Vision Measurement of Remote Falling Point"", 《INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE AND TECHNOLOGY》 *
张恒: ""无人机平台运动目标检测与跟踪及其视觉辅助着陆系统研究"", 《中国博士学位论文全文数据库,信息科技辑》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105890590A (en) * 2016-04-12 2016-08-24 西北工业大学 UAV (unmanned aerial vehicle) remote optical landing guidance system based on infrared laser lamps and multi-camera array
CN106228534A (en) * 2016-07-08 2016-12-14 众趣(北京)科技有限公司 Relation scaling method between a kind of rotating shaft based on constrained global optimization and camera
CN106228534B (en) * 2016-07-08 2019-05-17 众趣(北京)科技有限公司 Relationship scaling method between a kind of shaft and camera based on constrained global optimization
CN106444792A (en) * 2016-09-18 2017-02-22 中国空气动力研究与发展中心高速空气动力研究所 Infrared visual recognition-based unmanned aerial vehicle landing positioning system and method
CN110764117A (en) * 2019-10-31 2020-02-07 成都圭目机器人有限公司 Method for calibrating relative position of detection robot antenna and sensor based on total station

Also Published As

Publication number Publication date
CN105424059B (en) 2018-10-16

Similar Documents

Publication Publication Date Title
CN104197928B (en) Multi-camera collaboration-based method for detecting, positioning and tracking unmanned aerial vehicle
US12008824B2 (en) Target positioning method and device, and unmanned aerial vehicle
CN104200086A (en) Wide-baseline visible light camera pose estimation method
CA3027921C (en) Integrated sensor calibration in natural scenes
CN104215239B (en) Guidance method using vision-based autonomous unmanned plane landing guidance device
CN105588563B (en) Binocular camera and inertial navigation combined calibrating method in a kind of intelligent driving
CN104217439B (en) Indoor visual positioning system and method
CN106651953B (en) A kind of vehicle position and orientation estimation method based on traffic sign
US10909395B2 (en) Object detection apparatus
CN106408601B (en) A kind of binocular fusion localization method and device based on GPS
KR20180050823A (en) Generating method and apparatus of 3d lane model
CN108335337B (en) method and device for generating orthoimage picture
CN105424006A (en) Unmanned aerial vehicle hovering precision measurement method based on binocular vision
CN105913410A (en) Long-distance moving object height measurement apparatus and method based on machine vision
CN106019264A (en) Binocular vision based UAV (Unmanned Aerial Vehicle) danger vehicle distance identifying system and method
CN109685855A (en) A kind of camera calibration optimization method under road cloud monitor supervision platform
CN102650886A (en) Vision system based on active panoramic vision sensor for robot
CN109146958B (en) Traffic sign space position measuring method based on two-dimensional image
CN105424059A (en) Wide baseline infrared camera pose estimation method
CN106197382B (en) A kind of vehicle-mounted single camera target dynamic distance measuring method
CN103679647A (en) Point cloud model true color processing method of three-dimensional laser imaging system
CN103411587A (en) Positioning and attitude-determining method and system
Crispel et al. All-sky photogrammetry techniques to georeference a cloud field
CN113340272B (en) Ground target real-time positioning method based on micro-group of unmanned aerial vehicle
CN102168973B (en) Automatic navigating Z-shaft positioning method for omni-directional vision sensor and positioning system thereof

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20181016

Termination date: 20191106

CF01 Termination of patent right due to non-payment of annual fee