CN105424059A  Wide baseline infrared camera pose estimation method  Google Patents
Wide baseline infrared camera pose estimation method Download PDFInfo
 Publication number
 CN105424059A CN105424059A CN201510750446.0A CN201510750446A CN105424059A CN 105424059 A CN105424059 A CN 105424059A CN 201510750446 A CN201510750446 A CN 201510750446A CN 105424059 A CN105424059 A CN 105424059A
 Authority
 CN
 China
 Prior art keywords
 reference point
 camera
 line
 point
 coordinate system
 Prior art date
 Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
 Granted
Links
Classifications

 G—PHYSICS
 G01—MEASURING; TESTING
 G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
 G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
Abstract
The invention discloses a wide baseline infrared camera pose estimation method used for solving the technical problem that an existing method is poor in practicability. According to the technical scheme, six pairs of datum points of bilateral symmetry are selected in common view largescale scene regions of all cameras on a carrier landing runway, and world coordinates of the datum points are accurately measured through a total station; when calibration is carried out, mark lamps are placed on the datum points, and poses of all the cameras are accurately calculated by detecting the mark lamps. The fact that the infrared cameras are incapable of taking effective calibration board photos is considered, the influence on calibration results by visible light in complex natural scenes in the landing scene of an unmanned aerial vehicle and the own characteristics of the infrared cameras are taken into consideration, infrared laser is used as cooperative mark lamps, and optical filters are additionally installed on the infrared cameras. The six pairs of datum points are arranged at the two sides of the landing runway, and the total station is used for carrying out level space accuracy measurement on the space coordinates of the datum points. By means of testing, the calibration results are accurate, and the reprojection error on the images reaches 0.05 pixel or below.
Description
Technical field
The present invention relates to a kind of wide baseline near infrared camera position and orientation estimation method, particularly relate to a kind of unmanned plane autonomous landing on the ship guidance system wide baseline near infrared camera position and orientation estimation method.
Background technology
The unmanned plane autonomous landing on the ship navigation of viewbased access control model is the airmanship of current study hotspot, has bionics meaning and source, compared with traditional inertial and satellite navigation technology, has system independence, strong interference immunity, characteristic that precision is high.The navigation of unmanned plane autonomous landing on the ship refers to that measurement camera is arranged on the navigational system on landing runway, fixedfocus camera in navigational system is often subject to the impact of surround lighting when track and localization, in order to avoid this impact, in existing vision navigation system, adopt near infrared camera to the light of certain specific wavelength obtaining luminous point and send.In addition, in order to obtain the corresponding relation in space between threedimensional coordinate point and camera review coordinate, punctuate must be carried out to video camera, and the precision of camera calibration directly affects the precision of optical alignment.
Conventional scaling method needs to place calibrated reference before camera to be calibrated, the scaling board of usual known form and size is demarcated, as gridiron pattern, cross etc., but because near infrared camera can not obtain the texture information of ordinary optical camera calibration plate, so near infrared video camera cannot be demarcated with the scaling board of general visible video camera.Vision navigation system needs the landing runway covering enough scopes, so need to install telephoto lens additional to video camera, because the telephoto lens depth of field is short, is difficult to accurate focusing, brings very large difficulty to the method for tradition shooting precision calibration thing.
Document " Acameracalibrationmethodforlargefieldopticalmeasurement; Optik123 (2013) 65536558 " proposes a kind of scaling method for the wide baseline multicamera system of large scene, first the method utilizes gridiron pattern scaling board to calculate the internal reference matrix of camera, then local mode of placing monumented point is utilized to calculate the outer ginseng matrix of camera, wherein the projection be placed on camera image of monumented point is similar to the transverse matrix being parallel to image, can simplify the calculating of longitudinal distortion coefficients of camera lens thus.With regard to unmanned plane autonomous landing on the ship system, in scene, the meeting of visible ray produces interference greatly.And for infrared camera, conventional scaling method needs to place calibrated reference before camera to be calibrated, the scaling board of usual known form and size is demarcated, as gridiron pattern, cross etc., but because near infrared camera can not obtain the texture information of ordinary optical camera calibration plate, so near infrared video camera cannot be demarcated with the scaling board of general visible video camera.And vision navigation system needs the landing runway covering enough scopes, so need to install telephoto lens additional to video camera, because the telephoto lens depth of field is short, be difficult to accurate focusing, bring very large difficulty to the method for tradition shooting precision calibration thing.
Summary of the invention
In order to overcome the deficiency of existing method poor practicability, the invention provides a kind of wide baseline near infrared camera position and orientation estimation method.Choose symmetrical six pairs of reference points in the large scene region, the public visual field of the method each camera on warship runway, utilize the world coordinates of allstation instrument accurate measurement reference point; Carry out timing signal and place identification light at reference point location, by detecting the pose of each camera of identification light accurate Calculation.The method take into account, and near infrared camera cannot take effective scaling board photo, and traditional scaling method cannot be utilized to carry out pose estimation.And visible ray is on the characteristic of the impact of calibration result and infrared camera itself in the complicated natural scene in the scene of UAV Landing.Design employs infrared laser etc. as cooperation identification light, and installs optical filter additional on infrared camera.Six pairs of reference points are set in landing runway both sides, and utilize total powerstation to measure the spatial accuracy that the volume coordinate of reference point carries out level.After tested, calibration result is accurate, and the reprojection error on image reaches below 0.05 pixel.
The technical solution adopted for the present invention to solve the technical problems is: a kind of wide baseline near infrared camera position and orientation estimation method, is characterized in comprising the following steps:
Step one, a world coordinate system is set up to unmanned plane Autonomous landing runway, near infrared camera is placed in the both sides of unmanned plane decline runway and symmetrical, choosing camera baseline mid point is world coordinate system initial point, be Xaxis along runway center line marking direction, be Y direction perpendicular to runway heading, be Z axis straight up, coordinate system meets right hand theorem.
For estimating the transformation matrix between world three dimensional coordinate and camera image twodimensional coordinate, needing the corresponding point obtaining several threedimensional point and image coordinate, therefore selecting 12 reference points in runway both sides:
Reference point 1,2: the direction that reference point 2 points to reference point 1 is Yaxis positive dirction, reference point 1,2 two line I_12 mid points are world coordinate system initial point.I_12 length is 15m.
Reference point 3, the line I_34 of 4: reference point 3, the 4 and line I_12 of reference point 1,2 is parallel, and the vertical range between I_12 and I_34 is 50m, I_34 length is 15m.
Reference point 5,6: reference point 5, the line I_5 of 6,6 is parallel with the line I_12 of reference point 1,2, and I_12 and I_5, the vertical range between 6 is 100m, I_5, and 6 length are 15m.
Reference point 7,8: reference point 5, the line I_5 of 6,6 is parallel with the line I_12 of reference point 1,2, and I_12 and I_5, the vertical range between 6 is 150m, I_7, and 8 length are 15m.
Reference point 9,10: reference point 9, the line I_9 of 10,10 is parallel with the line I_12 of reference point 1,2, and I_12 and I_5, the vertical range between 6 is 200m, I_9, and 10 length are 15m.
Reference point 11, the line I_11.12 of 12: reference point 11, the 12 and line I_12 of reference point 1,2 is parallel, I_12 and I_5, and the vertical range between 6 is 300m, I_11, and 12 length are 15m.
A part is chosen for overhead reference point in 12 reference points.Wherein, reference point 7,8 is high point, is highly 2.8m; Reference point 11,12 is high point, is highly 2.6m.
After reconnaissance completes, at reference point placing total station prism, total powerstation is placed on world coordinate system initial point place, by total powerstation positive dirction positioned edge runway center line marking direction, uses the coordinate of allstation instrument accurate measurement 12 reference points under total station instrument coordinate system.
Step 2, choose the cooperation identification light of nearinfrared laser lamp as the near infrared camera in landing navigation system, and install optical filter additional on camera lens.After opening each cooperation identification light, use the camera pictures taken in landing navigation system, the mode adopting manual reconnaissance to combine with field detection mode obtains the image coordinate of reference point.
Have corresponding relation between coordinate points X in step 3, supposition threedimensional world coordinate system and camera image coordinate points x, camera matrix P just can be determined.Corresponding X and x is organized for each, derived relation formula:
Wherein P
^{iT}being ith row of matrix P, is a fourvector.
Step 4, the camera transformation matrix P utilizing first three step to calculate, calculate the coordinate of twodimensional image coordinate system in threedimensional world coordinate system by formula x=PX.
The invention has the beneficial effects as follows: choose symmetrical six pairs of reference points in the large scene region, the public visual field of the method each camera on warship runway, utilize the world coordinates of allstation instrument accurate measurement reference point; Carry out timing signal and place identification light at reference point location, by detecting the pose of each camera of identification light accurate Calculation.The method take into account, and near infrared camera cannot take effective scaling board photo, and traditional scaling method cannot be utilized to carry out pose estimation.And visible ray is on the characteristic of the impact of calibration result and infrared camera itself in the complicated natural scene in the scene of UAV Landing.Design employs infrared laser etc. as cooperation identification light, and installs optical filter additional on infrared camera.Six pairs of reference points are set in landing runway both sides, and utilize total powerstation to measure the spatial accuracy that the volume coordinate of reference point carries out level.After tested, calibration result is accurate, and the reprojection error on image reaches below 0.05 pixel.
Below in conjunction with embodiment, the present invention is elaborated.
Embodiment
The present invention's wide baseline near infrared camera position and orientation estimation method concrete steps are as follows:
1, the choosing and measurement of coordinates of reference point.
The pose estimation procedure of camera is actual is to the calculating of world coordinate system to transformation matrix between camera coordinates system, therefore need first to set up a world coordinate system to unmanned plane Autonomous landing runway, in the present invention world coordinate system choose as follows: camera be positioned at decline runway symmetria bilateralis distribution, choosing camera baseline mid point is world coordinate system initial point, be Xaxis along runway center line marking direction, be Y direction perpendicular to runway heading, be Z axis straight up, coordinate system meets right hand theorem.
For estimating the transformation matrix between world three dimensional coordinate and camera image twodimensional coordinate, need the corresponding point obtaining several threedimensional point and image coordinate, in the present invention, have selected 12 (6 to) reference points in runway both sides, wherein reference point to choose mode as follows:
Reference point 1,2: the direction that reference point 2 points to reference point 1 is Yaxis positive dirction, reference point 1,2 two line I_12 mid points are world coordinate system initial point.I_12 length is 15m.
Reference point 3, the line I_34 of 4: reference point 3, the 4 and line I_12 of reference point 1,2 is parallel, and the vertical range between I_12 and I_34 is 50m, I_34 length is 15m.
Reference point 5,6: reference point 5, the line I_5 of 6,6 is parallel with the line I_12 of reference point 1,2, and I_12 and I_5, the vertical range between 6 is 100m, I_5, and 6 length are 15m.
Reference point 7,8: reference point 5, the line I_5 of 6,6 is parallel with the line I_12 of reference point 1,2, and I_12 and I_5, the vertical range between 6 is 150m, I_7, and 8 length are 15m.
Reference point 9,10: reference point 9, the line I_9 of 10,10 is parallel with the line I_12 of reference point 1,2, and I_12 and I_5, the vertical range between 6 is 200m, I_9, and 10 length are 15m.
Reference point 11, the line I_11.12 of 12: reference point 11, the 12 and line I_12 of reference point 1,2 is parallel, I_12 and I_5, and the vertical range between 6 is 300m, I_11, and 12 length are 15m.
Demarcate under environment at large scene, in order to make simultaneously ground and aerial accuracy of detection guaranteed, in reference point, choose a part for overhead reference point.Wherein: reference point 7,8 is high point, is highly 2.8m; Reference point 11,12 is high point, is highly 2.6m.
The dimensional measuring instrument being is selected to be tape measure at reference point; After reconnaissance completes, at reference point placing total station prism, total powerstation is placed on world coordinate system initial point place, by total powerstation positive dirction positioned edge runway center line marking direction, uses the coordinate of allstation instrument accurate measurement 12 reference points under total station instrument coordinate system.Make total station instrument coordinate system in this way identical with world coordinate system, the conversion of total station instrument coordinate system and world coordinate system can be simplified.
2, take the cooperation identification light of datum and obtain the image coordinate of reference point.
The object of this step determines the image coordinate of reference point on image captured by camera.First cooperation identification light is placed in each datum, for the various weather conditions that landing runway may occur, and the background interference of large scene and distance problem, the present invention chooses the cooperation identification light of nearinfrared laser lamp as the near infrared camera in landing navigation system, and installs optical filter additional on camera lens.After opening each cooperation identification light, use the camera pictures taken in landing navigation system, because reference point is all selected in landing runway both sides, therefore each cooperation identification light is all in the image captured by camera.The present invention utilizes near infrared camera imaging characteristics, the mode adopting manual reconnaissance to combine with field detection mode obtains the image coordinate of reference point, namely in developed interactive software, the position of each identification light is clicked by mouse, then the method using intensityweighted average in the 5*5 territory of mouse reconnaissance, calculate the center of speck in picture, using the image coordinate of centre coordinate as reference point.
3, camera parameter is demarcated.
The simplest corresponding relation is, assuming that corresponding between 3D coordinate points X with camera image coordinate points x, if know the corresponding relation of abundant X and x, uses Method of Direct Liner Transformation (DTL), and camera matrix P becomes and can be determined.
Corresponding X and x is organized for each, can derived relation formula:
Wherein P
^{iT}be ith row of matrix P, it is 4 n dimensional vector ns.
4, camera pose is estimated.
1 of implementation step of the present invention, 2,3 steps have completed the estimation for camera transformation matrix P, namely formula can be passed through: the reprojection error of x=PX compute matrix P, reprojection error the estimated camera transformation of obtained reference point world coordinates is put to the proof P project on camera image, calculate the pixel distance between reprojection pixel and preimage element, for quantitative evaluation camera pose estimated accuracy.And can directly by formula: x=PX calculates image 2 dimension coordinate and ties up to coordinate in 3 dimension world coordinate systems.
Claims (1)
1. a wide baseline near infrared camera position and orientation estimation method, is characterized in that comprising the following steps:
Step one, a world coordinate system is set up to unmanned plane Autonomous landing runway, near infrared camera is placed in the both sides of unmanned plane decline runway and symmetrical, choosing camera baseline mid point is world coordinate system initial point, be Xaxis along runway center line marking direction, be Y direction perpendicular to runway heading, be Z axis straight up, coordinate system meets right hand theorem;
For estimating the transformation matrix between world three dimensional coordinate and camera image twodimensional coordinate, needing the corresponding point obtaining several threedimensional point and image coordinate, therefore selecting 12 reference points in runway both sides:
Reference point 1,2: the direction that reference point 2 points to reference point 1 is Yaxis positive dirction, reference point 1,2 two line I_12 mid points are world coordinate system initial point; I_12 length is 15m;
Reference point 3, the line I_34 of 4: reference point 3, the 4 and line I_12 of reference point 1,2 is parallel, and the vertical range between I_12 and I_34 is 50m, I_34 length is 15m;
Reference point 5,6: reference point 5, the line I_5 of 6,6 is parallel with the line I_12 of reference point 1,2, I_12 and I_5, and the vertical range between 6 is 100m, I_5, and 6 length are 15m;
Reference point 7,8: reference point 5, the line I_5 of 6,6 is parallel with the line I_12 of reference point 1,2, I_12 and I_5, and the vertical range between 6 is 150m, I_7, and 8 length are 15m;
Reference point 9,10: reference point 9, the line I_9 of 10,10 is parallel with the line I_12 of reference point 1,2, I_12 and I_5, and the vertical range between 6 is 200m, I_9, and 10 length are 15m;
Reference point 11, the line I_11.12 of 12: reference point 11, the 12 and line I_12 of reference point 1,2 is parallel, I_12 and I_5, and the vertical range between 6 is 300m, I_11, and 12 length are 15m;
A part is chosen for overhead reference point in 12 reference points; Wherein, reference point 7,8 is high point, is highly 2.8m; Reference point 11,12 is high point, is highly 2.6m;
After reconnaissance completes, at reference point placing total station prism, total powerstation is placed on world coordinate system initial point place, by total powerstation positive dirction positioned edge runway center line marking direction, uses the coordinate of allstation instrument accurate measurement 12 reference points under total station instrument coordinate system;
Step 2, choose the cooperation identification light of nearinfrared laser lamp as the near infrared camera in landing navigation system, and install optical filter additional on camera lens; After opening each cooperation identification light, use the camera pictures taken in landing navigation system, the mode adopting manual reconnaissance to combine with field detection mode obtains the image coordinate of reference point;
Have corresponding relation between coordinate points X in step 3, supposition threedimensional world coordinate system and camera image coordinate points x, camera matrix P just can be determined; Corresponding X and x is organized for each, derived relation formula:
Wherein P
^{iT}being ith row of matrix P, is a fourvector;
Step 4, the camera transformation matrix P utilizing first three step to calculate, calculate the coordinate of twodimensional image coordinate system in threedimensional world coordinate system by formula x=PX.
Priority Applications (1)
Application Number  Priority Date  Filing Date  Title 

CN201510750446.0A CN105424059B (en)  20151106  20151106  Wide baseline near infrared camera position and orientation estimation method 
Applications Claiming Priority (1)
Application Number  Priority Date  Filing Date  Title 

CN201510750446.0A CN105424059B (en)  20151106  20151106  Wide baseline near infrared camera position and orientation estimation method 
Publications (2)
Publication Number  Publication Date 

CN105424059A true CN105424059A (en)  20160323 
CN105424059B CN105424059B (en)  20181016 
Family
ID=55502430
Family Applications (1)
Application Number  Title  Priority Date  Filing Date 

CN201510750446.0A Expired  Fee Related CN105424059B (en)  20151106  20151106  Wide baseline near infrared camera position and orientation estimation method 
Country Status (1)
Country  Link 

CN (1)  CN105424059B (en) 
Cited By (4)
Publication number  Priority date  Publication date  Assignee  Title 

CN105890590A (en) *  20160412  20160824  西北工业大学  UAV (unmanned aerial vehicle) remote optical landing guidance system based on infrared laser lamps and multicamera array 
CN106228534A (en) *  20160708  20161214  众趣（北京）科技有限公司  Relation scaling method between a kind of rotating shaft based on constrained global optimization and camera 
CN106444792A (en) *  20160918  20170222  中国空气动力研究与发展中心高速空气动力研究所  Infrared visual recognitionbased unmanned aerial vehicle landing positioning system and method 
CN110764117A (en) *  20191031  20200207  成都圭目机器人有限公司  Method for calibrating relative position of detection robot antenna and sensor based on total station 
Citations (4)
Publication number  Priority date  Publication date  Assignee  Title 

CN103940364A (en) *  20140504  20140723  赵鸣  Subway tunnel relative deformation photogrammetry method 
CN104200086A (en) *  20140825  20141210  西北工业大学  Widebaseline visible light camera pose estimation method 
CN104215239A (en) *  20140829  20141217  西北工业大学  Visionbased autonomous unmanned plane landing guidance device and method 
CN104637053A (en) *  20150129  20150520  西北工业大学  Method for calibrating wide baseline multiarray camera system 

2015
 20151106 CN CN201510750446.0A patent/CN105424059B/en not_active Expired  Fee Related
Patent Citations (4)
Publication number  Priority date  Publication date  Assignee  Title 

CN103940364A (en) *  20140504  20140723  赵鸣  Subway tunnel relative deformation photogrammetry method 
CN104200086A (en) *  20140825  20141210  西北工业大学  Widebaseline visible light camera pose estimation method 
CN104215239A (en) *  20140829  20141217  西北工业大学  Visionbased autonomous unmanned plane landing guidance device and method 
CN104637053A (en) *  20150129  20150520  西北工业大学  Method for calibrating wide baseline multiarray camera system 
NonPatent Citations (3)
Title 

FENG GUANMING: ""A calibration methods for Vision measuring system with large view field"", 《2011 4TH INTERNATIONAL CONGRESS ON IMAGE AND SIGNAL PROCESSING》 * 
MEILIAN LIU: ""Accurate Installation Method and Precision Analysis for Vision Measurement of Remote Falling Point"", 《INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE AND TECHNOLOGY》 * 
张恒: ""无人机平台运动目标检测与跟踪及其视觉辅助着陆系统研究"", 《中国博士学位论文全文数据库，信息科技辑》 * 
Cited By (5)
Publication number  Priority date  Publication date  Assignee  Title 

CN105890590A (en) *  20160412  20160824  西北工业大学  UAV (unmanned aerial vehicle) remote optical landing guidance system based on infrared laser lamps and multicamera array 
CN106228534A (en) *  20160708  20161214  众趣（北京）科技有限公司  Relation scaling method between a kind of rotating shaft based on constrained global optimization and camera 
CN106228534B (en) *  20160708  20190517  众趣（北京）科技有限公司  Relationship scaling method between a kind of shaft and camera based on constrained global optimization 
CN106444792A (en) *  20160918  20170222  中国空气动力研究与发展中心高速空气动力研究所  Infrared visual recognitionbased unmanned aerial vehicle landing positioning system and method 
CN110764117A (en) *  20191031  20200207  成都圭目机器人有限公司  Method for calibrating relative position of detection robot antenna and sensor based on total station 
Also Published As
Publication number  Publication date 

CN105424059B (en)  20181016 
Similar Documents
Publication  Publication Date  Title 

CN104197928B (en)  Multicamera collaborationbased method for detecting, positioning and tracking unmanned aerial vehicle  
CN104200086A (en)  Widebaseline visible light camera pose estimation method  
CN104215239B (en)  Guidance method using visionbased autonomous unmanned plane landing guidance device  
AU2018282302B2 (en)  Integrated sensor calibration in natural scenes  
CN104217439B (en)  Indoor visual positioning system and method  
CN106651953B (en)  A kind of vehicle position and orientation estimation method based on traffic sign  
US10909395B2 (en)  Object detection apparatus  
CN104101331B (en)  Based on the noncooperative target pose measurement of alloptical field camera  
KR102016636B1 (en)  Calibration apparatus and method of camera and rader  
KR20180050823A (en)  Generating method and apparatus of 3d lane model  
CN108335337B (en)  method and device for generating orthoimage picture  
CN103955920A (en)  Binocular vision obstacle detection method based on threedimensional point cloud segmentation  
CN105424006A (en)  Unmanned aerial vehicle hovering precision measurement method based on binocular vision  
CN105913410A (en)  Longdistance moving object height measurement apparatus and method based on machine vision  
CN106019264A (en)  Binocular vision based UAV (Unmanned Aerial Vehicle) danger vehicle distance identifying system and method  
CN102650886A (en)  Vision system based on active panoramic vision sensor for robot  
CN109146958B (en)  Traffic sign space position measuring method based on twodimensional image  
CN106408601A (en)  GPSbased binocular fusion positioning method and device  
CN105424059A (en)  Wide baseline infrared camera pose estimation method  
CN109685855A (en)  A kind of camera calibration optimization method under road cloud monitor supervision platform  
CN106197382B (en)  A kind of vehiclemounted single camera target dynamic distance measuring method  
CN103679647A (en)  Point cloud model true color processing method of threedimensional laser imaging system  
CN103411587A (en)  Positioning and attitudedetermining method and system  
Crispel et al.  Allsky photogrammetry techniques to georeference a cloud field  
CN113340272B (en)  Ground target realtime positioning method based on microgroup of unmanned aerial vehicle 
Legal Events
Date  Code  Title  Description 

C06  Publication  
PB01  Publication  
C10  Entry into substantive examination  
SE01  Entry into force of request for substantive examination  
GR01  Patent grant  
GR01  Patent grant  
CF01  Termination of patent right due to nonpayment of annual fee 
Granted publication date: 20181016 Termination date: 20191106 

CF01  Termination of patent right due to nonpayment of annual fee 