CN113324538B - Cooperative target remote high-precision six-degree-of-freedom pose measurement method - Google Patents

Cooperative target remote high-precision six-degree-of-freedom pose measurement method Download PDF

Info

Publication number
CN113324538B
CN113324538B CN202110501861.8A CN202110501861A CN113324538B CN 113324538 B CN113324538 B CN 113324538B CN 202110501861 A CN202110501861 A CN 202110501861A CN 113324538 B CN113324538 B CN 113324538B
Authority
CN
China
Prior art keywords
pose
coordinate system
camera
initial
cooperative target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110501861.8A
Other languages
Chinese (zh)
Other versions
CN113324538A (en
Inventor
赵汝进
朱自发
马跃博
颜坤
刘恩海
易晋辉
曾思康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Optics and Electronics of CAS
Original Assignee
Institute of Optics and Electronics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Optics and Electronics of CAS filed Critical Institute of Optics and Electronics of CAS
Priority to CN202110501861.8A priority Critical patent/CN113324538B/en
Publication of CN113324538A publication Critical patent/CN113324538A/en
Application granted granted Critical
Publication of CN113324538B publication Critical patent/CN113324538B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a remote high-precision six-degree-of-freedom pose measurement method for a cooperative target, which comprises the following steps of: (1) Acquiring an initial six-degree-of-freedom pose and an initial depth value of a cooperative target; (2) calculating a laser spot circular equation; (3) Solving an equation of a space straight line determined by two three-dimensional space points of the initial position and the optical center of the camera; (4) Solving the intersection point of the facula circle and the space straight line, and correcting the initial pose; (5) And further optimizing the correction result by using a nonlinear optimization method. According to the method, the large error in the monocular vision pose measuring sight direction is corrected by means of the accurate depth value provided by the laser range finder, and meanwhile, the interference error caused by laser spot dispersion of the laser range finder can be eliminated.

Description

Cooperative target remote high-precision six-degree-of-freedom pose measurement method
Technical Field
The invention relates to the technical field of pose measurement based on multi-sensor fusion, in particular to a cooperative target long-distance high-precision six-degree-of-freedom pose measurement method.
Background
The target high-precision pose measurement with six degrees of freedom is beneficial to realizing precise and efficient machine operation, so that the method is widely applied to the fields of autonomous robots, precision machining, spacecraft docking and the like. Monocular vision is a non-contact six-degree-of-freedom attitude measurement method, has the advantages of low hardware complexity, high precision, wide measurement range and the like, and is one of the most common six-degree-of-freedom attitude measurement methods at present. However, due to the limitations of the model itself, the measurement accuracy along the optical axis of the camera is usually much lower than the measurement accuracy along the other two directions, and according to Zhao Lianjun et al, "single-eye three-point position measurement accuracy analysis" (see "optical precision engineering", vol 2014 22 (5)), the measurement accuracy along the optical axis of the camera is only one twentieth of that along the other two directions.
According to the position and pose measuring system of monocular camera-laser ranging sensor (see optical journal, volume 31 (3) pages 85-91) of Chaozhi et al, the depth value provided by the laser range finder is utilized to directly replace the displacement value in the visual line direction in the monocular vision position and pose measuring result, and the error caused by laser spot dispersion is not considered, so that the correction effect is limited and only millimeter-level precision can be achieved.
Disclosure of Invention
The invention provides a cooperative target remote high-precision six-degree-of-freedom pose measurement method, which is a solution based on multi-sensor fusion, aiming at the problem of large measurement error along the optical axis direction of a camera in the existing monocular vision measurement. The method can correct the initial pose measured by the camera by using the accurate depth information obtained by the laser range finder, and can eliminate the measurement error caused by laser spot dispersion. The method establishes a light spot diffusion circle, determines a straight line where an initial measurement position and a camera light center point are located, and then converts the fusion problem of two sensors into a geometric problem of solving the intersection point of a space circular surface and a space straight line. The calculated intersection point is the corrected measurement pose, and after correction, the displacement measurement precision along the sight line direction is higher than that of the other two directions.
The technical scheme adopted by the invention is as follows: a cooperative target remote high-precision six-degree-of-freedom pose measurement method comprises the following steps:
step (1) obtaining a cooperative target image, and resolving an initial pose T of the cooperative target according to a PnP algorithm i =[R i |t i ]Where R represents attitude information, t represents displacement information, and subscript i represents that this is the initial result;
the specific process of the step (1) is as follows:
step (11), a camera is used for imaging to obtain a cooperation target image, and then the sub-pixel coordinates of the feature imaging center point of the cooperation target are extracted based on a gray scale gravity center method
Figure BDA0003056718050000021
And performing a 3D cooperation feature point P on the cooperation target i T (i =1,2,3,4) and corresponding 2D imaging point on picture
Figure BDA0003056718050000022
Matching;
step (12) based on a PnP algorithm, performing pose calculation on the 2D-3D matching points by using the 4 pairs in the step (11) to obtain a pose transformation relation T of a cooperative target coordinate system relative to a camera optical center coordinate system i =[R i |t i ]。
Step (2), measuring accurate depth information D by using laser range finder a
Step (3) utilizing the accurate depth information in the step (2)D a Correcting the initial pose T in the step (1) i =[R i |t i ]Obtaining the corrected pose T a =[R a |t a ]Wherein the subscript a represents that this is the corrected result;
the specific process of the step (3) is as follows:
step (31), in a camera coordinate system, measuring the initial position coordinates through a camera
Figure BDA0003056718050000023
And origin O = [0 0 ] of optical center coordinate system of camera] T A straight line L can be determined OT The equation is
Figure BDA0003056718050000024
Where the superscript i represents this is the initial result, { l, m, n } denotes the parameters of the straight line;
step (32), the conversion relation between the laser range finder coordinate system and the camera optical center coordinate system is calibrated in advance, and can be represented by a space three-dimensional point and a space direction vector:
O LO =[x L ,y L ,z L ] T
n LO =[n xL ,n yL ,n zL ] T
wherein x is L 、y L 、z L The coordinate values of the origin position of the coordinate system of the laser range finder in the X, Y and Z directions of the camera coordinate system, n xL 、n yL 、n zL Euler rotation angle from the camera coordinate system to the laser rangefinder coordinate system.
Because the tail end of the laser has certain dispersion, the laser spot is not an accurate point but has certain size, and the laser spot is assumed to be dispersed into a circle with the radius of r without loss of generality. Since we know the representation of the laser range finder coordinate system in the camera coordinate system and the measured distance value of the laser range finder, we can find the coordinate P = [ x ] of the laser spot face center point r ,y r ,z r ] T And then solve through the above conditionsEquation of circle of confusion:
Figure BDA0003056718050000025
step (33) of obtaining the straight line L determined in step (31) OT And (4) intersecting the circular surface determined in the step (32), wherein the obtained coordinates of the intersection point are the corrected cooperative target position. The joint equation is as follows:
Figure BDA0003056718050000031
solving the equation will obtain the corrected coordinates of the position of the cooperative target
Figure BDA0003056718050000032
Wherein the superscript a represents that this is the corrected result; by t a Updating T i =[R i |t i ]Obtaining the pose T after primary correction by the displacement component in a =[R a |t a ]。
Step (4) of utilizing a nonlinear optimization method to correct pose T obtained in the step (3) a =[R a |t a ]Further optimization is carried out, and a specific optimization model is as follows:
Figure BDA0003056718050000033
wherein,
Figure BDA0003056718050000034
for 2D imaged points on the picture, pro (P) i T ) A reprojection function representing the feature points on the spatial cooperative target, λ is a weighting factor, typically set to 5,
Figure BDA0003056718050000035
indicating the value of the Z-direction displacement after correction,
Figure BDA0003056718050000036
representing the initial Z-direction displacement value, minimizing the above formula can obtain a more accurate pose T' a =[R′ a |t′ a ]。
Compared with the prior art, the invention has the advantages that:
(1) The invention solves the problem of larger error in the direction of the sight line in the existing monocular pose measuring method;
(2) The invention eliminates the error caused by laser spot dispersion in the existing method for fusing the camera and the laser range finder;
(3) The invention has the advantages of extremely high precision, small calculated amount and high stability.
Drawings
FIG. 1 is a detailed flow chart of a remote high-precision six-degree-of-freedom pose measurement method for a cooperative target according to the present invention;
FIG. 2 is a target coordinate layout of the present invention;
FIG. 3 is an image point of a cooperative target feature of the present invention in the camera plane;
FIG. 4 is a schematic diagram of the fusion method of the present invention.
Detailed Description
The present invention will be described in further detail with reference to specific examples.
The specific flow chart of the remote high-precision six-degree-of-freedom pose measurement method for the cooperative target is shown in figure 1, and the test distance is set to be 10 meters. The cooperative target design is shown in fig. 2, and the accurate coordinates of the six LEDs calibrated by the three-coordinate measuring machine are shown in table 1.
TABLE 1 exact coordinates of six LEDs
Figure BDA0003056718050000041
The method comprises the following specific steps:
step (1) obtaining a cooperative target image, and resolving an initial pose T of the cooperative target according to a PnP algorithm i =[R i |t i ]The specific process of the step (1) is as follows:
step (11), camera imaging is carried out, and then the sub-pixel coordinates of the imaging center point of the feature of the cooperative target are extracted based on a gray scale gravity center method
Figure BDA0003056718050000042
(FIG. 3 is the image of the cooperative target feature in a certain experiment, and Table 2 is the sub-pixel image coordinates of the imaging point), and proceeds to the 3D cooperative feature point P on the cooperative target i T (i =1,2,3,4) and corresponding 2D imaging point on picture
Figure BDA0003056718050000043
Matching;
TABLE 2 sub-Pixel coordinates of imaging points
Figure BDA0003056718050000044
Step (12) based on a PnP algorithm, performing pose calculation on the 2D-3D matching points by using the 4 pairs in the step (11) to obtain a pose transformation relation T of a cooperative target coordinate system relative to a camera optical center coordinate system i =[R i |t i ]. Table 3 shows the initial pose obtained for this measurement.
TABLE 3 initial pose measured
Figure BDA0003056718050000051
Step (2), measuring accurate depth information D by using laser range finder a
Step (3) of utilizing the accurate depth information D in the step (2) a Correcting the initial pose T in the step (1) i =[R i |t i ]Obtaining corrected accurate pose T a =[R a |t a ]The specific process of the step (3) is as follows:
step (31), in a camera coordinate system, measuring the initial position coordinates through a camera
Figure BDA0003056718050000052
And origin O = [0 0 ] of optical center coordinate system of camera] T A straight line L can be determined OT The equation is
Figure BDA0003056718050000053
Step (32), the conversion relation between the coordinate system of the laser range finder and the coordinate system of the optical center of the camera is calibrated in advance, and can be represented by a space three-dimensional point and a space direction vector:
O LO =[x L ,y L ,z L ] T
n LO =[n xL ,n yL ,n zL ] T
wherein x is L 、y L 、z L Respectively the coordinate value n of the origin position of the coordinate system of the laser range finder in the X, Y and Z directions of the camera coordinate system xL 、n yL 、n zL Euler rotation angle from the camera coordinate system to the laser rangefinder coordinate system.
Because the tail end of the laser has certain dispersion, the laser spot is not an accurate point but has certain size, and the generality is not lost, and the laser spot is assumed to be dispersed into a circle with the radius of r. Since the representation of the rangefinder coordinate system in the camera coordinate system and the measured distance length of the laser rangefinder are known, the coordinate P = (x) of the laser spot center point can be found r ,y r ,z r ) And then solving the equation of the diffusion circle through the conditions:
Figure BDA0003056718050000054
step (33) of obtaining the straight line L determined in step (31) OT And (4) intersecting the circular surface determined in the step (32), wherein the obtained coordinates of the intersecting point are corrected coordinates of the cooperative target position. The joint equation is as follows:
Figure BDA0003056718050000061
solving this equation will result in corrected coordinates of the cooperative target location
Figure BDA0003056718050000062
By t a Updating T i =[R i |t i ]Obtaining the pose T after primary correction by the displacement component in a =[R a |t a ]. Table 4 shows the pose corrected by this measurement value.
TABLE 4 pose obtained by correcting measured values
Figure BDA0003056718050000063
Step (4) of utilizing a nonlinear optimization method to correct pose T obtained in the step (3) a =[R a |t a ]The precision is further optimized and improved, and the specific optimization model is as follows:
Figure BDA0003056718050000064
minimizing the upper formula will result in a more accurate pose T' a =[R′ a |t′ a ]. Table 5 shows the pose obtained by further optimizing the correction result AT this time, the pose measured by the AT460 laser range finder (which can be regarded as the true value), and the measurement error AT this time.
Table 5 pose obtained by further optimizing the correction result
Figure BDA0003056718050000065
The details of the present invention are not described in detail in the prior art. The above description is only exemplary of the present invention and should not be taken as limiting the invention, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (1)

1. A cooperative target remote high-precision six-degree-of-freedom pose measurement method is characterized by comprising the following steps:
step (1), obtaining a cooperative target image, and resolving an initial pose T of the cooperative target according to a PnP algorithm i =[R i |t i ]Where R represents attitude information, t represents displacement information, and the subscript i represents that this is the initial result; the specific process of the step (1) is as follows:
step (11), a camera is used for imaging to obtain a cooperation target image, and then the sub-pixel coordinates of the feature imaging center point of the cooperation target are extracted based on a gray scale gravity center method
Figure FDA0003747152800000011
Matching the 2D-3D points;
step (12) based on a PnP algorithm, performing pose calculation on the 2D-3D matching points by using 4 pairs in the step (11) to obtain an initial pose T of the cooperative target i =[R i |t i ];
Step (2) measuring accurate depth information D by using a laser range finder a
Step (3) of utilizing the accurate depth information D in the step (2) a Correcting the initial pose T in the step (1) i =[R i |t i ]Obtaining the corrected pose T a =[R a |t a ]Wherein the subscript a represents that this is the corrected result;
the specific process of the step (3) is as follows:
step (31), in a camera coordinate system, measuring the initial position coordinates through a camera
Figure FDA0003747152800000012
And origin O = [0 0 ] of optical center coordinate system of camera] T A straight line L can be determined OT The equation is
Figure FDA0003747152800000013
Where the superscript i represents this is the initial result, { l, m, n } denotes the parameters of the straight line;
step (32), the conversion relation between the laser range finder coordinate system and the camera optical center coordinate system is calibrated in advance, and can be represented by a space three-dimensional point and a space direction vector:
O LO =[x L ,y L ,z L ] T
n LO =[n xL ,n yL ,n zL ] T
wherein x is L 、y L 、z L The coordinate values of the origin position of the coordinate system of the laser range finder in the X, Y and Z directions of the camera coordinate system, n xL 、n yL 、n zL The Euler rotation angle from the camera coordinate system to the laser range finder coordinate system is obtained;
assuming that the laser spot is dispersed into a circle with the radius of r, knowing the representation of the laser ray in the camera coordinate system and the measured distance value of the laser range finder, the coordinate P = [ x ] of the center point of the laser spot surface is obtained r ,y r ,z r ] T And then solving the equation of the diffusion circle through the conditions:
Figure FDA0003747152800000021
wherein, X, Y and Z respectively represent coordinate values of points on the space circle in X, Y and Z directions of the camera coordinate system;
step (33) of obtaining the straight line L determined in step (31) OT The intersection point of the corrected cooperative target position and the circular surface determined in the step (32) is obtained
Figure FDA0003747152800000022
By t a Updating T i =[R i |t i ]Obtaining the pose T after primary correction by the displacement component in a =[R a |t a ];
Step (4) of utilizing a nonlinear optimization methodThe corrected pose T obtained in the step (3) is corrected a =[R a |t a ]Further optimization is carried out, and a specific optimization model is as follows:
Figure FDA0003747152800000023
wherein,
Figure FDA0003747152800000024
for 2D imaged points on the picture, pro (P) i T ) Representing a re-projection function of feature points on the spatial cooperative target, lambda is a weighting factor, set to 5,
Figure FDA0003747152800000025
indicating the corrected Z-direction displacement value,
Figure FDA0003747152800000026
representing the initial Z-direction displacement value, minimizing the above formula can obtain a more accurate pose T' a =[R′ a |t′ a ]。
CN202110501861.8A 2021-05-08 2021-05-08 Cooperative target remote high-precision six-degree-of-freedom pose measurement method Active CN113324538B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110501861.8A CN113324538B (en) 2021-05-08 2021-05-08 Cooperative target remote high-precision six-degree-of-freedom pose measurement method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110501861.8A CN113324538B (en) 2021-05-08 2021-05-08 Cooperative target remote high-precision six-degree-of-freedom pose measurement method

Publications (2)

Publication Number Publication Date
CN113324538A CN113324538A (en) 2021-08-31
CN113324538B true CN113324538B (en) 2022-10-21

Family

ID=77415054

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110501861.8A Active CN113324538B (en) 2021-05-08 2021-05-08 Cooperative target remote high-precision six-degree-of-freedom pose measurement method

Country Status (1)

Country Link
CN (1) CN113324538B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117284499B (en) * 2023-11-24 2024-01-19 北京航空航天大学 Monocular vision-laser-based pose measurement method for spatial unfolding mechanism

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102063718A (en) * 2010-12-24 2011-05-18 江南大学 Field calibration and precision measurement method for spot laser measuring system
CN103837084A (en) * 2014-02-18 2014-06-04 浙江华东工程安全技术有限公司 Three-direction displacement measurement method based on laser speckle imaging technology
CN104111071A (en) * 2014-07-10 2014-10-22 上海宇航系统工程研究所 High-precision position posture calculating method based on laser ranging and camera visual fusion
CN104880178A (en) * 2015-06-01 2015-09-02 中国科学院光电技术研究所 Monocular vision pose measuring method based on tetrahedral side length and volume weighting constraint
JP6046852B1 (en) * 2016-07-06 2016-12-21 株式会社エクサ Shape analysis program
CN106441151A (en) * 2016-09-30 2017-02-22 中国科学院光电技术研究所 Measuring system for three-dimensional target Euclidean space reconstruction based on vision and active optical fusion
CN109448055A (en) * 2018-09-20 2019-03-08 中国科学院光电研究院 Monocular vision attitude determination method and system
CN110178045A (en) * 2016-11-17 2019-08-27 特里纳米克斯股份有限公司 Detector at least one object of optical detection
CN111121722A (en) * 2019-12-13 2020-05-08 南京理工大学 Binocular three-dimensional imaging method combining laser dot matrix and polarization vision
CN111199564A (en) * 2019-12-23 2020-05-26 中国科学院光电研究院 Indoor positioning method and device of intelligent mobile terminal and electronic equipment
CN111210468A (en) * 2018-11-22 2020-05-29 中移(杭州)信息技术有限公司 Image depth information acquisition method and device
CN111220126A (en) * 2019-11-19 2020-06-02 中国科学院光电技术研究所 Space object pose measurement method based on point features and monocular camera
CN111815712A (en) * 2020-06-24 2020-10-23 中国地质大学(武汉) High-precision camera-single laser combined calibration method
CN111964694A (en) * 2020-08-19 2020-11-20 天津大学 Laser range finder calibration method for three-dimensional measurement
CN111965807A (en) * 2020-09-18 2020-11-20 安徽科技学院 Optical aiming system, camera module and electronic equipment
CN112229323A (en) * 2020-09-29 2021-01-15 华南农业大学 Six-degree-of-freedom measurement method of checkerboard cooperative target based on monocular vision of mobile phone and application of six-degree-of-freedom measurement method

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE112005000738T5 (en) * 2004-03-29 2007-04-26 Evolution Robotics, Inc., Pasadena Method and device for determining position using reflected light sources
JP4644540B2 (en) * 2005-06-28 2011-03-02 富士通株式会社 Imaging device
US7855778B2 (en) * 2007-04-27 2010-12-21 Robert Bosch Company Limited Method and apparatus for locating and measuring the distance to a target
US9222771B2 (en) * 2011-10-17 2015-12-29 Kla-Tencor Corp. Acquisition of information for a construction site
TWI534451B (en) * 2012-04-27 2016-05-21 鴻海精密工業股份有限公司 3d laser location apparatus

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102063718A (en) * 2010-12-24 2011-05-18 江南大学 Field calibration and precision measurement method for spot laser measuring system
CN103837084A (en) * 2014-02-18 2014-06-04 浙江华东工程安全技术有限公司 Three-direction displacement measurement method based on laser speckle imaging technology
CN104111071A (en) * 2014-07-10 2014-10-22 上海宇航系统工程研究所 High-precision position posture calculating method based on laser ranging and camera visual fusion
CN104880178A (en) * 2015-06-01 2015-09-02 中国科学院光电技术研究所 Monocular vision pose measuring method based on tetrahedral side length and volume weighting constraint
JP6046852B1 (en) * 2016-07-06 2016-12-21 株式会社エクサ Shape analysis program
CN106441151A (en) * 2016-09-30 2017-02-22 中国科学院光电技术研究所 Measuring system for three-dimensional target Euclidean space reconstruction based on vision and active optical fusion
CN110178045A (en) * 2016-11-17 2019-08-27 特里纳米克斯股份有限公司 Detector at least one object of optical detection
CN109448055A (en) * 2018-09-20 2019-03-08 中国科学院光电研究院 Monocular vision attitude determination method and system
CN111210468A (en) * 2018-11-22 2020-05-29 中移(杭州)信息技术有限公司 Image depth information acquisition method and device
CN111220126A (en) * 2019-11-19 2020-06-02 中国科学院光电技术研究所 Space object pose measurement method based on point features and monocular camera
CN111121722A (en) * 2019-12-13 2020-05-08 南京理工大学 Binocular three-dimensional imaging method combining laser dot matrix and polarization vision
CN111199564A (en) * 2019-12-23 2020-05-26 中国科学院光电研究院 Indoor positioning method and device of intelligent mobile terminal and electronic equipment
CN111815712A (en) * 2020-06-24 2020-10-23 中国地质大学(武汉) High-precision camera-single laser combined calibration method
CN111964694A (en) * 2020-08-19 2020-11-20 天津大学 Laser range finder calibration method for three-dimensional measurement
CN111965807A (en) * 2020-09-18 2020-11-20 安徽科技学院 Optical aiming system, camera module and electronic equipment
CN112229323A (en) * 2020-09-29 2021-01-15 华南农业大学 Six-degree-of-freedom measurement method of checkerboard cooperative target based on monocular vision of mobile phone and application of six-degree-of-freedom measurement method

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
Calibration of a 3D laser rangefinder and a camera based on optimization solution;Yi An等;《JOURNAL OF INDUSTRIAL AND MANAGEMENT OPTIMIZATION》;20210228;第427-445页 *
Lidar-Monocular Visual Odometry;Johannes Graeter等;《2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)》;20190107;第7872-7879页 *
Mono-camera based calibration method for two-axes LRF measurement system;Langming Zhou等;《SPIE》;20190814;第1-7页 *
单目摄像机-激光测距传感器位姿测量系统;晁志超等;《光学学报》;20110315;第31卷(第3期);第1-7页 *
基于主被动融合的非合作目标重建与位姿测量研究;张壮;《中国优秀博硕士学位论文全文数据库(博士) 基础科学辑》;20190815;第A005-69页 *
基于多传感器融合的航天器间位姿参数估计;冯春等;《红外与激光工程》;20150525;第1616-1622页 *

Also Published As

Publication number Publication date
CN113324538A (en) 2021-08-31

Similar Documents

Publication Publication Date Title
CN111156998B (en) Mobile robot positioning method based on RGB-D camera and IMU information fusion
CN108362266B (en) Auxiliary monocular vision measurement method and system based on EKF laser ranging
CN108198219B (en) Error compensation method for camera calibration parameters for photogrammetry
CN109269512B (en) Relative navigation method for fusing planet landing image and distance measurement
CN108827264B (en) Mobile workbench and its mechanical arm optics target positioning device and localization method
CN112700501B (en) Underwater monocular subpixel relative pose estimation method
CN105091744A (en) Pose detection apparatus and method based on visual sensor and laser range finder
CN107300382B (en) Monocular vision positioning method for underwater robot
CN109373894B (en) Automatic positioning method for aircraft component connection intersection point hole system based on distributed monocular vision
CN106157322B (en) A kind of camera installation site scaling method based on plane mirror
CN110751123B (en) Monocular vision inertial odometer system and method
CN111879354A (en) Unmanned aerial vehicle measurement system that becomes more meticulous
CN114608554B (en) Handheld SLAM equipment and robot instant positioning and mapping method
CN113028990B (en) Laser tracking attitude measurement system and method based on weighted least square
Liu et al. A high-accuracy pose measurement system for robotic automated assembly in large-scale space
CN110211175B (en) Method for calibrating space pose of collimated laser beam
CN114459345B (en) Aircraft fuselage position and posture detection system and method based on visual space positioning
CN113324538B (en) Cooperative target remote high-precision six-degree-of-freedom pose measurement method
Cai et al. A novel measurement system based on binocular fisheye vision and its application in dynamic environment
CN113947638A (en) Image orthorectification method for fisheye camera
CN110260817B (en) Complex surface deflection measurement self-positioning method based on virtual mark points
CN104154931A (en) Optical machine positioning survey method of intersection survey system
CN102096918A (en) Calibration method of parameters of camera for rendezvous and docking
Kong et al. Online measurement method for assembly pose of gear structure based on monocular vision
CN113916128A (en) Method for improving precision based on optical pen type vision measurement system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant