CN113324538A - Cooperative target remote high-precision six-degree-of-freedom pose measurement method - Google Patents

Cooperative target remote high-precision six-degree-of-freedom pose measurement method Download PDF

Info

Publication number
CN113324538A
CN113324538A CN202110501861.8A CN202110501861A CN113324538A CN 113324538 A CN113324538 A CN 113324538A CN 202110501861 A CN202110501861 A CN 202110501861A CN 113324538 A CN113324538 A CN 113324538A
Authority
CN
China
Prior art keywords
pose
coordinate system
camera
initial
cooperative target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110501861.8A
Other languages
Chinese (zh)
Other versions
CN113324538B (en
Inventor
赵汝进
朱自发
马跃博
颜坤
刘恩海
易晋辉
曾思康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Optics and Electronics of CAS
Original Assignee
Institute of Optics and Electronics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Optics and Electronics of CAS filed Critical Institute of Optics and Electronics of CAS
Priority to CN202110501861.8A priority Critical patent/CN113324538B/en
Publication of CN113324538A publication Critical patent/CN113324538A/en
Application granted granted Critical
Publication of CN113324538B publication Critical patent/CN113324538B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a remote high-precision six-degree-of-freedom pose measurement method for a cooperative target, which comprises the following steps of: (1) acquiring an initial six-degree-of-freedom pose and an initial depth value of a cooperative target; (2) calculating a laser spot circular equation; (3) solving an equation of a space straight line determined by two three-dimensional space points of the initial position and the optical center of the camera; (4) solving the intersection point of the facula circle and the space straight line, and correcting the initial pose; (5) and further optimizing the correction result by using a nonlinear optimization method. The method corrects larger errors in the monocular vision pose measurement sight direction by means of the accurate depth value provided by the laser range finder, and meanwhile, interference errors caused by laser spot dispersion of the laser range finder can be eliminated.

Description

Cooperative target remote high-precision six-degree-of-freedom pose measurement method
Technical Field
The invention relates to the technical field of pose measurement based on multi-sensor fusion, in particular to a cooperative target long-distance high-precision six-degree-of-freedom pose measurement method.
Background
The target high-precision pose measurement with six degrees of freedom is beneficial to realizing accurate and efficient machine operation, so that the method is widely applied to the fields of autonomous robots, precision machining, spacecraft docking and the like. Monocular vision is a non-contact six-degree-of-freedom attitude measurement method, has the advantages of low hardware complexity, high precision, wide measurement range and the like, and is one of the most common six-degree-of-freedom attitude measurement methods at present. However, due to the limitations of the model itself, the measurement accuracy along the optical axis of the camera is usually much lower than the measurement accuracy along the other two directions, and according to the analysis on the measurement accuracy of single-eye three-point position of Zhaojun et al (see optical precision engineering, 2014, volume 22 (5)), the measurement accuracy along the optical axis of the camera is only one twentieth of the other two directions.
According to the position and pose measuring system of monocular camera-laser ranging sensor (see optical journal, volume 31 (3) pages 85-91) of Chaozhi et al, the depth value provided by the laser range finder is utilized to directly replace the displacement value in the visual line direction in the monocular vision position and pose measuring result, and the error caused by laser spot dispersion is not considered, so that the correction effect is limited and only millimeter-level precision can be achieved.
Disclosure of Invention
The invention provides a cooperative target remote high-precision six-degree-of-freedom pose measurement method, which is a solution based on multi-sensor fusion, aiming at the problem of large measurement error along the optical axis direction of a camera in the existing monocular vision measurement. The method can correct the initial pose measured by the camera by using the accurate depth information obtained by the laser range finder, and can eliminate the measurement error caused by laser spot dispersion. The method establishes a light spot diffusion circle, determines a straight line where an initial measurement position and a camera light center point are located, and then converts the fusion problem of two sensors into a geometric problem of solving the intersection point of a space circular surface and a space straight line. The calculated intersection point is the corrected measurement pose, and after correction, the displacement measurement precision along the sight line direction is higher than that of the other two directions.
The technical scheme adopted by the invention is as follows: a cooperative target remote high-precision six-degree-of-freedom pose measurement method comprises the following steps:
step (1), obtaining a cooperative target image, and resolving an initial pose T of the cooperative target according to a PnP algorithmi=[Ri|ti]Where R represents attitude information, t represents displacement information, and the subscript i represents that this is the initial result;
the specific process of the step (1) is as follows:
step (11), a camera is used for imaging to obtain a cooperation target image, and then the sub-pixel coordinates of the feature imaging center point of the cooperation target are extracted based on a gray scale gravity center method
Figure BDA0003056718050000021
And performing a 3D cooperative feature point P on the cooperative targeti T(i ═ 1,2,3,4) and corresponding 2D imaging points on the picture
Figure BDA0003056718050000022
Matching;
step (12) based on a PnP algorithm, performing pose calculation on the 2D-3D matching points by using the 4 pairs in the step (11) to obtain a pose transformation relation T of a cooperative target coordinate system relative to a camera optical center coordinate systemi=[Ri|ti]。
Step (2), measuring accurate depth information D by using laser range findera
Step (3) of utilizing the accurate depth information D in the step (2)aCorrecting the initial pose T in the step (1)i=[Ri|ti]Obtaining the corrected pose Ta=[Ra|ta]Wherein the subscript a represents that this is the corrected result;
the specific process of the step (3) is as follows:
step (31) of determining camera coordinatesIn the system, the initial position coordinates measured by the camera
Figure BDA0003056718050000023
And the origin O of the optical center coordinate system of the camera is [ 000 ]]TA straight line L can be determinedOTThe equation is
Figure BDA0003056718050000024
Where the superscript i represents this is the initial result, { l, m, n } denotes the parameters of the straight line;
step (32), the conversion relation between the laser range finder coordinate system and the camera optical center coordinate system is calibrated in advance, and can be represented by a space three-dimensional point and a space direction vector:
OLO=[xL,yL,zL]T
nLO=[nxL,nyL,nzL]T
wherein x isL、yL、zLThe coordinate values of the origin position of the coordinate system of the laser range finder in the X, Y and Z directions of the camera coordinate system, nxL、nyL、nzLEuler rotation angle from the camera coordinate system to the laser rangefinder coordinate system.
Because the tail end of the laser has certain dispersion, the laser spot is not an accurate point but has certain size, and the laser spot is assumed to be dispersed into a circle with the radius of r without loss of generality. Since the representation of the laser range finder coordinate system in the camera coordinate system and the measured distance value of the laser range finder are known, the coordinate P of the laser spot surface center point can be found to be [ x ]r,yr,zr]TAnd then solving the equation of the diffusion circle through the conditions:
Figure BDA0003056718050000025
step (33) of obtaining the straight line L determined in step (31)OTIntersection with the circular surface determined in this step (32)And (4) point, wherein the obtained intersection point coordinate is the corrected cooperative target position. The joint equation is as follows:
Figure BDA0003056718050000031
solving the equation will obtain the corrected coordinates of the position of the cooperative target
Figure BDA0003056718050000032
Wherein the superscript a represents that this is the corrected result; by taUpdating Ti=[Ri|ti]Obtaining the pose T after primary correction by the displacement component ina=[Ra|ta]。
Step (4) utilizing a nonlinear optimization method to correct the pose T obtained in the step (3)a=[Ra|ta]Further optimization is carried out, and a specific optimization model is as follows:
Figure BDA0003056718050000033
wherein,
Figure BDA0003056718050000034
for 2D imaged points on the picture, pro (P)i T) A reprojection function representing the feature points on the spatial cooperative target, λ is a weighting factor, typically set to 5,
Figure BDA0003056718050000035
indicating the corrected Z-direction displacement value,
Figure BDA0003056718050000036
representing the initial Z-direction displacement value, minimizing the above formula can obtain a more accurate pose T'a=[R′a|t′a]。
Compared with the prior art, the invention has the advantages that:
(1) the invention solves the problem of larger error in the direction of the sight line in the existing monocular pose measuring method;
(2) the invention eliminates the error caused by laser spot dispersion in the existing method for fusing the camera and the laser range finder;
(3) the invention has the advantages of extremely high precision, small calculated amount and high stability.
Drawings
FIG. 1 is a detailed flow chart of a remote high-precision six-degree-of-freedom pose measurement method for a cooperative target according to the present invention;
FIG. 2 is a target coordinate layout of the present invention;
FIG. 3 is an image point of a cooperative target feature of the present invention in the camera plane;
FIG. 4 is a schematic diagram of the fusion method of the present invention.
Detailed Description
The present invention will be described in further detail with reference to specific examples.
The specific flow chart of the remote high-precision six-degree-of-freedom pose measurement method for the cooperative target is shown in figure 1, and the test distance is set to be 10 meters. The cooperative target design is shown in fig. 2, and the accurate coordinates of the six LEDs calibrated by the three-coordinate measuring machine are shown in table 1.
TABLE 1 exact coordinates of six LEDs
Figure BDA0003056718050000041
The method comprises the following specific steps:
step (1), obtaining a cooperative target image, and resolving an initial pose T of the cooperative target according to a PnP algorithmi=[Ri|ti]The specific process of the step (1) is as follows:
step (11), camera imaging is carried out, and then the sub-pixel coordinates of the imaging center point of the feature of the cooperative target are extracted based on a gray scale gravity center method
Figure BDA0003056718050000042
(FIG. 3 is an image of the cooperative target feature in a certain experiment, and Table 2 is a sub-image of the image pointPixel image coordinates) and performs 3D cooperation feature point P on the cooperation targeti T(i ═ 1,2,3,4) and corresponding 2D imaging points on the picture
Figure BDA0003056718050000043
Matching;
TABLE 2 sub-Pixel coordinates of imaging points
Figure BDA0003056718050000044
Step (12) based on a PnP algorithm, performing pose calculation on the 2D-3D matching points by using the 4 pairs in the step (11) to obtain a pose transformation relation T of a cooperative target coordinate system relative to a camera optical center coordinate systemi=[Ri|ti]. Table 3 shows the initial pose obtained for this measurement.
TABLE 3 initial pose measured
Figure BDA0003056718050000051
Step (2), measuring accurate depth information D by using laser range findera
Step (3) of utilizing the accurate depth information D in the step (2)aCorrecting the initial pose T in the step (1)i=[Ri|ti]Obtaining corrected accurate pose Ta=[Ra|ta]The specific process of the step (3) is as follows:
step (31), in a camera coordinate system, measuring the initial position coordinates through a camera
Figure BDA0003056718050000052
And the origin O of the optical center coordinate system of the camera is [ 000 ]]TA straight line L can be determinedOTThe equation is
Figure BDA0003056718050000053
Step (32), the conversion relation between the laser range finder coordinate system and the camera optical center coordinate system is calibrated in advance, and can be represented by a space three-dimensional point and a space direction vector:
OLO=[xL,yL,zL]T
nLO=[nxL,nyL,nzL]T
wherein x isL、yL、zLThe coordinate values of the origin position of the coordinate system of the laser range finder in the X, Y and Z directions of the camera coordinate system, nxL、nyL、nzLEuler rotation angle from the camera coordinate system to the laser rangefinder coordinate system.
Because the tail end of the laser has certain dispersion, the laser spot is not an accurate point but has certain size, and the generality is not lost, and the laser spot is assumed to be dispersed into a circle with the radius of r. Since the representation of the rangefinder coordinate system in the camera coordinate system and the measured distance length of the laser rangefinder are known, the coordinate P ═ x (x) of the laser spot center point can be determinedr,yr,zr) And then solving the equation of the diffusion circle through the conditions:
Figure BDA0003056718050000054
step (33) of obtaining the straight line L determined in step (31)OTAnd (4) intersecting the circular surface determined in the step (32), wherein the obtained coordinates of the intersecting point are corrected coordinates of the cooperative target position. The joint equation is as follows:
Figure BDA0003056718050000061
solving the equation will obtain the corrected coordinates of the position of the cooperative target
Figure BDA0003056718050000062
By taUpdating Ti=[Ri|ti]Is divided intoMeasuring to obtain the preliminarily corrected pose Ta=[Ra|ta]. Table 4 shows the pose corrected by this measurement value.
TABLE 4 pose obtained by correcting measured values
Figure BDA0003056718050000063
Step (4) utilizing a nonlinear optimization method to correct the pose T obtained in the step (3)a=[Ra|ta]The precision is further optimized and improved, and the specific optimization model is as follows:
Figure BDA0003056718050000064
minimizing the upper formula will result in a more accurate pose T'a=[R′a|t′a]. Table 5 shows the pose obtained by further optimizing the correction result AT this time, the pose measured by the AT460 laser range finder (which can be regarded as the true value), and the measurement error AT this time.
Table 5 pose obtained by further optimizing the correction result
Figure BDA0003056718050000065
The present invention is not described in detail in the prior art. The above description is only exemplary of the present invention and should not be taken as limiting the invention, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (1)

1. A cooperative target remote high-precision six-degree-of-freedom pose measurement method is characterized by comprising the following steps:
step (1), acquiring a cooperation target image, and resolving the cooperation target according to a PnP algorithmInitial pose Ti=[Ri|ti]Where R represents attitude information, t represents displacement information, and the subscript i represents that this is the initial result; the specific process of the step (1) is as follows:
step (11), a camera is used for imaging to obtain a cooperation target image, and then the sub-pixel coordinates of the feature imaging center point of the cooperation target are extracted based on a gray scale gravity center method
Figure FDA0003056718040000011
Matching the 2D-3D points;
step (12) based on a PnP algorithm, performing pose calculation on the 2D-3D matching points by using 4 pairs in the step (11) to obtain an initial pose T of the cooperative targeti=[Ri|ti];
Step (2), measuring accurate depth information D by using laser range findera
Step (3) of utilizing the accurate depth information D in the step (2)aCorrecting the initial pose T in the step (1)i=[Ri|ti]Obtaining the corrected pose Ta=[Ra|ta]Wherein the subscript a represents that this is the corrected result;
the specific process of the step (3) is as follows:
step (31), in a camera coordinate system, measuring the initial position coordinates through a camera
Figure FDA0003056718040000012
And the origin O of the optical center coordinate system of the camera is [ 000 ]]TA straight line L can be determinedOTThe equation is
Figure FDA0003056718040000013
Where the superscript i represents this is the initial result, { l, m, n } denotes the parameters of the straight line;
step (32), the conversion relation between the laser range finder coordinate system and the camera optical center coordinate system is calibrated in advance, and can be represented by a space three-dimensional point and a space direction vector:
OLO=[xL,yL,zL]T
nLO=[nxL,nyL,nzL]T
wherein x isL、yL、zLThe coordinate values of the origin position of the coordinate system of the laser range finder in the X, Y and Z directions of the camera coordinate system, nxL、nyL、nzLThe Euler rotation angle from the camera coordinate system to the laser range finder coordinate system is obtained;
assuming that the laser spot is dispersed into a circle with radius r, knowing the representation of the laser ray in the camera coordinate system and the measured distance value of the laser range finder, the coordinate P of the center point of the laser spot surface is obtained as [ x ]r,yr,zr]TAnd then solving the equation of the diffusion circle through the conditions:
Figure FDA0003056718040000021
wherein, X, Y and Z respectively represent coordinate values of points on the space circle in the X, Y and Z directions of the camera coordinate system;
step (33) of obtaining the straight line L determined in step (31)OTThe intersection point of the corrected cooperative target position and the circular surface determined in the step (32) is obtained
Figure FDA0003056718040000022
By taUpdating Ti=[Ri|ti]Obtaining the pose T after primary correction by the displacement component ina=[Ra|ta];
Step (4) utilizing a nonlinear optimization method to correct the pose T obtained in the step (3)a=[Ra|ta]Further optimization is carried out, and a specific optimization model is as follows:
Figure FDA0003056718040000023
wherein,
Figure FDA0003056718040000024
for 2D imaged points on the picture, pro (P)i T) A reprojection function representing the feature points on the spatial cooperative target, λ is a weighting factor, typically set to 5,
Figure FDA0003056718040000025
indicating the corrected Z-direction displacement value,
Figure FDA0003056718040000026
representing the initial Z-direction displacement value, minimizing the above formula can obtain a more accurate pose T'a=[R′a|t′a]。
CN202110501861.8A 2021-05-08 2021-05-08 Cooperative target remote high-precision six-degree-of-freedom pose measurement method Active CN113324538B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110501861.8A CN113324538B (en) 2021-05-08 2021-05-08 Cooperative target remote high-precision six-degree-of-freedom pose measurement method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110501861.8A CN113324538B (en) 2021-05-08 2021-05-08 Cooperative target remote high-precision six-degree-of-freedom pose measurement method

Publications (2)

Publication Number Publication Date
CN113324538A true CN113324538A (en) 2021-08-31
CN113324538B CN113324538B (en) 2022-10-21

Family

ID=77415054

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110501861.8A Active CN113324538B (en) 2021-05-08 2021-05-08 Cooperative target remote high-precision six-degree-of-freedom pose measurement method

Country Status (1)

Country Link
CN (1) CN113324538B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117284499A (en) * 2023-11-24 2023-12-26 北京航空航天大学 Monocular vision-laser-based pose measurement method for spatial unfolding mechanism

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050213082A1 (en) * 2004-03-29 2005-09-29 Evolution Robotics, Inc. Methods and apparatus for position estimation using reflected light sources
US20060290781A1 (en) * 2005-06-28 2006-12-28 Fujitsu Limited Image obtaining apparatus
US20080266541A1 (en) * 2007-04-27 2008-10-30 Wai Lam Yung Method and apparatus for locating and measuring the distance to a target
CN102063718A (en) * 2010-12-24 2011-05-18 江南大学 Field calibration and precision measurement method for spot laser measuring system
US20130096873A1 (en) * 2011-10-17 2013-04-18 Kla-Tencor Corporation Acquisition of Information for a Construction Site
US20130286375A1 (en) * 2012-04-27 2013-10-31 Hon Hai Precision Industry Co., Ltd. High precise laser rangefinder
CN103837084A (en) * 2014-02-18 2014-06-04 浙江华东工程安全技术有限公司 Three-direction displacement measurement method based on laser speckle imaging technology
CN104111071A (en) * 2014-07-10 2014-10-22 上海宇航系统工程研究所 High-precision position posture calculating method based on laser ranging and camera visual fusion
CN104880178A (en) * 2015-06-01 2015-09-02 中国科学院光电技术研究所 Monocular vision pose measuring method based on tetrahedral side length and volume weighting constraint
JP6046852B1 (en) * 2016-07-06 2016-12-21 株式会社エクサ Shape analysis program
CN106441151A (en) * 2016-09-30 2017-02-22 中国科学院光电技术研究所 Measuring system for three-dimensional target Euclidean space reconstruction based on vision and active optical fusion
CN109448055A (en) * 2018-09-20 2019-03-08 中国科学院光电研究院 Monocular vision attitude determination method and system
CN110178045A (en) * 2016-11-17 2019-08-27 特里纳米克斯股份有限公司 Detector at least one object of optical detection
CN111121722A (en) * 2019-12-13 2020-05-08 南京理工大学 Binocular three-dimensional imaging method combining laser dot matrix and polarization vision
CN111199564A (en) * 2019-12-23 2020-05-26 中国科学院光电研究院 Indoor positioning method and device of intelligent mobile terminal and electronic equipment
CN111210468A (en) * 2018-11-22 2020-05-29 中移(杭州)信息技术有限公司 Image depth information acquisition method and device
CN111220126A (en) * 2019-11-19 2020-06-02 中国科学院光电技术研究所 Space object pose measurement method based on point features and monocular camera
CN111815712A (en) * 2020-06-24 2020-10-23 中国地质大学(武汉) High-precision camera-single laser combined calibration method
CN111965807A (en) * 2020-09-18 2020-11-20 安徽科技学院 Optical aiming system, camera module and electronic equipment
CN111964694A (en) * 2020-08-19 2020-11-20 天津大学 Laser range finder calibration method for three-dimensional measurement
CN112229323A (en) * 2020-09-29 2021-01-15 华南农业大学 Six-degree-of-freedom measurement method of checkerboard cooperative target based on monocular vision of mobile phone and application of six-degree-of-freedom measurement method

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050213082A1 (en) * 2004-03-29 2005-09-29 Evolution Robotics, Inc. Methods and apparatus for position estimation using reflected light sources
US20060290781A1 (en) * 2005-06-28 2006-12-28 Fujitsu Limited Image obtaining apparatus
US20080266541A1 (en) * 2007-04-27 2008-10-30 Wai Lam Yung Method and apparatus for locating and measuring the distance to a target
CN102063718A (en) * 2010-12-24 2011-05-18 江南大学 Field calibration and precision measurement method for spot laser measuring system
US20130096873A1 (en) * 2011-10-17 2013-04-18 Kla-Tencor Corporation Acquisition of Information for a Construction Site
US20130286375A1 (en) * 2012-04-27 2013-10-31 Hon Hai Precision Industry Co., Ltd. High precise laser rangefinder
CN103837084A (en) * 2014-02-18 2014-06-04 浙江华东工程安全技术有限公司 Three-direction displacement measurement method based on laser speckle imaging technology
CN104111071A (en) * 2014-07-10 2014-10-22 上海宇航系统工程研究所 High-precision position posture calculating method based on laser ranging and camera visual fusion
CN104880178A (en) * 2015-06-01 2015-09-02 中国科学院光电技术研究所 Monocular vision pose measuring method based on tetrahedral side length and volume weighting constraint
JP6046852B1 (en) * 2016-07-06 2016-12-21 株式会社エクサ Shape analysis program
CN106441151A (en) * 2016-09-30 2017-02-22 中国科学院光电技术研究所 Measuring system for three-dimensional target Euclidean space reconstruction based on vision and active optical fusion
CN110178045A (en) * 2016-11-17 2019-08-27 特里纳米克斯股份有限公司 Detector at least one object of optical detection
CN109448055A (en) * 2018-09-20 2019-03-08 中国科学院光电研究院 Monocular vision attitude determination method and system
CN111210468A (en) * 2018-11-22 2020-05-29 中移(杭州)信息技术有限公司 Image depth information acquisition method and device
CN111220126A (en) * 2019-11-19 2020-06-02 中国科学院光电技术研究所 Space object pose measurement method based on point features and monocular camera
CN111121722A (en) * 2019-12-13 2020-05-08 南京理工大学 Binocular three-dimensional imaging method combining laser dot matrix and polarization vision
CN111199564A (en) * 2019-12-23 2020-05-26 中国科学院光电研究院 Indoor positioning method and device of intelligent mobile terminal and electronic equipment
CN111815712A (en) * 2020-06-24 2020-10-23 中国地质大学(武汉) High-precision camera-single laser combined calibration method
CN111964694A (en) * 2020-08-19 2020-11-20 天津大学 Laser range finder calibration method for three-dimensional measurement
CN111965807A (en) * 2020-09-18 2020-11-20 安徽科技学院 Optical aiming system, camera module and electronic equipment
CN112229323A (en) * 2020-09-29 2021-01-15 华南农业大学 Six-degree-of-freedom measurement method of checkerboard cooperative target based on monocular vision of mobile phone and application of six-degree-of-freedom measurement method

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
JOHANNES GRAETER等: "Lidar-Monocular Visual Odometry", 《2018 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS)》 *
LANGMING ZHOU等: "Mono-camera based calibration method for two-axes LRF measurement system", 《SPIE》 *
YI AN等: "Calibration of a 3D laser rangefinder and a camera based on optimization solution", 《JOURNAL OF INDUSTRIAL AND MANAGEMENT OPTIMIZATION》 *
冯春等: "基于多传感器融合的航天器间位姿参数估计", 《红外与激光工程》 *
张壮: "基于主被动融合的非合作目标重建与位姿测量研究", 《中国优秀博硕士学位论文全文数据库(博士) 基础科学辑》 *
晁志超等: "单目摄像机-激光测距传感器位姿测量系统", 《光学学报》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117284499A (en) * 2023-11-24 2023-12-26 北京航空航天大学 Monocular vision-laser-based pose measurement method for spatial unfolding mechanism
CN117284499B (en) * 2023-11-24 2024-01-19 北京航空航天大学 Monocular vision-laser-based pose measurement method for spatial unfolding mechanism

Also Published As

Publication number Publication date
CN113324538B (en) 2022-10-21

Similar Documents

Publication Publication Date Title
CN111156998B (en) Mobile robot positioning method based on RGB-D camera and IMU information fusion
CN108362266B (en) Auxiliary monocular vision measurement method and system based on EKF laser ranging
CN102607457B (en) Measuring device and measuring method for large three-dimensional morphology based on inertial navigation technology
CN111775146A (en) Visual alignment method under industrial mechanical arm multi-station operation
CN108198219B (en) Error compensation method for camera calibration parameters for photogrammetry
CN108827264B (en) Mobile workbench and its mechanical arm optics target positioning device and localization method
CN109269512B (en) Relative navigation method for fusing planet landing image and distance measurement
CN112700501B (en) Underwater monocular subpixel relative pose estimation method
CN109373894B (en) Automatic positioning method for aircraft component connection intersection point hole system based on distributed monocular vision
CN114714356A (en) Method for accurately detecting calibration error of hand eye of industrial robot based on binocular vision
CN106157322B (en) A kind of camera installation site scaling method based on plane mirror
CN110751123B (en) Monocular vision inertial odometer system and method
CN111879354A (en) Unmanned aerial vehicle measurement system that becomes more meticulous
CN112819711B (en) Monocular vision-based vehicle reverse positioning method utilizing road lane line
CN114608554B (en) Handheld SLAM equipment and robot instant positioning and mapping method
CN113028990B (en) Laser tracking attitude measurement system and method based on weighted least square
CN115446836B (en) Visual servo method based on mixing of various image characteristic information
CN114459345B (en) Aircraft fuselage position and posture detection system and method based on visual space positioning
CN113947638A (en) Image orthorectification method for fisheye camera
Cai et al. A novel measurement system based on binocular fisheye vision and its application in dynamic environment
CN113324538B (en) Cooperative target remote high-precision six-degree-of-freedom pose measurement method
CN110260817B (en) Complex surface deflection measurement self-positioning method based on virtual mark points
CN102096918A (en) Calibration method of parameters of camera for rendezvous and docking
Kong et al. Online measurement method for assembly pose of gear structure based on monocular vision
CN112508933B (en) Flexible mechanical arm movement obstacle avoidance method based on complex space obstacle positioning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant