CN111127560B - Calibration method and system for three-dimensional reconstruction binocular vision system - Google Patents

Calibration method and system for three-dimensional reconstruction binocular vision system Download PDF

Info

Publication number
CN111127560B
CN111127560B CN201911095822.1A CN201911095822A CN111127560B CN 111127560 B CN111127560 B CN 111127560B CN 201911095822 A CN201911095822 A CN 201911095822A CN 111127560 B CN111127560 B CN 111127560B
Authority
CN
China
Prior art keywords
cameras
calibration
camera
control points
pairwise
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911095822.1A
Other languages
Chinese (zh)
Other versions
CN111127560A (en
Inventor
李学钧
戴相龙
蒋勇
王晓鹏
何成虎
杨政
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Haohan Information Technology Co ltd
Nantong Power Supply Co Of State Grid Jiangsu Electric Power Co
Original Assignee
Jiangsu Haohan Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Haohan Information Technology Co ltd filed Critical Jiangsu Haohan Information Technology Co ltd
Priority to CN201911095822.1A priority Critical patent/CN111127560B/en
Publication of CN111127560A publication Critical patent/CN111127560A/en
Application granted granted Critical
Publication of CN111127560B publication Critical patent/CN111127560B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration

Abstract

The embodiment of the invention relates to a calibration method and a calibration system for a three-dimensional reconstruction trinocular vision system, wherein the method comprises the following steps: the internal parameters and distortion parameters of each camera in the trinocular vision system are independently calibrated by adopting a Zhang calibration method; acquiring images through the cameras after independent calibration, and calibrating each pair of cameras in the trinocular vision system pairwise based on control points marked in the images; and calibrating the three-view camera by adopting a light beam adjustment parameter optimization method for the cameras which finish pairwise calibration based on the control points in the images. According to the technical scheme, the calibration of the binocular vision system can be achieved, and the binocular camera can obtain richer information, so that the three-dimensional reconstruction precision is higher than that of a binocular camera.

Description

Calibration method and system for three-dimensional reconstruction binocular vision system
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a calibration method and system for a three-dimensional visual system for three-dimensional reconstruction.
Background
The three-dimensional reconstruction technology based on vision is an important problem in the field of computer vision, and has wide application in the fields of distance detection, scene reconstruction and the like. The conventional three-dimensional scene reconstruction method is based on binocular vision, and two vision sensors are adopted to simulate human vision imaging to realize three-dimensional scene reconstruction. Due to the accuracy of reconstruction, three-dimensional reconstruction based on trinocular vision is gradually paid attention to by people at present.
In the three-dimensional reconstruction problem based on multi-view vision composed of a plurality of cameras, the calibration of the cameras is a precondition and a key problem of the three-dimensional reconstruction. Compared with the traditional binocular vision system, the binocular vision system relates to three cameras, and calibration of the binocular vision system is more complex.
Disclosure of Invention
The application aims to provide a calibration method and a calibration system for a three-dimensional reconstruction three-dimensional vision system, which can realize calibration of the three-dimensional vision system.
To achieve the above object, the present application provides a calibration method for a three-dimensional reconstructed binocular vision system, the method comprising:
the internal parameters and distortion parameters of each camera in the trinocular vision system are independently calibrated by adopting a Zhang calibration method;
acquiring images through the cameras after independent calibration, and calibrating each pair of cameras in the trinocular vision system pairwise based on control points marked in the images;
and calibrating the three-view camera by adopting a light beam adjustment parameter optimization method for the cameras which finish pairwise calibration based on the control points in the images.
Further, the internal parameters are used for representing internal structure parameters of the camera, and the distortion parameters are used for representing radial distortion and tangential distortion of the camera; the independent calibration of each camera is completed according to the following formula:
Figure BDA0002268308400000021
wherein the content of the first and second substances,
Figure BDA0002268308400000022
a matrix of translation variables is represented that is,
Figure BDA0002268308400000023
representing a matrix of distortion variables, fxAnd fyDenotes an offset parameter, cxAnd cyRepresents the transverse width coefficient, rijAnd tkThe distortion coefficient is represented, the transformation parameter is represented by s, (u, v) the imaging coordinate of the pixel point is represented, and the three-dimensional coordinate is represented by X, Y and Z.
Further, calibrating each pair of cameras in the binocular vision system based on the control points marked in the image comprises:
calculating SIFT feature descriptors on the control points, and matching corresponding control points in different images pairwise by using Hamming distance to establish a posture relation between the two cameras; the control points are pixel points with outstanding attributes in certain aspects in the image, and are the endpoints of isolated points and line segments with maximum or minimum intensity on certain attributes.
Further, after pairwise matching of corresponding control points in different images using Hamming distance, the method further comprises:
random sampling consistency is used to filter out false matches between control points.
Further, each pair of cameras is calibrated according to the following formula:
R=R2 R1 -1
T=T2-R2 R1 -1 T1
wherein R and T are respectively a rotation matrix and a translation matrix for representing the relative relation of the left camera and the right camera, R1And T1Respectively a rotation matrix and a translation vector R of a relative calibration object obtained by the first camera through independent calibration2And T2Which are respectively the rotation matrix and the translation vector of the relative calibration object obtained by the independent calibration of the second camera.
Further, calibrating the three-view camera by adopting a light beam adjustment parameter optimization method for the cameras which finish pairwise calibration comprises the following steps:
and simultaneously optimizing internal parameters, external parameters and distortion parameters of the three cameras which finish pairwise calibration by adopting a light beam adjustment parameter optimization method, eliminating distortion among views corresponding to the cameras and realizing line alignment, so that the imaging origin coordinates of the left view and the right view are consistent, the optical axes of the cameras are parallel, the left imaging plane and the right imaging plane are coplanar, and the epipolar lines are aligned, thereby finishing the calibration of the three-eye cameras.
Further, three cameras of the triple-view camera are installed on the same horizontal line, and the optical axes of the respective cameras are kept horizontal, and the center camera is provided with an equal interval from the two cameras on the left and right sides.
Further, the beam adjustment parameter optimization method implements a minimization problem characterized by the following formula:
Figure BDA0002268308400000031
where m is 3, representing three images taken by three cameras, n is the number of control points commonly visible in the three images, xijIs the coordinate of the ith control point on image j, vijIs a switching value, v is the control point i has a mapping on the image jij1, otherwise vijImage j is represented by vector a, 0jParameterized, each control point i is represented by biParameterized representation, Q (a)j,bi) Coordinates of the control point i on the image j calculated by the respective parameters are represented, and d (p, q) represents the euclidean distance of the vector (p, q).
To achieve the above object, the present application further provides a calibration system for a three-dimensional reconstructed binocular vision system, the calibration system comprising:
the monocular calibration unit is used for independently calibrating the internal parameters and distortion parameters of each camera in the binocular vision system by adopting a Zhang calibration method;
the binocular calibration unit is used for acquiring images through the cameras which are calibrated independently, and calibrating each pair of cameras in the binocular vision system pairwise based on control points marked in the images;
and the three-eye calibration unit is used for calibrating the three-eye camera by adopting a light beam adjustment parameter optimization method for the cameras which finish pairwise calibration based on the control points in the image.
Further, the binocular calibration unit includes:
the attitude relationship determination module is used for calculating SIFT feature descriptors on the control points and matching corresponding control points in different images pairwise by using Hamming distance so as to establish an attitude relationship between the two cameras; the control points are pixel points with outstanding attributes in certain aspects in the image, and are the endpoints of isolated points and line segments with maximum or minimum intensity on certain attributes.
From the above, compared with the common calibration technology of the binocular vision system, the marking technology of the trinocular vision system for three-dimensional reconstruction provided by the invention provides the joint calibration of the three objectives besides the binocular calibration, so that the interrelation of each camera in the trinocular vision system can be more accurately established, the accurate calibration of each camera in the trinocular vision system is realized, and the higher three-dimensional reconstruction precision can be obtained by using the trinocular vision system.
Drawings
FIG. 1 is a step diagram of a calibration method of a trinocular vision system for three-dimensional reconstruction in an embodiment of the present application;
fig. 2 is a functional block diagram of a calibration system of a trinocular vision system for three-dimensional reconstruction in an embodiment of the present application.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the present application, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments obtained by a person of ordinary skill in the art without any inventive work based on the embodiments in the present application shall fall within the scope of protection of the present application.
The present application provides a calibration method for a three-dimensional reconstructed binocular vision system, referring to fig. 1, the method includes the following steps.
S1: and (3) independently calibrating the internal parameters and distortion parameters of each camera in the trinocular vision system by adopting a Zhang calibration method.
S2: and acquiring images through the cameras after independent calibration, and calibrating each pair of cameras in the trinocular vision system pairwise based on the control points marked in the images.
S3: and calibrating the three-view camera by adopting a light beam adjustment parameter optimization method for the cameras which finish pairwise calibration based on the control points in the images.
Specifically, the parameters such as the internal and distortion of the camera can be individually calibrated using the well-known zhang scaling method. And determining the internal and distortion parameters of the camera based on the checkerboard image by adopting a Zhang calibration method. The internal parameters are internal structural parameters of the binocular camera; the distortion parameters are radial distortion and tangential distortion.
Then, pairwise calibration between the binocular cameras can be achieved. Based on images acquired by each camera, all control points in each image are marked, well-known Scale Invariant Feature Transform (SIFT) Feature descriptors on the control points are calculated, well-known Hamming distances are used for pairwise matching of corresponding control points in different images, well-known RANdom SAmple Consensus (RANSAC) is adopted for filtering out error matching between the control points, and attitude relation between any two cameras is established.
The control points are pixel points with particularly outstanding attributes in certain aspects in the image, and are isolated points with maximum or minimum intensity on certain attributes, end points of line segments and the like.
Finally, based on the control points in the images, a beam adjustment parameter optimization method is adopted to simultaneously optimize the internal, external and distortion parameters of the three cameras, so that distortion elimination and line alignment among the views corresponding to the cameras are realized, the imaging origin coordinates of the left and right views are consistent, the optical axes of the cameras are parallel, the left and right imaging planes are coplanar, the epipolar lines are aligned, and the calibration of the three-view cameras is completed. Wherein, three cameras are installed on the same horizontal line, the optical axis of camera is also horizontal to camera in the middle and the camera of left and right sides is equidistant.
In a specific application example, the internal parameters are used for characterizing internal structural parameters of the cameras, the distortion parameters are used for characterizing radial distortion and tangential distortion of the cameras, and independent calibration of each camera is completed according to the following formula:
Figure BDA0002268308400000061
wherein the content of the first and second substances,
Figure BDA0002268308400000062
a matrix of translation variables is represented that is,
Figure BDA0002268308400000063
representing a matrix of distortion variables, fxAnd fyDenotes an offset parameter, cxAnd cyRepresents the transverse width coefficient, rijAnd tkThe distortion coefficient is represented, the transformation parameter is represented by s, (u, v) the imaging coordinate of the pixel point is represented, and the three-dimensional coordinate is represented by X, Y and Z. And (3) calculating other optimal coefficients by collecting a series of (u, v, X, Y and Z) to realize monocular calibration.
In one embodiment, calibrating pairs of cameras in the trinocular vision system two by two based on control points marked in the image comprises:
calculating 128-dimensional SIFT feature descriptors on the control points, matching corresponding control points in different images pairwise by using Hamming distance, and filtering out error matching between the control points by adopting random sampling consistency so as to establish a posture relation between two cameras; the control points are pixel points with outstanding attributes in certain aspects in the image, and are the endpoints of isolated points and line segments with maximum or minimum intensity on certain attributes.
In one embodiment, the binocular camera needs to calibrate the relative relationship between the left and right camera coordinate systems. The relative relation between the left camera coordinate system and the right camera coordinate system is described by adopting a rotation matrix R and a translation matrix T, and the relative relation specifically comprises the following steps: a world coordinate system is established on the camera 1. Suppose there is a point Q in space whose world coordinate system is QwIts coordinates in the camera 1 and camera 2 coordinate systems can be expressed as:
Q1=R1Qw+T1
Q2=R2Qw+T2
further can obtain
Q2=R2 R1 -1(Q1-T1)+T2=R2 R1 -1 Q1+T2-R2 R1 -1 T1
By combining the above equations, it can be known that each pair of cameras calibrate the relative relationship between the left and right camera coordinate systems according to the following equations:
R=R2 R1 -1
T=T2-R2 R1 -1 T1
wherein R and T are respectively a rotation matrix and a translation matrix for representing the relative relation of the left camera and the right camera, R1And T1Respectively, the rotation matrix and translation vector R of the relative calibration object obtained by the first camera through independent calibration2And T2Which are respectively the rotation matrix and the translation vector of the relative calibration object obtained by the independent calibration of the second camera.
In one embodiment, calibrating the trinocular camera by using a beam adjustment parameter optimization method for the cameras completing pairwise calibration comprises the following steps:
and simultaneously optimizing internal parameters, external parameters and distortion parameters of the three cameras which finish pairwise calibration by adopting a light beam adjustment parameter optimization method, eliminating distortion among views corresponding to the cameras and realizing line alignment, so that the imaging origin coordinates of the left view and the right view are consistent, the optical axes of the cameras are parallel, the left imaging plane and the right imaging plane are coplanar, and the epipolar lines are aligned to finish the calibration of the three-eye cameras.
In one embodiment, three cameras of the trinocular camera are mounted on the same horizontal line, and the optical axes of the respective cameras are kept horizontal, and the middle camera is provided with equal spacing from the two cameras on the left and right.
In one embodiment, the beam adjustment parameter optimization method implements a minimization problem characterized by the following equation:
Figure BDA0002268308400000081
where m is 3, representing three images taken by three cameras, n is the number of control points commonly visible in the three images, xijIs the coordinate of the ith control point on image j, vijIs a switching value, v is the control point i has a mapping on the image jij1, otherwise vijImage j is represented by vector a, 0jParameterized, each control point i is represented by biParameterized representation, Q (a)j,bi) Coordinates of the control point i on the image j calculated by the respective parameters are represented, and d (p, q) represents the euclidean distance of the vector (p, q). The formula can minimize the errors of the projection of the n control points on the three graphs, thereby calibrating the three cameras.
Referring to fig. 2, the present application further provides a calibration system for a three-dimensional reconstruction binocular vision system, the calibration system comprising:
the monocular calibration unit is used for independently calibrating the internal parameters and distortion parameters of each camera in the binocular vision system by adopting a Zhang calibration method;
the binocular calibration unit is used for acquiring images through the cameras which are calibrated independently, and calibrating each pair of cameras in the binocular vision system pairwise based on control points marked in the images;
and the three-eye calibration unit is used for calibrating the three-eye camera by adopting a light beam adjustment parameter optimization method for the cameras which finish pairwise calibration based on the control points in the image.
In one embodiment, the binocular calibration unit includes:
the attitude relationship determination module is used for calculating SIFT feature descriptors on the control points and matching corresponding control points in different images pairwise by using Hamming distance so as to establish an attitude relationship between the two cameras; the control points are pixel points with outstanding attributes in certain aspects in the image, and are the endpoints of isolated points and line segments with maximum or minimum intensity on certain attributes.
From the above, compared with the common calibration technology of the binocular vision system, the marking technology of the trinocular vision system for three-dimensional reconstruction provided by the invention provides the joint calibration of the three objectives besides the binocular calibration, so that the interrelation of each camera in the trinocular vision system can be more accurately established, the accurate calibration of each camera in the trinocular vision system is realized, and the higher three-dimensional reconstruction precision can be obtained by using the trinocular vision system. Meanwhile, the three-eye camera can obtain richer information, so that the three-dimensional reconstruction precision is higher than that of the two-eye camera.
The foregoing description of various embodiments of the present application is provided for the purpose of illustration to those skilled in the art. It is not intended to be exhaustive or to limit the invention to a single disclosed embodiment. As described above, various alternatives and modifications of the present application will be apparent to those skilled in the art to which the above-described technology pertains. Thus, while some alternative embodiments have been discussed in detail, other embodiments will be apparent or relatively easy to derive by those of ordinary skill in the art. This application is intended to cover all alternatives, modifications, and variations of the invention that have been discussed herein, as well as other embodiments that fall within the spirit and scope of the above-described application.

Claims (5)

1. A method for calibrating a three-dimensional visual system for three-dimensional reconstruction, the method comprising:
the internal parameters and distortion parameters of each camera in the trinocular vision system are independently calibrated by adopting a Zhang calibration method;
acquiring images through the cameras after independent calibration, and calibrating each pair of cameras in the trinocular vision system pairwise based on control points marked in the images;
calibrating the three-view camera by adopting a light beam adjustment parameter optimization method for the cameras which finish pairwise calibration based on the control points in the images;
based on the control points marked in the image, pairwise calibrating each pair of cameras in the trinocular vision system comprises:
calculating SIFT feature descriptors on the control points, and matching corresponding control points in different images pairwise by using Hamming distance to establish a posture relation between the two cameras; the control points are pixel points with outstanding attributes in certain aspects in the image, and are the endpoints of isolated points and line segments with maximum or minimum intensity on certain attributes;
the method for calibrating the three-view camera by adopting the light beam adjustment parameter optimization method for the cameras completing pairwise calibration comprises the following steps:
the method comprises the steps of optimizing internal parameters, external parameters and distortion parameters of three cameras which finish pairwise calibration by adopting a light beam adjustment parameter optimization method, eliminating distortion among views corresponding to the cameras and realizing line alignment, so that the imaging origin coordinates of left and right views are consistent, the optical axes of the cameras are parallel, left and right imaging planes are coplanar, and epipolar lines are aligned to finish the calibration of the three-eye cameras;
the beam adjustment parameter optimization method realizes the minimization problem represented by the following formula:
Figure FDA0003495244780000011
where m is 3, representing three images taken by three cameras, n is the number of control points commonly visible in the three images, xijIs the coordinate of the ith control point on image j, vijIs a switching value, v is the control point i has a mapping on the image jij1, otherwise vijImage j is represented by vector a, 0jParameterized, each control point i is represented by biParameterized representation, Q (a)j,bi) Coordinates of the control point i on the image j calculated by the respective parameters are represented, and d (p, q) represents the euclidean distance of the vector (p, q).
2. The method of claim 1, wherein the internal parameters are used to characterize internal structural parameters of the camera, and the distortion parameters are used to characterize radial distortion and tangential distortion of the camera; the independent calibration of each camera is completed according to the following formula:
Figure FDA0003495244780000021
wherein the content of the first and second substances,
Figure FDA0003495244780000022
a matrix of translation variables is represented that is,
Figure FDA0003495244780000023
representing a matrix of distortion variables, fxAnd fyDenotes an offset parameter, cxAnd cyRepresents the transverse width coefficient, rijAnd tkThe distortion coefficient is represented, the transformation parameter is represented by s, (u, v) the imaging coordinate of the pixel point is represented, and the three-dimensional coordinate is represented by X, Y and Z.
3. The method of claim 1, wherein after pairwise matching of corresponding control points in different images using Hamming distance, the method further comprises:
random sampling consistency is used to filter out false matches between control points.
4. The method of claim 1, wherein three cameras of the trinocular camera are installed on the same horizontal line, and the optical axes of the respective cameras are kept horizontal, and the center camera is equally spaced from the two cameras on the left and right sides.
5. A calibration system for a three-dimensional reconstructed binocular vision system, the calibration system comprising:
the monocular calibration unit is used for independently calibrating the internal parameters and distortion parameters of each camera in the binocular vision system by adopting a Zhang calibration method;
the binocular calibration unit is used for acquiring images through the cameras which are calibrated independently, and calibrating each pair of cameras in the binocular vision system pairwise based on control points marked in the images;
the three-eye calibration unit is used for calibrating the three-eye camera by adopting a light beam adjustment parameter optimization method for the cameras which finish pairwise calibration based on the control points in the image;
the binocular calibration unit comprises:
the attitude relationship determination module is used for calculating SIFT feature descriptors on the control points and matching corresponding control points in different images pairwise by using Hamming distance so as to establish an attitude relationship between the two cameras; the control points are pixel points with outstanding attributes in certain aspects in the image, and are the endpoints of isolated points and line segments with maximum or minimum intensity on certain attributes;
the method for calibrating the three-view camera by adopting the light beam adjustment parameter optimization method for the cameras completing pairwise calibration comprises the following steps:
the method comprises the steps of optimizing internal parameters, external parameters and distortion parameters of three cameras which finish pairwise calibration by adopting a light beam adjustment parameter optimization method, eliminating distortion among views corresponding to the cameras and realizing line alignment, so that the imaging origin coordinates of left and right views are consistent, the optical axes of the cameras are parallel, left and right imaging planes are coplanar, and epipolar lines are aligned to finish the calibration of the three-eye cameras;
the beam adjustment parameter optimization method realizes the minimization problem represented by the following formula:
Figure FDA0003495244780000041
where m is 3, representing three images taken by three cameras, n is the number of control points commonly visible in the three images, xijIs the coordinate of the ith control point on image j, vijIs a switching value, v is the control point i has a mapping on image jij1, otherwisevijImage j is represented by vector a, 0jParameterized, each control point i is represented by biParameterized representation, Q (a)j,bi) Coordinates of the control point i on the image j calculated by the respective parameters are represented, and d (p, q) represents the euclidean distance of the vector (p, q).
CN201911095822.1A 2019-11-11 2019-11-11 Calibration method and system for three-dimensional reconstruction binocular vision system Active CN111127560B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911095822.1A CN111127560B (en) 2019-11-11 2019-11-11 Calibration method and system for three-dimensional reconstruction binocular vision system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911095822.1A CN111127560B (en) 2019-11-11 2019-11-11 Calibration method and system for three-dimensional reconstruction binocular vision system

Publications (2)

Publication Number Publication Date
CN111127560A CN111127560A (en) 2020-05-08
CN111127560B true CN111127560B (en) 2022-05-03

Family

ID=70495523

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911095822.1A Active CN111127560B (en) 2019-11-11 2019-11-11 Calibration method and system for three-dimensional reconstruction binocular vision system

Country Status (1)

Country Link
CN (1) CN111127560B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112361959B (en) * 2020-11-06 2022-02-22 西安新拓三维光测科技有限公司 Method and system for correcting coordinate of coding point for measuring motion attitude of helicopter blade and computer-readable storage medium
CN115731303B (en) * 2022-11-23 2023-10-27 江苏濠汉信息技术有限公司 Large-span transmission conductor sag three-dimensional reconstruction method based on bidirectional binocular vision

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102034238A (en) * 2010-12-13 2011-04-27 西安交通大学 Multi-camera system calibrating method based on optical imaging test head and visual graph structure
CN108230397A (en) * 2017-12-08 2018-06-29 深圳市商汤科技有限公司 Multi-lens camera is demarcated and bearing calibration and device, equipment, program and medium
CN108805939A (en) * 2018-06-19 2018-11-13 河海大学常州校区 The caliberating device and method of trinocular vision system based on statistics feature

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011143813A1 (en) * 2010-05-19 2011-11-24 深圳泰山在线科技有限公司 Object projection method and object projection sysytem
CN101876533B (en) * 2010-06-23 2011-11-30 北京航空航天大学 Microscopic stereovision calibrating method
WO2015024407A1 (en) * 2013-08-19 2015-02-26 国家电网公司 Power robot based binocular vision navigation system and method based on
CN103914874B (en) * 2014-04-08 2017-02-01 中山大学 Compact SFM three-dimensional reconstruction method without feature extraction

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102034238A (en) * 2010-12-13 2011-04-27 西安交通大学 Multi-camera system calibrating method based on optical imaging test head and visual graph structure
CN108230397A (en) * 2017-12-08 2018-06-29 深圳市商汤科技有限公司 Multi-lens camera is demarcated and bearing calibration and device, equipment, program and medium
CN108805939A (en) * 2018-06-19 2018-11-13 河海大学常州校区 The caliberating device and method of trinocular vision system based on statistics feature

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于归一化算法的一维标定物多相机标定;全燕鸣等;《光学学报》;20190430;第39卷(第04期);0415001-1—0415001-8 *

Also Published As

Publication number Publication date
CN111127560A (en) 2020-05-08

Similar Documents

Publication Publication Date Title
CN108734744B (en) Long-distance large-view-field binocular calibration method based on total station
CN108510551B (en) Method and system for calibrating camera parameters under long-distance large-field-of-view condition
Zeller et al. Depth estimation and camera calibration of a focused plenoptic camera for visual odometry
JP2010513907A (en) Camera system calibration
CN109579695B (en) Part measuring method based on heterogeneous stereoscopic vision
CN109544628B (en) Accurate reading identification system and method for pointer instrument
CN110009690A (en) Binocular stereo vision image measuring method based on polar curve correction
CN110458952B (en) Three-dimensional reconstruction method and device based on trinocular vision
CN104537707A (en) Image space type stereo vision on-line movement real-time measurement system
CN110874854B (en) Camera binocular photogrammetry method based on small baseline condition
CN110223355B (en) Feature mark point matching method based on dual epipolar constraint
CN111127560B (en) Calibration method and system for three-dimensional reconstruction binocular vision system
CN116129037B (en) Visual touch sensor, three-dimensional reconstruction method, system, equipment and storage medium thereof
Perdigoto et al. Calibration of mirror position and extrinsic parameters in axial non-central catadioptric systems
CN116433737A (en) Method and device for registering laser radar point cloud and image and intelligent terminal
CN110675436A (en) Laser radar and stereoscopic vision registration method based on 3D feature points
CN112686961A (en) Method and device for correcting calibration parameters of depth camera
CN111798507A (en) Power transmission line safety distance measuring method, computer equipment and storage medium
WO2022218161A1 (en) Method and apparatus for target matching, device, and storage medium
CN115797461A (en) Flame space positioning system calibration and correction method based on binocular vision
US20240054662A1 (en) Capsule endoscope image three-dimensional reconstruction method, electronic device, and readable storage medium
CN115375745A (en) Absolute depth measurement method based on polarization microlens light field image parallax angle
KR20230137937A (en) Device and method for correspondence analysis in images
CN110992463B (en) Three-dimensional reconstruction method and system for sag of transmission conductor based on three-eye vision
CN110487254B (en) Rapid underwater target size measuring method for ROV

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20231227

Address after: Jianghai dream Valley, no.998, Century Avenue, high tech Zone, Nantong City, Jiangsu Province, 226300

Patentee after: JIANGSU HAOHAN INFORMATION TECHNOLOGY Co.,Ltd.

Patentee after: Nantong Power Supply Company of State Grid Jiangsu Electric Power Company

Address before: Jianghai dream Valley, no.998, Century Avenue, high tech Zone, Nantong City, Jiangsu Province, 226300

Patentee before: JIANGSU HAOHAN INFORMATION TECHNOLOGY Co.,Ltd.

TR01 Transfer of patent right