CN117115268A - Binocular camera calibration method, system and device - Google Patents

Binocular camera calibration method, system and device Download PDF

Info

Publication number
CN117115268A
CN117115268A CN202311075014.5A CN202311075014A CN117115268A CN 117115268 A CN117115268 A CN 117115268A CN 202311075014 A CN202311075014 A CN 202311075014A CN 117115268 A CN117115268 A CN 117115268A
Authority
CN
China
Prior art keywords
calibration
camera
coordinate system
coordinates
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311075014.5A
Other languages
Chinese (zh)
Inventor
李忠辉
曹志强
公续荣
管培育
刘希龙
亢晋立
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Nengchuang Technology Co ltd
Institute of Automation of Chinese Academy of Science
Original Assignee
Beijing Nengchuang Technology Co ltd
Institute of Automation of Chinese Academy of Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Nengchuang Technology Co ltd, Institute of Automation of Chinese Academy of Science filed Critical Beijing Nengchuang Technology Co ltd
Priority to CN202311075014.5A priority Critical patent/CN117115268A/en
Publication of CN117115268A publication Critical patent/CN117115268A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application belongs to the field of machine vision, and particularly relates to a binocular camera calibration method, a binocular camera calibration system and a binocular camera calibration device, which aim to solve the problem that the binocular camera calibration is difficult due to uneven refraction caused by the addition of a transparent baffle. The method comprises the following steps: acquiring images of calibration plates placed in different postures through a left camera and a right camera of the binocular camera; combining the coordinates of the center of each effective circular spot on each calibration plate under the corresponding calibration plate coordinate system with the corresponding four-dimensional image coordinates to form a control point set; constructing an intersection point set and fitting to obtain a corresponding intersection equation; taking any calibration plate coordinate system as a reference coordinate system, and solving pose transformation matrixes of other calibration plate coordinate systems relative to the reference coordinate system; and taking the pixel projection straight lines of all pixels in the left camera and the right camera as the calibration result of the binocular camera to realize the calibration of the binocular camera. The application can effectively solve the problem of difficult calibration of the binocular camera caused by the addition of the transparent baffle.

Description

Binocular camera calibration method, system and device
Technical Field
The application belongs to the field of machine vision, and particularly relates to a binocular camera calibration method, a binocular camera calibration system and a binocular camera calibration device.
Background
The development of detection technology has significantly improved the quality of industrial production, and particularly the surface quality inspection of precision workpieces is of great importance. With the development of computer vision technology, binocular camera measurement has become an important way of non-contact measurement. The measurement accuracy of the binocular camera depends on the calibration accuracy. For binocular camera calibration, researchers at home and abroad have developed extensive research. The binocular camera is formed by combining two monocular cameras with fixed pose relations, and the calibration of the binocular camera is often required to be based on the calibration of the monocular camera. For monocular cameras, the prior art generally represents the imaging model as a pinhole imaging model, an orthogonal projection model, or the like; in order to improve the calibration accuracy, researchers can calibrate distortion parameters related to lens distortion on the basis of an imaging model. For a binocular camera, a common calibration method is to calibrate two monocular cameras of the binocular camera respectively, and then further determine the pose relationship between the two monocular cameras. The staged binocular camera calibration method can cause the calibration error of the monocular camera to influence the determination of the pose relationship of the two monocular cameras, so that the calibration precision of the binocular camera is reduced. Also, binocular cameras operating in complex industrial environments such as oil and gas, smoke and dust typically require the addition of transparent baffles for protection. The transparent baffle causes refraction of the light path, which tends to be non-uniform, and such a change in the camera imaging model would cause calibration difficulties.
For this reason, it is necessary for those skilled in the art to calibrate the binocular camera in the case that the imaging model is changed, so as to solve the problem that the addition of the transparent baffle causes uneven refraction, thereby making the calibration of the binocular camera difficult.
Disclosure of Invention
In order to solve the above problems in the prior art, that is, to solve the problem that the addition of a transparent baffle causes uneven refraction, thereby making the calibration of a binocular camera difficult, the application provides a calibration method of the binocular camera, comprising the following steps:
step S10, acquiring images of calibration plates placed in different postures through a left camera and a right camera of a binocular camera, and taking the images as input images;
step S20, respectively establishing a calibration plate coordinate system for calibration plates with different postures, acquiring the coordinates of the centers of all effective circular spots on each calibration plate under the corresponding calibration plate coordinate system, and respectively extracting four-dimensional image coordinates corresponding to the centers of each effective circular spot in each pair of input images; combining the coordinates of the center of each effective circular spot on each calibration plate under the corresponding calibration plate coordinate system with the corresponding four-dimensional image coordinates to form a control point set, wherein the effective circular spots can be observed by a left camera and a right camera of the binocular camera at the same time;
step S30, respectively searching two groups of control points with Euclidean distance of four-dimensional pixel coordinates smaller than a set distance threshold value in the two control point sets, and forming an intersection point set of intersection lines between calibration plates corresponding to the two control point sets; fitting each intersection point set to obtain a corresponding intersection equation;
step S40, combining each intersection equation, taking any calibration plate coordinate system as a reference coordinate system, respectively calculating the representation of the normal vector of the XY plane of other calibration plate coordinate systems under the reference coordinate system, and further solving the pose transformation matrix of the other calibration plate coordinate systems relative to the reference coordinate system;
step S50, according to control point sets corresponding to the calibration plates in each gesture, respectively obtaining image coordinates of centers of four nearest effective circular spots of each pixel in a left camera and a right camera of the binocular camera on each calibration plate image, calculating coordinates of intersection points of pixel projection straight lines of each pixel in the left camera and planes of the calibration plates in each gesture under a corresponding calibration plate coordinate system by interpolation, converting the coordinates into the reference coordinate system to obtain a first intersection point coordinate set corresponding to each pixel in the left camera, calculating coordinates of intersection points of pixel projection straight lines of each pixel in the right camera and planes of the calibration plates in each gesture under the corresponding calibration plate coordinate system by interpolation, and converting the coordinates into the reference coordinate system to obtain a second intersection point coordinate set corresponding to each pixel in the right camera; performing straight line fitting on coordinates in a first intersection point coordinate set corresponding to each pixel in the left camera, and taking the fitted straight line as a pixel projection straight line of each pixel in the left camera; performing straight line fitting on coordinates in a second intersection point coordinate set corresponding to each pixel in the right camera, and taking the fitted straight line as a pixel projection straight line of each pixel in the right camera; and taking the pixel projection straight lines of all pixels in the left camera and the pixel projection straight lines of all pixels in the right camera as the calibration result of the binocular camera to realize the calibration of the binocular camera.
In some preferred embodiments, images of calibration plates placed in different poses are acquired by a left camera and a right camera of a binocular camera as input images by:
the left camera and the right camera of the binocular camera are used for collecting images of calibration plates with different postures, and under the calibration plates with each posture, the left camera and the right camera of the binocular camera respectively collect one image and combine the images into a pair, and the images of the set pair are obtained together and are used as input images.
In some preferred embodiments, the four-dimensional image coordinates are obtained by:
and obtaining image coordinates corresponding to the centers of all the effective circular spots in each image of the input image through a findCirclesGrid function of an OpenCV library, and combining the image coordinates corresponding to the centers of all the effective circular spots in each pair of input images to form four-dimensional image coordinates.
In some preferred embodiments, any calibration plate coordinate system is taken as a reference coordinate system, and the representation of the normal vector of the XY plane of other calibration plate coordinate systems under the reference coordinate system is calculated respectively, so as to obtain the pose transformation matrix of the other calibration plate coordinate systems relative to the reference coordinate system, and the method comprises the following steps:
calibration plate coordinate system corresponding to calibration plate with first postureOther calibration plate coordinate System for reference coordinate System +.>XY plane of +.>Normal vector of (2) in the reference coordinate system +.>The following expression 1 n m The calculation is as follows:
1 n 21 I 121 I 23 /| 1 I 121 I 23 |
1 n 31 I 131 I 23 /| 1 I 131 I 23 |
wherein ∈A represents an antisymmetric operation +. 1 I 121 I 23 I is vector 1 I 121 I 23 Is of the modular length | 1 I 131 I 23 I is vector 1 I 131 I 23 Is used for the mold length of the mold, 1 I 23 is l 23 In a calibration plate coordinate systemAccording to the direction vector of (3)Solving, T represents the transposition operation, m I ij representing intersection line l ij Direction vector in calibration plate coordinate system, i, j=1, 2,3, i<j, m represents the number of coordinate systems of the calibration plate, m=1, 2,3;
calibration plate coordinate systemRelative to->Pose transformation matrix of (a) 1 T 2 The method comprises the following steps:
1 T 2 =[ 1 I 12 , 1 n 21 I 12 , 1 n 2 , 1 O][ 2 I 12 ,e z2 I 12 ,e z , 2 O] -1
wherein [ among others ] 2 I 12 ,e z ∧2 I 12 ,e z , 2 O] -1 Is a matrix of [ 2 I 12 ,e z2 I 12 ,e z , 2 O]Performing inversion operation, e z Is a unit vector of [0, 1 ]] T1 O is l 12 And l 13 In plane surfaceCoordinates of the intersection point on the road byThe solution is carried out to obtain the product, m A ijm B ijm C ij coefficients of the x term, coefficients of the y term, constant term, i, j=1, 2,3, and i of the intersecting line equation, respectively<j, m=i or m=j; 2 o is l 12 And l 23 In plane->The coordinates of the intersection point on the upper part by +.>Solving to obtain;
calibration plate coordinate systemRelative to->Pose transformation matrix of (a) 1 T 3 The method comprises the following steps:
1 T 3 =[ 1 I 13 , 1 n 3 ∧1 I 13 , 1 n 3 , 1 O][ 3 I 13 ,e z3 I 13 ,e z , 3 O] -1
wherein [ among others ] 3 I 13 ,e z3 I 13 ,e z , 3 O] -1 Is a matrix of [ 3 I 13 ,e z3 I 13 ,e z , 3 O]Performing inversion operation; 3 o is l 13 And l 23 In plane surfaceThe coordinates of the intersection point on the upper part by +.>Solving to obtain the final product.
In some preferred embodiments, the coordinates of the intersection point of the pixel projection straight line of any pixel in the left camera and the right camera and the plane where the calibration plate is located under each posture in the corresponding calibration plate coordinate system are calculated respectively, and the method is as follows:
according to the control point set corresponding to the calibration plate under each gesture, respectively obtaining the pixels px in the left camera of the binocular camera l The nearest four effective circular spots on each calibration plate imageImage coordinates of the center of [ ] m u lb , m v lb ),b=1,2,3,4,m=1,2,3;
Obtaining from the control point set m u lb , m v lb ) The center of the corresponding effective circular spot is in the coordinate system of the calibration plateLower coordinates [ ] m x b , m y b ,0);
Based on pixel px l Pixel coordinates (u) l ,v l ) And [ (II) a ] m u lb , m v lb ) And% m x b , m y b 0), calculating a pixel px in the left camera by using the difference value l Coordinates of intersection points of pixel projection straight lines of (2) and planes of the calibration plates under all postures under corresponding calibration plate coordinate systems
Respectively obtaining pixels px in the right camera of the binocular camera according to the control point set corresponding to the calibration plate under each gesture r Image coordinates of the centers of the nearest four effective circular spots on each calibration plate image m u rc , m v rc ),c=1,2,3,4;
Obtaining from the control point set m u rc , m v rc ) The center of the corresponding effective circular spot is in the coordinate system of the calibration plateLower coordinates [ ] m x c , m y c ,0);
Based on pixel px r Pixel coordinates (u) r ,v r ) And [ (II) a ] m u rc , m v rc ) And% m x c , m y c 0), calculating the pixel px in the right camera by using the difference value r The intersection point of the pixel projection straight line of (2) and the plane of the calibration plate under each gesture is in the corresponding positionCoordinates in the calibration plate coordinate system
In a second aspect of the present application, a binocular camera calibration system is provided, comprising:
an image acquisition module configured to acquire images of calibration plates placed in different postures through a left camera and a right camera of the binocular camera as input images;
the control point extraction module is configured to respectively establish a calibration plate coordinate system for calibration plates with different postures, acquire the coordinates of the centers of all the effective circular spots on each calibration plate under the corresponding calibration plate coordinate system, and respectively extract four-dimensional image coordinates corresponding to the centers of each effective circular spot in each pair of input images; combining the coordinates of the center of each effective circular spot on each calibration plate under the corresponding calibration plate coordinate system with the corresponding four-dimensional image coordinates to form a control point set, wherein the effective circular spots can be observed by a left camera and a right camera of the binocular camera at the same time;
the intersection line extraction module is configured to search two groups of control points, of which the Euclidean distance of four-dimensional pixel coordinates is smaller than a set distance threshold value, in the two control point sets respectively, and form an intersection line point set of intersection lines between the calibration plates corresponding to the two control point sets; fitting each intersection point set to obtain a corresponding intersection equation;
the pose matrix solving module is configured to combine each intersecting line equation, take any calibration plate coordinate system as a reference coordinate system, respectively calculate the representation of the normal vector of the XY plane of other calibration plate coordinate systems under the reference coordinate system, and further solve the pose transformation matrix of the other calibration plate coordinate systems relative to the reference coordinate system;
the camera calibration module is configured to obtain image coordinates of centers of four nearest effective circular spots of pixels in a left camera and a right camera of the binocular camera on each calibration plate image respectively according to control point sets corresponding to the calibration plates in each gesture, calculate coordinates of intersection points of pixel projection straight lines of the pixels in the left camera and planes of the calibration plates in each gesture under a corresponding calibration plate coordinate system by interpolation, and convert the coordinates into a reference coordinate system to obtain a first intersection point coordinate set corresponding to the pixels in the left camera, calculate coordinates of intersection points of pixel projection straight lines of the pixels in the right camera and the planes of the calibration plates in each gesture under the corresponding calibration plate coordinate system by interpolation, and convert the coordinates into the reference coordinate system to obtain a second intersection point coordinate set corresponding to the pixels in the right camera; performing straight line fitting on coordinates in a first intersection point coordinate set corresponding to each pixel in the left camera, and taking the fitted straight line as a pixel projection straight line of each pixel in the left camera; performing straight line fitting on coordinates in a second intersection point coordinate set corresponding to each pixel in the right camera, and taking the fitted straight line as a pixel projection straight line of each pixel in the right camera; and taking the pixel projection straight lines of all pixels in the left camera and the pixel projection straight lines of all pixels in the right camera as the calibration result of the binocular camera to realize the calibration of the binocular camera.
In a third aspect of the present application, a storage device is provided, in which a plurality of programs are stored, the programs being adapted to be loaded and executed by a processor to implement a binocular camera calibration method as described above.
In a fourth aspect of the present application, a processing device is provided, including a processor and a storage device; a processor adapted to execute each program; a storage device adapted to store a plurality of programs; the program is adapted to be loaded and executed by a processor to implement a binocular camera calibration method as described above.
The application has the beneficial effects that:
the application can effectively solve the problem of difficult calibration of the binocular camera caused by the introduction of the transparent baffle plate, and provides technical support for the calibration of the binocular camera in complex industrial scenes.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments, made with reference to the accompanying drawings in which:
fig. 1 is a flow chart of a binocular camera calibration method of the present application.
FIG. 2 is a schematic diagram of a binocular camera calibration system of the present application.
Detailed Description
The application is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be noted that, for convenience of description, only the portions related to the present application are shown in the drawings.
It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other. The application will be described in detail below with reference to the drawings in connection with embodiments.
The application relates to a binocular camera calibration method, as shown in fig. 1, which comprises the following steps:
step S10, acquiring images of calibration plates placed in different postures through a left camera and a right camera of a binocular camera, and taking the images as input images;
step S20, respectively establishing a calibration plate coordinate system for calibration plates with different postures, acquiring the coordinates of the centers of all effective circular spots on each calibration plate under the corresponding calibration plate coordinate system, and respectively extracting four-dimensional image coordinates corresponding to the centers of each effective circular spot in each pair of input images; combining the coordinates of the center of each effective circular spot on each calibration plate under the corresponding calibration plate coordinate system with the corresponding four-dimensional image coordinates to form a control point set, wherein the effective circular spots can be observed by a left camera and a right camera of the binocular camera at the same time;
step S30, respectively searching two groups of control points with Euclidean distance of four-dimensional pixel coordinates smaller than a set distance threshold value in the two control point sets, and forming an intersection point set of intersection lines between calibration plates corresponding to the two control point sets; fitting each intersection point set to obtain a corresponding intersection equation;
step S40, combining each intersection equation, taking any calibration plate coordinate system as a reference coordinate system, respectively calculating the representation of the normal vector of the XY plane of other calibration plate coordinate systems under the reference coordinate system, and further solving the pose transformation matrix of the other calibration plate coordinate systems relative to the reference coordinate system;
step S50, according to control point sets corresponding to the calibration plates in each gesture, respectively obtaining image coordinates of centers of four nearest effective circular spots of each pixel in a left camera and a right camera of the binocular camera on each calibration plate image, calculating coordinates of intersection points of pixel projection straight lines of each pixel in the left camera and planes of the calibration plates in each gesture under a corresponding calibration plate coordinate system by interpolation, converting the coordinates into the reference coordinate system to obtain a first intersection point coordinate set corresponding to each pixel in the left camera, calculating coordinates of intersection points of pixel projection straight lines of each pixel in the right camera and planes of the calibration plates in each gesture under the corresponding calibration plate coordinate system by interpolation, and converting the coordinates into the reference coordinate system to obtain a second intersection point coordinate set corresponding to each pixel in the right camera; performing straight line fitting on coordinates in a first intersection point coordinate set corresponding to each pixel in the left camera, and taking the fitted straight line as a pixel projection straight line of each pixel in the left camera; performing straight line fitting on coordinates in a second intersection point coordinate set corresponding to each pixel in the right camera, and taking the fitted straight line as a pixel projection straight line of each pixel in the right camera; and taking the pixel projection straight lines of all pixels in the left camera and the pixel projection straight lines of all pixels in the right camera as the calibration result of the binocular camera to realize the calibration of the binocular camera.
In order to more clearly describe the binocular camera calibration method of the present application, each step of the method embodiment of the present application will be described in detail below with reference to fig. 1.
The embodiment is a preferred implementation mode, and the imaging process of the binocular camera is modeled into two clusters of pixel projection straight lines, so that the problem of binocular parameter calibration is converted into the fitting problem of the pixel projection straight lines corresponding to each pixel. As shown in fig. 1, the specific steps are as follows:
step S10, acquiring images of calibration plates placed in different postures through a left camera and a right camera of a binocular camera, and taking the images as input images;
in this embodiment, the calibration plate is placed in the field of view of the binocular camera, and preferably, the left camera and the right camera of the binocular camera are used to collect images of the three calibration plates with different postures, where dense circular spots are uniformly distributed on the calibration plates, and the calibration plates with different postures should satisfy two-by-two non-parallelism. Under the calibration plate of each gesture, the left camera and the right camera of the binocular camera respectively acquire one image and combine the images into a pair; three calibration plates with different postures are considered, and three pairs of images are obtained as input images.
Step S20, respectively establishing a calibration plate coordinate system for calibration plates with different postures, acquiring the coordinates of the centers of all effective circular spots on each calibration plate under the corresponding calibration plate coordinate system, and respectively extracting four-dimensional image coordinates corresponding to the centers of each effective circular spot in each pair of input images; combining the coordinates of the center of each effective circular spot on each calibration plate under the corresponding calibration plate coordinate system with the corresponding four-dimensional image coordinates to form a control point set, wherein the effective circular spots can be observed by a left camera and a right camera of the binocular camera at the same time;
in the present embodiment, the center of the calibration plate is taken as the origin O P X is the direction parallel to the long side of the calibration plate P An axis Y in a direction parallel to the short side of the calibration plate P Axis, Z is determined according to right hand rule P Shaft forming a calibration plate coordinate system O P X P Y P Z P Wherein O is P X P Y P Is the plane of the calibration plate. The three calibration plates with different postures correspond to three coordinate systems of the calibration plates and are respectively marked asAnd m=1, 2 and 3, and obtaining the coordinates of the center of the effective circular spot on the calibration plate under the corresponding coordinate system of the calibration plate, wherein the effective circular spot refers to the circular spot which can be observed by the left camera and the right camera of the binocular camera at the same time. Obtaining image coordinates corresponding to the centers of all effective circular spots in each image of the input images through the findClescgrid function of the OpenCV library, and combining each effective spot in each pair of input imagesThe image coordinates corresponding to the center of the circular spot form four-dimensional image coordinates; combining the coordinates of the center of each effective circular spot on the calibration plate under the coordinate system of the calibration plate with the corresponding four-dimensional image coordinates to form a control point set D corresponding to each effective circular spot m ,D m ={ m x a , m y a ,0, m u la , m v la , m u ra , m v ra },a=1,2,…,N m M=1, 2,3, where N m The number of effective circular spots on the calibration plate with the mth gesture is # m x a , m y a 0) the center of the a-th effective circular spot on the calibration plate with the m-th gesture is +.>Coordinates of a coordinate system m u la , m v la ) And% m u ra , m v ra ) The center of the a effective circular spot on the m-th gesture calibration plate is the image coordinates of the images shot by the left camera and the right camera of the binocular camera, (-) m u la , m v la , m u ra , m v ra ) Is the four-dimensional image coordinate corresponding to the center of the a effective circular spot on the m-th gesture calibration plate.
Step S30, respectively searching two groups of control points with Euclidean distance of four-dimensional pixel coordinates smaller than a set distance threshold value in the two control point sets, and forming an intersection point set of intersection lines between calibration plates corresponding to the two control point sets; fitting each intersection point set to obtain a corresponding intersection equation;
in this embodiment, the control point set D corresponding to the calibration plate in each posture is based on m M=1, 2,3, searching for four-dimensional pixel coordinates between every two control point sets for a euclidean distance less than σ d And the two groups of control points corresponding to the two control point sets respectively form intersection point sets of intersection lines between the two calibration plates on the corresponding calibration plates, and a least square method (https:// zhuanlan zh) is adopted for each intersection point setihu.com/p/38128785) to obtain the corresponding intersection equation m A ij x+ m B ij y+ m C ij =0, where i, j=1, 2,3, and i<j, m=i or m=j, m A ijm B ijm C ij the coefficients of the x term, the coefficients of the y term and the constant term of the intersecting line equation are respectively; sigma (sigma) d As the distance threshold, the application is preferably set to be 1; an intersection line exists between any two attitude calibration plates, and each intersection line corresponds to an intersection line equation on the two attitude calibration plates associated with the intersection line.
Step S40, combining each intersection equation, taking any calibration plate coordinate system as a reference coordinate system, respectively calculating the representation of the normal vector of the XY plane of other calibration plate coordinate systems under the reference coordinate system, and further solving the pose transformation matrix of the other calibration plate coordinate systems relative to the reference coordinate system;
in this embodiment, three intersecting lines, respectively denoted as l, are formed between the three calibration plates of different postures ij Wherein i, j=1, 2,3, and i<j. The three intersecting lines are arranged in three calibration plate coordinate systemsThe direction vector in (a) is recorded as m I ij M=1, 2,3; when m=i or m=j, m I ij =( m B ij ,- m A ij 0), i.e 1 I 12 =( 1 B 12 ,- 1 A 12 ,0), 1 I 13 =( 1 B 13 ,- 1 A 13 ,0), 2 I 12 =( 2 B 12 ,- 2 A 12 ,0), 2 I 23 =( 2 B 23 ,- 2 A 23 ,0), 3 I 13 =( 3 B 13 ,- 3 A 13 ,0), 3 I 23 =( 3 B 23 ,- 3 A 23 ,0)。
Preferably byFor reference frame, XY plane of other calibration plate frameNormal vector of (2) in the reference coordinate system +.>The following expression 1 n m The calculation is as follows:
wherein ∈A represents an antisymmetric operation +. 1 I 121 I 23 I is vector 1 I 121 I 23 Is of the modular length | 1 I 131 I 23 I is vector 1 I 131 I 23 Is used for the mold length of the mold, 1 I 23 is l 23 In a calibration plate coordinate systemAccording to the direction vector of (3)Solving, T represents the transpose operation.
Calibration plate coordinate systemRelative to->Pose transformation matrix of (a) 1 T 2 The method comprises the following steps:
1 T 2 =[ 1 I 12 , 1 n 21 I 12 , 1 n 2 , 1 O][ 2 I 12 ,e z2 I 12 ,e z , 2 O] -1 (2)
wherein [ among others ] 2 I 12 ,e z2 I 12 ,e z , 2 O] -1 Is a matrix of [ 2 I 12 ,e z2 I 12 ,e z , 2 O]Performing inversion operation, e z Is a unit vector of [0, 1 ]] T1 O is l 12 And l 13 In plane surfaceCoordinates of the intersection point on the road bySolving to obtain; 2 o is l 12 And l 23 In plane->The coordinates of the intersection point on the upper part by +.>Solving to obtain the final product.
Calibration plate coordinate systemRelative to->Pose transformation matrix of (a) 1 T 3 The method comprises the following steps:
1 T 3 =[ 1 I 13 , 1 n 31 I 13 , 1 n 3 , 1 O][ 3 I 13 ,e z3 I 13 ,e z , 3 O] -1 (3)
wherein [ among others ] 3 I 13 ,e z3 I 13 ,e z , 3 O] -1 Is a matrix of [ 3 I 13 ,e z3 I 12 ,e z , 3 O]Performing inversion operation; 3 o is l 13 And l 23 In plane surfaceThe coordinates of the intersection point on the upper part by +.>Solving to obtain the final product.
Step S50, according to control point sets corresponding to the calibration plates in each gesture, respectively obtaining image coordinates of centers of four nearest effective circular spots of each pixel in a left camera and a right camera of the binocular camera on each calibration plate image, calculating coordinates of intersection points of pixel projection straight lines of each pixel in the left camera and planes of the calibration plates in each gesture under a corresponding calibration plate coordinate system by interpolation, converting the coordinates into the reference coordinate system to obtain a first intersection point coordinate set corresponding to each pixel in the left camera, calculating coordinates of intersection points of pixel projection straight lines of each pixel in the right camera and planes of the calibration plates in each gesture under the corresponding calibration plate coordinate system by interpolation, and converting the coordinates into the reference coordinate system to obtain a second intersection point coordinate set corresponding to each pixel in the right camera; performing straight line fitting on coordinates in a first intersection point coordinate set corresponding to each pixel in the left camera, and taking the fitted straight line as a pixel projection straight line of each pixel in the left camera; performing straight line fitting on coordinates in a second intersection point coordinate set corresponding to each pixel in the right camera, and taking the fitted straight line as a pixel projection straight line of each pixel in the right camera; and taking the pixel projection straight lines of all pixels in the left camera and the pixel projection straight lines of all pixels in the right camera as the calibration result of the binocular camera to realize the calibration of the binocular camera.
In this embodiment, according to the control point set D corresponding to the calibration plate in each posture m ={ m x a , m y a ,0, m u la , m v la , m u ra , m v ra },a=1,2,…,N m M=1, 2,3, respectively, givesPixel px in left camera of binocular camera l Image coordinates of the centers of the nearest four effective circular spots on each calibration plate image m u lb , m v lb ) B=1, 2,3,4, further, at D m The obtained% m u lb , m v lb ) The center of the corresponding effective circular spot is atCoordinates in a coordinate system m x b , m y b 0), b=1, 2,3,4, m=1, 2,3; based on pixel px l Pixel coordinates (u) l ,v l ) And [ (II) a ] m u lb , m v lb ) And% m x b , m y b 0), b=1, 2,3,4, preferably using bilinear interpolation (https:// blog. Csdn. Net/qq_ 37577735/artecle/details/80041586) to calculate the pixel px in the left camera when m=1, m=2 and m=3, respectively l Is a straight line and a plane of pixel projectionIs at +.>Coordinates of the coordinate system>And to a reference coordinate system +.>The pixel px is formed l Is defined by a first set of intersection coordinates. The specific conversion process is as follows: pixel px in left camera when m=1 l Is a pixel projection line and plane +.>The coordinates of the intersection point of (2) in the reference coordinate system areWhen m=2, bind ∈ ->Relative to->Pose transformation matrix of (a) 1 T 2 Calculating the pixel px in the left camera l Is a pixel projection line and plane +.>Coordinates of the intersection point of (2) in the reference coordinate systemWherein->When m=3, bind ∈ ->Relative to->Pose transformation matrix of (a) 1 T 3 Calculating the pixel px in the left camera l Is a pixel projection line and plane +.>Coordinates of the intersection point of (2) in the reference coordinate systemWherein-> Andconstituting the pixel px l Is defined by a first set of intersection coordinates. For a pair ofPixel px l The coordinates in the first intersection point coordinate set of (2) are preferably subjected to straight line fitting by a least square method (https:// zhuanlan. Zhihu. Com/p/38128785), and the fitted straight line is used as a pixel px in the left camera l Is a straight line. Further, the pixel projection straight line of other pixels in the left camera is obtained, wherein the method for obtaining the pixel projection straight line of other pixels in the left camera and the pixel px l The method for obtaining the pixel projection straight line is consistent.
According to the control point set D corresponding to the calibration plate under each gesture m ={ m x a , m y a ,0, m u la , m v la , m u ra , m v ra },a=1,2,…,N m M=1, 2,3, respectively obtaining the pixels px in the right camera of the binocular camera r Image coordinates of the centers of the nearest four effective circular spots on each calibration plate image m u rc , m v rc ) C=1, 2,3,4, further, at D m The obtained% m u rc , m v rc ) The center of the corresponding effective circular spot is atCoordinates in a coordinate system m x c , m y c 0), c=1, 2,3,4, m=1, 2,3; based on pixel px r Pixel coordinates (u) r ,v r ) And [ (II) a ] m u rc , m v rc ) And% m x c , m y c 0), c=1, 2,3,4, preferably using bilinear interpolation (https:// blog. Csdn. Net/qq_ 37577735/artecle/details/80041586) to calculate the pixel px in the right camera when m=1, m=2 and m=3, respectively r Is a pixel projection line and plane +.>Is at the intersection point ofCoordinates of the coordinate system>And to a reference coordinate system +.>The pixel px is formed r Is defined by a first intersection coordinate set. The specific conversion process is as follows: m=1 pixels px in right camera r Is a pixel projection line and plane +.>The intersection point of (2) in the reference coordinate system is +.>When in combination with->Relative to->Pose transformation matrix of (a) 1 T 2 Calculating the pixel px in the right camera r Is a pixel projection line and plane +.>Coordinates of the intersection point of (2) in the reference coordinate system +.>Wherein->When m=3, bind ∈ ->Relative to->Pose transformation matrix of (a) 1 T 3 Calculating the pixel px in the right camera r Is a straight line and a plane of pixel projectionCoordinates of the intersection point of (2) in the reference coordinate system +.>Wherein the method comprises the steps of Andconstituting the pixel px r Is defined by a first intersection coordinate set. For pixel px r The coordinates in the second intersection coordinate set of (a) are preferably fitted by a least squares method (https:// zhuanlan. Zhihu. Com/p/38128785) to a straight line, and the fitted straight line is used as the pixel px in the right camera r Is a straight line. Further, the pixel projection straight line of other pixels in the right camera is obtained, wherein the method for obtaining the pixel projection straight line of other pixels in the right camera and the pixel px r The method for obtaining the pixel projection straight line is consistent.
And taking the pixel projection straight lines of all pixels in the left camera and the pixel projection straight lines of all pixels in the right camera as the calibration result of the binocular camera to realize the calibration of the binocular camera.
A binocular camera calibration system according to a second embodiment of the present application, as shown in fig. 2, includes:
an image acquisition module 100 configured to acquire images of calibration plates placed in different postures as input images by left and right cameras of a binocular camera;
the control point extraction module 200 is configured to respectively establish a calibration plate coordinate system for calibration plates with different postures, acquire coordinates of centers of all effective circular spots on each calibration plate under the corresponding calibration plate coordinate system, and respectively extract four-dimensional image coordinates corresponding to the centers of each effective circular spot in each pair of input images; combining the coordinates of the center of each effective circular spot on each calibration plate under the corresponding calibration plate coordinate system with the corresponding four-dimensional image coordinates to form a control point set, wherein the effective circular spots can be observed by a left camera and a right camera of the binocular camera at the same time;
the intersection extraction module 300 is configured to search two groups of control points, between which the Euclidean distance of the four-dimensional pixel coordinates is smaller than a set distance threshold, in two-pair control point sets respectively, and form an intersection point set of intersections between calibration plates corresponding to the two control point sets; fitting each intersection point set to obtain a corresponding intersection equation;
the pose matrix solving module 400 is configured to calculate the representation of the normal vector of the XY plane of the other calibration plate coordinate system under the reference coordinate system by taking any calibration plate coordinate system as the reference coordinate system in combination with each intersecting line equation, so as to solve the pose transformation matrix of the other calibration plate coordinate system relative to the reference coordinate system;
the camera calibration module 500 is configured to obtain image coordinates of centers of four nearest effective circular spots of pixels in a left camera and a right camera of the binocular camera on each calibration plate image according to control point sets corresponding to calibration plates in each gesture, calculate coordinates of intersection points of pixel projection straight lines of the pixels in the left camera and planes of the calibration plates in each gesture under a corresponding calibration plate coordinate system by interpolation, and convert the coordinates into a reference coordinate system to obtain a first intersection point coordinate set corresponding to the pixels in the left camera, calculate coordinates of intersection points of pixel projection straight lines of the pixels in the right camera and the planes of the calibration plates in each gesture under the corresponding calibration plate coordinate system by interpolation, and convert the coordinates into the reference coordinate system to obtain a second intersection point coordinate set corresponding to the pixels in the right camera; performing straight line fitting on coordinates in a first intersection point coordinate set corresponding to each pixel in the left camera, and taking the fitted straight line as a pixel projection straight line of each pixel in the left camera; performing straight line fitting on coordinates in a second intersection point coordinate set corresponding to each pixel in the right camera, and taking the fitted straight line as a pixel projection straight line of each pixel in the right camera; and taking the pixel projection straight lines of all pixels in the left camera and the pixel projection straight lines of all pixels in the right camera as the calibration result of the binocular camera to realize the calibration of the binocular camera.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working processes and related descriptions of the above-described system may refer to corresponding processes in the foregoing method embodiments, which are not repeated herein.
It should be noted that, in the binocular camera calibration system provided in the foregoing embodiment, only the division of the functional modules is illustrated, and in practical application, the functional allocation may be performed by different functional modules according to needs, that is, the modules or steps in the foregoing embodiment of the present application are further decomposed or combined, for example, the modules in the foregoing embodiment may be combined into one module, or may be further decomposed into a plurality of sub-modules, so as to complete all or part of the functions described above. The names of the modules and steps related to the embodiments of the present application are merely for distinguishing the respective modules or steps, and are not to be construed as unduly limiting the present application.
A storage device according to a third embodiment of the present application stores therein a plurality of programs adapted to be loaded and executed by a processor to implement a binocular camera calibration method as described above.
A processing device according to a fourth embodiment of the present application includes a processor, a storage device; a processor adapted to execute each program; a storage device adapted to store a plurality of programs; the program is adapted to be loaded and executed by a processor to implement a binocular camera calibration method as described above.
It will be clear to those skilled in the art that, for convenience and brevity of description, the specific working process of the storage device and the processing device and the related description of the foregoing description may refer to the corresponding process in the foregoing method example, which is not repeated herein.
Those of skill in the art will appreciate that the various illustrative modules, method steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the program(s) corresponding to the software modules, method steps, may be embodied in Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, removable disk, CD-ROM, or any other form of storage medium known in the art. To clearly illustrate this interchangeability of electronic hardware and software, various illustrative components and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as electronic hardware or software depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application, but such implementation is not intended to be limiting.
The terms "first," "second," "third," and the like, are used for distinguishing between similar objects and not for describing a particular sequential or chronological order.
Thus far, the technical solution of the present application has been described in connection with the preferred embodiments shown in the drawings, but it is easily understood by those skilled in the art that the scope of protection of the present application is not limited to these specific embodiments. Equivalent modifications and substitutions for related technical features may be made by those skilled in the art without departing from the principles of the present application, and such modifications and substitutions will be within the scope of the present application.

Claims (8)

1. The binocular camera calibration method is characterized by comprising the following steps of:
step S10, acquiring images of calibration plates placed in different postures through a left camera and a right camera of a binocular camera, and taking the images as input images;
step S20, respectively establishing a calibration plate coordinate system for calibration plates with different postures, acquiring the coordinates of the centers of all effective circular spots on each calibration plate under the corresponding calibration plate coordinate system, and respectively extracting four-dimensional image coordinates corresponding to the centers of each effective circular spot in each pair of input images; combining the coordinates of the center of each effective circular spot on each calibration plate under the corresponding calibration plate coordinate system with the corresponding four-dimensional image coordinates to form a control point set, wherein the effective circular spots can be observed by a left camera and a right camera of the binocular camera at the same time;
step S30, respectively searching two groups of control points with Euclidean distance of four-dimensional pixel coordinates smaller than a set distance threshold value in the two control point sets, and forming an intersection point set of intersection lines between calibration plates corresponding to the two control point sets; fitting each intersection point set to obtain a corresponding intersection equation;
step S40, combining each intersection equation, taking any calibration plate coordinate system as a reference coordinate system, respectively calculating the representation of the normal vector of the XY plane of other calibration plate coordinate systems under the reference coordinate system, and further solving the pose transformation matrix of the other calibration plate coordinate systems relative to the reference coordinate system;
step S50, according to control point sets corresponding to the calibration plates in each gesture, respectively obtaining image coordinates of centers of four nearest effective circular spots of each pixel in a left camera and a right camera of the binocular camera on each calibration plate image, calculating coordinates of intersection points of pixel projection straight lines of each pixel in the left camera and planes of the calibration plates in each gesture under a corresponding calibration plate coordinate system by interpolation, converting the coordinates into the reference coordinate system to obtain a first intersection point coordinate set corresponding to each pixel in the left camera, calculating coordinates of intersection points of pixel projection straight lines of each pixel in the right camera and planes of the calibration plates in each gesture under the corresponding calibration plate coordinate system by interpolation, and converting the coordinates into the reference coordinate system to obtain a second intersection point coordinate set corresponding to each pixel in the right camera; performing straight line fitting on coordinates in a first intersection point coordinate set corresponding to each pixel in the left camera, and taking the fitted straight line as a pixel projection straight line of each pixel in the left camera; performing straight line fitting on coordinates in a second intersection point coordinate set corresponding to each pixel in the right camera, and taking the fitted straight line as a pixel projection straight line of each pixel in the right camera; and taking the pixel projection straight lines of all pixels in the left camera and the pixel projection straight lines of all pixels in the right camera as the calibration result of the binocular camera to realize the calibration of the binocular camera.
2. The binocular camera calibration method according to claim 1, wherein images of calibration plates placed in different postures are collected through a left camera and a right camera of the binocular camera as input images, and the method comprises the following steps:
the left camera and the right camera of the binocular camera are used for collecting images of calibration plates with different postures, and under the calibration plates with each posture, the left camera and the right camera of the binocular camera respectively collect one image and combine the images into a pair, and the images of the set pair are obtained together and are used as input images.
3. The binocular camera calibration method of claim 2, wherein the four-dimensional image coordinates are obtained by the following steps:
and obtaining image coordinates corresponding to the centers of all the effective circular spots in each image of the input image through a findCirclesGrid function of an OpenCV library, and combining the image coordinates corresponding to the centers of all the effective circular spots in each pair of input images to form four-dimensional image coordinates.
4. The binocular camera calibration method of claim 1, wherein any calibration plate coordinate system is used as a reference coordinate system, the representation of the normal vector of the XY plane of other calibration plate coordinate systems under the reference coordinate system is calculated respectively, and the pose transformation matrix of the other calibration plate coordinate systems relative to the reference coordinate system is obtained, and the method comprises the following steps:
calibration plate coordinate system corresponding to calibration plate with first postureOther calibration plate coordinate System for reference coordinate System +.>XY plane of +.>Normal vector of (2) in the reference coordinate system +.>The following expression 1 n m The calculation is as follows:
1 n 21 I 12 ∧1 I 23 /| 1 I 12 ∧1 I 23 |
1 n 31 I 13 ∧1 I 23 /| 1 I 13 ∧1 I 23 |
wherein the method comprises the steps of Representing an antisymmetric operation, | 1 I 12 ∧1 I 23 I is vector 1 I 12 ∧1 I 23 Is of the modular length | 1 I 13 ∧1 I 23 I is vector 1 I 13 ∧1 I 23 Is used for the mold length of the mold, 1 I 23 is l 23 In a calibration plate coordinate systemAccording to the direction vector of (3)Solving, T represents the transposition operation, m I ij representing intersection line l ij The direction vector in the calibration plate coordinate system, i, j=1, 2,3, i < j, m represents the number of calibration plate coordinate systems, m=1, 2,3;
calibration plate coordinate systemRelative to->Pose transformation matrix of (a) 1 T 2 The method comprises the following steps:
1 T 2 [ 1 I 121 n 2 ∧1 I 121 n 21 O][ 2 I 12 ,e z ∧2 I 12 ,e z2 O] -1
wherein [ among others ] 2 I 12 ,e z ∧2 I 12 ,e z ,2 O ] -1 Is a matrix of [ 2 I 12 ,e z ∧2 I 12 ,e z2 O]Performing inversion operation, e z Is a unit vector of [0, 1 ]] T1 O is l 12 And l 13 In plane surfaceCoordinates of the intersection point on the road byThe solution is carried out to obtain the product, m A ijm B ijm C ij the coefficients of the x term, the coefficients of the y term, the constant term of the intersecting line equation, i, j=1, 2,3, and i < j, m=i or m=j, respectively; 2 o is l 12 And l 23 In plane->The coordinates of the intersection point on the upper part by +.>Solving to obtain;
calibration plate coordinate systemRelative to->Pose transformation matrix of (a) 1 T 3 The method comprises the following steps:
1 T 3 =[ 1 I 131 n 3 ∧1 I 131 n 31 O][ 3 I 13 ,e z ∧3 I 13 ,e z3 O] -1
wherein [ among others ] 3 I 13 ,e z ∧3 I 13 ,e z3 O] -1 Is a matrix of [ 3 I 13 ,e z ∧3 I 13 ,e z3 O]Performing inversion operation; 3 o is l 13 And l 23 In plane surfaceThe coordinates of the intersection point on the upper part by +.>Solving to obtain the final product.
5. The binocular camera calibration method according to claim 1, wherein coordinates of an intersection point of a pixel projection straight line of any pixel of the left camera and the right camera and a plane where the calibration plate is located under each posture under a corresponding calibration plate coordinate system are calculated respectively, and the method is as follows:
according to the control point set corresponding to the calibration plate under each gesture, respectively obtaining the pixels px in the left camera of the binocular camera l Image coordinates of the centers of the nearest four effective circular spots on each calibration plate image m u lbm v lb ),b=1,2,3,4,m=1,2,3;
Obtaining from the control point set m u lbm v lb ) The center of the corresponding effective circular spot is in the coordinate system of the calibration plateLower coordinates [ ] m x bm y b ,0);
Based on pixel px l Pixel coordinates (u) l ,v l ) And [ (II) a ] m u lbm v lb ) And% m x bm y b 0), calculating a pixel px in the left camera by using the difference value l Coordinates of intersection points of pixel projection straight lines of (2) and planes of the calibration plates under all postures under corresponding calibration plate coordinate systems
Respectively obtaining pixels px in the right camera of the binocular camera according to the control point set corresponding to the calibration plate under each gesture r Image coordinates of the centers of the nearest four effective circular spots on each calibration plate image m u rcm v rc ),c=1,2,3,4;
Obtaining from the control point set m u rcm v rc ) The center of the corresponding effective circular spot is in the coordinate system of the calibration plateLower coordinates [ ] m x cm y c ,0);
Based on pixel px r Pixel coordinates (u) r ,v r ) And [ (II) a ] m u rcm v rc ) And% m x cm y c 0), calculating the pixel px in the right camera by using the difference value r Coordinates of intersection points of pixel projection straight lines of (2) and planes of the calibration plates under all postures under corresponding calibration plate coordinate systems
6. A binocular camera calibration system, comprising:
an image acquisition module configured to acquire images of calibration plates placed in different postures through a left camera and a right camera of the binocular camera as input images;
the control point extraction module is configured to respectively establish a calibration plate coordinate system for calibration plates with different postures, acquire the coordinates of the centers of all the effective circular spots on each calibration plate under the corresponding calibration plate coordinate system, and respectively extract four-dimensional image coordinates corresponding to the centers of each effective circular spot in each pair of input images; combining the coordinates of the center of each effective circular spot on each calibration plate under the corresponding calibration plate coordinate system with the corresponding four-dimensional image coordinates to form a control point set, wherein the effective circular spots can be observed by a left camera and a right camera of the binocular camera at the same time;
the intersection line extraction module is configured to search two groups of control points, of which the Euclidean distance of four-dimensional pixel coordinates is smaller than a set distance threshold value, in the two control point sets respectively, and form an intersection line point set of intersection lines between the calibration plates corresponding to the two control point sets; fitting each intersection point set to obtain a corresponding intersection equation;
the pose matrix solving module is configured to combine each intersecting line equation, take any calibration plate coordinate system as a reference coordinate system, respectively calculate the representation of the normal vector of the XY plane of other calibration plate coordinate systems under the reference coordinate system, and further solve the pose transformation matrix of the other calibration plate coordinate systems relative to the reference coordinate system;
the camera calibration module is configured to obtain image coordinates of centers of four nearest effective circular spots of pixels in a left camera and a right camera of the binocular camera on each calibration plate image respectively according to control point sets corresponding to the calibration plates in each gesture, calculate coordinates of intersection points of pixel projection straight lines of the pixels in the left camera and planes of the calibration plates in each gesture under a corresponding calibration plate coordinate system by interpolation, and convert the coordinates into a reference coordinate system to obtain a first intersection point coordinate set corresponding to the pixels in the left camera, calculate coordinates of intersection points of pixel projection straight lines of the pixels in the right camera and the planes of the calibration plates in each gesture under the corresponding calibration plate coordinate system by interpolation, and convert the coordinates into the reference coordinate system to obtain a second intersection point coordinate set corresponding to the pixels in the right camera; performing straight line fitting on coordinates in a first intersection point coordinate set corresponding to each pixel in the left camera, and taking the fitted straight line as a pixel projection straight line of each pixel in the left camera; performing straight line fitting on coordinates in a second intersection point coordinate set corresponding to each pixel in the right camera, and taking the fitted straight line as a pixel projection straight line of each pixel in the right camera; and taking the pixel projection straight lines of all pixels in the left camera and the pixel projection straight lines of all pixels in the right camera as the calibration result of the binocular camera to realize the calibration of the binocular camera.
7. A storage device in which a plurality of programs are stored, characterized in that the programs are adapted to be loaded and executed by a processor to implement a binocular camera calibration method according to any of claims 1-5.
8. A processing device, comprising a processor and a storage device; a processor adapted to execute each program; a storage device adapted to store a plurality of programs; characterized in that the program is adapted to be loaded and executed by a processor for implementing a binocular camera calibration method according to any of the claims 1-5.
CN202311075014.5A 2023-08-24 2023-08-24 Binocular camera calibration method, system and device Pending CN117115268A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311075014.5A CN117115268A (en) 2023-08-24 2023-08-24 Binocular camera calibration method, system and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311075014.5A CN117115268A (en) 2023-08-24 2023-08-24 Binocular camera calibration method, system and device

Publications (1)

Publication Number Publication Date
CN117115268A true CN117115268A (en) 2023-11-24

Family

ID=88801587

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311075014.5A Pending CN117115268A (en) 2023-08-24 2023-08-24 Binocular camera calibration method, system and device

Country Status (1)

Country Link
CN (1) CN117115268A (en)

Similar Documents

Publication Publication Date Title
CN109584156B (en) Microscopic sequence image splicing method and device
CN110969668A (en) Stereoscopic calibration algorithm of long-focus binocular camera
CN109443200B (en) Mapping method and device for global visual coordinate system and mechanical arm coordinate system
CN116309880A (en) Object pose determining method, device, equipment and medium based on three-dimensional reconstruction
CN112200157A (en) Human body 3D posture recognition method and system for reducing image background interference
CN116129037B (en) Visual touch sensor, three-dimensional reconstruction method, system, equipment and storage medium thereof
CN111476835A (en) Unsupervised depth prediction method, system and device for consistency of multi-view images
CN108447092B (en) Method and device for visually positioning marker
CN113034593A (en) 6D pose marking method and system and storage medium
CN113763478A (en) Unmanned vehicle camera calibration method, device, equipment, storage medium and system
CN109978928B (en) Binocular vision stereo matching method and system based on weighted voting
CN117292064A (en) Three-dimensional object modeling method and system based on structured light scanning data
CN117115268A (en) Binocular camera calibration method, system and device
CN104156952B (en) A kind of image matching method for resisting deformation
CN111145266A (en) Fisheye camera calibration method and device, fisheye camera and readable storage medium
CN116012449A (en) Image rendering method and device based on depth information
CN106874916B (en) Complex outside plate point cloud scene contrast extraction method and device
CN112652056B (en) 3D information display method and device
CN114757849A (en) Imaging method with high dynamic range
CN115018932A (en) Camera calibration method and device, electronic equipment and storage medium
CN209279912U (en) A kind of object dimensional information collecting device
JP7182528B2 (en) Method and apparatus for processing image data for machine vision
CN112529960A (en) Target object positioning method and device, processor and electronic device
CN112634439A (en) 3D information display method and device
CN114972536B (en) Positioning and calibrating method for aviation area array swing scanning type camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination