CN111445533A - Binocular camera calibration method, device, equipment and medium - Google Patents

Binocular camera calibration method, device, equipment and medium Download PDF

Info

Publication number
CN111445533A
CN111445533A CN202010235123.9A CN202010235123A CN111445533A CN 111445533 A CN111445533 A CN 111445533A CN 202010235123 A CN202010235123 A CN 202010235123A CN 111445533 A CN111445533 A CN 111445533A
Authority
CN
China
Prior art keywords
camera
mechanical arm
coordinate system
image
calibrated
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010235123.9A
Other languages
Chinese (zh)
Other versions
CN111445533B (en
Inventor
吴贵龙
刘玉平
丁智辉
蒋涛江
黄伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Bozhilin Robot Co Ltd
Original Assignee
Guangdong Bozhilin Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Bozhilin Robot Co Ltd filed Critical Guangdong Bozhilin Robot Co Ltd
Priority to CN202010235123.9A priority Critical patent/CN111445533B/en
Publication of CN111445533A publication Critical patent/CN111445533A/en
Application granted granted Critical
Publication of CN111445533B publication Critical patent/CN111445533B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The embodiment of the invention discloses a binocular camera calibration method, a device, equipment and a medium, wherein the method comprises the following steps: performing hand-eye calibration on each camera to be calibrated to obtain a coordinate transformation matrix corresponding to each camera to be calibrated; acquiring images of a target object in the visual fields of the cameras to be calibrated, and determining the geometric relationship among vectors formed by characteristic points in the images; and obtaining the coordinates of the characteristic points in the target coordinate system according to the coordinate transformation matrix, and determining the physical distance between the cameras to be calibrated according to the coordinates of the characteristic points in the target coordinate system and the geometric relationship. According to the binocular camera calibration method provided by the embodiment of the invention, the coordinates of the characteristic points under the target coordinate system are obtained according to the coordinate transformation matrix corresponding to each camera to be calibrated, and the physical distance between the cameras to be calibrated is obtained by combining the geometric relation between vectors formed by the characteristic points, so that the deviation and error caused by manual installation and measurement are solved, and the calibration accuracy of the binocular camera is improved.

Description

Binocular camera calibration method, device, equipment and medium
Technical Field
The embodiment of the invention relates to the technical field of robots, in particular to a binocular camera calibration method, device, equipment and medium.
Background
With the continuous development of industrial automation and intelligence, the application of cameras and image technology in the robot field is also continuously progressing. The combination of the mechanical arm and the camera enables the hands and eyes of the robot to complete the operation task more efficiently and qualitatively. At present, many robots are provided with binocular or multi-view cameras, and the physical position relationship between the cameras of the binocular or multi-view cameras is often measured during installation. However, it is difficult to give an accurate positional relationship due to factors such as mounting errors and measurement errors. Therefore, how to accurately calibrate the binocular camera is a technical problem to be solved.
Disclosure of Invention
The embodiment of the invention provides a binocular camera calibration method, a binocular camera calibration device, binocular camera calibration equipment and a binocular camera calibration medium, which are used for solving the problems of deviation and errors caused by manual installation and measurement and improving the accuracy of binocular camera calibration.
In a first aspect, an embodiment of the present invention provides a binocular camera calibration method, including:
performing hand-eye calibration on each camera to be calibrated to obtain a coordinate transformation matrix corresponding to each camera to be calibrated;
acquiring images of a target object in the visual fields of the cameras to be calibrated, and determining the geometric relationship among vectors formed by characteristic points in the images;
and obtaining the coordinates of the characteristic points in the target coordinate system according to the coordinate transformation matrix, and determining the physical distance between the cameras to be calibrated according to the coordinates of the characteristic points in the target coordinate system and the geometric relationship.
In a second aspect, an embodiment of the present invention further provides a binocular camera calibration apparatus, including:
the transformation matrix acquisition module is used for carrying out hand-eye calibration on each camera to be calibrated to obtain a coordinate transformation matrix corresponding to each camera to be calibrated;
the geometric relation determining module is used for acquiring images of the target object in the visual fields of the cameras to be calibrated and determining the geometric relation among vectors formed by the characteristic points in the images;
and the physical distance calculation module is used for obtaining the coordinates of the characteristic points in the target coordinate system according to the coordinate transformation matrix and determining the physical distance between the cameras to be calibrated according to the coordinates of the characteristic points in the target coordinate system and the geometric relationship.
In a third aspect, an embodiment of the present invention further provides a computer device, where the computer device includes:
one or more processors;
storage means for storing one or more programs;
when the one or more programs are executed by the one or more processors, the one or more processors implement the binocular camera calibration method provided by any of the embodiments of the present invention.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the binocular camera calibration method provided in any embodiment of the present invention.
The embodiment of the invention obtains the coordinate transformation matrix corresponding to each camera to be calibrated by carrying out hand-eye calibration on each camera to be calibrated; acquiring images of a target object in the visual fields of the cameras to be calibrated, and determining the geometric relationship among vectors formed by characteristic points in the images; the coordinates of the feature points under the target coordinate system are obtained according to the coordinate transformation matrix, and the physical distance between the cameras to be calibrated is determined according to the coordinates of the feature points under the target coordinate system and the geometric relationship, so that the deviation and the error caused by manual installation and measurement are solved, and the calibration accuracy of the binocular camera is improved.
Drawings
Fig. 1a is a flowchart of a binocular camera calibration method according to a first embodiment of the present invention;
fig. 1b is a schematic diagram of a camera to be calibrated according to a first embodiment of the present invention;
FIG. 1c is a schematic diagram of a coordinate system according to an embodiment of the present invention;
FIG. 1d is a schematic illustration of a nine-point calibration according to a first embodiment of the present invention;
fig. 1e is a schematic diagram of a photographing process of a camera to be calibrated according to a first embodiment of the present invention;
fig. 1f is a schematic diagram of coordinate transformation of an offset of a robot according to an embodiment of the present invention;
fig. 2a is a flowchart of a binocular camera calibration method according to a second embodiment of the present invention;
FIG. 2b is a schematic diagram of a first geometric relationship determination provided in the second embodiment of the present invention;
FIG. 2c is a schematic diagram of a second geometric relationship determination provided in the second embodiment of the present invention;
fig. 3a is a flowchart of a binocular camera calibration method according to a third embodiment of the present invention;
FIG. 3b is a schematic diagram of a coordinate system provided in the third embodiment of the present invention;
fig. 4 is a schematic structural diagram of a binocular camera calibration apparatus according to a fourth embodiment of the present invention;
fig. 5 is a schematic structural diagram of a computer device according to a fifth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1a is a flowchart of a binocular camera calibration method according to a first embodiment of the present invention. The present embodiment is applicable to a case where calibration is performed for a binocular camera or a multi-view camera. The method may be performed by a binocular camera calibration apparatus, which may be implemented in software and/or hardware, for example, which may be configured in a computer device. As shown in fig. 1a, the method comprises:
and S110, carrying out hand-eye calibration on each camera to be calibrated to obtain a coordinate transformation matrix corresponding to each camera to be calibrated.
In the embodiment, the physical distance between the cameras to be calibrated is determined by the movement distance of the mechanical arm. The movement distance of the mechanical arm can be further determined by the mechanical arm offset when the image of the target object in the visual field of each camera to be calibrated is acquired and the mechanical arm offset coordinate of the same point on the target object under each camera to be calibrated.
In the above process, in order to obtain the mechanical arm offset coordinate of the same point on the target object under each camera to be calibrated, a mechanical arm coordinate system needs to be established for each camera to be calibrated, the coordinate system is the mechanical arm offset coordinate system, and then the mechanical arm offset coordinate of the same point on the target object under each camera to be calibrated is obtained based on the mapping relationship between the pixel coordinate system of each camera to be calibrated and the mechanical arm offset coordinate system, so as to obtain the relative position between each camera to be calibrated.
Optionally, the mapping relationship between the pixel coordinate system of the camera to be calibrated and the mechanical arm offset coordinate system may be established by performing hand-eye calibration on the camera to be calibrated.
In one embodiment, the performing the hand-eye calibration on each camera to be calibrated to obtain the coordinate transformation matrix corresponding to each camera to be calibrated includes: and respectively carrying out nine-point calibration on each camera to be calibrated to obtain a coordinate transformation matrix corresponding to each camera to be calibrated. Optionally, for each camera to be calibrated, the camera to be calibrated is calibrated by using a nine-point calibration method, so as to obtain a coordinate transformation matrix between a pixel coordinate system of the camera to be calibrated and a mechanical arm offset coordinate system. Taking an example that the camera to be calibrated comprises a first camera and a second camera, nine-point calibration is performed on the first camera and the second camera respectively to obtain a coordinate transformation matrix for converting a pixel coordinate system of the first camera into a mechanical arm offset coordinate system and a coordinate transformation matrix for converting a pixel coordinate system of the second camera into the mechanical arm offset coordinate system.
When each camera to be calibrated is subjected to hand-eye calibration, the cameras to be calibrated do not need to be parallel, wherein the fact that the cameras to be calibrated do not need to be parallel means that when the cameras to be calibrated are installed on a support driven by a mechanical arm, the positive direction parallelism of the cameras to be calibrated during installation does not need to be ensured, but the height of each camera to be calibrated needs to be ensured to be constant all the time in the hand-eye calibration process, and therefore the mechanical arm offset coordinate system corresponding to each camera to be calibrated can be ensured to be in the same plane; and because the cameras only do translational motion, the mechanical arm coordinate systems corresponding to the cameras to be calibrated are parallel. And the nonparallel of the positive direction of the camera can be reflected in the transformation matrix during installation, and the nonparallel factors among the cameras to be calibrated are eliminated through the transformation matrix. Optionally, the cameras to be calibrated are arranged on the support driven by the mechanical arm, and are rigidly connected with each other, so that the relative positions of the cameras to be calibrated cannot be changed. Using binocular camera as an example, when installing binocular camera to the support that the arm drove, only need to guarantee that the camera direction is unanimous can, need not to guarantee that the camera is parallel completely. The support can be driven by the mechanical arm to translate in a set direction, when the support translates in the set direction, the height of the two-phase machine is unchanged during calibration, translation in the horizontal direction is only carried out, rotation is not carried out, and the vector to be solved is conveniently calculated by using a closed quadrilateral principle in the subsequent operation process; if the support rotates when translating in the set direction, solving is carried out on variables to be solved by combining a closed quadrilateral principle and rotation transformation in the subsequent calculation process.
Fig. 1b is a schematic diagram of a camera to be calibrated according to an embodiment of the present invention. As shown in fig. 1b, the left camera and the right camera of the binocular camera are rigidly connected and located on both sides of the mechanical arm, respectively. Fig. 1c is a schematic diagram of a coordinate system according to an embodiment of the present invention. As shown in FIG. 1c, there is a pixel coordinate system x corresponding to the left cameral-ol-ylPixel coordinate system x corresponding to right camerar-or-yrMechanical arm offset coordinate system X corresponding to left cameraL-OL-YLMechanical arm offset coordinate system X corresponding to right cameraR-OR-YRAnd a mechanical arm coordinate system X-O-Y. In the left c of the figure, the left phaseThe machine and the right camera are not parallel, but the camera does not rotate in the hand-eye calibration process and only translates, so that the mechanical arm offset coordinate system X of the left cameraL-OL-YLArm offset coordinate system X corresponding to right cameraR-OR-YRThe height of the left camera and the height of the right camera are not changed when the hand eye is calibrated, so that the mechanical arm offset coordinate system X of the left cameraL-OL-YLArm offset coordinate system X corresponding to right cameraR-OR-YRAre in the same plane. That is, the arm offset coordinate system X of the left cameraL-OL-YLArm offset coordinate system X corresponding to right cameraR-OR-YRThe coordinate system X of the mechanical arm offset of the left camera can be considered as the same plane without the rotation amountL-OL-YLArm offset coordinate system X corresponding to right cameraR-OR-YRIs in parallel relation and is in parallel relation with the mechanical arm coordinate system X-O-Y.
Optionally, the left camera and the right camera are respectively calibrated by hands and eyes to obtain respective coordinate transformation matrices. Taking the left camera as an example, the process of calibrating the camera hand-eye is to construct the relationship between the left camera image pixel coordinate system and the left camera mechanical arm offset coordinate system, namely xl-ol-ylAnd XL-OL-YLThe relationship between them. The above relationship can be expressed by the following equation:
Figure BDA0002430705180000051
where (X, Y) is a pixel coordinate of a certain point in an image pixel coordinate system of the left-side camera, and (X, Y) is a corresponding coordinate of the point (X, Y) in an offset coordinate system of the robot arm, and optionally, the unit of (X, Y) may be millimeters.
When the camera is calibrated at nine points, the first position at the time of calibration at nine points may be taken as the origin of the robot arm offset coordinate system of the camera. Because the 1 st photographing calibration object can not be located at the same position when nine-point calibration is carried out on different cameras, pixel points corresponding to the original points of the mechanical arm offset coordinate systems corresponding to the different cameras are not overlapped, but the solution of the physical distance between the cameras is not influenced, and the physical distance between the two cameras can be solved based on the offset between the original points of the mechanical arm offset coordinate systems corresponding to the two cameras as long as the offset between the original points of the mechanical arm offset coordinate systems corresponding to the two cameras is solved. In subsequent applications such as distance measurement and angle measurement by using a binocular camera, the mechanical arm coordinates corresponding to the pixel coordinates of two certain points of the two cameras are obtained, and the physical distance is added, so that the real physical distance of the two points can be obtained.
In order to improve the calibration precision of nine-point calibration, the mechanical arm offset of nine points can be reasonably selected, so that calibration objects calibrated by nine points uniformly appear in the field of view of the camera. Fig. 1d is a schematic diagram of a nine-point calibration provided in the first embodiment of the present invention, and as shown in fig. 1d, calibration objects of the nine-point calibration uniformly appear in the field of view of the camera. After the nine-point calibration process, the mechanical arm offset coordinates of 9 points are obtained, 9 pictures are taken, and the pixel coordinates of the specific corner point in the calibration plate can be obtained by extracting the corner points of the calibration object calibrated by the nine points. There are 6 unknowns in equation (1), two for each equation, so 3 sets of equations are required to solve the transformation matrix. Assuming that the 1 st, 2 nd, and 3 rd points in fig. 1d are selected, the arm offset coordinates are (X)1,Y1)、(X2,Y2)、(X3,Y3) The pixel coordinates are respectively (x)1,y1)、(x2,y2)、(x3,y3) Then the system of equations can be expressed as:
Figure BDA0002430705180000061
changing equation set (2) to a homogeneous equation:
Figure BDA0002430705180000062
equation set (3) can be abbreviated as:
A*m=M (4)
wherein A is a coordinate transformation matrix and is an unknown quantity; m is a 3x3 matrix formed by 3 pixel coordinate point column vectors; m is a 3x3 matrix of 3 arm offset coordinate point column vectors. The coordinate transformation matrix a can be obtained from equation (4). Optionally, the coordinate transformation matrix a may be obtained by an '/' operator in Matlab, where a is M/M; it is also possible in OpenCV to solve for M by multiplication of M and M by the inverse matrix, i.e. a ═ M (M)-1. Optionally, the coordinate transformation matrix may also be determined by other methods, such as an internal reference calibration or an over-determined equation.
In this embodiment, through nine-point camera calibration, different transformation matrix modes are selected according to different areas where pixel points are located, so that the precision of coordinate transformation can be improved, and errors caused by distortion are reduced. In addition, through nine-point camera calibration, different transformation matrix modes are selected according to different areas where pixel points are located, the precision of coordinate transformation can be improved, and errors caused by distortion are reduced. Wherein, a checkerboard can be selected as a calibration object for nine-point calibration. The checkerboard is used as a calibration object, and the pixel coordinates of the corner points can be solved more accurately through image processing, so that the coordinate transformation matrix obtained through solving is more accurate. In addition, a plurality of points on the checkerboard are selected for averaging, and more accurate offset under the coordinate system of the mechanical arm can be obtained.
And S120, obtaining images of the target object in the visual fields of the cameras to be calibrated, and determining the geometric relation among vectors formed by the characteristic points in the images.
In this embodiment, after the coordinate transformation matrix corresponding to each camera to be calibrated is obtained, a calibration object is selected as a target object, the height of the mechanical arm is kept unchanged, the mechanical arm is moved to obtain an image of the target object in the visual field of each camera to be calibrated, then feature points in the image are selected, and the geometric relationship between vectors formed by the feature points in the image is determined. Alternatively, the calibration object as the target object may be a checkerboard. Because each camera to be calibrated does not rotate in the process of acquiring an image and only translates, the mechanical arm offset coordinate system of each camera does not contain a rotation amount, and because the height of each camera to be calibrated is not changed during calibration, the mechanical arm offset coordinate systems of the cameras to be calibrated and the mechanical arm coordinate systems of the cameras to be calibrated are in the same plane, and the geometric relationship among vectors formed by characteristic points in the image acquired by each camera to be calibrated can be determined according to a closed quadrilateral rule.
In one embodiment, the target object is placed at a set position, fixed, and then the mechanical arm is moved to enable the target object to appear in the visual field of each camera to be calibrated in sequence, so as to obtain the image of the target object in the visual field of each camera to be calibrated, and the offset of the mechanical arm when the image of the target object in the visual field of each camera to be calibrated is shot is recorded. The position of the target object needs to ensure that the target object can completely appear in the view of the camera to be calibrated when each camera to be calibrated moves above the target object. Therefore, the target object is generally placed at the center of each camera to be calibrated. When the number of cameras to be calibrated is 2, the target object may be placed at the middle position of the left camera and the right camera, as shown in fig. 1 b. Fig. 1e is a schematic diagram of a photographing process of a camera to be calibrated according to a first embodiment of the present invention, and fig. 1e schematically illustrates a photographing process of two cameras by taking a binocular camera as an example. As shown in fig. 1e, the robot arm is moved so that the target object appears in the field of view of the left and right cameras in succession, two photographs are taken and the offset of the robot arm movement is recorded.
S130, obtaining coordinates of the feature points in the target coordinate system according to the coordinate transformation matrix, and determining the position relation between the cameras to be calibrated according to the coordinates and the geometric relation of the feature points in the target coordinate system.
In this embodiment, for each feature point, the coordinates of each feature point in the pixel coordinate system are determined according to the position of the feature point in the image, the pixel coordinates of the feature point are subjected to coordinate transformation according to the coordinate transformation matrix corresponding to the image to which the feature point belongs, the arm offset coordinate of the feature point in the arm offset coordinate system is obtained, and the position relationship between the cameras to be calibrated is calculated according to the arm offset coordinate of each feature point in the arm offset coordinate system and the geometric relationship determined in the above step.
The position relationship between the cameras to be calibrated may be a physical distance between the cameras to be calibrated. Specifically, the physical distance between the cameras to be calibrated may be a physical distance between feature points of the cameras to be calibrated. The feature point of the camera to be calibrated may be a center point of the camera to be calibrated or an upper left corner of the camera to be calibrated. Optionally, the physical distance between the cameras to be calibrated may be determined by first taking out the pixel coordinates of the corresponding feature points from the image, then converting the pixel coordinates into the mechanical arm coordinate system of each camera to be calibrated, and then determining the physical distance between the cameras to be calibrated by combining the relationship between the mechanical arm coordinate system origin points of each camera to be calibrated and the mechanical arm offset determined in the above process.
Fig. 1f is a schematic diagram of coordinate transformation of an offset of a robot arm according to an embodiment of the present invention. As shown in FIG. 1f, X '-O' -Y 'is the arm offset coordinate system, X' -O '-Y' is the pixel coordinate system, p (X ', Y'1) Is a coordinate point of the 1 st corner point of the target object in the robot arm offset coordinate system, P (x ', y'1) Is the coordinate point of the 1 st corner point of the target object in the pixel coordinate system. From p (x ', y'1) Is converted into P (X ', Y'1) Can be calculated by the following formula:
Figure BDA0002430705180000081
wherein, a' is a 2 × 3 coordinate transformation matrix corresponding to the camera.
In this embodiment, the step of performing coordinate transformation on the feature points and the step of determining the geometric relationship between vectors formed by the feature points are not limited. Optionally, coordinate transformation may be performed on the feature points to obtain mechanical arm offset coordinates of each feature point in a mechanical arm offset coordinate system, and then a geometric relationship between vectors formed by the feature points is determined; the geometric relationship among vectors formed by the characteristic points can be determined firstly, and then the coordinate transformation is carried out on the characteristic points to obtain the mechanical arm offset coordinate of each characteristic point under the mechanical arm offset coordinate system; or determining the geometric relationship among vectors formed by the characteristic points, and performing coordinate transformation on the characteristic points to obtain the mechanical arm offset coordinate of each characteristic point in the mechanical arm offset coordinate system.
In one embodiment, the geometric relationship between vectors composed of feature points in the image includes: and determining the relative relation between the original points of the coordinate systems of the camera mechanical arms to be calibrated according to the moving offset of the mechanical arms and the coordinates of the characteristic points of the calibration objects under the coordinate systems of the camera mechanical arms to be calibrated.
In the above process, a relationship between the offset of the mechanical arm movement and the mechanical arm coordinate system corresponding to each camera to be calibrated needs to be established. Firstly, the cameras to be calibrated perform translation without rotation in the process of acquiring images, so that the mechanical arm coordinate systems corresponding to the cameras to be calibrated are positioned on the same plane; secondly, each camera to be calibrated translates under a mechanical arm coordinate system, so that the translation offset is also in the plane of the mechanical arm coordinate system; finally, because the target object is fixed, the offset of translation between the first camera to be calibrated and the second camera to be calibrated is equivalent to the amount of movement of the camera to be calibrated from the first camera to be calibrated to the second camera to be calibrated, that is, the offset of the same point on the target object in the coordinate plane of the mechanical arm. The source of the offset of the mechanical arm movement is integrated, and then a quadrangle can be formed in the same plane by combining the coordinates of the feature points in the target object under the mechanical arm coordinate system of each camera to be calibrated and the vector between the original points of the mechanical arm coordinate system, so that the relationship between the mechanical arm coordinate systems can be solved through the vector operation of the quadrangle, and the relationship between the original points of the mechanical arms is established. After the relationship between the original points of the mechanical arms is established, the subsequent calculation of the physical distance between any points of the cameras at two sides can be carried out.
Taking a binocular camera as an example, the calibration process of the relationship between the origin points of the mechanical arm coordinate systems of the camera to be calibrated may be as follows: (1) the mechanical arm outputs the offset of the movement of the camera during two times of photographing, and it can be understood that the offset is the same for the two cameras; (2) according to the pixel coordinates and the transformation matrix of the characteristic points of each camera, the coordinates under the mechanical arm coordinate system corresponding to each characteristic point are solved; (3) and calculating the offset relation between the original points of the two mechanical arm coordinate systems according to the closed quadrilateral vector relation.
The present embodiment calculates the distance between the two cameras by the movement distance of the robot arm. The movement distance of the mechanical arm mainly comprises two parts: the method comprises the steps of firstly, photographing the movement distance of the mechanical arm twice, and secondly, taking the mechanical arm offset coordinates of the same point on a target object under different cameras. The former is mainly determined by the precision of the robot arm, and the latter is determined by the robot arm and the transformation precision of pixel coordinates to robot arm coordinates. Therefore, as long as the motion precision and the coordinate transformation precision of the mechanical arm are ensured, the precision of the obtained physical position relation of the camera is also ensured, and the calibration precision of the binocular camera is improved. In addition, the calibration method provided by the embodiment of the invention has the advantages that the information in the camera direction can be retained in the calibration parameters (namely the coordinate transformation matrix) in the hand-eye calibration, and the translation transformation can be carried out between the mechanical arm offset coordinate systems, so that the accurate level and parallel of the camera are not needed, and the calibration difficulty is simplified. It should be noted that the robot arm may not be guaranteed to be strictly level during the translation process of the robot arm. Taking a binocular camera as an example, that is, two cameras may not be on a horizontal plane when acquiring images, but the physical distance of the two cameras calculated according to the present embodiment is only the physical distance on the horizontal plane. However, since the height information of the cameras is embodied in the transformation matrix of each camera, the height difference between the two cameras does not affect the calibration of the physical distance on the horizontal plane, and similarly, the height difference does not affect the distance measurement of the binocular camera, so that the height difference can be not considered during the calibration of the binocular camera. To sum up, (1) the physical distance on the horizontal planes of the two cameras calibrated in the embodiment of the present invention; (2) the height difference between the two cameras in the vertical direction has no influence on the binocular camera distance measurement, and the binocular camera distance measurement can be uncalibrated.
The embodiment of the invention obtains the coordinate transformation matrix corresponding to each camera to be calibrated by carrying out hand-eye calibration on each camera to be calibrated; acquiring images of a target object in the visual fields of the cameras to be calibrated, and determining the geometric relationship among vectors formed by characteristic points in the images; the coordinates of the feature points under the target coordinate system are obtained according to the coordinate transformation matrix, and the physical distance between the cameras to be calibrated is determined according to the coordinates of the feature points under the target coordinate system and the geometric relationship, so that the deviation and the error caused by manual installation and measurement are solved, and the calibration accuracy of the binocular camera is improved.
Example two
Fig. 2a is a flowchart of a binocular camera calibration method according to a second embodiment of the present invention. The present embodiment is further optimized on the basis of the above embodiments. As shown in fig. 2a, the method comprises:
s210, performing hand-eye calibration on each camera to be calibrated to obtain a coordinate transformation matrix corresponding to each camera to be calibrated, wherein each camera to be calibrated comprises a first camera and a second camera.
S220, acquiring a first image of the target object in the first camera view field and a second image of the target object in the second camera view field.
In this embodiment, the target object may be a checkerboard marker, the checkerboard marker is placed at the center positions of the first camera and the second camera, the mechanical arm is moved to obtain a first image of the checkerboard marker in the field of view of the first camera and a second image of the checkerboard marker in the second camera, respectively, and the offset of the mechanical arm from when the first image is shot to when the second image is shot is recorded.
And S230, determining the geometric relationship between vectors formed by the feature points in the first image and the second image.
In this embodiment, feature points may be selected from the first image and the second image, two closed quadrangles may be constructed based on the selected feature points, and a geometric relationship between vectors formed by the feature points may be determined according to the constructed closed quadrangles.
In one embodiment of the present invention, determining a geometric relationship between vectors composed of feature points in the first image and the second image includes: the method comprises the steps that a mechanical arm corresponding to a first image deviates from the origin of a first coordinate system, a first mark point corresponding to a target object in the first image, a mechanical arm corresponding to a second image deviates from the origin of a second coordinate system, and a second mark point corresponding to the target object in the second image are used as first feature points, and a first geometric relation among vectors formed by the first feature points is constructed; and taking the mechanical arm offset from the origin of the first coordinate system, the first central point of the first image, the mechanical arm offset from the origin of the second coordinate system and the second central point of the second image as second characteristic points, and constructing a second geometric relationship among vectors formed by the second characteristic points.
Fig. 2b is a schematic diagram of determining the first geometric relationship according to the second embodiment of the present invention. As shown in fig. 2b, the determination of the first geometrical relationship is schematically shown. Offsetting the mechanical arm corresponding to the first image in the figure from the original point O of the first coordinate systemLA first mark point P corresponding to the target object in the first image1LThe mechanical arm corresponding to the second image deviates from the origin O of the second coordinate systemRAnd a second mark point P corresponding to the target object in the second image1RAnd as the first characteristic point, constructing a first geometric relation between vectors formed by the first characteristic points.
Fig. 2c is a schematic diagram of second geometric relationship determination provided in the second embodiment of the present invention. As shown in fig. 2c, the determination of the second geometrical relationship is schematically shown. Offsetting the mechanical arm corresponding to the first image in the figure from the original point O of the first coordinate systemLA first center point P of the first image2LThe mechanical arm corresponding to the second image deviates from the origin O of the second coordinate systemRAnd a second center point P of the second image2RAnd as the second characteristic points, constructing a first geometric relation between vectors formed by the second characteristic points.
On the basis of the scheme, the method includes the steps that a mechanical arm corresponding to a first image deviates from a first coordinate system origin, a first marking point corresponding to a target object in the first image, a mechanical arm corresponding to a second image deviates from a second coordinate system origin, and a second marking point corresponding to the target object in the second image are used as first feature points, and a first geometric relation among vectors formed by the first feature points is constructed, and the method includes the following steps: a first vector is constructed based on the deviation of the mechanical arm from the original point of the first coordinate system and the first mark point, a second vector is constructed based on the deviation of the mechanical arm from the original point of the second coordinate system and the second mark point, a third vector is constructed based on the deviation of the mechanical arm from the original point of the second coordinate system and the second mark point, and a fourth vector is constructed based on the deviation of the mechanical arm from the original point of the first coordinate system and the deviation of the mechanical arm from the original point of the second coordinate system; and obtaining a first solving equation of a fourth vector according to the closed quadrilateral vector algorithm.
Specifically, as shown in fig. 2b, the mechanical arm corresponding to the first image is shifted from the origin O of the first coordinate systemLA first mark point P corresponding to the target object in the first image1LThe mechanical arm corresponding to the second image deviates from the origin O of the second coordinate systemRAnd a second mark point P corresponding to the target object in the second image1RThe closed quadrangle is OLP1LP1RORThen construct the first vector
Figure BDA0002430705180000111
Second vector
Figure BDA0002430705180000112
Third vector
Figure BDA0002430705180000113
And a fourth vector
Figure BDA0002430705180000114
And obtaining a fourth vector according to a closed quadrilateral algorithm
Figure BDA0002430705180000121
Is solved by the first equation of
Figure BDA0002430705180000122
Figure BDA0002430705180000123
I.e. point P1LThe coordinates of the mechanical arm offset from the first coordinate system;
Figure BDA0002430705180000124
i.e. point P1RThe coordinates of the mechanical arm offset from the second coordinate system; both coordinates are obtained by the formula (1) or (5). Vector quantity
Figure BDA0002430705180000125
The offset of the mechanical arm is output by the mechanical arm when the two cameras take pictures, or the position of the mechanical arm is recorded every time of taking pictures, and then the offset of the mechanical arm is obtained by taking a difference.
On the basis of the scheme, the method takes the mechanical arm offset from the origin of the first coordinate system, the first central point of the first image, the mechanical arm offset from the origin of the second coordinate system and the second central point of the second image as second characteristic points, and constructs a second geometric relationship among vectors formed by the second characteristic points, and comprises the following steps: constructing a fifth vector based on the deviation of the mechanical arm from the origin of the first coordinate system and the first central point, constructing a sixth vector based on the deviation of the first central point and the second central point, and constructing a seventh vector based on the deviation of the mechanical arm from the origin of the second coordinate system and the second central point; and obtaining a second solving equation of a sixth vector according to the closed quadrilateral vector algorithm.
Specifically, as shown in fig. 2c, the mechanical arm corresponding to the first image is shifted from the origin O of the first coordinate systemLA first center point P of the first image2LThe mechanical arm corresponding to the second image deviates from the origin O of the second coordinate systemRAnd a second center point P of the second image2RThe closed quadrangle is OLP2LP2RORThen construct the fifth vector
Figure BDA0002430705180000126
The sixth vector
Figure BDA0002430705180000127
And a seventh vector
Figure BDA0002430705180000128
And obtaining a sixth vector according to a closed quadrilateral algorithm
Figure BDA0002430705180000129
Is solved by the second equation of
Figure BDA00024307051800001210
S240, aiming at each feature point, acquiring a pixel coordinate of the feature point in an image pixel coordinate system, and obtaining a mechanical arm coordinate of the feature point in a mechanical arm offset coordinate system according to the pixel coordinate and a transformation matrix corresponding to the feature point.
For example, taking fig. 2b and 2c as an example, the first marked point P in the graph is obtained first1LA first center point P2LIn the pixel coordinate system x of the first imagel-ol-ylLower pixel coordinates, and a second mark point P1RA second center point P2RIn the pixel coordinate system x of the second imager-or-yrThe pixel coordinate of the first image is used for converting the coordinate transformation matrix corresponding to the pixel coordinate system of the first image to the first mark point P1LA first center point P2LAt xl-ol-ylThe pixel coordinate is subjected to coordinate transformation to obtain a first mark point P1LA first center point P2LThe mechanical arm deviates from the first coordinate system XL-OL-YLThe mechanical arm shifts coordinates, and the second mark point P is subjected to coordinate transformation matrix corresponding to the second image pixel coordinate system1RA second center point P2RAt xl-ol-ylPerforming coordinate transformation on the pixel coordinates to obtain an origin O of the mechanical arm offset second coordinate systemRA second mark point P1RA second center point P2ROffset from the second coordinate system X at the armR-OR-YRLower arm offsetAnd (4) coordinates. It should be noted that the pixel coordinates of the origin of the coordinate system of each camera to be calibrated in the pixel coordinate system cannot be obtained, but the solution of the relative position relationship of the origin of the coordinate system of the mechanical arm is not affected. Can be as follows: suppose the robotic arm is offset from the origin O of the first coordinate systemLAs the origin of coordinates, the offset relationship between the two coordinate systems can be used as the offset of the robot arm from the origin O of the second coordinate systemRThe robot arm is offset from coordinates in the first coordinate system. Alternatively, the mechanical arm may be considered to be offset from the origin O of the first coordinate systemLAnd the mechanical arm deviates from the origin O of the second coordinate systemRAre known and are all (0, 0).
And S250, obtaining the physical distance between the first camera and the second camera according to the mechanical arm coordinates of the characteristic points, the mechanical arm offset between the first image and the second image and the geometric relation.
In this embodiment, after the mechanical arm coordinates of each feature point are obtained, a partial vector in the geometric relationship may be obtained, and the physical distance between the first camera and the second camera may be calculated according to the partial known vector.
In one embodiment of the present invention, obtaining a physical distance between the first camera and the second camera according to the robot arm coordinate of each feature point, the robot arm offset between the first image and the second image, and the geometric relationship includes: obtaining a fourth vector according to the mechanical arm coordinate of each first characteristic point, the mechanical arm offset between the first image and the second image and the first solving equation; obtaining a sixth vector according to the mechanical arm coordinate of each second feature point, the fourth vector and the second solving equation; and obtaining the physical distance between the first camera and the second camera according to the sixth vector.
Still taking fig. 2b and 2c as an example, the characteristic point in the graph is obtained by offsetting the mechanical arm from the origin O of the first coordinate systemLA first mark point P1LA first center point P2LThe mechanical arm deviates from the first coordinate system XL-OL-YLThe offset coordinate of the mechanical arm and the offset coordinate of the mechanical arm to the origin O of the second coordinate systemRA second mark point P1RA second center point P2ROffset from the second coordinate system X at the armR-OR-YRAnd then the mechanical arm shifts coordinates, and the original point O of the mechanical arm shifting the first coordinate system is setLAnd the arm is offset from the origin O of the second coordinate systemRIs (0,0), the first marker point P may be marked1LThe mechanical arm deviates from the first coordinate system XL-OL-YLThe offset coordinate of the lower mechanical arm is used as a first vector
Figure BDA0002430705180000131
Marking the second mark point P1ROffset from the second coordinate system X at the armR-OR-YRTaking the offset coordinate of the lower mechanical arm as a third vector
Figure BDA0002430705180000132
And obtaining a second vector according to the offset of the motion of the mechanical arm for twice photographing
Figure BDA0002430705180000133
Then solving the equation according to the first
Figure BDA0002430705180000134
And a first vector
Figure BDA0002430705180000135
Second vector
Figure BDA0002430705180000136
And a third vector
Figure BDA0002430705180000137
Finding a fourth vector
Figure BDA0002430705180000138
Then, the origin O of the first coordinate system is shifted according to the mechanical armLAnd a first center point P2LThe mechanical arm deviates from the first coordinate system XL-OL-YLObtaining a fifth vector by the offset coordinate of the mechanical arm
Figure BDA0002430705180000141
Offsetting the origin O of the second coordinate system according to the mechanical armRAnd a second center point P2ROffset from the second coordinate system X at the armR-OR-YRObtaining a seventh vector by the offset coordinate of the mechanical arm
Figure BDA0002430705180000142
Continue solving the equation according to the second
Figure BDA0002430705180000143
And a fourth vector
Figure BDA0002430705180000144
The fifth vector
Figure BDA0002430705180000145
And a seventh vector
Figure BDA0002430705180000146
Find the sixth vector
Figure BDA0002430705180000147
That is, the position relationship between the center point of the first camera and the center point of the second camera is obtained, and finally, the sixth vector is obtained
Figure BDA0002430705180000148
And obtaining the physical distance between the center point of the first camera and the center point of the second camera, and completing the calibration of the binocular camera.
The embodiment of the invention concreties the geometric relationship among vectors formed by each characteristic point in the image, determines the physical distance between the cameras to be calibrated according to the geometric relationship, and solves the offset between the original points of two coordinate systems through the closed quadrangle and vector operation, thereby improving the calibration precision of the binocular camera.
EXAMPLE III
Fig. 3a is a flowchart of a binocular camera calibration method according to a third embodiment of the present invention. The present embodiment provides a preferred embodiment based on the above-described embodiments. As shown in fig. 3a, the method comprises:
and S310, setting a binocular camera and adjusting the position of the mechanical arm.
Two cameras are arranged on a support driven by a mechanical arm, the cameras are rigidly connected, and the relative positions of the cameras cannot be changed. When the device is installed, only the consistent direction of the cameras is required to be ensured, and the complete parallelism is not required to be ensured. The support can be driven by the mechanical arm to translate in the XY direction, and the camera cannot rotate in the translation process. Wherein the position of the calibration object is proper. When the left camera is moved over the calibration object, the calibration object may appear completely in the field of view of the left camera, the calibration object remains stationary, and when the right camera is moved over the calibration object, the calibration object may appear completely in the field of view of the right camera. Therefore, the calibration object is generally placed at a position intermediate to the two cameras.
And S320, establishing an image pixel coordinate, a mechanical arm coordinate system and a mechanical arm offset coordinate system.
FIG. 3b is a schematic diagram of a coordinate system according to a third embodiment of the present invention, as shown in FIG. 3b, xl-ol-ylAnd xr-or-yrFor the image pixel coordinate system, X-O-Y is the robot arm coordinate system, and the relationship between the two camera robot arm offset coordinate systems can be referred to in FIG. 1 c. The image pixel coordinate system takes the upper left corner of the image as an origin, and the downward direction represents the positive direction of x, and the rightward direction represents the positive direction of y, and the unit is a pixel. The mechanical arm coordinate system takes the center of the mechanical arm base as the origin of coordinates, the X positive direction is expressed downwards, the Y positive direction is expressed rightwards, and the unit is millimeter.
And S330, respectively calibrating the hands and eyes of the two cameras to obtain respective transformation matrixes.
The hand-eye calibration can be performed on each camera by using a nine-point calibration method to obtain a transformation matrix corresponding to each camera.
The process of performing hand-eye calibration of the two cameras is as follows: and fixing the calibration object, moving the camera to be calibrated, and taking a picture when the calibration object is completely appeared in the field of view of the camera. It should be noted that the height of each camera to be calibrated is not changed, and only horizontal translation is performed without rotation.
Wherein, S310 to S340 are calibration processes of each camera, it is required to ensure that all camera calibration times are translations without rotation, and a transformation matrix corresponding to each camera is output after nine-point calibration.
S340, moving the mechanical arm to enable the calibration plate to appear in the visual fields of the two cameras respectively, and taking pictures and recording the pictures.
The step is a calibration process among cameras, and specifically comprises the following steps: fixing the calibration object, translating the camera to enable the calibration object to be completely in the camera view, photographing and recording the offset of the mechanical arm, and outputting the images shot by the cameras and the offset of the mechanical arm.
And S350, transforming coordinates to obtain a vector between the origin points of the offset coordinate systems of the two camera mechanical arms.
Calculating the vector between the origin points of the two offset coordinate systems of the camera mechanical arm can be divided into two steps: and solving the coordinates of the same point on the calibration object under the mechanical arm coordinate systems corresponding to different cameras, then constructing a closed quadrangle and solving the relation between the original points of the mechanical arm coordinate systems.
In the process of solving the coordinates of the same point on the calibration object under the mechanical arm coordinate systems corresponding to different cameras, inputting data into images shot by the cameras and transformation matrixes, extracting pixel coordinates of the feature points through image processing, then calculating with the transformation matrixes to obtain the mechanical arm coordinates corresponding to the point, and outputting data into the coordinates of the feature points under the mechanical arm coordinate systems of the cameras. In the process of constructing a closed quadrangle and solving the relation between the original points of the coordinate system of the mechanical arm, input data are coordinates of each characteristic point under the coordinate system of the mechanical arm of each camera and the offset of the mechanical arm, and output data are the relation between the original points of the coordinate system of each mechanical arm. For more detailed schemes, reference may be made to the above embodiments, which are not described herein again.
And S360, solving the physical distance between the central points of the two cameras under the offset coordinate system of the mechanical arm.
The physical distance between the two camera center points is solved through closed quadrilateral and vector operation. In the process, input data is the relation between each camera transformation matrix and each camera mechanical arm coordinate system origin, pixel coordinates of pixel center positions are firstly taken, corresponding mechanical arm coordinates are obtained according to the transformation matrices, and then the physical distance between camera center points can be output by combining a closed quadrangle according to the relation between the mechanical arm coordinate system origins.
Specifically, first, the coordinates of the same point in the coordinate system of the robot arm of each camera are calculated according to the transformation matrix of each camera by using the formula 1 or the formula 5 provided in the above embodiment, then the relationship between the coordinate origins of the coordinate system of the robot arm corresponding to each camera is obtained according to the coordinates of the same point of the calibration object in the coordinate system of each robot arm, the robot arm offset corresponding to each camera, and the vector relationship between the closed quadrangles (formula 6), and finally the physical distance between the centers of each camera can be obtained according to the relationship between the origins of the coordinate systems of each robot arm by combining the robot arm coordinates (obtained by the formula 1) corresponding to the pixel center position of each camera image and the closed quadrangle principle (formula 7). For more detailed schemes, reference may be made to the above embodiments, which are not described herein again.
In the process, the two mechanical arm coordinate systems are realized on the premise that the movement mode of the cameras in the calibration process is translation without rotation, the cameras are in rigid connection, all the cameras move in each translation, namely all the cameras move in one coordinate system, so that a plurality of coordinate systems exist, only the reference coordinate origins are not consistent, but the reference coordinate origins are located in the same coordinate plane. The quadrilateral thus constructed is also located in this coordinate plane and is vector-operable. In addition, no matter in the camera self-calibration process or the binocular camera distance calibration process, the error may not be introduced due to the provision of the precondition, so that the height difference exists between the cameras, but the height difference does not affect the distance measurement as long as the height difference is within a reasonable range, because the translation process is maintained in the same plane, the coordinate is finally located in the same plane after being transformed, and the difference in height is reflected on the transformation matrix.
The method provided by the embodiment of the invention can also be applied to the ranging application of a multi-view camera. It can be understood that the key of multi-view camera ranging is to transform the mechanical arm (offset) coordinate systems corresponding to multiple cameras to the same coordinate system. There are two main cases in a scene with multiple cameras: the first camera and all the cameras cannot see the calibration object at the same time, and the second camera and some cameras can see the calibration object at the same time. In the first case, all the cameras need to be calibrated pairwise relative to the 1 st camera, and all the cameras do not need to be calibrated pairwise; in the second case, the cameras that can see the calibration objects simultaneously are calibrated simultaneously, and other cameras that cannot see the calibration objects simultaneously need to perform pairwise calibration with respect to a camera that sees the calibration objects simultaneously.
According to the technical scheme of the embodiment, the mapping relation between the pixel coordinate of the camera and the mechanical arm coordinate system is established through the hand-eye calibration of the camera, the calibration process only needs to tell the amount of the offset of the mechanical arm movement, and the mechanical arm offset coordinate system can be established according to the amount of the offset of the mechanical arm movement. Since the robot arm can only move in the X and Y directions, the robot arm offset coordinate systems of the two cameras can be transformed into each other by translation. Therefore, the two mechanical arm offset coordinate systems can perform vector operation under the unified coordinate system. The relative position of the cameras and the offset of the mechanical arm are linked, the mechanical arm is moved, so that the calibration object appears in the visual fields of the left camera and the right camera successively, the problem that the calibration object is not located at the same position of the two cameras is solved through camera calibration and coordinate transformation, and the distance between the two cameras is more accurate.
Example four
Fig. 4 is a schematic structural diagram of a binocular camera calibration apparatus provided in the fourth embodiment of the present invention. The binocular camera calibration apparatus may be implemented in software and/or hardware, for example, the binocular camera calibration apparatus may be configured in a computer device. As shown in fig. 4, the apparatus includes a transformation matrix obtaining module 410, a geometric relationship determining module 420, and a physical distance calculating module 430, wherein:
a transformation matrix obtaining module 410, configured to perform hand-eye calibration on each camera to be calibrated, so as to obtain a coordinate transformation matrix corresponding to each camera to be calibrated;
the geometric relationship determining module 420 is configured to obtain an image of the target object in each field of view of the camera to be calibrated, and determine a geometric relationship between vectors formed by each feature point in the image;
and the physical distance calculation module 430 is configured to obtain coordinates of the feature points in the target coordinate system according to the coordinate transformation matrix, and determine a physical distance between the cameras to be calibrated according to the coordinates of the feature points in the target coordinate system and the geometric relationship.
According to the embodiment of the invention, the hand-eye calibration is carried out on each camera to be calibrated through the transformation matrix acquisition module, and the coordinate transformation matrix corresponding to each camera to be calibrated is obtained; the geometric relation determining module acquires images of the target object in the visual fields of the cameras to be calibrated and determines the geometric relation among vectors formed by the characteristic points in the images; the physical distance calculation module obtains the coordinates of the characteristic points in the target coordinate system according to the coordinate transformation matrix, and determines the physical distance between the cameras to be calibrated according to the coordinates of the characteristic points in the target coordinate system and the geometric relationship, so that the deviation and the error caused by manual installation and measurement are solved, and the calibration accuracy of the binocular camera is improved.
Optionally, on the basis of the above scheme, the camera to be calibrated includes a first camera and a second camera, and the geometric relationship determining module 420 includes:
an image acquisition unit for acquiring a first image of a target object in a first camera field of view and a second image of the target object in a second camera field of view;
and the relation determining unit is used for determining the geometric relation between vectors formed by the feature points in the first image and the second image.
Optionally, on the basis of the foregoing scheme, the physical distance calculating module 430 includes:
the mechanical arm coordinate determining unit is used for acquiring the pixel coordinates of the characteristic points in an image pixel coordinate system aiming at each characteristic point and obtaining the mechanical arm coordinates of the characteristic points in a mechanical arm offset coordinate system according to the pixel coordinates and the transformation matrix corresponding to the characteristic points;
and the camera distance calculating unit is used for obtaining the physical distance between the first camera and the second camera according to the mechanical arm coordinates of the characteristic points, the mechanical arm offset between the first image and the second image and the geometric relation.
Optionally, on the basis of the above scheme, the relationship determining unit is specifically configured to: the method comprises the steps that a mechanical arm corresponding to a first image deviates from the origin of a first coordinate system, a first mark point corresponding to a target object in the first image, a mechanical arm corresponding to a second image deviates from the origin of a second coordinate system, and a second mark point corresponding to the target object in the second image are used as first feature points, and a first geometric relation among vectors formed by the first feature points is constructed; and taking the mechanical arm offset from the origin of the first coordinate system, the first central point of the first image, the mechanical arm offset from the origin of the second coordinate system and the second central point of the second image as second characteristic points, and constructing a second geometric relationship among vectors formed by the second characteristic points.
Optionally, on the basis of the above scheme, the relationship determining unit is specifically configured to: a first vector is constructed based on the deviation of the mechanical arm from the original point of the first coordinate system and the first mark point, a second vector is constructed based on the deviation of the mechanical arm from the original point of the second coordinate system and the second mark point, a third vector is constructed based on the deviation of the mechanical arm from the original point of the second coordinate system and the second mark point, and a fourth vector is constructed based on the deviation of the mechanical arm from the original point of the first coordinate system and the deviation of the mechanical arm from the original point of the second coordinate system; and obtaining a first solving equation of a fourth vector according to the closed quadrilateral vector algorithm.
Optionally, on the basis of the above scheme, the relationship determining unit is specifically configured to: constructing a fifth vector based on the deviation of the mechanical arm from the origin of the first coordinate system and the first central point, constructing a sixth vector based on the deviation of the first central point and the second central point, and constructing a seventh vector based on the deviation of the mechanical arm from the origin of the second coordinate system and the second central point; and obtaining a second solving equation of a sixth vector according to the closed quadrilateral vector algorithm.
Optionally, on the basis of the above scheme, the camera distance calculating unit is specifically configured to: obtaining a fourth vector according to the mechanical arm coordinate of each first characteristic point, the mechanical arm offset between the first image and the second image and the first solving equation; obtaining a sixth vector according to the mechanical arm coordinate of each second feature point, the fourth vector and the second solving equation; and obtaining the physical distance between the first camera and the second camera according to the sixth vector.
The binocular camera calibration device provided by the embodiment of the invention can execute the binocular camera calibration method provided by any embodiment, and has the corresponding functional modules and beneficial effects of the execution method.
EXAMPLE five
Fig. 5 is a schematic structural diagram of a computer device according to a fifth embodiment of the present invention. FIG. 5 illustrates a block diagram of an exemplary computer device 512 suitable for use in implementing embodiments of the present invention. The computer device 512 shown in FIG. 5 is only an example and should not bring any limitations to the functionality or scope of use of embodiments of the present invention.
As shown in FIG. 5, computer device 512 is in the form of a general purpose computing device. Components of computer device 512 may include, but are not limited to: one or more processors 516, a system memory 528, and a bus 518 that couples the various system components including the system memory 528 and the processors 516.
Bus 518 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and processor 516, or a local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Computer device 512 typically includes a variety of computer system readable media. Such media can be any available media that is accessible by computer device 512 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 528 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)530 and/or cache memory 532. The computer device 512 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage 534 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 5, and commonly referred to as a "hard drive"). Although not shown in FIG. 5, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 518 through one or more data media interfaces. Memory 528 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 540 having a set (at least one) of program modules 542, including but not limited to an operating system, one or more application programs, other program modules, and program data, may be stored in, for example, the memory 528, each of which examples or some combination may include an implementation of a network environment. The program modules 542 generally perform the functions and/or methods of the described embodiments of the invention.
Computer device 512 may also communicate with one or more external devices 514 (e.g., keyboard, pointing device, display 524, etc.), and may also communicate with one or more devices that enable a user to interact with the computer device 512, and/or with any devices (e.g., network card, modem, etc.) that enable the computer device 512 to communicate with one or more other computing devices.
The processor 516 executes various functional applications and data processing by running a program stored in the system memory 528, for example, implementing a binocular camera calibration method provided by an embodiment of the present invention, the method includes:
performing hand-eye calibration on each camera to be calibrated to obtain a coordinate transformation matrix corresponding to each camera to be calibrated;
acquiring images of a target object in each camera field to be calibrated, and determining the geometric relationship among vectors formed by each characteristic point in the images;
and obtaining the coordinates of the feature points in a target coordinate system according to the coordinate transformation matrix, and determining the physical distance between the cameras to be calibrated according to the coordinates of the feature points in the target coordinate system and the geometric relationship.
Of course, those skilled in the art can understand that the processor may also implement the technical solution of the binocular camera calibration method provided by any embodiment of the present invention.
EXAMPLE six
The sixth embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the binocular camera calibration method provided in the sixth embodiment of the present invention, where the method includes:
performing hand-eye calibration on each camera to be calibrated to obtain a coordinate transformation matrix corresponding to each camera to be calibrated;
acquiring images of a target object in each camera field to be calibrated, and determining the geometric relationship among vectors formed by each characteristic point in the images;
and obtaining the coordinates of the feature points in a target coordinate system according to the coordinate transformation matrix, and determining the physical distance between the cameras to be calibrated according to the coordinates of the feature points in the target coordinate system and the geometric relationship.
Of course, the computer program stored on the computer-readable storage medium provided by the embodiments of the present invention is not limited to the method operations described above, and may also perform related operations in the binocular camera calibration method and/or the magnetic resonance scanning method provided by any embodiments of the present invention.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including AN object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. A binocular camera calibration method is characterized by comprising the following steps:
performing hand-eye calibration on each camera to be calibrated to obtain a coordinate transformation matrix corresponding to each camera to be calibrated;
acquiring images of a target object in each camera field to be calibrated, and determining the geometric relationship among vectors formed by each characteristic point in the images;
and obtaining the coordinates of the feature points in a target coordinate system according to the coordinate transformation matrix, and determining the physical distance between the cameras to be calibrated according to the coordinates of the feature points in the target coordinate system and the geometric relationship.
2. The method according to claim 1, wherein the cameras to be calibrated comprise a first camera and a second camera, and the acquiring images of the target object in the fields of view of the cameras to be calibrated and determining the geometric relationship between vectors formed by feature points in the images comprises:
acquiring a first image of the target object in the first camera field of view and a second image of the target object in the second camera field of view;
and determining the geometrical relation between vectors formed by the characteristic points in the first image and the second image.
3. The method according to claim 2, wherein the obtaining of the coordinates of the feature points in a target coordinate system according to the coordinate transformation matrix, and the determining of the physical distance between the cameras to be calibrated according to the coordinates of the feature points in the target coordinate system and the geometric relationship comprise:
acquiring a pixel coordinate of each feature point in an image pixel coordinate system, and obtaining a mechanical arm coordinate of the feature point in a mechanical arm offset coordinate system according to the pixel coordinate and a transformation matrix corresponding to the feature point;
and obtaining the physical distance between the first camera and the second camera according to the mechanical arm coordinate of each characteristic point, the mechanical arm offset between the first image and the second image and the geometric relationship.
4. The method of claim 3, wherein determining the geometric relationship between vectors of feature points in the first image and the second image comprises:
taking the mechanical arm corresponding to the first image deviated from the origin of a first coordinate system, the first marking point corresponding to the target object in the first image, the mechanical arm corresponding to the second image deviated from the origin of a second coordinate system and the second marking point corresponding to the target object in the second image as first feature points, and constructing a first geometric relationship among vectors formed by the first feature points;
and taking the mechanical arm offset from the origin of the first coordinate system, the first central point of the first image, the mechanical arm offset from the origin of the second coordinate system and the second central point of the second image as second characteristic points, and constructing a second geometric relationship among vectors formed by the second characteristic points.
5. The method according to claim 4, wherein the constructing a first geometric relationship between vectors formed by the first feature points by using the first coordinate system origin offset of the mechanical arm corresponding to the first image, the first marker point corresponding to the target object in the first image, the second coordinate system origin offset of the mechanical arm corresponding to the second image, and the second marker point corresponding to the target object in the second image as the first feature points comprises:
constructing a first vector based on the mechanical arm offsetting a first coordinate system origin and the first mark point, constructing a second vector based on the first mark point and the second mark point, constructing a third vector based on the mechanical arm offsetting a second coordinate system origin and the second mark point, and constructing a fourth vector based on the mechanical arm offsetting the first coordinate system origin and the mechanical arm offsetting the second coordinate system origin;
and obtaining a first solving equation of the fourth vector according to a closed quadrilateral vector algorithm.
6. The method of claim 5, wherein the step of constructing a second geometric relationship between vectors formed by the second feature points by using the robot arm offset from the first coordinate system origin, the first center point of the first image, the robot arm offset from the second coordinate system origin and the second center point of the second image as the second feature points comprises:
constructing a fifth vector based on the mechanical arm offsetting a first coordinate system origin and the first center point, constructing a sixth vector based on the first center point and the second center point, and constructing a seventh vector based on the mechanical arm offsetting a second coordinate system origin and the second center point;
and obtaining a second solving equation of the sixth vector according to a closed quadrilateral vector algorithm.
7. The method of claim 6, wherein the obtaining the physical distance between the first camera and the second camera according to the mechanical arm coordinate of each of the feature points, the mechanical arm offset between the first image and the second image, and the geometric relationship comprises:
obtaining the fourth vector according to the mechanical arm coordinate of each first feature point, the mechanical arm offset between the first image and the second image, and the first solving equation;
obtaining a sixth vector according to the mechanical arm coordinate of each second feature point, the fourth vector and the second solution equation;
and obtaining the physical distance between the first camera and the second camera according to the sixth vector.
8. A binocular camera calibration device is characterized by comprising:
the transformation matrix acquisition module is used for carrying out hand-eye calibration on each camera to be calibrated to obtain a coordinate transformation matrix corresponding to each camera to be calibrated;
the geometric relation determining module is used for acquiring images of a target object in the visual fields of the cameras to be calibrated and determining the geometric relation among vectors formed by characteristic points in the images;
and the physical distance calculation module is used for obtaining the coordinates of the characteristic points in a target coordinate system according to the coordinate transformation matrix and determining the physical distance between the cameras to be calibrated according to the coordinates of the characteristic points in the target coordinate system and the geometric relationship.
9. A computer device, the device comprising:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the binocular camera calibration method of any of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the binocular camera calibration method according to any one of claims 1 to 7.
CN202010235123.9A 2020-03-27 2020-03-27 Binocular camera calibration method, device, equipment and medium Active CN111445533B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010235123.9A CN111445533B (en) 2020-03-27 2020-03-27 Binocular camera calibration method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010235123.9A CN111445533B (en) 2020-03-27 2020-03-27 Binocular camera calibration method, device, equipment and medium

Publications (2)

Publication Number Publication Date
CN111445533A true CN111445533A (en) 2020-07-24
CN111445533B CN111445533B (en) 2023-08-01

Family

ID=71656107

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010235123.9A Active CN111445533B (en) 2020-03-27 2020-03-27 Binocular camera calibration method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN111445533B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112132904A (en) * 2020-08-26 2020-12-25 江苏迪盛智能科技有限公司 Method and system for determining position relation between alignment camera and optical device
CN112634376A (en) * 2020-12-25 2021-04-09 深圳中科飞测科技股份有限公司 Calibration method and device, calibration equipment and storage medium
CN112700505A (en) * 2020-12-31 2021-04-23 山东大学 Binocular three-dimensional tracking-based hand-eye calibration method, equipment and storage medium
CN112884847A (en) * 2021-03-02 2021-06-01 济南大学 Dual-camera calibration method and system
CN112929535A (en) * 2021-01-25 2021-06-08 北京中科慧眼科技有限公司 Binocular camera-based lens attitude correction method and system and intelligent terminal
CN113850873A (en) * 2021-09-24 2021-12-28 成都圭目机器人有限公司 Offset position calibration method of linear array camera under carrying platform positioning coordinate system
CN114406985A (en) * 2021-10-18 2022-04-29 苏州迪凯尔医疗科技有限公司 Target tracking mechanical arm method, system, equipment and storage medium
CN114677429A (en) * 2022-05-27 2022-06-28 深圳广成创新技术有限公司 Positioning method and device of manipulator, computer equipment and storage medium
WO2022259051A1 (en) * 2021-06-07 2022-12-15 Alcon Inc. Optical axis calibration of robotic camera system
CN115781665A (en) * 2022-11-01 2023-03-14 深圳史河机器人科技有限公司 Monocular camera-based mechanical arm control method and device and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0896267A2 (en) * 1997-08-04 1999-02-10 Fuji Jukogyo Kabushiki Kaisha Position recognizing system of autonomous running vehicle
CN101893425A (en) * 2010-07-09 2010-11-24 清华大学 Visual full-parameter wheel alignment detection system and method based on linear array images
CN108562274A (en) * 2018-04-20 2018-09-21 南京邮电大学 A kind of noncooperative target pose measuring method based on marker
CN108734744A (en) * 2018-04-28 2018-11-02 国网山西省电力公司电力科学研究院 A kind of remote big field-of-view binocular scaling method based on total powerstation
CN109129445A (en) * 2018-09-29 2019-01-04 先临三维科技股份有限公司 Hand and eye calibrating method, scaling board, device, equipment and the storage medium of mechanical arm
US20190012807A1 (en) * 2017-07-04 2019-01-10 Baidu Online Network Technology (Beijing) Co., Ltd.. Three-dimensional posture estimating method and apparatus, device and computer storage medium
CN109483516A (en) * 2018-10-16 2019-03-19 浙江大学 A kind of mechanical arm hand and eye calibrating method based on space length and epipolar-line constraint
CN109658460A (en) * 2018-12-11 2019-04-19 北京无线电测量研究所 A kind of mechanical arm tail end camera hand and eye calibrating method and system
WO2020027605A1 (en) * 2018-08-01 2020-02-06 한국원자력연구원 Image processing method and apparatus for stereoscopic images of nearby object in binocular camera system of parallel axis type
US20200082183A1 (en) * 2018-09-07 2020-03-12 Baidu Online Network Technology (Beijing) Co., Ltd. Method for position detection, device, and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0896267A2 (en) * 1997-08-04 1999-02-10 Fuji Jukogyo Kabushiki Kaisha Position recognizing system of autonomous running vehicle
CN101893425A (en) * 2010-07-09 2010-11-24 清华大学 Visual full-parameter wheel alignment detection system and method based on linear array images
US20190012807A1 (en) * 2017-07-04 2019-01-10 Baidu Online Network Technology (Beijing) Co., Ltd.. Three-dimensional posture estimating method and apparatus, device and computer storage medium
CN108562274A (en) * 2018-04-20 2018-09-21 南京邮电大学 A kind of noncooperative target pose measuring method based on marker
CN108734744A (en) * 2018-04-28 2018-11-02 国网山西省电力公司电力科学研究院 A kind of remote big field-of-view binocular scaling method based on total powerstation
WO2020027605A1 (en) * 2018-08-01 2020-02-06 한국원자력연구원 Image processing method and apparatus for stereoscopic images of nearby object in binocular camera system of parallel axis type
US20200082183A1 (en) * 2018-09-07 2020-03-12 Baidu Online Network Technology (Beijing) Co., Ltd. Method for position detection, device, and storage medium
CN109129445A (en) * 2018-09-29 2019-01-04 先临三维科技股份有限公司 Hand and eye calibrating method, scaling board, device, equipment and the storage medium of mechanical arm
CN109483516A (en) * 2018-10-16 2019-03-19 浙江大学 A kind of mechanical arm hand and eye calibrating method based on space length and epipolar-line constraint
CN109658460A (en) * 2018-12-11 2019-04-19 北京无线电测量研究所 A kind of mechanical arm tail end camera hand and eye calibrating method and system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
GUO YAN XU; LI PENG CHEN;: ""Study on binocular stereo camera calibration method"", pages 1 - 5 *
高飞;葛一粟;汪韬;卢书芳;张元鸣;: "基于空间平面约束的视觉定位模型研究", no. 07, pages 1 - 8 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112132904B (en) * 2020-08-26 2024-04-12 江苏迪盛智能科技有限公司 Method and system for determining position relationship between alignment camera and optical device
CN112132904A (en) * 2020-08-26 2020-12-25 江苏迪盛智能科技有限公司 Method and system for determining position relation between alignment camera and optical device
CN112634376A (en) * 2020-12-25 2021-04-09 深圳中科飞测科技股份有限公司 Calibration method and device, calibration equipment and storage medium
CN112634376B (en) * 2020-12-25 2024-06-04 深圳中科飞测科技股份有限公司 Calibration method and device, calibration equipment and storage medium
CN112700505A (en) * 2020-12-31 2021-04-23 山东大学 Binocular three-dimensional tracking-based hand-eye calibration method, equipment and storage medium
CN112929535A (en) * 2021-01-25 2021-06-08 北京中科慧眼科技有限公司 Binocular camera-based lens attitude correction method and system and intelligent terminal
CN112884847B (en) * 2021-03-02 2022-08-09 济南大学 Dual-camera calibration method and system
CN112884847A (en) * 2021-03-02 2021-06-01 济南大学 Dual-camera calibration method and system
WO2022259051A1 (en) * 2021-06-07 2022-12-15 Alcon Inc. Optical axis calibration of robotic camera system
CN113850873A (en) * 2021-09-24 2021-12-28 成都圭目机器人有限公司 Offset position calibration method of linear array camera under carrying platform positioning coordinate system
CN113850873B (en) * 2021-09-24 2024-06-07 成都圭目机器人有限公司 Offset position calibration method of linear array camera under carrying platform positioning coordinate system
CN114406985A (en) * 2021-10-18 2022-04-29 苏州迪凯尔医疗科技有限公司 Target tracking mechanical arm method, system, equipment and storage medium
CN114406985B (en) * 2021-10-18 2024-04-12 苏州迪凯尔医疗科技有限公司 Mechanical arm method, system, equipment and storage medium for target tracking
CN114677429A (en) * 2022-05-27 2022-06-28 深圳广成创新技术有限公司 Positioning method and device of manipulator, computer equipment and storage medium
CN115781665A (en) * 2022-11-01 2023-03-14 深圳史河机器人科技有限公司 Monocular camera-based mechanical arm control method and device and storage medium
CN115781665B (en) * 2022-11-01 2023-08-08 深圳史河机器人科技有限公司 Mechanical arm control method and device based on monocular camera and storage medium

Also Published As

Publication number Publication date
CN111445533B (en) 2023-08-01

Similar Documents

Publication Publication Date Title
CN111445533B (en) Binocular camera calibration method, device, equipment and medium
CN106408612B (en) Machine vision system calibration
KR20210056964A (en) Method and apparatus for calibrating external parameters of image acquisition device, device and storage medium
CN110689579A (en) Rapid monocular vision pose measurement method and measurement system based on cooperative target
Stahs et al. Fast and robust range data acquisition in a low-cost environment
CN111579561B (en) Position point compensation method, device, equipment and storage medium
CN113012226A (en) Camera pose estimation method and device, electronic equipment and computer storage medium
CN113034612A (en) Calibration device and method and depth camera
CN115552486A (en) System and method for characterizing an object pose detection and measurement system
WO2010013289A1 (en) Camera calibration image creation apparatus and camera calibration image creation program
CN115713563A (en) Camera calibration method and device, electronic equipment and storage medium
CN114049401A (en) Binocular camera calibration method, device, equipment and medium
Dornaika et al. Robust Camera Calibration using 2D-to-3D feature correspondences
CN117173254A (en) Camera calibration method, system, device and electronic equipment
CN112902961A (en) Calibration method, medium, calibration equipment and system based on machine vision positioning
CN116038720A (en) Hand-eye calibration method, device and equipment based on point cloud registration
Steger Algorithms for the orthographic-n-point problem
CN115953478A (en) Camera parameter calibration method and device, electronic equipment and readable storage medium
CN115965697A (en) Projector calibration method, calibration system and device based on Samm's law
CN115042184A (en) Robot hand-eye coordinate conversion method and device, computer equipment and storage medium
CN114926542A (en) Mixed reality fixed reference system calibration method based on optical positioning system
CN113781581A (en) Depth of field distortion model calibration method based on target loose attitude constraint
CN116100564B (en) High-precision calibration method and device for calibrating manipulator
Ribeiro et al. Photogrammetric multi-camera calibration using an industrial programmable robotic arm
Weilong et al. On camera calibration and distortion correction based on bundle adjustment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant