CN111445533B - Binocular camera calibration method, device, equipment and medium - Google Patents

Binocular camera calibration method, device, equipment and medium Download PDF

Info

Publication number
CN111445533B
CN111445533B CN202010235123.9A CN202010235123A CN111445533B CN 111445533 B CN111445533 B CN 111445533B CN 202010235123 A CN202010235123 A CN 202010235123A CN 111445533 B CN111445533 B CN 111445533B
Authority
CN
China
Prior art keywords
mechanical arm
coordinate system
camera
image
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010235123.9A
Other languages
Chinese (zh)
Other versions
CN111445533A (en
Inventor
吴贵龙
刘玉平
丁智辉
蒋涛江
黄伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Bozhilin Robot Co Ltd
Original Assignee
Guangdong Bozhilin Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Bozhilin Robot Co Ltd filed Critical Guangdong Bozhilin Robot Co Ltd
Priority to CN202010235123.9A priority Critical patent/CN111445533B/en
Publication of CN111445533A publication Critical patent/CN111445533A/en
Application granted granted Critical
Publication of CN111445533B publication Critical patent/CN111445533B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The embodiment of the invention discloses a binocular camera calibration method, a binocular camera calibration device, binocular camera calibration equipment and a binocular camera calibration medium, wherein the binocular camera calibration method comprises the following steps: performing hand-eye calibration on each camera to be calibrated to obtain a coordinate transformation matrix corresponding to each camera to be calibrated; acquiring images of a target object in the fields of view of cameras to be calibrated, and determining geometric relations among vectors formed by characteristic points in the images; and obtaining coordinates of the feature points under the target coordinate system according to the coordinate transformation matrix, and determining physical distances between cameras to be calibrated according to the coordinates and geometric relations of the feature points under the target coordinate system. According to the binocular camera calibration method provided by the embodiment of the invention, the coordinates of the characteristic points under the target coordinate system are obtained according to the coordinate transformation matrix corresponding to each camera to be calibrated, and the physical distance between the cameras to be calibrated is obtained by combining the geometric relationship between vectors formed by the characteristic points, so that the deviation and error caused by manual installation and measurement are solved, and the accuracy of the binocular camera calibration is improved.

Description

Binocular camera calibration method, device, equipment and medium
Technical Field
The embodiment of the invention relates to the technical field of robots, in particular to a binocular camera calibration method, a binocular camera calibration device, binocular camera calibration equipment and a binocular camera calibration medium.
Background
With the continuous development of industrial automation and intelligence, the application of camera and image technology in the robot field is also advancing continuously. The combination of the mechanical arm and the camera also enables the hands and eyes of the robot to finish the task more efficiently and with higher quality. Currently, many robots are provided with binocular or multi-view cameras, and the physical position relationship between the cameras of the binocular or multi-view cameras is usually given by measurement during installation. However, it is difficult to give an accurate positional relationship due to factors such as installation errors, measurement errors, and the like. It can be seen that how to accurately calibrate the binocular camera is a technical problem to be solved.
Disclosure of Invention
The embodiment of the invention provides a binocular camera calibration method, device, equipment and medium, which are used for solving the problems of deviation and error caused by manual installation and measurement and improving the accuracy of binocular camera calibration.
In a first aspect, an embodiment of the present invention provides a binocular camera calibration method, including:
performing hand-eye calibration on each camera to be calibrated to obtain a coordinate transformation matrix corresponding to each camera to be calibrated;
acquiring images of a target object in the fields of view of cameras to be calibrated, and determining geometric relations among vectors formed by characteristic points in the images;
And obtaining coordinates of the feature points under the target coordinate system according to the coordinate transformation matrix, and determining physical distances between cameras to be calibrated according to the coordinates and geometric relations of the feature points under the target coordinate system.
In a second aspect, an embodiment of the present invention further provides a binocular camera calibration apparatus, including:
the transformation matrix acquisition module is used for performing hand-eye calibration on each camera to be calibrated to obtain a coordinate transformation matrix corresponding to each camera to be calibrated;
the geometric relation determining module is used for acquiring images of the target object in the vision field of each camera to be calibrated and determining geometric relations among vectors formed by each characteristic point in the images;
and the physical distance calculation module is used for obtaining the coordinates of the characteristic points under the target coordinate system according to the coordinate transformation matrix, and determining the physical distance between cameras to be calibrated according to the coordinates and the geometric relationship of the characteristic points under the target coordinate system.
In a third aspect, an embodiment of the present invention further provides a computer apparatus, including:
one or more processors;
a storage means for storing one or more programs;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the binocular camera calibration method as provided by any of the embodiments of the present invention.
In a fourth aspect, embodiments of the present invention further provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a binocular camera calibration method as provided by any of the embodiments of the present invention.
According to the embodiment of the invention, the camera to be calibrated is subjected to hand-eye calibration, so that the coordinate transformation matrix corresponding to each camera to be calibrated is obtained; acquiring images of a target object in the fields of view of cameras to be calibrated, and determining geometric relations among vectors formed by characteristic points in the images; and obtaining the coordinates of the characteristic points under the target coordinate system according to the coordinate transformation matrix, and determining the physical distance between cameras to be calibrated according to the coordinates and the geometric relationship of the characteristic points under the target coordinate system, so that the deviation and the error caused by manual installation and measurement are solved, and the accuracy of the binocular camera calibration is improved.
Drawings
FIG. 1a is a flowchart of a binocular camera calibration method according to an embodiment of the present invention;
FIG. 1b is a schematic diagram of a camera setup to be calibrated according to an embodiment of the present invention;
FIG. 1c is a schematic diagram of a coordinate system according to an embodiment of the present invention;
FIG. 1d is a schematic illustration of a nine-point calibration according to a first embodiment of the present invention;
Fig. 1e is a schematic diagram of a photographing process of a camera to be calibrated according to an embodiment of the present invention;
FIG. 1f is a schematic diagram of a coordinate transformation of an offset of a mechanical arm according to an embodiment of the present invention;
FIG. 2a is a flowchart of a binocular camera calibration method according to a second embodiment of the present invention;
FIG. 2b is a schematic diagram illustrating a first geometric relationship determination according to a second embodiment of the present invention;
FIG. 2c is a schematic diagram illustrating a second geometric relationship determination according to a second embodiment of the present invention;
FIG. 3a is a flowchart of a binocular camera calibration method according to a third embodiment of the present invention;
FIG. 3b is a schematic diagram of a coordinate system according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of a binocular camera calibration apparatus according to a fourth embodiment of the present invention;
fig. 5 is a schematic structural diagram of a computer device according to a fifth embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting thereof. It should be further noted that, for convenience of description, only some, but not all of the structures related to the present invention are shown in the drawings.
Example 1
Fig. 1a is a flowchart of a binocular camera calibration method according to an embodiment of the present invention. The present embodiment is applicable to a case when a binocular camera or a multi-view camera is targeted. The method may be performed by a binocular camera calibration apparatus, which may be implemented in software and/or hardware, e.g., configured in a computer device. As shown in fig. 1a, the method comprises:
s110, performing hand-eye calibration on each camera to be calibrated to obtain a coordinate transformation matrix corresponding to each camera to be calibrated.
In this embodiment, the physical distance between cameras to be calibrated is determined by the movement distance of the mechanical arm. The movement distance of the mechanical arm can be further determined by the offset of the mechanical arm when the target object is acquired in the view field of each camera to be calibrated and the offset coordinates of the mechanical arm of the same point on the target object under each camera to be calibrated.
In the above process, in order to obtain the mechanical arm offset coordinate of the same point on the target object under each camera to be calibrated, it is necessary to establish a mechanical arm coordinate system for each camera to be calibrated, where the coordinate system is the mechanical arm offset coordinate system, and then obtain the mechanical arm offset coordinate of the same point on the target object under each camera to be calibrated based on the mapping relationship between the pixel coordinate system of each camera to be calibrated and the mechanical arm offset coordinate system, so as to obtain the relative position between each camera to be calibrated.
Optionally, the mapping relationship between the pixel coordinate system of the camera to be calibrated and the mechanical arm offset coordinate system can be established by performing hand-eye calibration on the camera to be calibrated.
In one embodiment, performing hand-eye calibration on each camera to be calibrated to obtain a coordinate transformation matrix corresponding to each camera to be calibrated, including: and respectively carrying out nine-point calibration on each camera to be calibrated to obtain a coordinate transformation matrix corresponding to each camera to be calibrated. Optionally, for each camera to be calibrated, calibrating the camera to be calibrated by using a nine-point calibration method to obtain a coordinate transformation matrix between a pixel coordinate system of the camera to be calibrated and a mechanical arm offset coordinate system. Taking the example that the camera to be calibrated comprises a first camera and a second camera, performing nine-point calibration on the first camera and the second camera respectively to obtain a coordinate transformation matrix from a pixel coordinate system of the first camera to a mechanical arm offset coordinate system and a coordinate transformation matrix from the pixel coordinate system of the second camera to the mechanical arm offset coordinate system.
It should be noted that when each camera to be calibrated is subjected to hand-eye calibration, the cameras to be calibrated do not need to be parallel, wherein the fact that the cameras to be calibrated do not need to be parallel means that when the cameras to be calibrated are mounted on a support driven by a mechanical arm, the positive direction parallelism of the cameras to be calibrated during mounting is not required to be guaranteed, but the height of each camera to be calibrated is required to be always unchanged during the hand-eye calibration, and the fact that the mechanical arm offset coordinate system corresponding to each camera to be calibrated is on the same plane can be guaranteed based on the fact that the height of each camera to be calibrated is required to be always unchanged; and because the camera only makes translational motion, the corresponding mechanical arm coordinate systems of the cameras to be calibrated are parallel. The non-parallelism of the positive directions of the cameras during installation can be embodied in a transformation matrix, and the non-parallelism factors among the cameras to be calibrated are eliminated through the transformation matrix. Optionally, each camera to be calibrated is arranged on a bracket driven by the mechanical arm, and is rigidly connected with each other, so that the relative position of each camera to be calibrated is not changed. Taking a binocular camera as an example, when the binocular camera is mounted on a bracket driven by a mechanical arm, the camera direction is ensured to be consistent, and the camera is not required to be completely parallel. The support can translate in a set direction under the drive of the mechanical arm, when the support translates in the set direction, the calibration height of the two phases is unchanged, only translation in the horizontal direction is carried out, rotation is not carried out, and the vector to be solved is conveniently calculated in the subsequent operation process by using a closed quadrilateral principle; if the bracket rotates when translating in the set direction, solving the variable to be solved by combining a closed quadrilateral principle and rotation transformation in the subsequent calculation process.
Fig. 1b is a schematic diagram of a camera setup to be calibrated according to an embodiment of the present invention. As shown in fig. 1b, the left camera and the right camera of the binocular camera are rigidly connected and are respectively located at two sides of the mechanical arm. Fig. 1c is a schematic diagram of a coordinate system according to an embodiment of the present invention. As shown in FIG. 1c, there is a pixel coordinate system x corresponding to the left camera l -o l -y l Pixel coordinate system x corresponding to right camera r -o r -y r Mechanical arm offset coordinate system X corresponding to left camera L -O L -Y L Mechanical arm offset coordinate system X corresponding to right camera R -O R -Y R And a mechanical arm coordinate system X-O-Y. In the left c of the figure, the left camera is not parallel to the right camera, but the camera does not rotate and only translates in the process of calibrating the eyes, so the mechanical arm offset coordinate system X of the left camera L -O L -Y L Mechanical arm offset coordinate system X corresponding to right camera R -O R -Y R The rotation amount is not contained, and the heights of the left camera and the right camera are not changed when the eyes are marked, so that the mechanical arm offset coordinate system X of the left camera L -O L -Y L Mechanical arm offset coordinate system X corresponding to right camera R -O R -Y R In the same plane. That is, the arm offset coordinate system X of the left camera L -O L -Y L Mechanical arm offset coordinate system X corresponding to right camera R -O R -Y R In the same plane, and does not include the rotation amount, the arm offset of the left camera can be consideredCoordinate system X L -O L -Y L Mechanical arm offset coordinate system X corresponding to right camera R -O R -Y R Is parallel to the coordinate system X-O-Y of the mechanical arm.
Optionally, the left camera and the right camera are respectively calibrated by hand and eye to obtain respective coordinate transformation matrixes. Taking a left camera as an example, the process of calibrating the eyes of the camera is to construct the relation between the pixel coordinate system of the image of the left camera and the offset coordinate system of the mechanical arm of the left camera, namely x l -o l -y l And X L -O L -Y L Relationship between them. The above relationship can be expressed by the following equation:
where (X, Y) is the pixel coordinate of a certain point in the image pixel coordinate system of the left camera, and (X, Y) is the coordinate corresponding to the point (X, Y) in the mechanical arm offset coordinate system, and optionally, the unit of (X, Y) may be millimeter.
When the camera is calibrated at nine points, the first position of the camera at the nine points can be taken as the origin of the mechanical arm offset coordinate system of the camera. Since the 1 st photographing calibration object is not guaranteed to be positioned at the same position when nine-point calibration is performed on different cameras, pixel points corresponding to the origins of the mechanical arm offset coordinate systems corresponding to different cameras are not coincident, but the solving of the physical distance between the cameras is not affected, and the physical distance between the two cameras can be obtained based on the offset between the origins of the mechanical arm offset coordinate systems corresponding to the two cameras as long as the offset between the origins of the mechanical arm offset coordinate systems is obtained. In the subsequent application of measuring distance, angle and the like by using binocular cameras, the coordinates of the mechanical arm corresponding to the pixel coordinates of two points of the two cameras are obtained, and the physical distance is added, so that the real physical distance of the two points can be obtained.
In order to improve the calibration precision of nine-point calibration, the mechanical arm offset of nine points can be reasonably selected, so that the calibration object of nine-point calibrationAppears uniformly in the field of view of the camera. Fig. 1d is a schematic diagram of nine-point calibration according to the first embodiment of the present invention, as shown in fig. 1d, the calibration objects of nine-point calibration appear uniformly in the field of view of the camera. After the nine-point calibration process, the offset coordinates of the mechanical arm of 9 points are obtained, 9 pictures are taken, and the pixel coordinates of the specific corner points in the calibration plate can be obtained by extracting the corner points of the calibration objects calibrated by the nine points. There are 6 unknowns in equation (1), two equations for each equation, so 3 sets of equations are needed to solve the transformation matrix. Assuming that the 1 st, 2 nd, 3 rd points in fig. 1d are selected, the arm offset coordinates are (X 1 ,Y 1 )、(X 2 ,Y 2 )、(X 3 ,Y 3 ) The pixel coordinates are (x) 1 ,y 1 )、(x 2 ,y 2 )、(x 3 ,y 3 ) The system of equations can be expressed as:
changing the equation set (2) into a homogeneous equation is:
equation set (3) may be abbreviated as:
A*m=M (4)
wherein A is a coordinate transformation matrix, which is an unknown quantity; m is a 3x3 matrix formed by 3 pixel coordinate point column vectors; m is a 3x3 matrix of 3 arm offset coordinate point column vectors. The coordinate transformation matrix a can be obtained from equation (4). Alternatively, the coordinate transformation matrix a can be found by an operator '/' in Matlab, by a=m/M; m can also be solved in OpenCV by multiplication of M by M's inverse matrix, i.e. a=m (M) -1 . Alternatively, the coordinate transformation matrix may be determined by other methods, such as internal calibration or overdetermined equations.
In this embodiment, by calibrating the nine-point camera, the precision of coordinate transformation can be improved and errors caused by distortion can be reduced by selecting different transformation matrixes according to different areas where the pixel points are located. In addition, through nine-point camera calibration, the precision of coordinate transformation can be improved and errors caused by distortion can be reduced by selecting different transformation matrixes according to different areas where pixel points are located. Wherein, the checkerboard can be selected as a calibration object for nine-point calibration. The checkerboard is used as a calibration object, and the pixel coordinates of the corner points can be accurately solved through image processing, so that the coordinate transformation matrix obtained through solving is more accurate. In addition, the plurality of points on the checkerboard are selected to be averaged, so that more accurate offset under the coordinate system of the mechanical arm can be obtained.
S120, obtaining images of the target object in the field of view of each camera to be calibrated, and determining geometric relations among vectors formed by characteristic points in the images.
In this embodiment, after obtaining the coordinate transformation matrix corresponding to each camera to be calibrated, selecting a calibration object as the target object, keeping the height of the mechanical arm unchanged, moving the mechanical arm to obtain an image of the target object in the field of view of each camera to be calibrated, then selecting feature points in the image, and determining the geometric relationship between vectors formed by each feature point in the image. Alternatively, the calibration object as the target object may be a checkerboard. Because each camera to be calibrated does not rotate in the process of acquiring the image, only translation is carried out, the mechanical arm offset coordinate system of each camera does not contain rotation quantity, and because the height of each camera to be calibrated is not changed in the process of calibration, the mechanical arm offset coordinate systems of each camera to be calibrated and the mechanical arm coordinate systems of the cameras to be calibrated are all in the same plane, and therefore the geometric relationship between vectors formed by characteristic points in the image acquired by each camera to be calibrated can be determined according to a closed quadrilateral method.
In one embodiment, a target object is placed at a set position and is fixed, then the mechanical arm is moved to enable the target object to appear in the fields of view of the cameras to be calibrated successively, images of the target object in the fields of view of the cameras to be calibrated are obtained, and the offset of the mechanical arm when the images of the target object in the fields of view of the cameras to be calibrated are shot is recorded. The position of the target object needs to ensure that the target object can completely appear in the field of view of the cameras to be calibrated when each camera to be calibrated moves above the target object. Therefore, a general target object is placed at the center of each camera to be calibrated. When the number of cameras to be calibrated is 2, the target object may be placed at the middle position of the left camera and the right camera, as shown in fig. 1 b. Fig. 1e is a schematic diagram of a photographing process of a camera to be calibrated according to an embodiment of the present invention, and in fig. 1e, a binocular camera is taken as an example, and a photographing process of two cameras is schematically illustrated. As shown in fig. 1e, the moving robot arm makes the target object appear in the field of view of the left camera and the right camera in sequence, takes two pictures and records the offset of the movement of the robot arm.
And S130, obtaining coordinates of the feature points under the target coordinate system according to the coordinate transformation matrix, and determining the position relationship between cameras to be calibrated according to the coordinates and the geometric relationship of the feature points under the target coordinate system.
In this embodiment, for each feature point, the coordinates of each feature point in the pixel coordinate system are determined according to the position of the feature point in the image, the pixel coordinates of the feature point are transformed according to the coordinate transformation matrix corresponding to the image to which the feature point belongs, the mechanical arm offset coordinates of the feature point in the mechanical arm offset coordinate system are obtained, and the positional relationship between the cameras to be calibrated is calculated according to the mechanical arm offset coordinates of each feature point in the mechanical arm offset coordinate system and the geometric relationship determined in the above steps.
The positional relationship between the cameras to be calibrated may be a physical distance between the cameras to be calibrated. Specifically, the physical distance between the cameras to be calibrated may be the physical distance between the feature points of the cameras to be calibrated. The feature point of the camera to be calibrated may be a center point of the camera to be calibrated or an upper left corner position of the camera to be calibrated. Optionally, determining the physical distance between the cameras to be calibrated may be that the pixel coordinates of the corresponding feature points are first extracted from the image, then converted into the mechanical arm coordinate system of each camera to be calibrated, and then the physical distance between the cameras to be calibrated can be determined by combining the relationship between the coordinate origins of the mechanical arm coordinate systems of each camera to be calibrated and the mechanical arm offset determined in the above process.
Fig. 1f is a schematic diagram of a coordinate transformation of an offset of a mechanical arm according to an embodiment of the invention. As shown in FIG. 1f, X '-O' -Y 'is the arm offset coordinate system, X' -O '-Y' is the pixel coordinate system, p (X ', Y' 1 ) Is the coordinate point of the 1 st corner of the target object under the mechanical arm offset coordinate system, P (x ', y' 1 ) Is the coordinate point of the 1 st corner of the target object in the pixel coordinate system. From p (x ', y' 1 ) Converted into P (X ', Y' 1 ) The calculation can be made by the following formula:
wherein A' is a coordinate transformation matrix of 2*3 corresponding to the camera.
In the present embodiment, the step of transforming the coordinates of the feature points and the step of determining the geometric relationship between the vectors formed by the feature points are not limited. Optionally, the coordinate transformation can be performed on the feature points to obtain the mechanical arm offset coordinates of each feature point under the mechanical arm offset coordinate system, and then the geometric relationship between vectors formed by the feature points is determined; the geometrical relationship among vectors formed by the characteristic points can be determined firstly, and then the characteristic points are subjected to coordinate transformation to obtain the mechanical arm offset coordinates of the characteristic points under the mechanical arm offset coordinate system; or, determining the geometric relationship between vectors formed by the feature points, and performing coordinate transformation on the feature points to obtain the mechanical arm offset coordinates of each feature point under the mechanical arm offset coordinate system.
In one embodiment, the geometric relationship between vectors of feature points in the image includes: and determining the relative relation between the origins of the coordinate systems of the mechanical arms of the cameras to be calibrated according to the offset of the movement of the mechanical arms and the coordinates of the characteristic points of the calibration objects under the coordinate systems of the mechanical arms of the cameras to be calibrated.
In the above process, a relationship between the offset of the movement of the mechanical arm and the mechanical arm coordinate system corresponding to each camera to be calibrated needs to be established. Firstly, translation without rotation is carried out in the process of acquiring images by each camera to be calibrated, so that the coordinate systems of the mechanical arms corresponding to the cameras to be calibrated are positioned on the same plane; secondly, each camera to be calibrated translates under the coordinate system of the mechanical arm, so that the offset of translation is also in the plane of the coordinate system of the mechanical arm; finally, as the target object is fixed, the translational offset between the first to-be-calibrated camera and the second to-be-calibrated camera is equivalent to the movement amount of the target object from the first to-be-calibrated camera to the second to-be-calibrated camera, namely the offset of the same point on the target object in the coordinate plane of the mechanical arm. The motion offset sources of the mechanical arms are combined, and then the coordinates of the characteristic points in the target object under the mechanical arm coordinate systems of the cameras to be calibrated and the vectors between the mechanical arm coordinate system origins are combined to form a quadrilateral in the same plane, so that the relation between the mechanical arm coordinate systems can be solved through the vector operation of the quadrilateral, and the relation between the mechanical arm origins is established. After the relation between the origins of the mechanical arms is established, the subsequent calculation of the physical distance between any points of the cameras on the two sides can be performed.
Taking a binocular camera as an example, the calibration process of the relation between the mechanical arm coordinate system origins of the camera to be calibrated can be as follows: (1) The mechanical arm outputs the offset of the camera movement when photographing twice, and it is understood that the offset is the same for the two cameras; (2) According to the pixel coordinates of the characteristic points of each camera and the transformation matrix, solving the coordinates of the characteristic points under the mechanical arm coordinate system; (3) And calculating the offset relation between the origins of the coordinate systems of the two mechanical arms according to the closed quadrilateral vector relation.
The present embodiment calculates the distance between the two cameras by the movement distance of the robot arm. The movement distance of the mechanical arm mainly comprises two parts: the mechanical arm offset coordinate of the same point on the target object under different cameras is the movement distance of the mechanical arm for two photographing. The former is mainly determined by the accuracy of the robot arm, and the latter is determined by the robot arm and the conversion accuracy of the pixel coordinates to the robot arm coordinates. Therefore, as long as the motion precision and the coordinate transformation precision of the mechanical arm are guaranteed, the precision of the obtained physical position relationship of the camera is also guaranteed, and the calibration precision of the binocular camera is improved. In addition, in the calibration method provided by the embodiment of the invention, because the information in the camera direction can be reserved in the calibration parameters (namely the coordinate transformation matrix) in the hand-eye calibration, and the translation transformation can be carried out between the coordinate systems of the offset of the mechanical arm, the accurate level and parallel of the camera are not needed, and the calibration difficulty is simplified. It should be noted that, during the translation of the mechanical arm, the mechanical arm may not be guaranteed to be strictly horizontal. Taking a binocular camera as an example, that is, two cameras may not be on one horizontal plane when acquiring an image, but the physical distance of the two cameras calculated according to the present embodiment is only the physical distance on the horizontal plane. However, since the height information of the cameras is reflected in the transformation matrix of each camera, the height difference between the two cameras does not affect the calibration of the physical distance on the horizontal plane, and similarly, the height difference does not affect the distance measurement of the binocular camera, so that the height difference can be ignored when the binocular camera is calibrated. In summary, (1) the physical distances on the two camera levels identified in the embodiments of the present invention; (2) The height difference in the vertical direction of the two cameras has no influence on the distance measurement of the binocular camera and can be uncalibrated.
According to the embodiment of the invention, the camera to be calibrated is subjected to hand-eye calibration, so that the coordinate transformation matrix corresponding to each camera to be calibrated is obtained; acquiring images of a target object in the fields of view of cameras to be calibrated, and determining geometric relations among vectors formed by characteristic points in the images; and obtaining the coordinates of the characteristic points under the target coordinate system according to the coordinate transformation matrix, and determining the physical distance between cameras to be calibrated according to the coordinates and the geometric relationship of the characteristic points under the target coordinate system, so that the deviation and the error caused by manual installation and measurement are solved, and the accuracy of the binocular camera calibration is improved.
Example two
Fig. 2a is a flowchart of a binocular camera calibration method according to a second embodiment of the present invention. This embodiment is further optimized on the basis of the above-described embodiments. As shown in fig. 2a, the method comprises:
s210, performing hand-eye calibration on each camera to be calibrated to obtain a coordinate transformation matrix corresponding to each camera to be calibrated, wherein the cameras to be calibrated comprise a first camera and a second camera.
S220, acquiring a first image of the target object in a first camera view and a second image of the target object in a second camera view.
In this embodiment, the target object may be a checkerboard calibration object, the checkerboard calibration object is placed at the center positions of the first camera and the second camera, the mechanical arm is moved to obtain a first image of the checkerboard calibration object in the field of view of the first camera and a second image of the checkerboard calibration object in the second camera, and the offset of the mechanical arm from when the first image is captured to when the second image is captured is recorded.
S230, determining geometric relations among vectors formed by the feature points in the first image and the second image.
In this embodiment, feature points may be selected from the first image and the second image, two closed quadrilaterals may be constructed based on the selected feature points, and a geometric relationship between vectors composed of the feature points may be determined according to the constructed closed quadrilaterals.
In one embodiment of the present invention, determining a geometric relationship between vectors of feature points in a first image and a second image includes: the method comprises the steps of taking a mechanical arm corresponding to a first image, which is shifted from a first coordinate system origin, a first mark point corresponding to a target object in the first image, a mechanical arm corresponding to a second image, which is shifted from a second coordinate system origin, and a second mark point corresponding to the target object in the second image as first feature points, and constructing a first geometric relationship among vectors formed by the first feature points; and taking the first coordinate system origin point offset by the mechanical arm, the first center point of the first image, the second coordinate system origin point offset by the mechanical arm and the second center point of the second image as second characteristic points, and constructing a second geometric relationship among vectors formed by the second characteristic points.
Fig. 2b is a schematic diagram of a first geometric relationship determination according to a second embodiment of the present invention. The manner in which the first geometrical relationship is determined is schematically illustrated in fig. 2 b. Shifting the mechanical arm corresponding to the first image in the figure to the origin O of the first coordinate system L First marking point P corresponding to target object in first image 1L The mechanical arm corresponding to the second image shifts to the origin O of the second coordinate system R Second marking point P corresponding to target object in second image 1R As the first feature points, a first geometric relationship between vectors composed of the first feature points is constructed.
Fig. 2c is a schematic diagram of a second geometric relationship determination according to a second embodiment of the present invention. The manner in which the second geometrical relationship is determined is schematically illustrated in fig. 2 c. Shifting the mechanical arm corresponding to the first image in the figure to the origin O of the first coordinate system L First center point P of first image 2L The mechanical arm corresponding to the second image shifts to the origin O of the second coordinate system R A second center point P of the second image 2R As the second feature points, a first geometric relationship between vectors constituted by the second feature points is constructed.
On the basis of the above scheme, the first geometrical relationship between vectors formed by the first feature points is constructed by taking the first coordinate system origin point offset by the mechanical arm corresponding to the first image, the first mark point corresponding to the target object in the first image, the second coordinate system origin point offset by the mechanical arm corresponding to the second image and the second mark point corresponding to the target object in the second image as the first feature points, and the first geometrical relationship comprises: constructing a first vector based on the first coordinate system origin and the first mark point of the mechanical arm, constructing a second vector based on the first mark point and the second mark point, constructing a third vector based on the second coordinate system origin and the second mark point of the mechanical arm, and constructing a fourth vector based on the first coordinate system origin and the second coordinate system origin of the mechanical arm; and obtaining a first solving equation of the fourth vector according to a closed quadrilateral vector algorithm.
Specifically, as shown in fig. 2b, the mechanical arm corresponding to the first image is offset from the origin O of the first coordinate system L First marking point P corresponding to target object in first image 1L The mechanical arm corresponding to the second image shifts to the origin O of the second coordinate system R Second marking point P corresponding to target object in second image 1R The closed quadrangle is O L P 1L P 1R O R Then build the first vectorSecond vector->Third vector->And fourth vector->And obtaining a fourth vector +.>The first solution equation of (2) is
I.e. point P 1L The coordinates are located on the first coordinate system of the offset of the mechanical arm; />I.e. point P 1R The coordinates are located at the second coordinate system of the offset of the mechanical arm; both coordinates are obtained by the formula (1) or (5). Vector->Is the offset of the mechanical arm when the two cameras take pictures, and the offset is output by the mechanical arm or recorded every time the two cameras take picturesRecording the position of the mechanical arm, and then making a difference to be the offset of the mechanical arm.
On the basis of the above scheme, the second geometric relationship between vectors formed by the second feature points is constructed by taking the first coordinate system origin point offset by the mechanical arm, the first center point of the first image, the second coordinate system origin point offset by the mechanical arm and the second center point of the second image as the second feature points, and the second geometric relationship comprises: constructing a fifth vector based on the first coordinate system origin and the first center point of the mechanical arm offset, constructing a sixth vector based on the first center point and the second center point, and constructing a seventh vector based on the second coordinate system origin and the second center point of the mechanical arm offset; and obtaining a second solving equation of the sixth vector according to the closed quadrilateral vector algorithm.
Specifically, as shown in fig. 2c, the mechanical arm corresponding to the first image is offset from the origin O of the first coordinate system L First center point P of first image 2L The mechanical arm corresponding to the second image shifts to the origin O of the second coordinate system R A second center point P of the second image 2R The closed quadrangle is O L P 2L P 2R O R Then construct the fifth vectorSixth vector->And a seventh vectorAnd obtaining a sixth vector ++according to the closed quadrilateral algorithm>The second solution equation of (2) is
S240, acquiring pixel coordinates of the feature points in an image pixel coordinate system according to each feature point, and acquiring the mechanical arm coordinates of the feature points in the mechanical arm offset coordinate system according to the pixel coordinates and a transformation matrix corresponding to the feature points.
Taking fig. 2b and 2c as examples, the first marking point P in the graph is obtained 1L First center point P 2L In the pixel coordinate system x of the first image l -o l -y l Lower pixel coordinates and a second mark point P 1R Second center point P 2R In the pixel coordinate system x of the second image r -o r -y r The pixel coordinates under the first image pixel coordinate system, and then the first mark point P is corresponding to the coordinate transformation matrix corresponding to the first image pixel coordinate system 1L First center point P 2L At x l -o l -y l The pixel coordinates are subjected to coordinate transformation to obtain a first marking point P 1L First center point P 2L Offset from the first coordinate system X in the mechanical arm L -O L -Y L The lower mechanical arm shifts the coordinates, and the second mark point P is corresponding to the coordinate transformation matrix corresponding to the second image pixel coordinate system 1R Second center point P 2R At x l -o l -y l The pixel coordinates are subjected to coordinate transformation to obtain a second coordinate system origin O of the mechanical arm offset R Second mark point P 1R Second center point P 2R Offset of the second coordinate system X in the mechanical arm R -O R -Y R The lower robotic arm offsets the coordinates. It should be noted that, the pixel coordinates of the origin of the coordinate system of each camera to be calibrated under the pixel coordinate system cannot be obtained, but the solving of the relative position relationship of the origin of the coordinate system of the mechanical arm is not affected. The method comprises the following steps: assume that the robotic arm is offset from the first coordinate system origin O L For the origin of coordinates, the offset relationship between the two coordinate systems can be used as the origin O of the second coordinate system offset by the mechanical arm R The coordinates in the first coordinate system are offset at the robotic arm. Alternatively, the robotic arm may also be considered to be offset from the first coordinate system origin O L And the mechanical arm is shifted from the origin O of the second coordinate system R Is known, is (0, 0).
S250, obtaining the physical distance between the first camera and the second camera according to the mechanical arm coordinates of the feature points, the mechanical arm offset between the first image and the second image and the geometric relationship.
In this embodiment, after the coordinates of the mechanical arm of each feature point are obtained, a partial vector in the geometric relationship may be obtained, and the physical distance between the first camera and the second camera may be obtained by calculating according to a partial known vector.
In one embodiment of the present invention, obtaining the physical distance between the first camera and the second camera according to the arm coordinates of each feature point, the arm offset between the first image and the second image, and the geometric relationship includes: obtaining a fourth vector according to the mechanical arm coordinates of each first characteristic point, the mechanical arm offset between the first image and the second image and the first solving equation; obtaining a sixth vector according to the mechanical arm coordinates of each second characteristic point, the fourth vector and the second solving equation; and obtaining the physical distance between the first camera and the second camera according to the sixth vector.
Taking fig. 2b and fig. 2c as an example, the mechanical arm for obtaining the characteristic point in the drawing deviates from the origin O of the first coordinate system L First marked point P 1L First center point P 2L Offset from the first coordinate system X in the mechanical arm L -O L -Y L The lower mechanical arm offset coordinate and the mechanical arm offset second coordinate system origin O R Second mark point P 1R Second center point P 2R Offset of the second coordinate system X in the mechanical arm R -O R -Y R The lower mechanical arm shifts the coordinates and sets the origin O of the mechanical arm shifting first coordinate system L Is shifted by the mechanical arm by the origin O of the second coordinate system R The first marker point P can be set to (0, 0) 1L Offset from the first coordinate system X in the mechanical arm L -O L -Y L The lower arm offset coordinate is taken as a first vectorSecond mark point P 1R Offset of the second coordinate system X in the mechanical arm R -O R -Y R The lower arm offset coordinate is taken as a third vector +.>And a second vector is obtained according to the offset of the two photographing mechanical arm movements>Then solve equation +.>First vector +.>Second vectorAnd a third vector->Find the fourth vector +.>Then offset the origin O of the first coordinate system according to the mechanical arm L A first center point P 2L Offset from the first coordinate system X in the mechanical arm L -O L -Y L The lower mechanical arm offset coordinate obtains a fifth vector +.>Offset of the second coordinate system origin O according to the robotic arm R A second center point P 2R Offset of the second coordinate system X in the mechanical arm R -O R -Y R The lower mechanical arm offset coordinate obtains a seventh vector +.>Continuing to solve the equation according to the second solutionFourth vector->Fifth vector->And a seventh vector->Obtaining a sixth vector->Namely, the positional relationship between the first camera center point and the second camera center point is obtained, and finally the sixth vector is obtained >And obtaining the physical distance between the first camera center point and the second camera center point, and completing the calibration of the binocular camera.
The embodiment of the invention embodies the geometric relationship between the vectors formed by the feature points in the determined image and determines the physical distance between the cameras to be calibrated according to the geometric relationship, solves the offset between the origins of the two coordinate systems through closed quadrangle and vector operation, and improves the calibration precision of the binocular camera.
Example III
Fig. 3a is a flowchart of a binocular camera calibration method according to a third embodiment of the present invention. This embodiment provides a preferred embodiment on the basis of the above-described embodiments. As shown in fig. 3a, the method comprises:
s310, setting a binocular camera and adjusting the position of the mechanical arm.
The two cameras are arranged on the support driven by the mechanical arm, the cameras are rigidly connected, and the relative positions of the cameras are not changed. When the camera is installed, the camera direction is consistent, and complete parallelism is not required. The support can translate in the XY direction under the drive of the mechanical arm, and the camera cannot rotate in the translation process. Wherein, the position of the calibration object is suitable. The calibration object may appear completely in the field of view of the left camera when the left camera is moved over the calibration object, the calibration object remains stationary, and the calibration object may appear completely in the field of view of the right camera when the right camera is moved over the calibration object. Thus, the calibration object is typically placed in a position intermediate the two cameras.
S320, establishing an image pixel coordinate, a mechanical arm coordinate system and a mechanical arm offset coordinate system.
FIG. 3b is a schematic diagram of a coordinate system according to a third embodiment of the present invention, as shown in FIG. 3b, x l -o l -y l X r -o r -y r For the image pixel coordinate system, X-O-Y is the mechanical arm coordinate system, and the relation between the two camera mechanical arm offset coordinate systems can be referred to as fig. 1c. The image pixel coordinate system takes the upper left corner of the image as an origin, downwards represents the positive direction of x, and rightwards represents the positive direction of y, and the unit is a pixel. The mechanical arm coordinate system takes the center of a mechanical arm base as a coordinate origin, downwards represents the X positive direction, rightwards represents the Y positive direction, and the unit is millimeter.
S330, calibrating the eyes of the two cameras respectively to obtain respective transformation matrixes.
And (3) performing hand-eye calibration on each camera by using a nine-point calibration method to obtain a transformation matrix corresponding to each camera.
The process of calibrating the eyes of the two cameras is as follows: and fixing the calibration object, moving the camera to be calibrated, and photographing when the calibration object is completely in the field of view of the camera. It should be noted that the height of each camera to be calibrated is unchanged when being calibrated, only horizontal translation is performed, and no rotation is carried out.
S310-S340 are calibration processes of each camera, translation without rotation is required to be ensured when all cameras are calibrated, and a transformation matrix corresponding to each camera is output after nine-point calibration.
S340, moving the mechanical arm to enable the calibration plates to be respectively displayed in the visual fields of the two cameras, photographing and recording.
The method comprises the following steps of: and fixing the calibration object, translating the camera, enabling the calibration object to be completely in the field of view of the camera, photographing and recording the offset of the mechanical arm, and outputting the images photographed by the cameras and the offset of the mechanical arm.
S350, transforming coordinates to obtain vectors between the origins of the offset coordinate systems of the two camera mechanical arms.
Calculating the vector between the origins of the two camera robotic arm offset coordinate systems can be divided into two steps: and solving the coordinates of the same point on the calibration object under the mechanical arm coordinate systems corresponding to different cameras, then constructing a closed quadrangle and solving the relation between the origins of the mechanical arm coordinate systems.
In the process of solving the coordinates of the same point on the calibration object under the mechanical arm coordinate systems corresponding to different cameras, input data are images shot by the cameras and a transformation matrix, pixel coordinates of the characteristic points are extracted through image processing, then the coordinates of the mechanical arm corresponding to the point are obtained through operation of the transformation matrix, and output data are the coordinates of the characteristic points under the mechanical arm coordinate systems of the cameras. In the process of constructing a closed quadrangle and solving the relation between the origins of the coordinate systems of the mechanical arms, input data are the coordinates of each characteristic point under the coordinate system of each camera mechanical arm and the offset of the mechanical arm, and output data are the relation between the origins of the coordinate systems of each mechanical arm. For more details, reference may be made to the above embodiments, and details are not repeated here.
S360, obtaining the physical distance between the two camera center points under the mechanical arm offset coordinate system.
And solving the physical distance between the two camera center points through closed quadrangle and vector operation. In the process, input data is the relation between each camera transformation matrix and the origin of the mechanical arm coordinate system of each camera, firstly, the pixel coordinates of the pixel center position are taken, the corresponding mechanical arm coordinates are obtained according to the transformation matrix, and then the physical distance between the camera center points can be output by combining the closed quadrangle according to the relation between the origins of the mechanical arm coordinate systems.
Specifically, first, according to the transformation matrix of each camera, the coordinates of the same point under the mechanical arm coordinate system of each camera are calculated by using the formula 1 or the formula 5 provided by the above embodiment, then, according to the coordinates of the same point under the mechanical arm coordinate system of each calibration object, the corresponding mechanical arm offset of each camera, and the vector relation (formula 6) between closed quadrilaterals, the relation between the coordinate origins of the mechanical arm coordinate system corresponding to each camera is obtained, and finally, according to the relation between the coordinate origins of the mechanical arm coordinate system, the mechanical arm coordinates (obtained by formula 1) corresponding to the pixel center position of each camera are combined, and the closed quadrilaterals principle (formula 7), the physical distance between the centers of each camera can be obtained. For more details, reference may be made to the above embodiments, and details are not repeated here.
In the above process, the two mechanical arm coordinate systems are realized based on the precondition that the movement mode of the camera in the calibration process is translation without rotation, and the cameras are rigidly connected, and all the cameras move in each translation, that is, all the cameras move under one coordinate system, so that a plurality of coordinate systems appear, and only the origin of coordinates of the reference is inconsistent, but all the cameras are positioned in the same coordinate plane. The quadrangle thus constructed is also located in the coordinate plane, and vector operation can be performed. In addition, whether in the process of calibrating the cameras or the process of calibrating the distances between the binocular cameras, the errors can not be introduced by the regulation of preconditions, so that the height difference exists between the cameras, but the height difference can not influence the distance measurement as long as the height difference is within a reasonable range, because the height difference exists on the physical distance of the cameras, the translation process is kept on the same plane, the coordinate transformation is finally carried out on the same plane, and the difference in the height can be reflected on a transformation matrix.
The method provided by the embodiment of the invention can also be applied to the ranging application of the multi-camera. It can be understood that the coordinates of the mechanical arms (offsets) corresponding to the multiple cameras are transformed to the same coordinate system when the key of the multi-camera ranging is the same. In a scene with a plurality of cameras, two main cases are: 1. all cameras can not see the calibration object at the same time, and two or a few cameras can see the calibration object at the same time. In the first case, all cameras need to be calibrated pairwise relative to the 1 st camera, and all cameras do not need to be calibrated pairwise mutually; in the second case, the cameras capable of simultaneously seeing the calibration object are calibrated simultaneously, and the other cameras incapable of simultaneously seeing the calibration need to calibrate in pairs relative to one camera capable of simultaneously seeing the calibration object.
According to the technical scheme, the mapping relation between the pixel coordinates of the camera and the mechanical arm coordinate system is established through the hand-eye calibration of the camera, and the mechanical arm offset coordinate system can be established according to the offset of the mechanical arm motion only by telling the motion of the mechanical arm by a certain offset in the calibration process. Since the robot arm can only move in the X and Y directions, the robot arm offset coordinate systems of the two cameras can be mutually transformed by translation. Therefore, the two mechanical arm offset coordinate systems can carry out vector operation under a unified coordinate system. The relative position of the cameras is also related to the offset of the mechanical arm, the mechanical arm is moved, the calibration object is enabled to appear in the left camera view and the right camera view successively, the problem that the calibration object is not located at the same position of the two cameras is solved through camera calibration and coordinate transformation, and the distance between the two cameras is enabled to be more accurate.
Example IV
Fig. 4 is a schematic structural diagram of a binocular camera calibration apparatus according to a fourth embodiment of the present invention. The binocular camera calibration apparatus may be implemented in software and/or hardware, for example, the binocular camera calibration apparatus may be configured in a computer device. As shown in fig. 4, the apparatus includes a transformation matrix acquisition module 410, a geometric relationship determination module 420, and a physical distance calculation module 430, wherein:
The transformation matrix acquisition module 410 is configured to perform hand-eye calibration on each camera to be calibrated, and obtain a coordinate transformation matrix corresponding to each camera to be calibrated;
the geometric relationship determining module 420 is configured to obtain images of the target object in the fields of view of the cameras to be calibrated, and determine geometric relationships between vectors formed by the feature points in the images;
the physical distance calculating module 430 is configured to obtain coordinates of the feature points in the target coordinate system according to the coordinate transformation matrix, and determine the physical distance between the cameras to be calibrated according to the coordinates and the geometric relationship of the feature points in the target coordinate system.
According to the embodiment of the invention, the camera to be calibrated is subjected to hand-eye calibration through the transformation matrix acquisition module, and the coordinate transformation matrix corresponding to each camera to be calibrated is obtained; the geometric relation determining module obtains images of the target object in the vision field of each camera to be calibrated, and determines geometric relations among vectors formed by characteristic points in the images; the physical distance calculation module obtains the coordinates of the feature points under the target coordinate system according to the coordinate transformation matrix, and determines the physical distance between cameras to be calibrated according to the coordinates and the geometric relationship of the feature points under the target coordinate system, so that the deviation and the error caused by manual installation and measurement are solved, and the accuracy of the binocular camera calibration is improved.
Optionally, on the basis of the above solution, the camera to be calibrated includes a first camera and a second camera, and the geometric relationship determining module 420 includes:
an image acquisition unit configured to acquire a first image of a target object in a first camera view and a second image of the target object in a second camera view;
and the relation determining unit is used for determining the geometric relation between vectors formed by the feature points in the first image and the second image.
Optionally, based on the above scheme, the physical distance calculating module 430 includes:
the mechanical arm coordinate determining unit is used for obtaining the pixel coordinates of the characteristic points in the image pixel coordinate system aiming at each characteristic point, and obtaining the mechanical arm coordinates of the characteristic points in the mechanical arm offset coordinate system according to the pixel coordinates and the transformation matrix corresponding to the characteristic points;
and the camera distance calculation unit is used for obtaining the physical distance between the first camera and the second camera according to the mechanical arm coordinates of the feature points, the mechanical arm offset between the first image and the second image and the geometric relationship.
Optionally, on the basis of the above scheme, the relationship determining unit is specifically configured to: the method comprises the steps of taking a mechanical arm corresponding to a first image, which is shifted from a first coordinate system origin, a first mark point corresponding to a target object in the first image, a mechanical arm corresponding to a second image, which is shifted from a second coordinate system origin, and a second mark point corresponding to the target object in the second image as first feature points, and constructing a first geometric relationship among vectors formed by the first feature points; and taking the first coordinate system origin point offset by the mechanical arm, the first center point of the first image, the second coordinate system origin point offset by the mechanical arm and the second center point of the second image as second characteristic points, and constructing a second geometric relationship among vectors formed by the second characteristic points.
Optionally, on the basis of the above scheme, the relationship determining unit is specifically configured to: constructing a first vector based on the first coordinate system origin and the first mark point of the mechanical arm, constructing a second vector based on the first mark point and the second mark point, constructing a third vector based on the second coordinate system origin and the second mark point of the mechanical arm, and constructing a fourth vector based on the first coordinate system origin and the second coordinate system origin of the mechanical arm; and obtaining a first solving equation of the fourth vector according to a closed quadrilateral vector algorithm.
Optionally, on the basis of the above scheme, the relationship determining unit is specifically configured to: constructing a fifth vector based on the first coordinate system origin and the first center point of the mechanical arm offset, constructing a sixth vector based on the first center point and the second center point, and constructing a seventh vector based on the second coordinate system origin and the second center point of the mechanical arm offset; and obtaining a second solving equation of the sixth vector according to the closed quadrilateral vector algorithm.
Optionally, based on the above scheme, the camera distance calculating unit is specifically configured to: obtaining a fourth vector according to the mechanical arm coordinates of each first characteristic point, the mechanical arm offset between the first image and the second image and the first solving equation; obtaining a sixth vector according to the mechanical arm coordinates of each second characteristic point, the fourth vector and the second solving equation; and obtaining the physical distance between the first camera and the second camera according to the sixth vector.
The binocular camera calibration device provided by the embodiment of the invention can execute the binocular camera calibration method provided by any embodiment, and has the corresponding functional modules and beneficial effects of the execution method.
Example five
Fig. 5 is a schematic structural diagram of a computer device according to a fifth embodiment of the present invention. Fig. 5 illustrates a block diagram of an exemplary computer device 512 suitable for use in implementing embodiments of the present invention. The computer device 512 shown in fig. 5 is merely an example, and should not be construed as limiting the functionality and scope of use of embodiments of the present invention.
As shown in FIG. 5, computer device 512 is in the form of a general purpose computing device. Components of computer device 512 may include, but are not limited to: one or more processors 516, a system memory 528, a bus 518 that connects the various system components (including the system memory 528 and the processor 516).
Bus 518 represents one or more of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, a processor 516, or a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, micro channel architecture (MAC) bus, enhanced ISA bus, video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Computer device 512 typically includes a variety of computer system readable media. Such media can be any available media that is accessible by computer device 512 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 528 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM) 530 and/or cache memory 532. The computer device 512 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage 534 may be used to read from or write to a non-removable, non-volatile magnetic media (not shown in FIG. 5, commonly referred to as a "hard disk drive"). Although not shown in fig. 5, a magnetic disk drive for reading from and writing to a removable non-volatile magnetic disk (e.g., a "floppy disk"), and an optical disk drive for reading from or writing to a removable non-volatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In such cases, each drive may be coupled to bus 518 through one or more data media interfaces. Memory 528 may include at least one program product having a set (e.g., at least one) of program modules configured to carry out the functions of embodiments of the invention.
A program/utility 540 having a set (at least one) of program modules 542 may be stored in, for example, memory 528, such program modules 542 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment. Program modules 542 generally perform the functions and/or methods in the described embodiments of the invention.
The computer device 512 may also communicate with one or more external devices 514 (e.g., keyboard, pointing device, display 524, etc.), one or more devices that enable a user to interact with the computer device 512, and/or any devices (e.g., network card, modem, etc.) that enable the computer device 512 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 522. Also, the computer device 512 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN) and/or a public network, such as the Internet, through a network adapter 520. As shown, network adapter 520 communicates with other modules of computer device 512 via bus 518. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with computer device 512, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage systems, and the like.
The processor 516 executes various functional applications and data processing by running programs stored in the system memory 528, for example, implementing the binocular camera calibration method provided by the embodiments of the present invention, the method comprising:
performing hand-eye calibration on each camera to be calibrated to obtain a coordinate transformation matrix corresponding to each camera to be calibrated;
acquiring images of a target object in the field of view of each camera to be calibrated, and determining geometric relations among vectors formed by characteristic points in the images;
and obtaining the coordinates of the characteristic points under a target coordinate system according to the coordinate transformation matrix, and determining the physical distance between the cameras to be calibrated according to the coordinates of the characteristic points under the target coordinate system and the geometric relationship.
Of course, those skilled in the art will understand that the processor may also implement the technical solution of the binocular camera calibration method provided by any embodiment of the present invention.
Example six
The sixth embodiment of the present invention also provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the binocular camera calibration method as provided by the embodiments of the present invention, the method comprising:
Performing hand-eye calibration on each camera to be calibrated to obtain a coordinate transformation matrix corresponding to each camera to be calibrated;
acquiring images of a target object in the field of view of each camera to be calibrated, and determining geometric relations among vectors formed by characteristic points in the images;
and obtaining the coordinates of the characteristic points under a target coordinate system according to the coordinate transformation matrix, and determining the physical distance between the cameras to be calibrated according to the coordinates of the characteristic points under the target coordinate system and the geometric relationship.
Of course, the computer readable storage medium provided by the embodiments of the present invention, on which the computer program stored, is not limited to the above-described method operations, but may also perform the related operations in the binocular camera calibration method and/or the magnetic resonance scanning method provided by any of the embodiments of the present invention.
The computer storage media of embodiments of the invention may take the form of any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
Note that the above is only a preferred embodiment of the present invention and the technical principle applied. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, while the invention has been described in connection with the above embodiments, the invention is not limited to the embodiments, but may be embodied in many other equivalent forms without departing from the spirit or scope of the invention, which is set forth in the following claims.

Claims (5)

1. A binocular camera calibration method, comprising:
performing hand-eye calibration on each camera to be calibrated to obtain a coordinate transformation matrix corresponding to each camera to be calibrated;
acquiring images of a target object in the field of view of each camera to be calibrated, and determining geometric relations among vectors formed by characteristic points in the images;
obtaining coordinates of the feature points under a target coordinate system according to the coordinate transformation matrix, and determining physical distances between cameras to be calibrated according to the coordinates of the feature points under the target coordinate system and the geometric relationship;
Acquiring a first image of the target object in a first camera view and a second image of the target object in a second camera view;
a first geometrical relationship among vectors formed by the first feature points is constructed by taking a first coordinate system origin point offset by the mechanical arm corresponding to the first image, a first mark point corresponding to the target object in the first image, a second coordinate system origin point offset by the mechanical arm corresponding to the second image and a second mark point corresponding to the target object in the second image as first feature points;
taking the first coordinate system origin point offset by the mechanical arm, the first central point of the first image, the second coordinate system origin point offset by the mechanical arm and the second central point of the second image as second characteristic points, and constructing a second geometric relationship among vectors formed by the second characteristic points;
constructing a first vector based on the first coordinate system origin offset by the mechanical arm and the first mark point, constructing a second vector based on the first mark point and the second mark point, constructing a third vector based on the second coordinate system origin offset by the mechanical arm and the second mark point, and constructing a fourth vector based on the first coordinate system origin offset by the mechanical arm and the second coordinate system origin offset by the mechanical arm;
Obtaining a first solving equation of the fourth vector according to a closed quadrilateral vector algorithm;
constructing a fifth vector based on the first coordinate system origin and the first center point of the mechanical arm offset, constructing a sixth vector based on the first center point and the second center point, and constructing a seventh vector based on the second coordinate system origin and the second center point of the mechanical arm offset;
obtaining a second solving equation of the sixth vector according to a closed quadrilateral vector algorithm;
obtaining a fourth vector according to the mechanical arm coordinates of each first feature point, the mechanical arm offset between the first image and the second image and the first solving equation;
obtaining a sixth vector according to the mechanical arm coordinates of each second characteristic point, the fourth vector and the second solving equation;
and obtaining the physical distance between the first camera and the second camera according to the sixth vector.
2. The method according to claim 1, wherein the obtaining the coordinates of the feature points in the target coordinate system according to the coordinate transformation matrix, and determining the physical distance between the cameras to be calibrated according to the coordinates of the feature points in the target coordinate system and the geometric relationship, includes:
For each characteristic point, acquiring a pixel coordinate of the characteristic point under an image pixel coordinate system, and acquiring a mechanical arm coordinate of the characteristic point under a mechanical arm offset coordinate system according to the pixel coordinate and a transformation matrix corresponding to the characteristic point;
and obtaining the physical distance between the first camera and the second camera according to the mechanical arm coordinates of each characteristic point, the mechanical arm offset between the first image and the second image and the geometric relationship.
3. A binocular camera calibration apparatus, comprising:
the transformation matrix acquisition module is used for performing hand-eye calibration on each camera to be calibrated to obtain a coordinate transformation matrix corresponding to each camera to be calibrated;
the geometric relation determining module is used for acquiring images of the target object in the vision field of each camera to be calibrated and determining geometric relations among vectors formed by characteristic points in the images;
the physical distance calculation module is used for obtaining the coordinates of the characteristic points under a target coordinate system according to the coordinate transformation matrix, and determining the physical distance between the cameras to be calibrated according to the coordinates of the characteristic points under the target coordinate system and the geometric relationship;
A geometric relationship determination module comprising:
an image acquisition unit configured to acquire a first image of a target object in a first camera view and a second image of the target object in a second camera view;
a relationship determining unit for determining a geometric relationship between vectors composed of feature points in the first image and the second image;
the relation determining unit is specifically configured to:
the method comprises the steps of taking a mechanical arm corresponding to a first image, which is shifted from a first coordinate system origin, a first mark point corresponding to a target object in the first image, a mechanical arm corresponding to a second image, which is shifted from a second coordinate system origin, and a second mark point corresponding to the target object in the second image as first feature points, and constructing a first geometric relationship among vectors formed by the first feature points;
taking the first coordinate system origin point of the mechanical arm deviation, the first central point of the first image, the second coordinate system origin point of the mechanical arm deviation and the second central point of the second image as second characteristic points, and constructing a second geometric relationship among vectors formed by the second characteristic points;
constructing a first vector based on the first coordinate system origin and the first mark point of the mechanical arm, constructing a second vector based on the first mark point and the second mark point, constructing a third vector based on the second coordinate system origin and the second mark point of the mechanical arm, and constructing a fourth vector based on the first coordinate system origin and the second coordinate system origin of the mechanical arm;
Obtaining a first solving equation of a fourth vector according to a closed quadrilateral vector algorithm;
constructing a fifth vector based on the first coordinate system origin and the first center point of the mechanical arm offset, constructing a sixth vector based on the first center point and the second center point, and constructing a seventh vector based on the second coordinate system origin and the second center point of the mechanical arm offset;
obtaining a second solving equation of a sixth vector according to the closed quadrilateral vector algorithm;
the camera distance calculating unit is specifically used for:
obtaining a fourth vector according to the mechanical arm coordinates of each first characteristic point, the mechanical arm offset between the first image and the second image and the first solving equation;
obtaining a sixth vector according to the mechanical arm coordinates of each second characteristic point, the fourth vector and the second solving equation;
and obtaining the physical distance between the first camera and the second camera according to the sixth vector.
4. A computer device, the device comprising:
one or more processors;
a storage means for storing one or more programs;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the binocular camera calibration method of any of claims 1-2.
5. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the binocular camera calibration method according to any one of claims 1-2.
CN202010235123.9A 2020-03-27 2020-03-27 Binocular camera calibration method, device, equipment and medium Active CN111445533B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010235123.9A CN111445533B (en) 2020-03-27 2020-03-27 Binocular camera calibration method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010235123.9A CN111445533B (en) 2020-03-27 2020-03-27 Binocular camera calibration method, device, equipment and medium

Publications (2)

Publication Number Publication Date
CN111445533A CN111445533A (en) 2020-07-24
CN111445533B true CN111445533B (en) 2023-08-01

Family

ID=71656107

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010235123.9A Active CN111445533B (en) 2020-03-27 2020-03-27 Binocular camera calibration method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN111445533B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112132904B (en) * 2020-08-26 2024-04-12 江苏迪盛智能科技有限公司 Method and system for determining position relationship between alignment camera and optical device
CN112634376B (en) * 2020-12-25 2024-06-04 深圳中科飞测科技股份有限公司 Calibration method and device, calibration equipment and storage medium
CN112700505B (en) * 2020-12-31 2022-11-22 山东大学 Binocular three-dimensional tracking-based hand and eye calibration method and device and storage medium
CN112929535B (en) * 2021-01-25 2022-09-20 北京中科慧眼科技有限公司 Binocular camera-based lens attitude correction method and system and intelligent terminal
CN112884847B (en) * 2021-03-02 2022-08-09 济南大学 Dual-camera calibration method and system
US20220392012A1 (en) * 2021-06-07 2022-12-08 Alcon Inc. Optical axis calibration of robotic camera system
CN113850873B (en) * 2021-09-24 2024-06-07 成都圭目机器人有限公司 Offset position calibration method of linear array camera under carrying platform positioning coordinate system
CN114406985B (en) * 2021-10-18 2024-04-12 苏州迪凯尔医疗科技有限公司 Mechanical arm method, system, equipment and storage medium for target tracking
CN114677429B (en) * 2022-05-27 2022-08-30 深圳广成创新技术有限公司 Positioning method and device of manipulator, computer equipment and storage medium
CN115781665B (en) * 2022-11-01 2023-08-08 深圳史河机器人科技有限公司 Mechanical arm control method and device based on monocular camera and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109129445A (en) * 2018-09-29 2019-01-04 先临三维科技股份有限公司 Hand and eye calibrating method, scaling board, device, equipment and the storage medium of mechanical arm
CN109483516A (en) * 2018-10-16 2019-03-19 浙江大学 A kind of mechanical arm hand and eye calibrating method based on space length and epipolar-line constraint
CN109658460A (en) * 2018-12-11 2019-04-19 北京无线电测量研究所 A kind of mechanical arm tail end camera hand and eye calibrating method and system

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3833786B2 (en) * 1997-08-04 2006-10-18 富士重工業株式会社 3D self-position recognition device for moving objects
CN101893425B (en) * 2010-07-09 2012-06-27 清华大学 Visual full-parameter wheel alignment detection system and method based on linear array images
CN109214980B (en) * 2017-07-04 2023-06-23 阿波罗智能技术(北京)有限公司 Three-dimensional attitude estimation method, three-dimensional attitude estimation device, three-dimensional attitude estimation equipment and computer storage medium
CN108562274B (en) * 2018-04-20 2020-10-27 南京邮电大学 Marker-based non-cooperative target pose measurement method
CN108734744B (en) * 2018-04-28 2022-02-18 国网山西省电力公司电力科学研究院 Long-distance large-view-field binocular calibration method based on total station
KR102113285B1 (en) * 2018-08-01 2020-05-20 한국원자력연구원 Image processing method and apparatus of parallel axis typed stereo camera system for 3d-vision of near objects
CN113486796B (en) * 2018-09-07 2023-09-05 百度在线网络技术(北京)有限公司 Unmanned vehicle position detection method, unmanned vehicle position detection device, unmanned vehicle position detection equipment, storage medium and vehicle

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109129445A (en) * 2018-09-29 2019-01-04 先临三维科技股份有限公司 Hand and eye calibrating method, scaling board, device, equipment and the storage medium of mechanical arm
CN109483516A (en) * 2018-10-16 2019-03-19 浙江大学 A kind of mechanical arm hand and eye calibrating method based on space length and epipolar-line constraint
CN109658460A (en) * 2018-12-11 2019-04-19 北京无线电测量研究所 A kind of mechanical arm tail end camera hand and eye calibrating method and system

Also Published As

Publication number Publication date
CN111445533A (en) 2020-07-24

Similar Documents

Publication Publication Date Title
CN111445533B (en) Binocular camera calibration method, device, equipment and medium
US9547802B2 (en) System and method for image composition thereof
KR20210056964A (en) Method and apparatus for calibrating external parameters of image acquisition device, device and storage medium
WO2021208933A1 (en) Image rectification method and apparatus for camera
CN111801198B (en) Hand-eye calibration method, system and computer storage medium
JPH11118425A (en) Calibration method and device and calibration data production
CN110148454B (en) Positioning method, positioning device, server and storage medium
US9990739B1 (en) Method and device for fisheye camera automatic calibration
CN109544643A (en) A kind of camera review bearing calibration and device
CN112907727A (en) Calibration method, device and system of relative transformation matrix
CN113034612A (en) Calibration device and method and depth camera
CN115854866A (en) Optical target three-dimensional measurement system and method, electronic equipment and storage medium
CN114463436A (en) Calibration method, system, equipment and storage medium of galvanometer scanning device
CN114049401A (en) Binocular camera calibration method, device, equipment and medium
CN110111364A (en) Method for testing motion, device, electronic equipment and storage medium
CN117173254A (en) Camera calibration method, system, device and electronic equipment
CN112902961A (en) Calibration method, medium, calibration equipment and system based on machine vision positioning
López-Nicolás et al. Unitary torus model for conical mirror based catadioptric system
WO2021134219A1 (en) Parameter calibration method and apapratus
CN116309881A (en) Tripod head camera external parameter measuring and calculating method, device, equipment and medium
CN113240754B (en) Method, device, equipment and storage medium for determining internal parameters of PTZ image pickup device
CN115965697A (en) Projector calibration method, calibration system and device based on Samm's law
CN110675445B (en) Visual positioning method, device and storage medium
CN108650465B (en) Method and device for calculating augmented reality label of camera picture and electronic equipment
Ribeiro et al. Photogrammetric multi-camera calibration using an industrial programmable robotic arm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant