CN115810052A - Camera calibration method and device, electronic equipment and storage medium - Google Patents

Camera calibration method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN115810052A
CN115810052A CN202111083399.0A CN202111083399A CN115810052A CN 115810052 A CN115810052 A CN 115810052A CN 202111083399 A CN202111083399 A CN 202111083399A CN 115810052 A CN115810052 A CN 115810052A
Authority
CN
China
Prior art keywords
camera
coordinate system
position coordinates
position points
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111083399.0A
Other languages
Chinese (zh)
Inventor
汪力骁
李鹏飞
丁有爽
邵天兰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mech Mind Robotics Technologies Co Ltd
Original Assignee
Mech Mind Robotics Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mech Mind Robotics Technologies Co Ltd filed Critical Mech Mind Robotics Technologies Co Ltd
Priority to CN202111083399.0A priority Critical patent/CN115810052A/en
Priority to PCT/CN2021/138574 priority patent/WO2023040095A1/en
Publication of CN115810052A publication Critical patent/CN115810052A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a camera calibration method, a camera calibration device, electronic equipment and a readable storage medium, wherein the camera calibration method comprises the following steps: acquiring measurement position coordinates of a plurality of position points in a camera coordinate system; determining an initial pose of a camera under a robot coordinate system; and determining a compensation matrix for the camera according to the measurement position coordinates of the plurality of position points in the camera coordinate system and the initial pose of the camera in the robot coordinate system. The calibration method of the camera can simultaneously determine the external parameters of the camera and the compensation matrix aiming at the internal parameters of the camera, so that the position coordinates of the position points under the camera coordinate system can be compensated according to the compensation matrix, more accurate position coordinates are obtained, and the precision of the 3D point cloud image shot by the 3D camera is effectively improved.

Description

Camera calibration method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of industrial camera technologies, and in particular, to a camera calibration method and apparatus, an electronic device, and a storage medium.
Background
With the development of robotics, machine vision technology is required to be used in more and more scenes. For example, a machine vision technology may be used to identify an object to be detected, before identifying the object to be detected, the position coordinates of a plurality of position points on the surface of the object to be detected in the camera coordinate system may be collected by the 3D camera, and further, a 3D point cloud image of the object may be generated according to the position coordinates of the plurality of position points on the surface of the object in the camera coordinate system. The accuracy of the 3D point cloud image of the object will directly affect the accuracy of subsequent robot identification of the object. For example, for a robot of a grabbing type, a position to be grabbed of an object needs to be determined according to a 3D point cloud image of the object to be grabbed. In a specific application scenario, for example, in a process of acquiring position coordinates of a plurality of position points on a surface of an object under a camera coordinate system, internal and external parameters of the camera (in an image measurement process and a machine vision application, in order to determine a correlation between a three-dimensional geometric position of a certain point on the surface of the space object and a corresponding point in an image, a geometric model imaged by a camera must be established, and these geometric model parameters are camera parameters) directly influence the accuracy of the position coordinates of the plurality of position points under the camera coordinate system.
In the prior art, a plurality of position points of a 3D point cloud image are obtained by the same camera internal parameters, but after the camera is calibrated (calibration is a process of converting parameters required by a camera image between a 2D point on a plane and a 3D point in a real scene shot by the camera), because the universality of the position points of different depths of the object surface from the camera is different, that is, the internal parameters are not generally suitable for the position points with different depths in the whole space region, if the 3D point cloud image is obtained by the same camera internal parameters, the accuracy of the 3D point cloud image in a preset region and the accuracy of the 3D point cloud image outside the preset region are different. Therefore, how to determine the accurate position coordinates corresponding to the point cloud point acquired by the camera is a technical problem to be solved by the application.
Disclosure of Invention
The invention provides a camera calibration method, which is used for accurately determining accurate position coordinates corresponding to point cloud points acquired by a camera, and comprises the following steps:
acquiring measurement position coordinates of a plurality of position points in a camera coordinate system;
determining an initial pose of a camera under a robot coordinate system;
and determining a compensation matrix for the camera according to the measured position coordinates of the plurality of position points in the camera coordinate system and the initial pose of the camera in the robot coordinate system.
In a specific implementation, the acquiring coordinates of the measurement positions of the plurality of position points in the camera coordinate system further includes:
acquiring a plurality of calibration plate images collected by a camera when the calibration plate moves to a plurality of spatial positions;
and acquiring the measurement position coordinates of the plurality of position points in a camera coordinate system according to the plurality of calibration plate images acquired by the camera.
In a specific implementation, the determining an initial pose of the camera in the robot coordinate system further includes:
in the process that the robot drives the calibration plates to move through the flanges, the positions and postures of the flanges in the robot coordinate system and the positions and postures of the calibration plates in the camera coordinate system are obtained;
and determining the initial poses of the camera in the robot coordinate system according to the poses of the flanges in the robot coordinate system and the poses of the calibration plates in the camera coordinate system.
In a specific implementation, the determining a compensation matrix for the camera according to the measured position coordinates of the plurality of position points in the camera coordinate system and the initial pose of the camera in the robot coordinate system further includes:
dividing a plurality of position points into a plurality of groups of position points according to the difference of space areas under a camera coordinate system;
and determining a compensation matrix aiming at the camera according to the measurement position coordinates of each group of position points in the camera coordinate system and the initial pose of the camera in the robot coordinate system.
In a specific implementation, the dividing, according to the difference of the spatial regions, the plurality of position points into a plurality of groups of position points in the camera coordinate system further includes:
dividing a space into a plurality of layers according to different heights under a camera coordinate system, wherein a plurality of partitions are divided in each layer;
and dividing the position points in the same partition in the same layer into a group of position points according to the position coordinates of the position points in the camera coordinate system.
In a specific implementation, after determining the compensation matrix for the camera, the method further includes:
acquiring position coordinates of object surface position points acquired by a camera in a camera coordinate system;
and calibrating the position coordinates of the object surface position points in the camera coordinate system according to the compensation matrix to obtain the compensated accurate position coordinates.
In a specific implementation, the calibrating the position coordinates of the object surface position points in the camera coordinate system according to the compensation matrix to obtain the compensated accurate position coordinates further includes:
determining the layering and the partition of the object surface position point according to the position coordinate of the object surface position point in a camera coordinate system and the internal reference of a camera;
and compensating the position coordinates of the object surface position points under the camera coordinate system according to the compensation matrix corresponding to the layering and the partition of the object surface position points to obtain the compensated accurate position coordinates.
In a specific implementation, the determining, according to the position coordinates of the object surface position point in the camera coordinate system and the internal reference of the camera, the layering and the partitioning where the object surface position point is located further includes:
determining the layering of the surface position points of the object according to the z-axis coordinates of the surface position points of the object in a camera coordinate system;
determining pixel coordinates corresponding to the object surface position points according to the x-axis coordinates and the y-axis coordinates of the object surface position points in a camera coordinate system and internal parameters of a camera;
and determining the partition where the object surface position point is located according to the pixel coordinate corresponding to the object surface position point.
In a specific implementation, the calibrating the position coordinates of the object surface position points in the camera coordinate system according to the compensation matrix corresponding to the layering and the division where the object surface position points are located further includes:
when the object surface position point is in the cross-hierarchy and/or cross-partition region, determining the weight of each cross-hierarchy and/or partition region according to the distance between the object surface position point and the cross-hierarchy and/or partition region;
respectively compensating the position coordinates of the object surface position points under a camera coordinate system according to the compensation matrixes of the cross-layers and/or the partitions to obtain a plurality of compensation position coordinates of the object surface position points;
and carrying out weighted summation on the plurality of compensation position coordinates of the object surface position points according to the weights of all the cross-layers and/or the partitions to obtain the compensation position coordinates after weighted summation.
In a specific implementation, the determining a compensation matrix for the camera according to the measured position coordinates of each group of position points in the camera coordinate system and the initial pose of the camera in the robot coordinate system further includes:
determining initial theoretical position coordinates of each group of position points in a camera coordinate system according to the initial pose of the camera in the robot coordinate system;
fitting the measurement position coordinates and the initial theoretical position coordinates of each group of position points in a camera coordinate system, and determining an initial compensation matrix for the camera;
acquiring the adjusted poses of a plurality of cameras in a robot coordinate system and determining the theoretical position coordinates of a plurality of groups of position points after the cameras are adjusted;
and adjusting the initial compensation matrix according to the current poses returned by the plurality of cameras after being adjusted in the robot coordinate system, the measurement positions of the plurality of groups of position points in the camera coordinate system and the current position coordinates after being adjusted by the cameras until the Euclidean distance of the error between the measurement position coordinates of the groups of position points in the camera coordinate system and the current theoretical position coordinates is smaller than a preset threshold value and/or is adjusted for a preset number of times, and taking the current compensation matrix as the compensation matrix for the cameras.
In a specific implementation, the determining, according to the current pose of the camera in the robot coordinate system, the current theoretical position coordinates of each group of position points in the camera coordinate system further includes:
acquiring the pose of the flange corresponding to each group of position points in a robot coordinate system and the pose of the calibration plate in a camera coordinate system;
determining the pose of the calibration plate relative to the flange according to the initial pose of the camera in the robot coordinate system, the pose of the flange in the robot coordinate system and the pose of the calibration plate in the camera coordinate system;
determining the position coordinates of each group of position points in the robot coordinate system according to the position coordinates of each group of position points in the calibration plate coordinate system, the pose of the calibration plate relative to the flange and the pose of the flange in the robot coordinate system;
and determining initial theoretical position coordinates of the position points in the camera coordinate system according to the position coordinates of the position points in the robot coordinate system and the initial pose of the camera in the robot coordinate system.
The invention also provides a camera calibration method device, which is characterized by comprising the following steps:
the coordinate acquisition module is used for acquiring the measurement position coordinates of the plurality of position points in a camera coordinate system;
the external parameter determining module is used for determining the initial pose of the camera under the robot coordinate system;
and the compensation matrix determining module is used for determining a compensation matrix for the camera according to the measurement position coordinates of the plurality of position points in the camera coordinate system and the initial pose of the camera in the robot coordinate system.
In a specific implementation, the coordinate obtaining module further includes:
the image acquisition sub-module is used for acquiring a plurality of calibration plate images acquired by the camera when the calibration plate moves to a plurality of spatial positions;
and the coordinate calculation sub-module is used for acquiring the measurement position coordinates of the plurality of position points in the camera coordinate system according to the plurality of calibration plate images acquired by the camera.
In a specific implementation, the external reference determining module further includes:
the flange coordinate acquisition sub-module is used for acquiring the poses of the plurality of flanges in a robot coordinate system and the poses of the plurality of calibration plates in a camera coordinate system in the process that the robot drives the calibration plates to move through the flanges;
and the external parameter calculation submodule is used for determining the initial pose of the camera in the robot coordinate system according to the poses of the flanges in the robot coordinate system and the poses of the calibration plates in the camera coordinate system.
In a specific implementation, the compensation matrix determining module further includes:
the coordinate grouping submodule is used for dividing the position points into a plurality of groups of position points according to the difference of the space areas under the camera coordinate system;
and the compensation matrix calculation submodule is used for determining a compensation matrix for the camera according to the measurement position coordinates of each group of position points in the camera coordinate system and the initial pose of the camera in the robot coordinate system.
In a specific implementation, the coordinate grouping sub-module further includes:
the space division submodule is used for dividing the space into a plurality of layers according to different heights under a camera coordinate system, and a plurality of partitions are divided in each layer;
and the grouping submodule is used for dividing the position points in the same partition in the same layer into a group of position points according to the position coordinates of the position points in the camera coordinate system.
In a specific implementation, the calibration apparatus of the camera further includes:
the position point coordinate acquisition module is used for acquiring the position coordinates of the surface position points of the object acquired by the camera in a camera coordinate system after determining a compensation matrix for the camera;
and the accurate position coordinate calculation module is used for calibrating the position coordinates of the object surface position points under the camera coordinate system according to the compensation matrix to obtain the compensated accurate position coordinates.
In a specific implementation, the module for obtaining accurate position coordinates further includes:
the space region determining submodule is used for determining the layering and the partition where the object surface position point is located according to the position coordinate of the object surface position point in the camera coordinate system and the internal parameters of the camera;
and the accurate position coordinate acquisition sub-module is used for compensating the position coordinates of the object surface position points under the camera coordinate system according to the compensation matrixes corresponding to the layering and the division of the object surface position points to obtain the compensated accurate position coordinates.
In a specific implementation, the spatial region determining sub-module further includes:
the layering determination submodule is used for determining the layering of the object surface position points according to the z-axis coordinates of the object surface position points in a camera coordinate system;
the pixel coordinate determination submodule is used for determining the pixel coordinate corresponding to the object surface position point according to the x-axis coordinate and the y-axis coordinate of the object surface position point in the camera coordinate system and the internal reference of the camera;
and the partition determining submodule is used for determining the partition where the object surface position point is located according to the pixel coordinate corresponding to the object surface position point.
In a specific implementation, the accurate position coordinate obtaining sub-module further includes:
the weight determination submodule is used for determining the weight of each cross-layering and/or partitioning according to the distance between the object surface position point and the cross-layering and/or partitioning when the object surface position point is in the cross-layering and/or cross-partitioning area;
the compensation submodule is used for respectively compensating the position coordinates of the object surface position points under the camera coordinate system according to the compensation matrixes of the cross-layers and/or the partitions to obtain a plurality of compensation position coordinates of the object surface position points;
and the weighted summation submodule is used for carrying out weighted summation on the plurality of compensation position coordinates of the object surface position points according to the weights of all the cross-layers and/or the partitions to obtain the compensation position coordinates after weighted summation.
In a specific implementation, the compensation matrix obtaining sub-module further includes:
the initial theoretical position coordinate acquisition sub-module is used for determining initial theoretical position coordinates of each group of position points in a camera coordinate system according to the initial pose of the camera in the robot coordinate system;
the initial compensation matrix acquisition submodule is used for fitting the measured position coordinates and the initial theoretical position coordinates of each group of position points in a camera coordinate system and determining an initial compensation matrix for the camera;
the adjustment data acquisition sub-module is used for acquiring the adjusted poses of the cameras in the robot coordinate system and determining the theoretical position coordinates of the groups of position points after the cameras are adjusted;
and the matrix determination submodule is used for adjusting an initial compensation matrix according to the current poses returned by the plurality of cameras after being adjusted in the robot coordinate system, the measurement positions of the plurality of groups of position points in the camera coordinate system and the current position coordinates after being adjusted by the cameras until the Euclidean distance of errors between the measurement position coordinates of the groups of position points in the camera coordinate system and the current theoretical position coordinates is smaller than a preset threshold value and/or is adjusted to reach a preset number of times, and taking the current compensation matrix as the compensation matrix for the cameras.
In a specific implementation, the initial theoretical position coordinate obtaining sub-module further includes:
the flange and calibration plate pose acquisition sub-module is used for acquiring the poses of the flange corresponding to each group of position points in the robot coordinate system and the pose of the calibration plate in the camera coordinate system;
the calibration plate relative flange pose acquisition sub-module is used for determining the pose of the calibration plate relative to the flange according to the initial pose of the camera in the robot coordinate system, the pose of the flange in the robot coordinate system and the pose of the calibration plate in the camera coordinate system;
the position point relative robot coordinate acquisition submodule is used for determining the position coordinates of each group of position points under the robot coordinate system according to the position coordinates of each group of position points under the calibration plate coordinate system, the pose of the calibration plate relative to the flange and the pose of the flange under the robot coordinate system;
and the initial theoretical position coordinate determination submodule is used for determining the initial theoretical position coordinates of each group of position points in the camera coordinate system according to the position coordinates of each group of position points in the robot coordinate system and the initial pose of the camera in the robot coordinate system.
The invention also provides electronic equipment which comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein the processor realizes the calibration method of the camera when executing the computer program.
The present invention also provides a computer-readable storage medium storing a computer program for executing the calibration method of the camera.
The invention provides a camera calibration method, a camera calibration device, electronic equipment and a readable storage medium, wherein the method comprises the following steps: acquiring measurement position coordinates of a plurality of position points in a camera coordinate system; determining an initial pose of a camera under a robot coordinate system; and determining a compensation matrix for the camera according to the measurement position coordinates of the plurality of position points in the camera coordinate system and the initial pose of the camera in the robot coordinate system. The calibration method of the camera can simultaneously determine the external parameters of the camera and the compensation matrix aiming at the internal parameters of the camera, so that the position coordinates of the position points under the camera coordinate system can be compensated according to the compensation matrix, more accurate position coordinates are obtained, and the precision of the 3D point cloud image shot by the 3D camera is effectively improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the embodiments or the technical solutions in the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts. In the drawings:
FIG. 1 is a schematic flow chart of a calibration method for a camera according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a process for obtaining position coordinates of a plurality of position points in a camera coordinate system according to an embodiment of the present invention;
FIG. 3 is a schematic flow chart illustrating a process for determining initial position coordinates of a camera in a robot coordinate system according to an embodiment of the present invention;
FIG. 4 is a schematic flow chart for determining a compensation matrix for a camera according to an embodiment of the present invention;
FIG. 5 is a schematic flow chart of location point grouping in accordance with one embodiment of the present invention;
FIG. 6 is a schematic flow chart of compensated accurate position coordinates according to an embodiment of the present invention;
FIG. 7 is a schematic diagram illustrating a detailed process for determining the stratification and zoning where the position point on the surface of the object is located according to an embodiment of the present invention;
FIG. 8 is a schematic flow diagram for compensating cross-tier and/or cross-partition location points in accordance with one embodiment of the present invention;
FIG. 9 is a flowchart illustrating the determination of a compensation matrix for a camera according to an embodiment of the present invention;
FIG. 10 is a flowchart illustrating the process of determining initial theoretical position coordinates of each set of position points in a camera coordinate system according to an embodiment of the present invention;
FIG. 11 is a schematic diagram of a calibration apparatus for a camera according to an embodiment of the present invention;
FIG. 12 is a block diagram of a coordinate acquisition module according to an embodiment of the present invention;
FIG. 13 is a block diagram illustrating a configuration of a parameter determination module in accordance with an embodiment of the present invention;
FIG. 14 is a block diagram of a compensation matrix determination module according to an embodiment of the present invention;
FIG. 15 is a block diagram of a coordinate grouping submodule in accordance with an embodiment of the present invention;
FIG. 16 is a block diagram of an accurate position coordinate determination module according to an embodiment of the present invention;
FIG. 17 is a block diagram of a spatial region determination submodule in accordance with an embodiment of the present invention;
FIG. 18 is a block diagram of an accurate position coordinate acquisition sub-module in accordance with one embodiment of the present invention;
FIG. 19 is a block diagram of a compensation matrix calculation sub-module according to an embodiment of the present invention;
FIG. 20 is a block diagram of an initial theoretical position coordinate acquisition sub-module in accordance with an embodiment of the present invention;
FIG. 21 is a schematic diagram of a camera calibration system according to one embodiment of the present invention;
fig. 22 is a schematic diagram illustrating the principle of spatial segmentation in accordance with an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the embodiments of the present invention are further described in detail below with reference to the accompanying drawings. The exemplary embodiments and descriptions of the present invention are provided to explain the present invention, but not to limit the present invention.
Before Calibration work is carried out, a system for calibrating a camera can be set up, and the system can include the camera, a robot with a mechanical arm, and a Calibration plate (Calibration plate, which can be used for correcting lens distortion and the like in applications such as machine vision, image measurement, photogrammetry, three-dimensional reconstruction and the like), wherein the Calibration plate is connected to a flange at the front end of the mechanical arm, so that the mechanical arm can be driven by the mechanical arm to move randomly to change the position, as shown in fig. 21, the front end of the mechanical arm of the robot can fix the Calibration plate through the flange, the camera can be fixed at a preset position, and in addition, the initial position coordinate of the camera in a robot coordinate system is unknown. Further, the setting of the camera position may have various schemes in implementation. For example, the camera may be fixed to a stand outside the robot, or may be fixed to the robot, and further, may change the position of the camera following the robot arm.
As shown in fig. 1, the present invention provides a calibration method for a camera, which is used to accurately determine an accurate position coordinate corresponding to a point cloud collected by the camera, and the calibration method for the camera includes:
step 101: acquiring measurement position coordinates of a plurality of position points in a camera coordinate system;
step 102: determining an initial pose of a camera under a robot coordinate system;
step 103: and determining a compensation matrix for the camera according to the measured position coordinates of the plurality of position points in the camera coordinate system and the initial pose of the camera in the robot coordinate system.
The compensation matrix includes compensation for internal parameters of the camera and compensation for external parameters, both of which affect the accuracy of the position coordinates of the position points acquired by the camera, and it is understood herein that the compensation matrix compensates errors caused by the deviation of the internal parameters and the external parameters.
In a specific implementation, the step 101: the acquisition of the measured position coordinates of the plurality of position points in the camera coordinate system may have various embodiments. For example, as shown in fig. 2, the step 101: acquiring the measured position coordinates of the plurality of position points in the camera coordinate system may further include:
step 201: acquiring a plurality of calibration plate images collected by a camera when the calibration plate moves to a plurality of spatial positions;
step 202: and acquiring the measurement position coordinates of the plurality of position points in a camera coordinate system according to the plurality of calibration plate images acquired by the camera.
In the implementation, the position of the calibration plate can be changed by adjusting the mechanical arm of the robot, and the pose of the flange at the front end of the mechanical arm under the coordinate system of the robot can be obtained when the mechanical arm is adjusted each time, wherein the pose is a known quantity and can be measured and determined by a sensor on the robot; meanwhile, after the mechanical arm of the robot adjusts the pose of the calibration plate every time, the measurement position coordinates of the position points on the calibration plate under the camera coordinate system can be collected through the camera.
In a specific implementation, there may be various schemes for determining the initial pose of the camera in the robot coordinate system, for example, as shown in fig. 3, the pose of the flange in the robot coordinate system and the pose of the calibration plate in the camera coordinate system may be acquired for multiple times, and then according to the step 102: determining an initial pose of the camera in the robot coordinate system may further include:
step 301: in the process that the robot drives the calibration plate to move through the flange, the poses of the plurality of flanges in the robot coordinate system and the poses of the plurality of calibration plates in the camera coordinate system are obtained;
step 302: and determining the initial poses of the camera in the robot coordinate system according to the poses of the flanges in the robot coordinate system and the poses of the calibration plates in the camera coordinate system.
Further, the step 302: according to the poses of the flanges in the robot coordinate system and the poses of the calibration plates in the camera coordinate system, the initial poses of the camera in the robot coordinate system are determined, and calculation can be performed according to the following formula:
Figure BDA0003264842300000081
wherein the content of the first and second substances,
Figure BDA0003264842300000082
representing the initial pose of the camera in a robot coordinate system;
Figure BDA0003264842300000083
representing the pose of the flange under a robot coordinate system;
Figure BDA0003264842300000084
representing the pose of the calibration plate relative to the flange;
Figure BDA0003264842300000085
representing the measured position coordinates of the location point in the camera coordinate system. It can be understood that the calibration plate is driven by the mechanical arm through the flangeDuring the movement, the position of the calibration plate relative to the flange is not changed, i.e. the position of the calibration plate relative to the flange is not changed
Figure BDA0003264842300000086
And the initial pose of the camera in the robot coordinate system can be obtained according to the formula.
In a specific implementation, the step 103: the determining a compensation matrix for the camera according to the measured position coordinates of the plurality of position points in the camera coordinate system and the initial pose of the camera in the robot coordinate system may be implemented in various embodiments, for example, as shown in fig. 4, and may further include:
step 401: dividing a plurality of position points into a plurality of groups of position points according to the difference of space areas under a camera coordinate system;
step 402: and determining a compensation matrix aiming at the camera according to the measurement position coordinates of each group of position points in the camera coordinate system and the initial pose of the camera in the robot coordinate system.
The position points are divided according to the space areas under the camera coordinate system, so that different space depths and areas have corresponding compensation matrixes, and the accuracy of the position point coordinates under the camera coordinate system is improved.
In the specific implementation, the pose of the calibration plate relative to the robot flange can be determined according to the initial pose of the camera in the robot coordinate system, the measurement position coordinates of the position points in the camera coordinate system and the pose of the flange in the robot coordinate system, and further, the theoretical position coordinates of the position points in the camera coordinate system can be determined according to the position coordinates of the position points in the calibration plate coordinate system and the pose of the flange in the robot coordinate system
Figure BDA0003264842300000091
Wherein the content of the first and second substances,
Figure BDA0003264842300000092
representing a matrix of theoretical position coordinates of a plurality of points on the calibration plate in the camera coordinate system, i.e. the theoretical coordinates corresponding to n position points
Figure BDA0003264842300000093
Can be expressed as:
Figure BDA0003264842300000094
in a specific implementation, the plurality of position points are divided into a plurality of groups of position points, and various embodiments are possible. For example, as shown in fig. 22, a plurality of position points are divided according to different division regions in space in a camera coordinate system, specifically, the position points may be divided into a plurality of layers according to heights, and then each layer is divided into a plurality of partitions according to a set region, so as to obtain a plurality of groups of position points. Specifically, as shown in fig. 5, the step 401: dividing the plurality of position points into a plurality of groups of position points according to the difference of the space regions under the camera coordinate system, may further include:
step 501: dividing a space into a plurality of layers according to different heights under a camera coordinate system, wherein a plurality of partitions are divided in each layer;
step 502: and dividing the position points in the same partition in the same layer into a group of position points according to the position coordinates of the position points in the camera coordinate system.
In specific implementation, after the compensation matrix is obtained, the position coordinates of the position points in the camera coordinate system can be compensated according to the compensation matrix, so that accurate position coordinates are obtained. As shown in fig. 1, at step 103: after determining the compensation matrix for the camera according to the position coordinates of the plurality of position points in the camera coordinate system and the initial pose of the camera in the robot coordinate system, the method may further include:
step 104: acquiring position coordinates of object surface position points acquired by a camera in a camera coordinate system;
step 105: and calibrating the position coordinates of the object surface position points in the camera coordinate system according to the compensation matrix to obtain the compensated accurate position coordinates.
In specific implementation, after the compensation matrix is obtained, the position coordinates of the camera can be compensated through the compensation matrix, so that the requirement on the accuracy of the initial position coordinates of the position points shot by the camera can be reduced, and the requirement on the precision of the camera is further reduced.
In a specific implementation, the step 105: various embodiments may be implemented to calibrate the position coordinates of the object surface position point in the camera coordinate system according to the compensation matrix to obtain the compensated accurate position coordinates, for example, as shown in fig. 6, the method may further include:
step 601: determining the layering and the partition of the object surface position point according to the position coordinate of the object surface position point in a camera coordinate system and the internal reference of a camera;
step 602: and compensating the position coordinates of the object surface position points under the camera coordinate system according to the compensation matrix corresponding to the layering and the partition where the object surface position points are located, so as to obtain the compensated accurate position coordinates.
In specific implementation, the step 601: for example, the z-axis coordinate, the x-axis coordinate and the y-axis coordinate of the position of the object surface position point in the camera coordinate system may be extracted respectively, so as to determine the division and the division of the position point, and specifically, as shown in fig. 7, the division and the division of the object surface position point may be determined as follows:
step 701: determining the layering of the object surface position points according to the z-axis coordinates of the object surface position points in a camera coordinate system;
step 702: determining pixel coordinates corresponding to the object surface position points according to the x-axis coordinates and the y-axis coordinates of the object surface position points in a camera coordinate system and internal parameters of a camera;
step 703: and determining the partition where the object surface position point is located according to the pixel coordinate corresponding to the object surface position point.
In a specific implementation, the step 602: according to the compensation matrix corresponding to the layering and the partition where the object surface position point is located, the position coordinate of the object surface position point under the camera coordinate system is compensated to obtain the compensated accurate position coordinate, and various implementation schemes can be provided. For example, since the position point may not belong to any hierarchy or partition completely among the hierarchies or partitions in practice, when compensating the position coordinates of the object surface position point in the camera coordinate system in order to improve the compensation accuracy of the position point across the hierarchies or partitions, as shown in fig. 8, the method may further include:
step 801: when the object surface position point is in the cross-layering and/or cross-partitioning area, determining the weight of each cross-layering and/or partitioning area according to the distance between the object surface position point and the cross-layering and/or partitioning area;
step 802: respectively compensating the position coordinates of the object surface position points under a camera coordinate system according to the compensation matrixes of the cross-layers and/or the partitions to obtain a plurality of compensation position coordinates of the object surface position points;
step 803: and carrying out weighted summation on a plurality of compensation position coordinates of the object surface position points according to the weights of all cross-layering and/or partitioning areas, and obtaining the compensation position coordinates after weighted summation.
Specifically, under the condition that the position point belongs to different partitions of the same hierarchy, the compensation result of the position point is weighted and summed according to the distance between the position point and the different partitions and the weight corresponding to the distance between the position point and the different partitions, so as to obtain the compensation position coordinate corresponding to the position point after weighted summation. And under the condition that the position points belong to different layers, carrying out weighted summation on the compensation results of the position points according to the distances between the position points and the different layers and the weights corresponding to the distances between the position points and the different layers to obtain the compensation position coordinates corresponding to the position points after weighted summation. And under the condition that the position points belong to different layers and different partitions, carrying out weighted summation on the compensation results of the position points according to the distances between the position points and the different layers and the distances between the position points and the different partitions to obtain the compensation position coordinates after weighted summation corresponding to the position points. Therefore, by performing weighted summation on the compensation results of the position points located in the boundary region, the accuracy of the obtained coordinate positions of the boundary position points can be improved.
In a specific implementation, the step 402: determining a compensation matrix for the camera according to the measured position coordinates of each group of position points in the camera coordinate system and the initial pose of the camera in the robot coordinate system, as shown in fig. 9, the method may further include:
901: determining initial theoretical position coordinates of each group of position points in a camera coordinate system according to the initial pose of the camera in the robot coordinate system;
902: fitting the measurement position coordinates and the initial theoretical position coordinates of each group of position points in a camera coordinate system, and determining an initial compensation matrix for the camera;
903: acquiring the adjusted poses of a plurality of cameras under a robot coordinate system and determining the theoretical position coordinates of a plurality of groups of position points after camera adjustment;
904: and adjusting the initial compensation matrix according to the current poses returned by the plurality of cameras after being adjusted in the robot coordinate system, the measurement positions of the plurality of groups of position points in the camera coordinate system and the current position coordinates after being adjusted by the cameras until the Euclidean distance of the error between the measurement position coordinates of the groups of position points in the camera coordinate system and the current theoretical position coordinates is smaller than a preset threshold value and/or is adjusted for a preset number of times, and taking the current compensation matrix as the compensation matrix for the cameras.
Further, step 902: when the measured position coordinates and the initial theoretical position coordinates of each group of position points in the camera coordinate system are fitted to determine the initial compensation matrix for the camera, various embodiments are possible, for example, the fitting may be performed by a least square method.
In a specific implementation, the step 402: according to the measured position coordinates of each group of position points in the camera coordinate system and the initial pose of the camera in the robot coordinate system, determining a compensation matrix for the camera, and calculating according to the following formula:
Figure BDA0003264842300000111
wherein the content of the first and second substances,
Figure BDA0003264842300000112
error representing position coordinates of the camera in a robot coordinate system;
Figure BDA0003264842300000113
representing theoretical position coordinates;
Figure BDA0003264842300000114
representing the measurement location coordinates; m denotes a compensation matrix.
In specific implementation, according to the measured position coordinates and the theoretical position coordinates respectively corresponding to a plurality of groups of position points and an error formula
Figure BDA0003264842300000115
Multiple adjustments of the pose of the camera in the robot coordinate system, i.e.
Figure BDA0003264842300000116
So that
Figure BDA0003264842300000117
(Euclidean distance) is minimum, namely a compensation matrix M aiming at the camera can be obtained, wherein the compensation matrix comprises compensation quantity aiming at the position corresponding to each position area in each layer; wherein the content of the first and second substances,
Figure BDA0003264842300000118
can be directly shot by a camera.
In specific implementation, the step 901: and determining the initial theoretical position coordinates of each group of position points in the camera coordinate system according to the initial pose of the camera in the robot coordinate system, wherein various implementation schemes can be adopted. For example, the current theoretical position coordinates of each set of position points in the camera coordinate system can be determined by finding an intermediate medium between each set of position points and the camera coordinate system, so as to introduce the position points in the calibration board into the camera coordinate system, i.e. as shown in fig. 10, the step 901: determining initial theoretical position coordinates of each group of position points in the camera coordinate system according to the initial pose of the camera in the robot coordinate system, wherein the method further comprises the following steps:
1001: acquiring the pose of the flange corresponding to each group of position points in a robot coordinate system and the pose of the calibration plate in a camera coordinate system;
1002: determining the pose of the calibration plate relative to the flange according to the initial pose of the camera in the robot coordinate system, the pose of the flange in the robot coordinate system and the pose of the calibration plate in the camera coordinate system;
1003: determining the position coordinates of each group of position points in the robot coordinate system according to the position coordinates of each group of position points in the calibration plate coordinate system, the pose of the calibration plate relative to the flange and the pose of the flange in the robot coordinate system;
1004: and determining initial theoretical position coordinates of the position points in the camera coordinate system according to the position coordinates of the position points in the robot coordinate system and the initial pose of the camera in the robot coordinate system.
In specific implementation, the step 105: there are various embodiments for calibrating the position coordinates of the object surface position point in the camera coordinate system according to the compensation matrix to obtain the compensated accurate position coordinates, for example, the following formula may be used for calculation:
M·A=A’
wherein M represents a compensation matrix; a represents the position coordinates of the surface position point of the object in a camera coordinate system; a' represents the position coordinates of the compensated object surface position points in the camera coordinate system; furthermore, after normalization processing is carried out on the A', an accurate position point coordinate I can be obtained.
As shown in fig. 11, the present invention further provides a camera calibration method and apparatus, wherein the camera calibration apparatus includes:
a coordinate acquisition module 1101 that acquires measurement position coordinates of a plurality of position points in a camera coordinate system;
an external parameter determining module 1102, configured to determine an initial pose of the camera in a robot coordinate system;
a compensation matrix determining module 1103, configured to determine a compensation matrix for the camera according to the measured position coordinates of the multiple position points in the camera coordinate system and the initial pose of the camera in the robot coordinate system.
In a specific implementation, as shown in fig. 12, the coordinate obtaining module 1101 further includes:
the image acquisition sub-module 1201 is configured to acquire a plurality of calibration plate images acquired by the camera when the calibration plate moves to a plurality of spatial positions;
and the coordinate calculation submodule 1202 is configured to obtain measurement position coordinates of the plurality of position points in the camera coordinate system according to the plurality of calibration plate images acquired by the camera.
In a specific implementation, as shown in fig. 13, the external parameter determining module 1102 further includes:
the flange coordinate acquisition submodule 1301 is used for acquiring poses of the plurality of flanges in a robot coordinate system and poses of the plurality of calibration plates in a camera coordinate system in the process that the robot drives the calibration plates to move through the flanges;
and the external reference calculation submodule 1301 is used for determining the initial pose of the camera in the robot coordinate system according to the poses of the plurality of flanges in the robot coordinate system and the poses of the plurality of calibration plates in the camera coordinate system.
In a specific implementation, as shown in fig. 14, the compensation matrix determining module 1103 further includes:
a coordinate grouping submodule 1401 for dividing the plurality of position points into a plurality of groups of position points according to the difference of the spatial regions in the camera coordinate system;
and the compensation matrix obtaining submodule 1402 is configured to determine a compensation matrix for the camera according to the measured position coordinates of each group of position points in the camera coordinate system and the initial pose of the camera in the robot coordinate system.
In a specific implementation, as shown in fig. 15, the coordinate grouping sub-module 1401 further includes:
the space division submodule 1501 is used for dividing a space into a plurality of layers according to different heights in a camera coordinate system, and a plurality of partitions are divided in each layer;
the grouping sub-module 1502 is configured to divide the location points in the same partition in the same layer into a group of location points according to the location coordinates of the location points in the camera coordinate system.
In a specific implementation, the calibration apparatus for a camera further includes:
a position point coordinate acquisition module 1104, configured to obtain position coordinates of a position point on the surface of the object, acquired by the camera, in a camera coordinate system after determining a compensation matrix for the camera;
the accurate position coordinate calculation module 1105 is configured to calibrate the position coordinates of the object surface position point in the camera coordinate system according to the compensation matrix, so as to obtain the compensated accurate position coordinates.
In a specific implementation, as shown in fig. 16, the accurate position coordinate obtaining module 1105 further includes:
the spatial region determining sub-module 1601 is configured to determine a hierarchy and a partition where the object surface location point is located according to the location coordinate of the object surface location point in the camera coordinate system and the internal reference of the camera;
the accurate position coordinate obtaining sub-module 1602 is configured to compensate the position coordinates of the object surface position points in the camera coordinate system according to the compensation matrix corresponding to the layering and the partitioning where the object surface position points are located, so as to obtain compensated accurate position coordinates.
In a specific implementation, as shown in fig. 17, the spatial region determining sub-module 1601 further includes:
a layering determination submodule 1701 for determining the layering of the object surface position point according to the z-axis coordinate of the object surface position point in the camera coordinate system;
a pixel coordinate determining submodule 1702 for determining a pixel coordinate corresponding to the object surface location point according to the x-axis coordinate and the y-axis coordinate of the object surface location point in the camera coordinate system and the internal reference of the camera;
and a partition determining submodule 1703, configured to determine, according to the pixel coordinates corresponding to the object surface location point, a partition where the object surface location point is located.
In a specific implementation, as shown in fig. 18, the accurate position coordinate obtaining sub-module 1602 further includes:
a weight determination submodule 1801, configured to determine, when the object surface location point is in a cross-hierarchy and/or cross-partition region, a weight of each cross-hierarchy and/or partition region according to a distance between the object surface location point and the cross-hierarchy and/or partition region;
the compensation submodule 1802 is configured to compensate the position coordinates of the object surface position points in the camera coordinate system according to the compensation matrices of the cross-hierarchies and/or partitions, so as to obtain a plurality of compensation position coordinates of the object surface position points;
and a weighted summation submodule 1803, configured to perform weighted summation on the multiple compensated position coordinates of the object surface position point according to the weights across the hierarchies and/or partitions, so as to obtain a compensated position coordinate after weighted summation.
In a specific implementation, as shown in fig. 19, the compensation matrix obtaining sub-module 1402 further includes:
an initial theoretical position coordinate obtaining submodule 1901, configured to determine initial theoretical position coordinates of each group of position points in a camera coordinate system according to an initial pose of the camera in the robot coordinate system;
an initial compensation matrix obtaining submodule 1902, configured to fit measurement position coordinates and initial theoretical position coordinates of each group of position points in a camera coordinate system, and determine an initial compensation matrix for a camera;
an adjustment data acquisition submodule 1903, configured to acquire poses of the multiple cameras after adjustment in the robot coordinate system and determine theoretical position coordinates of the multiple groups of position points after adjustment of the cameras;
the matrix determination submodule 1904 is configured to adjust an initial compensation matrix according to the current poses returned after the multiple cameras are adjusted in the robot coordinate system, the measurement positions of the multiple groups of position points in the camera coordinate system, and the current position coordinates after the cameras are adjusted until the euclidean distance between the measurement position coordinates of the groups of position points in the camera coordinate system and the current theoretical position coordinates is smaller than a preset threshold and/or the adjustment times reaches preset times, and use the current compensation matrix as a compensation matrix for the cameras.
In a specific implementation, as shown in fig. 20, the initial theoretical position coordinate obtaining sub-module 1901 further includes:
a flange and calibration plate pose acquisition submodule 2001 for acquiring poses of the flange corresponding to each set of position points in the robot coordinate system and the calibration plate in the camera coordinate system;
a calibration plate relative flange pose acquisition sub-module 2002 for determining the pose of the calibration plate relative to the flange according to the initial pose of the camera in the robot coordinate system, the pose of the flange in the robot coordinate system, and the pose of the calibration plate in the camera coordinate system;
the position point relative robot coordinate obtaining sub-module 2003 is used for determining the position coordinates of each group of position points in the robot coordinate system according to the position coordinates of each group of position points in the calibration plate coordinate system, the pose of the calibration plate relative to the flange and the pose of the flange in the robot coordinate system;
the initial theoretical position coordinate determination submodule 2004 is configured to determine initial theoretical position coordinates of each group of position points in the camera coordinate system according to the position coordinates of each group of position points in the robot coordinate system and the initial pose of the camera in the robot coordinate system.
The invention also provides electronic equipment which comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein the processor realizes the calibration method of the camera when executing the computer program.
The present invention also provides a computer-readable storage medium storing a computer program for executing the calibration method of the camera.
In summary, the calibration method, the calibration apparatus, the electronic device and the readable storage medium of the camera provided by the invention include: acquiring measurement position coordinates of a plurality of position points in a camera coordinate system; determining an initial pose of a camera under a robot coordinate system; and determining a compensation matrix for the camera according to the measurement position coordinates of the plurality of position points in the camera coordinate system and the initial pose of the camera in the robot coordinate system. The calibration method of the camera can simultaneously determine the external parameters of the camera and the compensation matrix aiming at the internal parameters of the camera, so that the position coordinates of the position points under the camera coordinate system can be compensated according to the compensation matrix, more accurate position coordinates are obtained, and the precision of the 3D point cloud image shot by the 3D camera is effectively improved.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above-mentioned embodiments are provided to further explain the objects, technical solutions and advantages of the present invention in detail, and it should be understood that the above-mentioned embodiments are only examples of the present invention and should not be used to limit the scope of the present invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (14)

1. A calibration method of a camera is characterized by comprising the following steps:
acquiring measurement position coordinates of a plurality of position points in a camera coordinate system;
determining an initial pose of a camera under a robot coordinate system;
and determining a compensation matrix for the camera according to the measured position coordinates of the plurality of position points in the camera coordinate system and the initial pose of the camera in the robot coordinate system.
2. The calibration method according to claim 1, wherein the obtaining of the measured position coordinates of the plurality of position points in the camera coordinate system further comprises:
acquiring a plurality of calibration plate images acquired by a camera when the calibration plate moves to a plurality of spatial positions;
and acquiring the measurement position coordinates of the plurality of position points in a camera coordinate system according to the plurality of calibration plate images acquired by the camera.
3. The calibration method according to claim 1, wherein the determining of the initial pose of the camera in the robot coordinate system further comprises:
in the process that the robot drives the calibration plate to move through the flange, the poses of the plurality of flanges in the robot coordinate system and the poses of the plurality of calibration plates in the camera coordinate system are obtained;
and determining the initial poses of the camera in the robot coordinate system according to the poses of the flanges in the robot coordinate system and the poses of the calibration plates in the camera coordinate system.
4. The calibration method according to claim 1, wherein determining a compensation matrix for the camera according to the measured position coordinates of the plurality of position points in the camera coordinate system and the initial pose of the camera in the robot coordinate system further comprises:
dividing a plurality of position points into a plurality of groups of position points according to the difference of space areas under a camera coordinate system;
and determining a compensation matrix aiming at the camera according to the measurement position coordinates of each group of position points in the camera coordinate system and the initial pose of the camera in the robot coordinate system.
5. The calibration method according to claim 4, wherein the dividing the plurality of position points into a plurality of groups of position points according to the difference of the spatial regions in the camera coordinate system further comprises:
dividing a space into a plurality of layers according to different heights under a camera coordinate system, wherein a plurality of partitions are divided in each layer;
and dividing the position points in the same partition in the same layer into a group of position points according to the position coordinates of the position points in the camera coordinate system.
6. The calibration method of claim 1, after determining the compensation matrix for the camera, further comprising:
acquiring position coordinates of object surface position points acquired by a camera in a camera coordinate system;
and calibrating the position coordinates of the object surface position points in the camera coordinate system according to the compensation matrix to obtain the compensated accurate position coordinates.
7. The calibration method according to claim 6, wherein the calibrating the position coordinates of the object surface position points in the camera coordinate system according to the compensation matrix to obtain the compensated accurate position coordinates further comprises:
determining the layering and the partition of the object surface position point according to the position coordinate of the object surface position point in a camera coordinate system and the internal reference of a camera;
and compensating the position coordinates of the object surface position points under the camera coordinate system according to the compensation matrix corresponding to the layering and the partition of the object surface position points to obtain the compensated accurate position coordinates.
8. The calibration method according to claim 7, wherein the determining the layer and the partition where the position point on the surface of the object is located according to the position coordinates of the position point on the surface of the object in the camera coordinate system and the internal parameters of the camera further comprises:
determining the layering of the object surface position points according to the z-axis coordinates of the object surface position points in a camera coordinate system;
determining pixel coordinates corresponding to the object surface position points according to the x-axis coordinates and the y-axis coordinates of the object surface position points in a camera coordinate system and internal parameters of a camera;
and determining the partition where the object surface position point is located according to the pixel coordinate corresponding to the object surface position point.
9. The calibration method according to claim 7, wherein the position coordinates of the object surface position points in the camera coordinate system are compensated according to the compensation matrix corresponding to the layer and the partition where the object surface position points are located, so as to obtain compensated accurate position coordinates, further comprising:
when the object surface position point is in the cross-hierarchy and/or cross-partition region, determining the weight of each cross-hierarchy and/or partition region according to the distance between the object surface position point and the cross-hierarchy and/or partition region;
respectively compensating the position coordinates of the object surface position points under the camera coordinate system according to the compensation matrixes of the cross-layering and/or partitioning to obtain a plurality of compensation position coordinates of the object surface position points;
and carrying out weighted summation on the plurality of compensation position coordinates of the object surface position points according to the weights of all the cross-layers and/or the partitions to obtain the compensation position coordinates after weighted summation.
10. The calibration method according to claim 4, wherein the determining a compensation matrix for the camera according to the measured position coordinates of each group of position points in the camera coordinate system and the initial pose of the camera in the robot coordinate system further comprises:
determining initial theoretical position coordinates of each group of position points in a camera coordinate system according to the initial pose of the camera in the robot coordinate system;
fitting the measured position coordinates and the initial theoretical position coordinates of each group of position points in a camera coordinate system, and determining an initial compensation matrix for the camera;
acquiring the adjusted poses of a plurality of cameras in a robot coordinate system and determining the theoretical position coordinates of a plurality of groups of position points after the cameras are adjusted;
and adjusting the initial compensation matrix according to the current poses returned after the cameras are adjusted in the robot coordinate system, the measurement positions of the groups of position points in the camera coordinate system and the current position coordinates after the cameras are adjusted until the Euclidean distance of the error between the measurement position coordinates of the groups of position points in the camera coordinate system and the current theoretical position coordinates is smaller than a preset threshold value and/or is adjusted for a preset number of times, and taking the current compensation matrix as the compensation matrix for the cameras.
11. The calibration method according to claim 10, wherein the determining the initial theoretical position coordinates of each set of position points in the camera coordinate system according to the initial pose of the camera in the robot coordinate system further comprises:
acquiring the positions and postures of the flanges corresponding to the groups of position points in a robot coordinate system and the positions and postures of the calibration plates in a camera coordinate system;
determining the pose of the calibration plate relative to the flange according to the initial pose of the camera in the robot coordinate system, the pose of the flange in the robot coordinate system and the pose of the calibration plate in the camera coordinate system;
determining the position coordinates of each group of position points in the robot coordinate system according to the position coordinates of each group of position points in the calibration plate coordinate system, the pose of the calibration plate relative to the flange and the pose of the flange in the robot coordinate system;
and determining the initial theoretical position coordinates of each group of position points in the camera coordinate system according to the position coordinates of each group of position points in the robot coordinate system and the initial pose of the camera in the robot coordinate system.
12. A calibration method and device for a camera are characterized in that the calibration device for the camera comprises:
the coordinate acquisition module is used for acquiring the measurement position coordinates of the plurality of position points in a camera coordinate system;
the external parameter determining module is used for determining the initial pose of the camera under the robot coordinate system;
and the compensation matrix determining module is used for determining a compensation matrix for the camera according to the measurement position coordinates of the plurality of position points in the camera coordinate system and the initial pose of the camera in the robot coordinate system.
13. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the calibration method of the camera according to any one of claims 1 to 11 when executing the computer program.
14. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program for executing the calibration method of the camera according to any one of claims 1 to 11.
CN202111083399.0A 2021-09-16 2021-09-16 Camera calibration method and device, electronic equipment and storage medium Pending CN115810052A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111083399.0A CN115810052A (en) 2021-09-16 2021-09-16 Camera calibration method and device, electronic equipment and storage medium
PCT/CN2021/138574 WO2023040095A1 (en) 2021-09-16 2021-12-15 Camera calibration method and apparatus, electronic device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111083399.0A CN115810052A (en) 2021-09-16 2021-09-16 Camera calibration method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115810052A true CN115810052A (en) 2023-03-17

Family

ID=85482008

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111083399.0A Pending CN115810052A (en) 2021-09-16 2021-09-16 Camera calibration method and device, electronic equipment and storage medium

Country Status (2)

Country Link
CN (1) CN115810052A (en)
WO (1) WO2023040095A1 (en)

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010031248A1 (en) * 2010-07-12 2012-01-12 Kuka Roboter Gmbh Method for measuring a robot arm of an industrial robot
JP7035657B2 (en) * 2018-03-15 2022-03-15 セイコーエプソン株式会社 Robot control device, robot, robot system, and camera calibration method
CN110276806B (en) * 2019-05-27 2023-06-09 江苏大学 Online hand-eye calibration and grabbing pose calculation method for four-degree-of-freedom parallel robot stereoscopic vision hand-eye system
CN111136656B (en) * 2019-12-24 2020-12-08 上海智殷自动化科技有限公司 Method for automatically identifying and grabbing three-dimensional irregular object of robot
CN111660295B (en) * 2020-05-28 2023-01-03 中国科学院宁波材料技术与工程研究所 Industrial robot absolute precision calibration system and calibration method
CN112561886A (en) * 2020-12-18 2021-03-26 广东工业大学 Automatic workpiece sorting method and system based on machine vision
CN112847341B (en) * 2020-12-25 2024-02-02 中国科学院宁波材料技术与工程研究所 Industrial robot step-by-step calibration system and method

Also Published As

Publication number Publication date
WO2023040095A1 (en) 2023-03-23

Similar Documents

Publication Publication Date Title
CN111982072B (en) Target ranging method based on monocular vision
JP5999615B2 (en) Camera calibration information generating apparatus, camera calibration information generating method, and camera calibration information generating program
CN109297436B (en) Binocular line laser stereo measurement reference calibration method
CN111192235B (en) Image measurement method based on monocular vision model and perspective transformation
US10499038B2 (en) Method and system for recalibrating sensing devices without familiar targets
WO2018201677A1 (en) Bundle adjustment-based calibration method and device for telecentric lens-containing three-dimensional imaging system
CN112184811B (en) Monocular space structured light system structure calibration method and device
CN111899305A (en) Camera automatic calibration optimization method and related system and equipment
US20220230348A1 (en) Method and apparatus for determining a three-dimensional position and pose of a fiducial marker
CN111207670A (en) Line structured light calibration device and method
CN113494893A (en) Calibration method and device of three-dimensional laser scanning system and computer equipment
CN110044266B (en) Photogrammetry system based on speckle projection
CN113822920B (en) Method for acquiring depth information by structured light camera, electronic equipment and storage medium
CN112712566B (en) Binocular stereo vision sensor measuring method based on structure parameter online correction
KR102023087B1 (en) Method for camera calibration
CN113393413B (en) Water area measuring method and system based on monocular and binocular vision cooperation
CN109141344A (en) A kind of method and system based on the accurate ranging of binocular camera
CN112525106A (en) Three-phase machine cooperative laser-based 3D detection method and device
CN115810052A (en) Camera calibration method and device, electronic equipment and storage medium
CN115014398B (en) Monocular stereoscopic vision measurement system position and attitude calibration method, device and system
CN113012279B (en) Non-contact three-dimensional imaging measurement method and system and computer readable storage medium
CN115018922A (en) Distortion parameter calibration method, electronic device and computer readable storage medium
CN110232715B (en) Method, device and system for self calibration of multi-depth camera
CN114170321A (en) Camera self-calibration method and system based on distance measurement
CN115719387A (en) 3D camera calibration method, point cloud image acquisition method and camera calibration system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination