CN110866956A - Robot calibration method and terminal - Google Patents

Robot calibration method and terminal Download PDF

Info

Publication number
CN110866956A
CN110866956A CN201911028907.8A CN201911028907A CN110866956A CN 110866956 A CN110866956 A CN 110866956A CN 201911028907 A CN201911028907 A CN 201911028907A CN 110866956 A CN110866956 A CN 110866956A
Authority
CN
China
Prior art keywords
robot
calibration
calibrated
image
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911028907.8A
Other languages
Chinese (zh)
Inventor
程俊
胡颖
宋呈群
杨远源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN201911028907.8A priority Critical patent/CN110866956A/en
Publication of CN110866956A publication Critical patent/CN110866956A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Manipulator (AREA)

Abstract

The application is applicable to the technical field of computers, and provides a robot calibration method and a terminal, and the method comprises the following steps: acquiring a first image shot by a robot to be calibrated; calculating camera external parameters corresponding to each first image based on the corner point coordinates in each first calibration image, the first target corner point coordinates and the camera internal parameters corresponding to each first image; and determining calibration data of each robot to be calibrated based on the camera external parameters corresponding to each first image. In the above manner, the terminal acquires an image obtained by the calibration device shot by the robot to be calibrated, and calculates and obtains the camera external reference corresponding to the robot to be calibrated based on the corner point coordinates, the target corner point coordinates and the camera internal reference in the image, so as to obtain the calibration data of the robot to be calibrated. The calibration method has the advantages of low calibration cost, good flexibility, high efficiency and simple steps, and is beneficial to robot calibration.

Description

Robot calibration method and terminal
Technical Field
The application belongs to the technical field of computers, and particularly relates to a robot calibration method and a terminal.
Background
At present, with the increase of labor cost, the requirement of a factory production line on automatic production is more and more increased. In addition, some high-temperature environments, air pollution operation environments, dangerous operation environments, strong radiation operation environments and the like need robots to complete the operation. Therefore, it is important to calibrate the position of the robot.
However, in the existing robot calibration method, calibration data (which may be used to indicate the current position of the robot) is obtained by using a precise measuring instrument to laser-trace an irradiation calibration point. The calibration method has the advantages of high calibration cost, poor flexibility, low efficiency and complicated operation steps, and is not beneficial to robot calibration.
Disclosure of Invention
In view of this, the embodiment of the application provides a robot calibration method and a terminal, so as to solve the problems of high calibration cost, poor flexibility, low efficiency, complicated operation steps and inconvenience for robot calibration in the conventional robot calibration method.
A first aspect of an embodiment of the present application provides a robot calibration method, including:
acquiring a first image shot by a robot to be calibrated; the first image comprises a first calibration image corresponding to a calibration plate on first calibration equipment;
calculating camera external parameters corresponding to each first image based on the corner point coordinates in each first calibration image, the first target corner point coordinates and the camera internal parameters corresponding to each first image; the first target corner point coordinates are coordinates in a calibration plate on the first calibration equipment, which correspond to the corner point coordinates in the first calibration image;
and determining calibration data of each robot to be calibrated based on the camera external parameters corresponding to each first image.
Further, to accurately and quickly calculate each camera external parameter, based on the corner point coordinates in each of the first calibration images, the first target corner point coordinates, and the camera internal parameter corresponding to each of the first images, calculating the camera external parameter corresponding to each of the first images may include:
constructing a calibration equation set corresponding to each robot to be calibrated based on the corner point coordinates in each first calibration image, each first target corner point coordinates and camera internal parameters corresponding to each first image;
and solving each calibration equation set to obtain the camera external parameter corresponding to each robot to be calibrated.
Further, in order to facilitate the robots in the factory to cooperate with each other and determine the positions of the robots, the method further includes:
adjusting calibration data of a second robot to be calibrated based on the calibration data of the first robot to be calibrated to obtain first target data; the first robot to be calibrated is any robot to be calibrated, which can shoot a calibration plate on the first calibration equipment; the second robot to be calibrated is any robot to be calibrated, which can shoot a calibration plate on the first calibration equipment except the first robot to be calibrated; the first target data is used for representing the position of the second robot to be calibrated relative to the first robot to be calibrated.
Further, in order to guarantee that each robot is accurately calibrated, the method further comprises the following steps:
when the uncalibrated robot is detected to exist, acquiring a second image shot based on the uncalibrated robot; the second image comprises a second calibration image corresponding to a calibration plate on second calibration equipment; the second calibration equipment is located at a position different from the position of the first calibration equipment;
calculating camera external parameters corresponding to each second image based on the corner point coordinates in each second calibration image, second target corner point coordinates and camera internal parameters corresponding to each second image, wherein the second target corner point coordinates are coordinates corresponding to the corner point coordinates in the second calibration image in a calibration plate on the second calibration equipment;
and determining calibration data of each uncalibrated robot based on the camera external parameters corresponding to each second image.
Further, in order to facilitate the robots in the factory to cooperate with each other and determine the positions of the robots, the method further includes:
adjusting the calibration data of the uncalibrated robot based on the calibration data of the third robot to be calibrated to obtain second target data; the third robot to be calibrated is any robot to be calibrated, which can shoot the calibration plate on the first calibration device and can shoot the calibration plate on the second calibration device; the second target data is used for representing the position of the uncalibrated robot relative to the third robot to be calibrated.
Further, in order to accelerate the calibration speed of the robot to be calibrated, the robot to be calibrated is simultaneously calibrated, the robot to be calibrated is conveniently calibrated in a large scene, and when the terminal detects that the number of the robots to be calibrated is greater than or equal to a preset threshold value, the acquiring of the first image shot by the robot to be calibrated includes: dividing the robots to be calibrated into at least two groups, and acquiring images to be analyzed sent by each group of robots to be calibrated; each group of robots corresponds to one first calibration device, and different groups of robots correspond to different first calibration devices;
grouping the images to be analyzed according to the grouping identification of the robots to be calibrated to obtain first images corresponding to each group of robots to be calibrated; wherein the first image corresponds to a group identification;
the determining calibration data of each robot to be calibrated based on the camera external parameters corresponding to each first image comprises: and determining calibration data of each robot to be calibrated based on camera external parameters corresponding to each first image and the grouping identification of the robot to be calibrated to which each first image belongs.
Further, in order to facilitate the robots in the factory to cooperate with each other and determine the positions of the robots, the method further includes:
determining a target robot from the grouped robots to be calibrated; the target robot belongs to all grouped robots to be calibrated at the same time;
based on the grouping identification, acquiring calibration data corresponding to each group of the target robot;
based on calibration data corresponding to each group of the target robot, adjusting the calibration data of the robot to be calibrated corresponding to each group identifier to obtain third target data; the third target data is used for representing the position of each robot to be calibrated in each group relative to the target robot in each group.
A second aspect of an embodiment of the present invention provides a terminal, including:
the robot calibration system comprises an acquisition unit, a calibration unit and a calibration unit, wherein the acquisition unit is used for acquiring a first image shot by a robot to be calibrated; the first image comprises a first calibration image corresponding to a calibration plate on first calibration equipment;
the calculation unit is used for calculating the camera external parameters corresponding to each first image based on the corner point coordinates in each first calibration image, the first target corner point coordinates and the camera internal parameters corresponding to each first image; the first target corner point coordinates are coordinates in a calibration plate on the first calibration device corresponding to the corner point coordinates in the first calibration image.
And the determining unit is used for determining the calibration data of each robot to be calibrated based on the camera external parameters corresponding to each first image.
Further, the computing unit is specifically configured to:
constructing a calibration equation set corresponding to each robot to be calibrated based on the corner point coordinates in each first calibration image, each first target corner point coordinates and camera internal parameters corresponding to each first image;
and solving each calibration equation set to obtain the camera external parameter corresponding to each robot to be calibrated.
Further, the terminal further includes:
the first adjusting unit is used for adjusting calibration data of a second robot to be calibrated based on the calibration data of the first robot to be calibrated to obtain first target data; the first robot to be calibrated is any robot to be calibrated, which can shoot a calibration plate on the first calibration equipment; the second robot to be calibrated is any robot to be calibrated, which can shoot a calibration plate on the first calibration equipment except the first robot to be calibrated; the first target data is used for representing the position of the second robot to be calibrated relative to the first robot to be calibrated.
Further, the terminal further includes:
an image acquisition unit configured to acquire a second image taken based on an uncalibrated robot when it is detected that the uncalibrated robot remains; the second image comprises a second calibration image corresponding to a calibration plate on second calibration equipment; the second calibration equipment is located at a position different from the position of the first calibration equipment;
the camera external reference calculation unit is used for calculating the camera external reference corresponding to each second image based on the corner point coordinates in each second calibration image, the second target corner point coordinates and the camera internal reference corresponding to each second image, wherein the second target corner point coordinates are coordinates corresponding to the corner point coordinates in the second calibration image in a calibration plate on the second calibration equipment;
and the calibration data determining unit is used for determining the calibration data of each uncalibrated robot based on the camera external parameters corresponding to each second image.
The uncalibrated robots at least comprise a third robot to be calibrated;
further, the terminal further includes:
the second adjusting unit is used for adjusting the calibration data of the uncalibrated robot based on the calibration data of the third robot to be calibrated to obtain second target data; the third robot to be calibrated is any robot to be calibrated, which can shoot the calibration plate on the first calibration device and can shoot the calibration plate on the second calibration device; the second target data is used for representing the position of the uncalibrated robot relative to the third robot to be calibrated.
The number of the robots to be calibrated is greater than or equal to a preset threshold value;
further, the obtaining unit is specifically configured to:
dividing the robots to be calibrated into at least two groups, and acquiring images to be analyzed sent by each group of robots to be calibrated; each group of robots corresponds to one first calibration device, and different groups of robots correspond to different first calibration devices;
grouping the images to be analyzed according to the grouping identification of the robots to be calibrated to obtain first images corresponding to each group of robots to be calibrated; wherein the first image corresponds to a group identification.
Further, the determining unit is specifically configured to: and determining calibration data of each robot to be calibrated based on camera external parameters corresponding to each first image and the grouping identification of the robot to be calibrated to which each first image belongs.
Further, the terminal further includes:
the target robot determining unit is used for determining a target robot from the grouped robots to be calibrated; the target robot belongs to all grouped robots to be calibrated at the same time;
the data acquisition unit is used for acquiring calibration data corresponding to each group of the target robot based on the group identification;
the third adjusting unit is used for adjusting the calibration data of the robot to be calibrated corresponding to each group identifier based on the calibration data corresponding to each group of the target robot to obtain third target data; the third target data is used for representing the position of each robot to be calibrated in each group relative to the target robot in each group.
A third aspect of an embodiment of the present invention provides another terminal, including a processor, an input device, an output device, and a memory, where the processor, the input device, the output device, and the memory are connected to each other, where the memory is used to store a computer program that supports the terminal to execute the above method, where the computer program includes program instructions, and the processor is configured to call the program instructions and execute the following steps:
acquiring a first image shot by a robot to be calibrated; the first image comprises a first calibration image corresponding to a calibration plate on first calibration equipment;
calculating camera external parameters corresponding to each first image based on the corner point coordinates in each first calibration image, the first target corner point coordinates and the camera internal parameters corresponding to each first image; the first target corner point coordinates are coordinates in a calibration plate on the first calibration equipment, which correspond to the corner point coordinates in the first calibration image;
and determining calibration data of each robot to be calibrated based on the camera external parameters corresponding to each first image.
A fourth aspect of embodiments of the present invention provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of:
acquiring a first image shot by a robot to be calibrated; the first image comprises a first calibration image corresponding to a calibration plate on first calibration equipment;
calculating camera external parameters corresponding to each first image based on the corner point coordinates in each first calibration image, the first target corner point coordinates and the camera internal parameters corresponding to each first image; the first target corner point coordinates are coordinates in a calibration plate on the first calibration equipment, which correspond to the corner point coordinates in the first calibration image;
and determining calibration data of each robot to be calibrated based on the camera external parameters corresponding to each first image.
The robot calibration method and the terminal provided by the embodiment of the application have the following beneficial effects:
according to the embodiment of the application, a first image shot by a robot to be calibrated is obtained; calculating camera external parameters corresponding to each first image based on the corner point coordinates in each first calibration image, the first target corner point coordinates and the camera internal parameters corresponding to each first image; and determining calibration data of each robot to be calibrated based on the camera external parameters corresponding to each first image. In the above manner, the terminal acquires an image obtained by the calibration device shot by the robot to be calibrated, and calculates and obtains the camera external reference corresponding to the robot to be calibrated based on the corner point coordinates, the target corner point coordinates and the camera internal reference in the image, so as to obtain the calibration data of the robot to be calibrated. The calibration method can be realized only by calibrating equipment and a terminal, does not need a precise measuring instrument, and has low calibration cost; the calibration equipment can move according to the position of the robot to be calibrated, and the flexibility is very high; the calibration method can calibrate a plurality of robots to be calibrated simultaneously, so that the robot calibration efficiency is improved; the whole calibration process is mainly completed by the robot to be calibrated, the terminal and the calibration equipment, the operation steps are simple, and the robot calibration is facilitated.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a flowchart illustrating an implementation of a robot calibration method according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a calibration apparatus provided herein;
fig. 3 is a flowchart illustrating an implementation of a robot calibration method according to another embodiment of the present disclosure;
FIG. 4 is a flowchart illustrating an implementation of a robot calibration method according to another embodiment of the present disclosure;
fig. 5 is a flowchart illustrating an implementation of a robot calibration method according to still another embodiment of the present disclosure;
fig. 6 is a schematic diagram of a terminal according to an embodiment of the present application;
fig. 7 is a schematic diagram of a terminal according to another embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
Referring to fig. 1, fig. 1 is a schematic flow chart of a robot calibration method according to an embodiment of the present invention. In this embodiment, an execution main body of the robot calibration method is a terminal, and the terminal includes but is not limited to a computer, a smart phone, a desktop computer, a tablet computer, a Personal Digital Assistant (PDA), and other mobile terminals. The robot calibration method as shown in fig. 1 may include:
s101: acquiring a first image shot by a robot to be calibrated; the first image comprises a first calibration image corresponding to a calibration plate on first calibration equipment.
The calibration equipment is used for the auxiliary terminal to calibrate the robot to be calibrated. Fig. 2 shows a calibration apparatus, which comprises a base, a tapered tower-type calibration plate, and a support column. A plurality of wheels are arranged below the base and can drive the calibration equipment to move freely; the conical tower type calibration plate is composed of a plurality of isosceles trapezoid surfaces, and each isosceles trapezoid surface is provided with black and white chessboard-shaped grids. The support column is used for connecting the base and the conical tower type calibration plate, the support column can be driven by the motor to rotate, then the conical tower type calibration plate is driven to rotate, the length of the support column can be adjusted, and a user can adjust the conical tower type calibration plate of the calibration device to a proper height. The user may set a corresponding coordinate system for the calibration device in advance, and as shown in fig. 2, the coordinate system (R) is established with the center of the base of the calibration device as the originx,Ry,Rz) (ii) a The coordinate system can determine the corresponding coordinate of the square on each isosceles trapezoid surface. The number of the isosceles trapezoid surfaces is freely set and adjusted by a user, for example, a conical tower calibration plate of one calibration device may be composed of 3 isosceles trapezoid surfaces, or a conical tower calibration plate of one calibration device may be composed of 5 isosceles trapezoid surfaces; the description is given for illustrative purposes only and is not intended to be limiting. It should be noted that the calibration of the calibration apparatus mentioned in the present embodimentThe plate is a conical tower type calibration plate of the calibration equipment.
In this embodiment, the calibration devices corresponding to the robots to be calibrated are the same, that is, the robots to be calibrated all correspond to the first calibration device. The user can place the first calibration device at a first preset position in advance, and the position can enable a plurality of robots to be calibrated to shoot the first calibration device. Further, the height of the first calibration device can be adjusted by a user, so that the effect of the first image obtained by shooting the first calibration device by each robot to be calibrated is optimal.
The terminal obtains a first image shot by the robot to be calibrated. Specifically, a camera is mounted on the robot to be calibrated, and the robot to be calibrated shoots calibration equipment corresponding to the robot to be calibrated through the camera to obtain a first image; the first image comprises a first calibration image corresponding to a calibration plate on the calibration equipment. The robot to be calibrated sends the shot first image and the robot identification information to a terminal, and the terminal receives the first image and the robot identification information sent by the robot to be calibrated; and the terminal extracts a first calibration image corresponding to the calibration plate on the calibration equipment from the first image and stores the first calibration image.
S102: calculating camera external parameters corresponding to each first image based on the corner point coordinates in each first calibration image, the first target corner point coordinates and the camera internal parameters corresponding to each first image; the first target corner point coordinates are coordinates in a calibration plate on the first calibration device corresponding to the corner point coordinates in the first calibration image.
And the terminal calculates the camera external parameters corresponding to each first image based on the corner point coordinates in each first calibration image, the first target corner point coordinates and the camera internal parameters corresponding to each first image. The camera internal parameter is a camera internal parameter corresponding to a camera on the robot to be calibrated for shooting the first image, and the camera internal parameter is carried by the camera when the camera leaves a factory; the first target corner coordinates are coordinates in a calibration plate on the first calibration device corresponding to the corner coordinates in the first calibration image, i.e. each corner coordinate in the first calibration image, and there is a coordinate corresponding to each corner coordinate in the calibration plate of the first calibration device. The camera external parameter is a camera external parameter corresponding to the first image, and can also be understood as a camera external parameter corresponding to a camera on the robot to be calibrated, which shoots the first image.
Specifically, the terminal acquires camera internal parameters corresponding to the first image, namely, the camera internal parameters corresponding to a camera used for shooting the first image on the robot to be calibrated. The terminal acquires the corner coordinates in the first calibration image corresponding to the robot to be calibrated, and acquires the first target corner coordinates corresponding to each corner coordinate. And constructing an equation set corresponding to the calibration robot according to the corner point coordinates, the target corner point coordinates and the camera internal parameters corresponding to the robot to be calibrated, and solving the equation set to obtain the camera external parameters corresponding to the first image. It should be noted that the terminal constructs an equation set corresponding to the robot to be calibrated according to the corner point coordinates corresponding to the robot to be calibrated, the first target corner point coordinates and the camera internal parameters, and solves the equation set to obtain the camera external parameters corresponding to the first image shot by the robot to be calibrated. And the terminal constructs an equation set corresponding to each robot to be calibrated in the same way, and the camera external parameters corresponding to the first image shot by each robot to be calibrated can be obtained by solving each equation set.
Further, in order to accurately and quickly calculate each camera external parameter, S102 may include S1021-S1022, as follows:
s1021: and constructing a calibration equation set corresponding to each robot to be calibrated based on the corner point coordinates in each first calibration image, each first target corner point coordinates and the camera internal parameters corresponding to each first image.
The terminal acquires the corner coordinates corresponding to each robot to be calibrated and the first target corner coordinates corresponding to each corner coordinate. The number of corner coordinates corresponding to each robot to be calibrated is greater than or equal to a preset number, the preset number is set by a user, for example, the number of corner coordinates corresponding to one robot to be calibrated may be 6, 15, and the like, and accordingly, the number of the first target angle coordinates corresponds to the number of the corner coordinates.
The terminal obtains camera internal parameters corresponding to the first image based on the robot identification information, namely, the camera internal parameters corresponding to the camera on the robot to be calibrated identified by the robot identification information are obtained. For example, the terminal sends camera internal reference acquisition information to the robot to be calibrated based on the robot identification information, and when the robot to be calibrated receives the camera internal reference acquisition information, the camera internal reference corresponding to the camera is acquired, and the camera internal reference is sent to the terminal.
The terminal can construct an equation corresponding to the robot to be calibrated according to one corner coordinate corresponding to each robot to be calibrated, one first target corner coordinate corresponding to the corner coordinate, and camera internal parameters corresponding to the robot to be calibrated. And establishing equations corresponding to the robot to be calibrated according to the acquired angular point coordinates. And combining a plurality of equations corresponding to the robot to be calibrated to generate a calibration equation set corresponding to the robot to be calibrated.
For example, the coordinate of one corner point in the first calibration image corresponding to the robot a to be calibrated is (u)1,v11) a first target corner coordinate corresponding to the corner coordinate is (X)i,Yi,Zi1) the camera internal parameter corresponding to the robot A to be calibrated is
Figure BDA0002249517180000091
An equation corresponding to the robot A to be calibrated is constructed and obtained as follows:
Figure BDA0002249517180000092
wherein the content of the first and second substances,
Figure BDA0002249517180000093
the camera external parameter is a camera external parameter corresponding to a first image of the robot A to be calibrated; (u)0,v0) And the coordinate of the principal point in the optical axis projection coordinate system in the coordinate system corresponding to the camera is represented. And the terminal constructs a plurality of equations aiming at the robot A to be calibrated to form a calibration equation set corresponding to the robot A to be calibrated.
S1022: and solving each calibration equation set to obtain the camera external parameter corresponding to each robot to be calibrated.
The terminal can solve the calibration equation set corresponding to each robot to be calibrated through a least square method, and the camera external parameters corresponding to each robot to be calibrated are obtained. Namely, the terminal solves the calibration equation set corresponding to the robot to be calibrated, and the obtained camera external parameter is corresponding to the robot to be calibrated. For example, the terminal solves the system of equations in S1021 to obtain
Figure BDA0002249517180000101
The value of (a) is the camera external parameter corresponding to the robot a to be calibrated, and may also be referred to as the camera external parameter of the first image corresponding to the robot a to be calibrated.
S103: and determining calibration data of each robot to be calibrated based on the camera external parameters corresponding to each first image.
The terminal determines calibration data of each robot to be calibrated based on external parameters of each camera, and the calibration data can represent the position of the robot to be calibrated. The camera external parameter is obtained by solving an equation set corresponding to the robot to be calibrated, and the camera external parameter corresponds to the robot to be calibrated. Specifically, the camera external parameters obtained by the terminal calculation are the camera external parameters corresponding to the camera on the robot to be calibrated, which is used for shooting the first image, and the camera external parameters are equivalent to calibration data of the robot to be calibrated. The terminal directly uses the camera external parameter as calibration data of the robot to be calibrated; the camera external parameter can also be converted into coordinates, and the coordinates obtained through conversion are used as calibration data of the robot to be calibrated.
According to the embodiment of the application, a first image shot by a robot to be calibrated is obtained; calculating camera external parameters corresponding to each first image based on the corner point coordinates in each first calibration image, the first target corner point coordinates and the camera internal parameters corresponding to each first image; and determining calibration data of each robot to be calibrated based on the camera external parameters corresponding to each first image. In the above manner, the terminal acquires an image obtained by the calibration device shot by the robot to be calibrated, and calculates and obtains the camera external reference corresponding to the robot to be calibrated based on the corner point coordinates, the target corner point coordinates and the camera internal reference in the image, so as to obtain the calibration data of the robot to be calibrated. The calibration method can be realized only by calibrating equipment and a terminal, does not need a precise measuring instrument, and has low calibration cost; the calibration equipment can move according to the position of the robot to be calibrated, and the flexibility is very high; the calibration method can calibrate a plurality of robots to be calibrated simultaneously, so that the robot calibration efficiency is improved; the whole calibration process is mainly completed by the robot to be calibrated, the terminal and the calibration equipment, the operation steps are simple, and the robot calibration is facilitated.
Referring to fig. 3, fig. 3 is a schematic flow chart of a robot calibration method according to another embodiment of the present invention. In this embodiment, an execution main body of the robot calibration method is a terminal, and the terminal includes but is not limited to a computer, a smart phone, a desktop computer, a tablet computer, a Personal Digital Assistant (PDA), and other mobile terminals.
The difference between the embodiment and the embodiment corresponding to fig. 1 is S204, and S201 to S203 in the embodiment are completely the same as S101 to S103 in the embodiment corresponding to fig. 1, and specific reference is made to the description of S101 to S103 in the embodiment corresponding to fig. 1, which is not repeated herein. In order to facilitate the robots to cooperate with each other and determine the mutual positions of the robots, S204 may be further included after S203, specifically as follows:
s204: adjusting calibration data of a second robot to be calibrated based on the calibration data of the first robot to be calibrated to obtain first target data; the first robot to be calibrated is any robot to be calibrated, which can shoot a calibration plate on the first calibration equipment; the second robot to be calibrated is any robot to be calibrated, which can shoot a calibration plate on the first calibration equipment except the first robot to be calibrated; the first target data is used for representing the position of the second robot to be calibrated relative to the first robot to be calibrated.
And the terminal adjusts the calibration data of the second robot to be calibrated based on the calibration data of the first robot to be calibrated to obtain first target data. The first robot to be calibrated is any robot to be calibrated, which can shoot a calibration plate on the first calibration equipment. For example, M to-be-calibrated robots correspond to the same first calibration device, that is, the M to-be-calibrated robots can all shoot the calibration plate on the first calibration device, and any one of the M to-be-calibrated robots can all be used as the first to-be-calibrated robot. The second robot to be calibrated is any robot to be calibrated, which can shoot the calibration plate on the first calibration equipment except the first robot to be calibrated. For example, if any one of the M robots to be calibrated is used as the first robot to be calibrated, any one of the M robots to be calibrated, except the first robot to be calibrated, may be used as the second robot to be calibrated. The first target data is used for representing the position of the second robot to be calibrated relative to the first robot to be calibrated. For example, one of the M robots to be calibrated serves as a first robot to be calibrated, and the first target data represents the position of each of the remaining robots to be calibrated in the M robots to be calibrated relative to the first robot to be calibrated.
Specifically, the terminal takes calibration data of the first robot to be calibrated as reference data, which is equivalent to reference data, and the other second robots to be calibrated can adjust calibration data of the second robot to be calibrated according to the reference data, and record the adjusted data as first target data. For example, a vector included in calibration data of the first robot to be calibrated is acquired, a vector included in calibration data of the second robot to be calibrated is acquired, vector operation is performed on the two vectors, that is, the former vector is subtracted from the latter vector, and the subtracted vector is used as calibration data of the second robot to be calibrated, that is, the subtracted vector is used as first target data. The first target data may represent a position of the second robot to be calibrated relative to the first robot to be calibrated.
Further, since the first robot to be calibrated is any robot to be calibrated, which can shoot the calibration plate on the first calibration device, different first robots to be calibrated can be set, calibration data of other second robots to be calibrated can be adjusted, different first target data can be obtained, and thus the position relationship between any two robots in the M robots to be calibrated can be obtained. Therefore, the robots in the factory can cooperate with each other more conveniently, the working efficiency is improved, and the user management is facilitated.
According to the embodiment of the application, a first image shot by a robot to be calibrated is obtained; calculating camera external parameters corresponding to each first image based on the corner point coordinates in each first calibration image, the first target corner point coordinates and the camera internal parameters corresponding to each first image; and determining calibration data of each robot to be calibrated based on the camera external parameters corresponding to each first image. In the above manner, the terminal acquires an image obtained by the calibration device shot by the robot to be calibrated, and calculates and obtains the camera external reference corresponding to the robot to be calibrated based on the corner point coordinates, the target corner point coordinates and the camera internal reference in the image, so as to obtain the calibration data of the robot to be calibrated. The calibration method can be realized only by calibrating equipment and a terminal, does not need a precise measuring instrument, and has low calibration cost; the calibration equipment can move according to the position of the robot to be calibrated, and the flexibility is very high; the calibration method can calibrate a plurality of robots to be calibrated simultaneously, so that the robot calibration efficiency is improved; the whole calibration process is mainly completed by the robot to be calibrated, the terminal and the calibration equipment, the operation steps are simple, and the robot calibration is facilitated.
Referring to fig. 4, fig. 4 is a schematic flowchart of a robot calibration method according to another embodiment of the present invention. In this embodiment, an execution main body of the robot calibration method is a terminal, and the terminal includes but is not limited to a computer, a smart phone, a desktop computer, a tablet computer, a Personal Digital Assistant (PDA), and other mobile terminals.
The difference between the present embodiment and the embodiment corresponding to fig. 1 is S304-S307, where S301-S303 in the present embodiment are completely the same as S101-S103 in the embodiment corresponding to fig. 1, and please refer to the description related to S101-S103 in the embodiment corresponding to fig. 1, which is not repeated herein. In order to ensure that each robot is accurately calibrated, S304-S306 may be further included after S303, specifically as follows:
s304: when the uncalibrated robot is detected to exist, acquiring a second image shot based on the uncalibrated robot; the second image comprises a second calibration image corresponding to a calibration plate on second calibration equipment; the second calibration device is located at a different position than the first calibration device.
The terminal detects whether the robot to be calibrated which is not calibrated currently exists, namely detects whether the robot which is not calibrated exists. The terminal can search the calibration data of each robot to be calibrated through the robot identification information of the robot to be calibrated, and when the fact that a certain robot does not have corresponding calibration data is detected, the robot to be calibrated is judged to be an uncalibrated robot.
And when the terminal detects that the uncalibrated robot exists, acquiring a second image shot by the uncalibrated robot. Specifically, a camera is mounted on the uncalibrated robot, and the uncalibrated robot shoots a second calibration device corresponding to the uncalibrated robot through the camera to obtain a second image; the second image comprises a second calibration image corresponding to the calibration plate on the second calibration device. The uncalibrated robot sends the shot second image and the robot identification information to the terminal, and the terminal receives the uncalibrated robot sent second image and the robot identification information; and the terminal extracts a second calibration image corresponding to the calibration plate on the second calibration device from the second image and stores the second calibration image. When the second calibration device and the first calibration device are two different calibration devices, the first calibration device and the second calibration device are respectively placed at a first preset position and a second preset position. The first preset position can enable a plurality of robots to be calibrated to shoot the first calibration equipment; the second preset position enables a plurality of uncalibrated robots to shoot the second calibration equipment. When the second calibration device and the first calibration device are the same calibration device, the calibration device is placed at a first preset position when the robot to be calibrated is calibrated for the first time, and after the calibration of the robot to be calibrated is completed, the calibration device is moved to a second preset position, so that the robot which is not calibrated can be conveniently calibrated.
S305: calculating camera external parameters corresponding to each second image based on the corner point coordinates in each second calibration image, second target corner point coordinates and camera internal parameters corresponding to each second image, wherein the second target corner point coordinates are coordinates corresponding to the corner point coordinates in the second calibration image in a calibration plate on the second calibration equipment;
and the terminal calculates the camera external parameters corresponding to each second image based on the corner point coordinates in each second calibration image, the second target corner point coordinates and the camera internal parameters corresponding to each second image. The camera internal parameter is a camera internal parameter corresponding to a camera on the uncalibrated robot for shooting the second image, and the camera internal parameter is carried by the camera when the camera leaves a factory; the second target corner coordinates are coordinates in the calibration plate on the second calibration device corresponding to the corner coordinates in the second calibration image, i.e. each corner coordinate in the second calibration image, and there is a coordinate corresponding to each corner coordinate in the calibration plate of the second calibration device. The camera external parameter is a camera external parameter corresponding to the second image, and can also be understood as a camera external parameter corresponding to a camera on the uncalibrated robot for shooting the second image.
Specifically, the terminal acquires camera internal parameters corresponding to the second image, that is, camera internal parameters corresponding to a camera on the robot that is not calibrated are acquired. And the terminal acquires the corner coordinates in the second calibration image corresponding to the uncalibrated robot and acquires second target corner coordinates corresponding to each corner coordinate. And constructing an equation set corresponding to the calibrated robot according to the corner point coordinates corresponding to the calibrated robot, the second target corner point coordinates and the camera internal parameters, and solving the equation set to obtain the camera external parameters corresponding to the second image. It should be noted that the terminal constructs an equation set corresponding to the calibrated robot according to the corner point coordinates, the first target corner point coordinates and the camera internal parameters corresponding to the uncalibrated robot, and solves the equation set to obtain the camera external parameters corresponding to the first image shot by the uncalibrated robot. And the terminal constructs an equation set corresponding to each uncalibrated robot in the same way, and the camera external parameters corresponding to the first image shot by each uncalibrated robot can be obtained by solving each equation set. In this embodiment, a process of specifically calculating the camera external reference corresponding to each second image is similar to S102 in the embodiment corresponding to fig. 1, and reference may be made to the description of S102 in the embodiment corresponding to fig. 1, which is not repeated herein.
S306: and determining calibration data of each uncalibrated robot based on the camera external parameters corresponding to each second image.
The terminal determines calibration data of each uncalibrated robot based on the camera external reference corresponding to each second image, and the calibration data can represent the position of the uncalibrated robot. The camera external parameter is obtained by solving an equation set corresponding to which uncalibrated robot, and the camera external parameter corresponds to which uncalibrated robot. Specifically, the camera external parameter obtained by the terminal calculation is the camera external parameter corresponding to the camera on the uncalibrated robot for shooting the second image, which is equivalent to obtaining calibration data of the uncalibrated robot. The terminal directly uses the camera external parameter as calibration data of the uncalibrated robot; or converting the camera external parameter into coordinates, and using the coordinates obtained by conversion as calibration data of the uncalibrated robot.
Furthermore, the uncalibrated robot at least comprises a third robot to be calibrated, and the third robot to be calibrated is any robot to be calibrated which can shoot a calibration plate on the first calibration device and can shoot a calibration plate on the second calibration device.
In order to facilitate the robots in the factory to cooperate with each other and determine the positions of the robots with each other, S307 may be further included after S306, specifically as follows:
s307: adjusting the calibration data of the uncalibrated robot based on the calibration data of the third robot to be calibrated to obtain second target data; the third robot to be calibrated is any robot to be calibrated, which can shoot the calibration plate on the first calibration device and can shoot the calibration plate on the second calibration device; the second target data is used for representing the position of the uncalibrated robot relative to the third robot to be calibrated.
And the terminal adjusts the calibration data of the uncalibrated robot based on the calibration data of the third robot to be calibrated to obtain second target data. The third robot to be calibrated is any robot to be calibrated, which can shoot the calibration plate on the first calibration device and can shoot the calibration plate on the second calibration device. When the second calibration device and the first calibration device are two different calibration devices, the third robot to be calibrated can shoot the calibration plate on the first calibration device at the first preset position, and can shoot any robot to be calibrated of the calibration plate on the second calibration device at the second preset position. When the second calibration device is the same as the first calibration device, the third robot to be calibrated can shoot the calibration plate on the calibration device at the first preset position for the first time, and can shoot the calibration plate on the calibration device at the second preset position for the second time. The first time terminal is used for calibrating the robot to be calibrated, and the second time terminal is used for calibrating the robot which is not calibrated.
Specifically, the terminal takes calibration data of the third robot to be calibrated as reference data, which is equivalent to reference data, and the remaining uncalibrated robots can adjust their own calibration data according to the reference data, and record the adjusted data as second target data. For example, a vector included in the calibration data of the third robot to be calibrated is acquired, a vector included in the calibration data of the robot that is not calibrated is acquired, vector operation is performed on the two vectors, that is, the former vector is subtracted from the latter vector, and the subtracted vector is used as the calibration data of the robot that is not calibrated, that is, the second target data. The second target data may represent a position of the uncalibrated robot relative to a third robot to be calibrated.
Further, S304-S307 in this embodiment may also be executed after S204 in the embodiment corresponding to fig. 3, and the specific execution process is subject to the actual situation, which is not limited herein.
According to the embodiment of the application, a first image shot by a robot to be calibrated is obtained; calculating camera external parameters corresponding to each first image based on the corner point coordinates in each first calibration image, the first target corner point coordinates and the camera internal parameters corresponding to each first image; and determining calibration data of each robot to be calibrated based on the camera external parameters corresponding to each first image. In the above manner, the terminal acquires an image obtained by the calibration device shot by the robot to be calibrated, and calculates and obtains the camera external reference corresponding to the robot to be calibrated based on the corner point coordinates, the target corner point coordinates and the camera internal reference in the image, so as to obtain the calibration data of the robot to be calibrated. The calibration method can be realized only by calibrating equipment and a terminal, does not need a precise measuring instrument, and has low calibration cost; the calibration equipment can move according to the position of the robot to be calibrated, and the flexibility is very high; the calibration method can calibrate a plurality of robots to be calibrated simultaneously, so that the robot calibration efficiency is improved; the whole calibration process is mainly completed by the robot to be calibrated, the terminal and the calibration equipment, the operation steps are simple, and the robot calibration is facilitated
Referring to fig. 5, fig. 5 is a schematic flowchart of a robot calibration method according to still another embodiment of the present invention. In this embodiment, an execution main body of the robot calibration method is a terminal, and the terminal includes but is not limited to a computer, a smart phone, a desktop computer, a tablet computer, a Personal Digital Assistant (PDA), and other mobile terminals.
The differences between the embodiment of the present embodiment and the embodiment corresponding to fig. 1 are S401-S402 and S404-S407, where S403 in the present embodiment is completely the same as S102 in the embodiment corresponding to fig. 1, and specific reference is made to the description related to S102 in the embodiment corresponding to fig. 1, which is not repeated herein.
In order to accelerate the calibration speed of the robot to be calibrated, to realize simultaneous calibration of a plurality of robots to be calibrated, and to facilitate calibration of the robot to be calibrated in a large-scale scene, when the terminal detects that the number of the robots to be calibrated is greater than or equal to a preset threshold, the embodiment may include S401-402, specifically as follows:
s401: dividing the robots to be calibrated into at least two groups, and acquiring images to be analyzed sent by each group of robots to be calibrated; each group of robots corresponds to one first calibration device, and different groups of robots correspond to different first calibration devices.
The method comprises the steps that a terminal detects the number of robots to be calibrated, when the number of the robots to be calibrated is larger than or equal to a preset threshold value, the robots to be calibrated are divided into at least two groups, and images to be analyzed sent by each group of the robots to be calibrated are obtained. Wherein, the preset threshold is set and adjusted by the user; each group of robots to be calibrated corresponds to one first calibration device, and different groups of robots to be calibrated correspond to different first calibration devices. For example, when there are 10 robots to be calibrated, the robots are divided into two groups, each group includes 5 robots to be calibrated, each group of robots corresponds to a calibration device, and the calibration devices of the two groups are different.
The terminal divides the robots to be calibrated into at least two groups and obtains images to be analyzed sent by each group of robots to be calibrated. Specifically, the terminal divides the robots to be calibrated into two groups according to the position away from a first calibration device, each robot to be calibrated in each group respectively shoots a calibration plate on the first calibration device corresponding to the robot to be calibrated to obtain images to be analyzed corresponding to each group of robots to be calibrated, the robots to be calibrated send the images to be analyzed to the terminal, and the terminal obtains the images to be analyzed sent by each group of robots to be calibrated.
S402: grouping the images to be analyzed according to the grouping identification of the robots to be calibrated to obtain first images corresponding to each group of robots to be calibrated; wherein the first image corresponds to a group identification.
The terminal groups the images to be analyzed according to the group identification of the robots to be calibrated to obtain first images corresponding to each group of robots to be calibrated; wherein the first image corresponds to the group identification. The method comprises the steps that a terminal firstly groups images to be analyzed according to a group identification of a robot to be calibrated to obtain a first image corresponding to each group of robots to be calibrated; and then acquiring a first calibration image corresponding to each robot to be calibrated based on the first image.
S403: calculating camera external parameters corresponding to each first image based on the corner point coordinates in each first calibration image, the first target corner point coordinates and the camera internal parameters corresponding to each first image; the first target corner point coordinates are coordinates in a calibration plate on the first calibration device corresponding to the corner point coordinates in the first calibration image.
S403 in this embodiment is identical to S102 in the embodiment corresponding to fig. 1, and please refer to the description related to S102 in the embodiment corresponding to fig. 1, which is not repeated herein.
S404: and determining calibration data of each robot to be calibrated based on camera external parameters corresponding to each first image and the grouping identification of the robot to be calibrated to which each first image belongs.
The terminal determines calibration data of each robot to be calibrated based on the camera external reference and the grouping identification corresponding to each robot to be calibrated, and the calibration data can represent the position of the robot to be calibrated. The camera external parameter is obtained by solving an equation set corresponding to the robot to be calibrated, and the camera external parameter corresponds to the robot to be calibrated. Specifically, the camera external parameters obtained by the terminal calculation are the camera external parameters corresponding to the camera on the robot to be calibrated, which is used for shooting the first image, and the camera external parameters are equivalent to calibration data of the robot to be calibrated. The terminal directly uses the camera external parameter as calibration data of the robot to be calibrated; the camera external parameter can also be converted into coordinates, and the coordinates obtained through conversion are used as calibration data of the robot to be calibrated.
In order to facilitate the robots in the factory to cooperate with each other and determine the mutual positions of the robots, S405-S407 may be further included after S404, specifically as follows:
s405: determining a target robot from the grouped robots to be calibrated; the target robot is a robot to be calibrated which belongs to all groups at the same time.
And the terminal determines a target robot from the grouped robots to be calibrated, wherein the target robot belongs to all the grouped robots to be calibrated at the same time, namely the target robot is the robot to be calibrated which is commonly owned in each group. For example, when the robots to be calibrated are divided into two groups, namely a first group of robots to be calibrated and a second group of robots to be calibrated, the target robot is in both the first group of robots to be calibrated and the second group of robots to be calibrated. And the terminal marks the robots to be calibrated which belong to all the groups at the same time as the target robots.
S406: and acquiring calibration data corresponding to each group of the target robot based on the group identification.
And the terminal acquires the calibration data corresponding to each group of the target robot according to the group identifier. For example, when the robots to be calibrated are divided into two groups, the robots to be calibrated are respectively a first group of robots to be calibrated and a second group of robots to be calibrated, calibration data corresponding to the target robot in the first group of robots to be calibrated is obtained according to the grouping identifier and can be recorded as first calibration data; and acquiring calibration data corresponding to the target robot in the second group of robots to be calibrated according to the grouping identification, and recording the calibration data as second calibration data.
S407: based on calibration data corresponding to each group of the target robot, adjusting the calibration data of the robot to be calibrated corresponding to each group identifier to obtain third target data; the third target data is used for representing the position of each robot to be calibrated in each group relative to the target robot in each group.
And the terminal adjusts the calibration data of the robot to be calibrated corresponding to each group identifier based on the calibration data corresponding to each group of the target robot to obtain third target data. The third target data may be used to represent the position of each robot to be calibrated within the respective group relative to the target robots in the respective group.
For example, when the robots to be calibrated are divided into two groups, the robots to be calibrated are respectively a first group of robots to be calibrated and a second group of robots to be calibrated, calibration data corresponding to the robots to be calibrated of the target robot in the first group are obtained according to the grouping identification and are recorded as first calibration data; and acquiring calibration data corresponding to the target robot in the second group of robots to be calibrated according to the grouping identification, and recording the calibration data as second calibration data. The terminal takes first calibration data corresponding to the target robot in the first group of robots to be calibrated as reference data of the first group of robots to be calibrated, other robots in the first group of robots to be calibrated adjust the calibration data of the other robots according to the reference data, the adjusted data are recorded as third target data, and the third target data can be used for representing the positions of the other robots to be calibrated in the first group of robots to be calibrated relative to the target robots in the group. And the terminal takes the second calibration data corresponding to the target robot in the second group of the robots to be calibrated as the reference data of the second group of the robots to be calibrated, other robots in the second group of the robots to be calibrated adjust the calibration data of the other robots according to the reference data, the adjusted data is recorded as third target data, and the third target data can be used for representing the positions of the other robots to be calibrated in the second group of the robots to be calibrated relative to the target robots in the group.
According to the embodiment of the application, a first image shot by a robot to be calibrated is obtained; calculating camera external parameters corresponding to each first image based on the corner point coordinates in each first calibration image, the first target corner point coordinates and the camera internal parameters corresponding to each first image; and determining calibration data of each robot to be calibrated based on the camera external parameters corresponding to each first image. In the above manner, the terminal acquires an image obtained by the calibration device shot by the robot to be calibrated, and calculates and obtains the camera external reference corresponding to the robot to be calibrated based on the corner point coordinates, the target corner point coordinates and the camera internal reference in the image, so as to obtain the calibration data of the robot to be calibrated. The calibration method can be realized only by calibrating equipment and a terminal, does not need a precise measuring instrument, and has low calibration cost; the calibration equipment can move according to the position of the robot to be calibrated, and the flexibility is very high; the calibration method can calibrate a plurality of robots to be calibrated simultaneously, so that the robot calibration efficiency is improved; the whole calibration process is mainly completed by the robot to be calibrated, the terminal and the calibration equipment, the operation steps are simple, and the robot calibration is facilitated.
Referring to fig. 6, fig. 6 is a schematic view of a terminal calibrated by a robot according to an embodiment of the present disclosure. The terminal includes units for executing the steps in the embodiments corresponding to fig. 1, fig. 3, fig. 4, and fig. 5. Please refer to fig. 1, fig. 3, fig. 4, and fig. 5 for the corresponding embodiments. For convenience of explanation, only the portions related to the present embodiment are shown. Referring to fig. 6, it includes:
an obtaining unit 510, configured to obtain a first image captured by a robot to be calibrated; the first image comprises a first calibration image corresponding to a calibration plate on first calibration equipment;
a calculating unit 520, configured to calculate a camera external parameter corresponding to each first image based on the corner point coordinates in each first calibration image, the first target corner point coordinates, and the camera internal parameter corresponding to each first image; the first target corner point coordinates are coordinates in a calibration plate on the first calibration device corresponding to the corner point coordinates in the first calibration image.
A determining unit 530, configured to determine calibration data of each robot to be calibrated based on the camera external parameter corresponding to each first image.
Further, the calculating unit 520 is specifically configured to:
constructing a calibration equation set corresponding to each robot to be calibrated based on the corner point coordinates in each first calibration image, each first target corner point coordinates and camera internal parameters corresponding to each first image;
and solving each calibration equation set to obtain the camera external parameter corresponding to each robot to be calibrated.
Further, the terminal further includes:
the first adjusting unit is used for adjusting calibration data of a second robot to be calibrated based on the calibration data of the first robot to be calibrated to obtain first target data; the first robot to be calibrated is any robot to be calibrated, which can shoot a calibration plate on the first calibration equipment; the second robot to be calibrated is any robot to be calibrated, which can shoot a calibration plate on the first calibration equipment except the first robot to be calibrated; the first target data is used for representing the position of the second robot to be calibrated relative to the first robot to be calibrated.
Further, the terminal further includes:
an image acquisition unit configured to acquire a second image taken based on an uncalibrated robot when it is detected that the uncalibrated robot remains; the second image comprises a second calibration image corresponding to a calibration plate on second calibration equipment; the second calibration equipment is located at a position different from the position of the first calibration equipment;
the camera external reference calculation unit is used for calculating the camera external reference corresponding to each second image based on the corner point coordinates in each second calibration image, the second target corner point coordinates and the camera internal reference corresponding to each second image, wherein the second target corner point coordinates are coordinates corresponding to the corner point coordinates in the second calibration image in a calibration plate on the second calibration equipment;
and the calibration data determining unit is used for determining the calibration data of each uncalibrated robot based on the camera external parameters corresponding to each second image.
The uncalibrated robots at least comprise a third robot to be calibrated;
further, the terminal further includes:
the second adjusting unit is used for adjusting the calibration data of the uncalibrated robot based on the calibration data of the third robot to be calibrated to obtain second target data; the third robot to be calibrated is any robot to be calibrated, which can shoot the calibration plate on the first calibration device and can shoot the calibration plate on the second calibration device; the second target data is used for representing the position of the uncalibrated robot relative to the third robot to be calibrated.
The number of the robots to be calibrated is greater than or equal to a preset threshold value;
further, the obtaining unit 510 is specifically configured to:
dividing the robots to be calibrated into at least two groups, and acquiring images to be analyzed sent by each group of robots to be calibrated; each group of robots corresponds to one first calibration device, and different groups of robots correspond to different first calibration devices;
grouping the images to be analyzed according to the grouping identification of the robots to be calibrated to obtain first images corresponding to each group of robots to be calibrated; wherein the first image corresponds to a group identification.
Further, the determining unit 530 is specifically configured to: and determining calibration data of each robot to be calibrated based on camera external parameters corresponding to each first image and the grouping identification of the robot to be calibrated to which each first image belongs.
Further, the terminal further includes:
the target robot determining unit is used for determining a target robot from the grouped robots to be calibrated; the target robot belongs to all grouped robots to be calibrated at the same time;
the data acquisition unit is used for acquiring calibration data corresponding to each group of the target robot based on the group identification;
the third adjusting unit is used for adjusting the calibration data of the robot to be calibrated corresponding to each group identifier based on the calibration data corresponding to each group of the target robot to obtain third target data; the third target data is used for representing the position of each robot to be calibrated in each group relative to the target robot in each group.
Referring to fig. 7, fig. 7 is a schematic diagram of a terminal for robot calibration according to another embodiment of the present disclosure. As shown in fig. 7, the terminal 6 of this embodiment includes: a processor 60, a memory 61, and computer readable instructions 62 stored in the memory 61 and executable on the processor 60. The processor 60, when executing the computer readable instructions 62, implements the steps in the above-described embodiments of the method for calibrating a robot at a terminal, for example, S101 to S103 shown in fig. 1. Alternatively, the processor 60, when executing the computer readable instructions 62, implements the functions of the units in the embodiments described above, such as the functions of the units 310 to 330 shown in fig. 3.
Illustratively, the computer readable instructions 62 may be divided into one or more units, which are stored in the memory 61 and executed by the processor 60 to accomplish the present application. The one or more elements may be a series of computer readable instruction segments capable of performing certain functions, which are used to describe the execution of the computer readable instructions 62 in the terminal 6. For example, the computer readable instructions 42 may be obtained by an acquisition unit, a calculation unit, and a determination unit, each unit functioning specifically as described above.
The terminal may include, but is not limited to, a processor 60, a memory 61. It will be appreciated by those skilled in the art that fig. 6 is only an example of a terminal 6 and does not constitute a limitation of the terminal 6, and that it may comprise more or less components than those shown, or some components may be combined, or different components, for example the terminal may also comprise input output terminals, network access terminals, buses, etc.
The Processor 60 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 61 may be an internal storage unit of the terminal 6, such as a hard disk or a memory of the terminal 6. The memory 61 may also be an external storage terminal of the terminal 6, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) and the like provided on the terminal 6. Further, the memory 61 may also include both an internal storage unit of the terminal 6 and an external storage terminal. The memory 61 is used for storing the computer readable instructions and other programs and data required by the terminal. The memory 61 may also be used to temporarily store data that has been output or is to be output.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not cause the essential features of the corresponding technical solutions to depart from the spirit scope of the technical solutions of the embodiments of the present application, and are intended to be included within the scope of the present application.

Claims (10)

1. A robot calibration method is characterized by comprising the following steps:
acquiring a first image shot by a robot to be calibrated; the first image comprises a first calibration image corresponding to a calibration plate on first calibration equipment;
calculating camera external parameters corresponding to each first image based on the corner point coordinates in each first calibration image, the first target corner point coordinates and the camera internal parameters corresponding to each first image; the first target corner point coordinates are coordinates in a calibration plate on the first calibration equipment, which correspond to the corner point coordinates in the first calibration image;
and determining calibration data of each robot to be calibrated based on the camera external parameters corresponding to each first image.
2. The robot calibration method according to claim 1, wherein said calculating the camera external parameters corresponding to each of the first images based on the corner point coordinates in each of the first calibration images, the first target corner point coordinates, and the camera internal parameters corresponding to each of the first images comprises:
constructing a calibration equation set corresponding to each robot to be calibrated based on the corner point coordinates in each first calibration image, each first target corner point coordinates and camera internal parameters corresponding to each first image;
and solving each calibration equation set to obtain the camera external parameter corresponding to each robot to be calibrated.
3. The robot calibration method according to claim 1, wherein after determining calibration data of each robot to be calibrated based on the camera external reference corresponding to each first image, the method further comprises:
adjusting calibration data of a second robot to be calibrated based on the calibration data of the first robot to be calibrated to obtain first target data; the first robot to be calibrated is any robot to be calibrated, which can shoot a calibration plate on the first calibration equipment; the second robot to be calibrated is any robot to be calibrated, which can shoot a calibration plate on the first calibration equipment except the first robot to be calibrated; the first target data is used for representing the position of the second robot to be calibrated relative to the first robot to be calibrated.
4. The robot calibration method according to claim 1, wherein after determining calibration data of each robot to be calibrated based on the camera external reference corresponding to each first image, the method further comprises:
when the uncalibrated robot is detected to exist, acquiring a second image shot based on the uncalibrated robot; the second image comprises a second calibration image corresponding to a calibration plate on second calibration equipment; the second calibration equipment is located at a position different from the position of the first calibration equipment;
calculating camera external parameters corresponding to each second image based on the corner point coordinates in each second calibration image, the second target corner point coordinates and the camera internal parameters corresponding to each second image; the second target corner point coordinates are coordinates corresponding to corner point coordinates in the second calibration image in a calibration plate on the second calibration device;
and determining calibration data of each uncalibrated robot based on the camera external parameters corresponding to each second image.
5. A robot calibration method according to claim 4, wherein the uncalibrated robots comprise at least one third robot to be calibrated; after determining calibration data of each uncalibrated robot based on the camera external reference corresponding to each second image, the method further includes:
adjusting the calibration data of the uncalibrated robot based on the calibration data of the third robot to be calibrated to obtain second target data; the third robot to be calibrated is any robot to be calibrated, which can shoot the calibration plate on the first calibration device and can shoot the calibration plate on the second calibration device; the second target data is used for representing the position of the uncalibrated robot relative to the third robot to be calibrated.
6. The robot calibration method according to claim 1, wherein the number of the robots to be calibrated is greater than or equal to a preset threshold, and the acquiring the first image taken by the robot to be calibrated comprises:
dividing the robots to be calibrated into at least two groups, and acquiring images to be analyzed sent by each group of robots to be calibrated; each group of robots corresponds to one first calibration device, and different groups of robots correspond to different first calibration devices;
grouping the images to be analyzed according to the grouping identification of the robots to be calibrated to obtain first images corresponding to each group of robots to be calibrated; wherein the first image corresponds to a group identification;
the determining calibration data of each robot to be calibrated based on the camera external parameters corresponding to each first image comprises:
and determining calibration data of each robot to be calibrated based on camera external parameters corresponding to each first image and the grouping identification of the robot to be calibrated to which each first image belongs.
7. The robot calibration method according to claim 6, wherein the determining of the calibration data of each robot to be calibrated based on the camera external reference corresponding to each first image further comprises;
determining a target robot from the grouped robots to be calibrated; the target robot belongs to all grouped robots to be calibrated at the same time;
based on the grouping identification, acquiring calibration data corresponding to each group of the target robot;
based on calibration data corresponding to each group of the target robot, adjusting the calibration data of the robot to be calibrated corresponding to each group identifier to obtain third target data; the third target data is used for representing the position of each robot to be calibrated in each group relative to the target robot in each group.
8. A terminal, comprising:
the robot calibration system comprises an acquisition unit, a calibration unit and a calibration unit, wherein the acquisition unit is used for acquiring a first image shot by a robot to be calibrated; the first image comprises a first calibration image corresponding to a calibration plate on first calibration equipment;
the calculation unit is used for calculating the camera external parameters corresponding to each first image based on the corner point coordinates in each first calibration image, the first target corner point coordinates and the camera internal parameters corresponding to each first image; the first target corner point coordinates are coordinates in a calibration plate on the first calibration device corresponding to the corner point coordinates in the first calibration image.
And the determining unit is used for determining the calibration data of each robot to be calibrated based on the camera external parameters corresponding to each first image.
9. A terminal comprising a memory, a processor, and computer readable instructions stored in the memory and executable on the processor, wherein the processor when executing the computer readable instructions implements the method of any one of claims 1 to 7.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
CN201911028907.8A 2019-10-28 2019-10-28 Robot calibration method and terminal Pending CN110866956A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911028907.8A CN110866956A (en) 2019-10-28 2019-10-28 Robot calibration method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911028907.8A CN110866956A (en) 2019-10-28 2019-10-28 Robot calibration method and terminal

Publications (1)

Publication Number Publication Date
CN110866956A true CN110866956A (en) 2020-03-06

Family

ID=69654882

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911028907.8A Pending CN110866956A (en) 2019-10-28 2019-10-28 Robot calibration method and terminal

Country Status (1)

Country Link
CN (1) CN110866956A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111627073A (en) * 2020-04-30 2020-09-04 贝壳技术有限公司 Calibration method, calibration device and storage medium based on human-computer interaction

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102294695A (en) * 2010-06-25 2011-12-28 鸿富锦精密工业(深圳)有限公司 Robot calibration method and calibration system
CN103646403A (en) * 2013-12-26 2014-03-19 北京经纬恒润科技有限公司 Vehicle-mounted multi-camera calibration method, system and image processing device
CN105447877A (en) * 2015-12-13 2016-03-30 大巨龙立体科技有限公司 Parallel dual-camera stereo calibration method
CN107633536A (en) * 2017-08-09 2018-01-26 武汉科技大学 A kind of camera calibration method and system based on two-dimensional planar template
CN109523597A (en) * 2017-09-18 2019-03-26 百度在线网络技术(北京)有限公司 The scaling method and device of Camera extrinsic
CN109767474A (en) * 2018-12-31 2019-05-17 深圳积木易搭科技技术有限公司 A kind of more mesh camera calibration method, apparatus and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102294695A (en) * 2010-06-25 2011-12-28 鸿富锦精密工业(深圳)有限公司 Robot calibration method and calibration system
CN103646403A (en) * 2013-12-26 2014-03-19 北京经纬恒润科技有限公司 Vehicle-mounted multi-camera calibration method, system and image processing device
CN105447877A (en) * 2015-12-13 2016-03-30 大巨龙立体科技有限公司 Parallel dual-camera stereo calibration method
CN107633536A (en) * 2017-08-09 2018-01-26 武汉科技大学 A kind of camera calibration method and system based on two-dimensional planar template
CN109523597A (en) * 2017-09-18 2019-03-26 百度在线网络技术(北京)有限公司 The scaling method and device of Camera extrinsic
CN109767474A (en) * 2018-12-31 2019-05-17 深圳积木易搭科技技术有限公司 A kind of more mesh camera calibration method, apparatus and storage medium

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
HUANG LI 等: "A humanoid robot localization method for", 《IEEE》 *
YINGHAO NING 等: "A Practical Calibration Method for Spinal Surgery Robot", 《IEEE》 *
邓勇军等: "基于OpenCV的移动焊接机器人视觉系统自主标定方法", 《焊接学报》 *
郑剑斌等: "视觉点胶机的摄像机标定技术", 《科技创新与应用》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111627073A (en) * 2020-04-30 2020-09-04 贝壳技术有限公司 Calibration method, calibration device and storage medium based on human-computer interaction
CN111627073B (en) * 2020-04-30 2023-10-24 贝壳技术有限公司 Calibration method, calibration device and storage medium based on man-machine interaction

Similar Documents

Publication Publication Date Title
CN109754426B (en) Method, system and device for verifying camera calibration parameters
US10924729B2 (en) Method and device for calibration
US10726580B2 (en) Method and device for calibration
CN108364313B (en) Automatic alignment method, system and terminal equipment
CN107702695B (en) Method for testing relative position of camera module lens and image sensor
CN110632582B (en) Sound source positioning method, device and storage medium
CN111323751B (en) Sound source positioning method, device and storage medium
CN110095089B (en) Method and system for measuring rotation angle of aircraft
CN107749050B (en) Fisheye image correction method and device and computer equipment
CN111311632A (en) Object pose tracking method, device and equipment
CN110136207B (en) Fisheye camera calibration system, fisheye camera calibration method, fisheye camera calibration device, electronic equipment and storage medium
CN109341718B (en) Calibration method and device based on multi-view vision
CN116188594B (en) Calibration method, calibration system, calibration device and electronic equipment of camera
CN102236790B (en) Image processing method and device
CN114078165A (en) Calibration method of optical imaging module, distortion correction method and related equipment
CN107067441B (en) Camera calibration method and device
CN110866956A (en) Robot calibration method and terminal
CN111696141B (en) Three-dimensional panoramic scanning acquisition method, device and storage device
CN111429529B (en) Coordinate conversion calibration method, electronic equipment and computer storage medium
WO2021237574A1 (en) Camera parameter determination method and apparatus, and readable storage medium
CN111105365B (en) Color correction method, medium, terminal and device for texture image
CN112419424A (en) Gun and ball linkage calibration method and device and related equipment
CN115965697A (en) Projector calibration method, calibration system and device based on Samm's law
CN111353945A (en) Fisheye image correction method, fisheye image correction device and storage medium
CN114466143A (en) Shooting angle calibration method and device, terminal equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200306