CN109952176B - Robot calibration method and system, robot and storage medium - Google Patents

Robot calibration method and system, robot and storage medium Download PDF

Info

Publication number
CN109952176B
CN109952176B CN201780034033.XA CN201780034033A CN109952176B CN 109952176 B CN109952176 B CN 109952176B CN 201780034033 A CN201780034033 A CN 201780034033A CN 109952176 B CN109952176 B CN 109952176B
Authority
CN
China
Prior art keywords
boundary
calibration
robot
moving
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780034033.XA
Other languages
Chinese (zh)
Other versions
CN109952176A (en
Inventor
阳光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen A&E Intelligent Technology Institute Co Ltd
Original Assignee
Shenzhen A&E Intelligent Technology Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen A&E Intelligent Technology Institute Co Ltd filed Critical Shenzhen A&E Intelligent Technology Institute Co Ltd
Publication of CN109952176A publication Critical patent/CN109952176A/en
Application granted granted Critical
Publication of CN109952176B publication Critical patent/CN109952176B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a robot calibration method, a system, a robot and a storage medium, wherein the calibration method comprises the steps of controlling the front end of the robot to move according to a preset rule (S101), detecting the change condition of a gray value caused by the movement, and recording the moving distance of the front end (S102); calculating coordinates of the initial position of the front end in a visual coordinate system and coordinates of the calibration point in a robot coordinate system (S103); and calculating to obtain a conversion relation between the robot coordinate system and the visual coordinate system according to the coordinates of the initial position in the visual coordinate system and the coordinates of the calibration point in the robot coordinate system (S104). The accuracy of the conversion relation between the robot coordinate system and the visual coordinate is improved through the method.

Description

Robot calibration method and system, robot and storage medium
Technical Field
The invention relates to the technical field of robot coordinate setting, in particular to a robot calibration method, a robot calibration system, a robot and a storage medium.
Background
Industrial robots have now played an increasingly important role in production in manufacturing industries all over the world. In order to make an industrial robot capable of more complex tasks, the robot needs to have a better control system and also needs to sense more environmental changes. Among them, the robot vision is the most important robot sensing device because of its large information amount and high information integrity. Taking the robot for precision welding of electronic components of the circuit board as an example, in the welding process, the robot can position a workpiece or a working surface by using a camera in a vision system, and calculate the relative position of a working scene relative to the robot so as to assist the robot in completing the operation.
The robot uses the vision system to calculate the transformation relationship between the vision coordinate system and the robot coordinate system, which becomes an important research subject for robot development, and obtaining a more accurate transformation relationship between the coordinate systems is a precondition for solving the problem that the robot completes high-precision operation. However, the existing robot usually moves the robot to the calibration point of the vision system by manual control so that the robot calculates the transformation relationship between the vision coordinate system and the robot coordinate system by using the vision system, which usually has large human errors, resulting in large errors of the obtained transformation relationship.
Disclosure of Invention
The invention aims to provide a robot calibration method, a robot calibration system, a robot and a storage medium, which can improve the accuracy of a conversion relation between a robot coordinate system and a visual coordinate.
In order to achieve the above object, the present invention provides a robot calibration method, including:
controlling the front end of the robot to move from an initial position on a calibration plate according to a preset rule, wherein a calibration point for calibration, a first boundary and a second boundary determined according to the calibration point are arranged on the calibration plate;
acquiring a plurality of images of the front end acquired by a vision system in the moving process, detecting the change conditions of the gray values at two sides of the first boundary and the second boundary from the plurality of images, and recording the moving distance of the front end;
calculating to obtain the coordinates of the initial position of the front end in a visual coordinate system and the coordinates of the calibration point in a robot coordinate system according to the change condition of the gray value and the moving distance;
and calculating to obtain a conversion relation between a robot coordinate system and the visual coordinate system according to the initial position and the position of the calibration point, thereby completing the calibration of the robot.
Wherein the controlling of the front end of the robot to move from the initial position on the calibration plate according to a preset rule comprises:
moving the front end from the initial position in a y-direction of the vision system;
and moving the front end from the initial position in an x-direction of the vision system.
Wherein the acquiring a plurality of images of the front end acquired by a vision system in the moving process, detecting a change condition of gray values at two sides of the first boundary and the second boundary from the plurality of images, and recording a moving distance of the front end comprises:
acquiring the plurality of images in the process of moving the front end from the initial position along the y direction of the visual system, and detecting the change situation of first gray values on two sides of the first boundary from the plurality of images; when the gray values on two sides of the first boundary change, recording a first moving distance of a front end, and obtaining a first position and a second position of the front end in the y direction before and after the change of the first gray value is caused, and a first distance between an initial position of the front end in the y direction and the first boundary;
acquiring the plurality of images in the process of moving the front end from the initial position along the x direction of the vision system, and detecting the change condition of second gray values on two sides of the second boundary from the plurality of images; recording a second moving distance of the front end when the gray values at the two sides of the second boundary are changed; and obtaining a third position and a fourth position of the front end in the x-direction before and after causing the second gray value to change, and a second distance between the initial position of the front end in the x-direction and the second boundary.
Wherein, control the front end of robot removes according to predetermineeing the rule on the calibration board and still includes:
moving the front end from the initial position to the fourth position along the x direction of the vision system according to a second moving distance, and then moving the front end from the fourth position along the y direction of the vision system;
and moving the front end from the initial position to the second position along the y direction of the vision system according to the first moving distance, and then moving the front end from the second position along the x direction of the vision system.
Wherein, the obtaining of the plurality of images of the front end acquired by the vision system in the moving process, the detecting of the change of the gray values at both sides of the first boundary and the second boundary from the plurality of images, and the recording of the moving distance of the front end further comprise:
acquiring the plurality of images during the moving of the front end from the fourth position along the y direction of the visual system, and detecting the change situation of the first gray values on two sides of the first boundary from the plurality of images; when the gray values on two sides of the first boundary change, a fifth position and a sixth position of the front end in the y direction before and after the change of the first gray value is obtained;
acquiring the plurality of images during the process of moving the front end from the second position along the x direction of the vision system, and detecting the change situation of the second gray value at two sides of the second boundary from the plurality of images; and when the gray values on the two sides of the second boundary are changed, the seventh position and the eighth position of the front end in the x direction before and after the change of the second gray value is obtained.
Wherein, the calculating according to the change situation of the gray value and the moving distance to obtain the coordinate of the initial position of the front end in a visual coordinate system and the coordinate of the calibration point in a robot coordinate system includes:
calculating to obtain the coordinates of the initial position of the front end in the visual coordinate system according to the first distance and the second distance;
and obtaining a first reference line according to the first position, the second position, the fifth position and the sixth position, obtaining a second reference line according to the third position, the fourth position, the seventh position and the eighth position, wherein the coordinates of the intersection point of the first reference line and the second reference line are the coordinates of the calibration point in the robot coordinate system.
Obtaining a first reference line according to the first position, the second position, the fifth position and the sixth position specifically includes:
connecting a midpoint between the first position and the second position with a midpoint between the fifth position and the sixth position to obtain a first reference line;
obtaining a second reference line according to the third position, the fourth position, the seventh position and the eighth position, specifically:
and connecting a midpoint between the third position and the fourth position with a midpoint between the seventh position and the eighth position to obtain a second reference line.
Wherein the origin of the vision system coincides with the index point;
and defining one boundary line which is closest to the x axis of the vision system as a first boundary and defining one boundary line which is closest to the y axis of the vision system as a second boundary.
The calibration plate is a checkerboard formed by alternately arranging black lattices and white lattices, and calibration points arranged on the calibration plate are intersections of the black lattices and the white lattices;
the first boundary and the second boundary determined according to the index point include:
and respectively taking the edge lines of the two black and white grids passing through the calibration point as a first boundary line and a second boundary line.
The detecting of the change of the gray level values at both sides of the first boundary and the second boundary specifically includes:
and detecting the change situation of the gray value of a plurality of pixels at two sides of the first boundary and the second boundary.
In another aspect, the present invention provides a storage medium, wherein the storage medium stores program data, and the stored data can be executed to implement the robot calibration method.
In another aspect, the present invention provides a robot calibration system, including: the method comprises the following steps: a vision device, a processor, and a memory; wherein the vision device and the memory are respectively connected with the processor;
the vision device is used for shooting the space operated by the front end of the robot and establishing a corresponding vision system;
the memory is used for storing preset movement rules of the front end of the robot and execution instructions of the processor;
the processor is configured to perform the following actions:
controlling the front end of the robot to move from an initial position on a calibration plate according to a preset rule, wherein a calibration point for calibration, a first boundary and a second boundary determined according to the calibration point are arranged on the calibration plate;
acquiring a plurality of images of the front end acquired by a vision system in the moving process, detecting the change conditions of the gray values at two sides of the first boundary and the second boundary from the plurality of images, and recording the moving distance of the front end;
calculating to obtain the coordinates of the initial position of the front end in a visual coordinate system and the coordinates of the calibration point in a robot coordinate system according to the change condition of the gray value and the moving distance;
and calculating to obtain a conversion relation between a robot coordinate system and the visual coordinate system according to the initial position and the position of the calibration point, thereby completing the calibration of the robot.
Wherein the controlling the front end of the robot performed by the processor to move from an initial position on a calibration plate according to a preset rule comprises:
moving the front end from the initial position in a y-direction of the vision system;
and moving the front end from the initial position in an x-direction of the vision system.
Wherein the plurality of images of the front end acquired by the vision system in the moving process detect a change in gray level values at both sides of the first boundary and the second boundary from the plurality of images; and recording the moving distance of the front end, comprising:
acquiring the plurality of images in the process of moving the front end from the initial position along the y direction of the visual system, and detecting the change situation of first gray values on two sides of the first boundary from the plurality of images; when the gray values on two sides of the first boundary change, recording a first moving distance of the front end, and obtaining a first position and a second position of the front end in the y direction before and after the change of the first gray value is caused, and a first distance between an initial position of the front end in the y direction and the first boundary;
acquiring the plurality of images in the process of moving the front end from the initial position along the x direction of the vision system, and detecting the change condition of second gray values on two sides of the second boundary from the plurality of images; recording a second moving distance of the front end when the gray values at the two sides of the second boundary are changed; and obtaining a third position and a fourth position of the front end in the x-direction before and after causing the second gray value to change, and a second distance between the initial position of the front end in the x-direction and the second boundary.
Wherein the controlling the front end of the robot performed by the processor moves from an initial position on a calibration plate according to a preset rule, further comprising:
moving the front end from the initial position to the fourth position along the x direction of the vision system according to a second moving distance, and then moving the front end from the fourth position along the y direction of the vision system;
and moving the front end from the initial position to the second position along the y direction of the vision system according to the first moving distance, and then moving the front end from the second position along the x direction of the vision system.
Wherein the acquiring, by the processor, a plurality of images of the front end acquired by a vision system in the moving process, detecting a change of gray values at both sides of the first boundary and the second boundary from the plurality of images, and recording a moving distance of the front end further includes:
acquiring the plurality of images during the moving of the front end from the fourth position along the y direction of the visual system, and detecting the change situation of the first gray values on two sides of the first boundary from the plurality of images; when the gray values on two sides of the first boundary change, a fifth position and a sixth position of the front end in the y direction before and after the change of the first gray value is obtained;
acquiring the plurality of images during the process of moving the front end from the second position along the x direction of the vision system, and detecting the change situation of the second gray value at two sides of the second boundary from the plurality of images; and when the gray values on the two sides of the second boundary are changed, the seventh position and the eighth position of the front end in the x direction before and after the change of the second gray value is obtained.
Wherein the calculating, by the processor, according to the change of the gray value and the moving distance to obtain the coordinate of the initial position of the front end in a visual coordinate system and the coordinate of the calibration point in a robot coordinate system includes:
calculating the coordinates of the initial position of the front end in the visual system according to the first distance and the second distance;
and obtaining a first reference line according to the first position, the second position, the fifth position and the sixth position, obtaining a second reference line according to the third position, the fourth position, the seventh position and the eighth position, wherein the intersection point of the first reference line and the second reference line is the coordinate of the calibration point in the robot coordinate system.
Obtaining a first reference line according to the first position, the second position, the fifth position and the sixth position specifically includes:
connecting a midpoint between the first position and the second position with a midpoint between the fifth position and the sixth position to obtain a first reference line;
obtaining a second reference line according to the third position, the fourth position, the seventh position and the eighth position, specifically:
and connecting a midpoint between the third position and the fourth position with a midpoint between the seventh position and the eighth position to obtain a second reference line.
Wherein the origin of the vision system coincides with the index point;
and defining one boundary line which is closest to the x axis of the vision system as a first boundary and defining one boundary line which is closest to the y axis of the vision system as a second boundary.
The calibration plate is a checkerboard formed by alternately arranging black lattices and white lattices, and calibration points arranged on the calibration plate are intersections of the black lattices and the white lattices;
the first boundary and the second boundary determined according to the index point include:
and respectively taking the edge lines of the two black and white grids passing through the calibration point as a first boundary line and a second boundary line.
In another aspect, the present invention provides a robot, which includes a robot body and a robot calibration system,
the robot comprises a front end of the robot for gripping an object;
the robot calibration system includes:
a vision device, a processor, and a memory; wherein the vision device and the memory are respectively connected with the processor;
the vision device is used for shooting the space operated by the front end of the robot and establishing a corresponding vision system;
the memory is used for storing preset movement rules of the front end of the robot and execution instructions of the processor;
the processor is configured to perform the following actions:
controlling the front end of the robot to move from an initial position on a calibration plate according to a preset rule, wherein a calibration point for calibration, a first boundary and a second boundary determined according to the calibration point are arranged on the calibration plate;
acquiring a plurality of images of the front end acquired by a vision system in the moving process, detecting the change conditions of the gray values at two sides of the first boundary and the second boundary from the plurality of images, and recording the moving distance of the front end;
calculating to obtain the coordinates of the initial position of the front end in a visual coordinate system and the coordinates of the calibration point in a robot coordinate system according to the change condition of the gray value and the moving distance;
and calculating to obtain a conversion relation between a robot coordinate system and the visual coordinate system according to the initial position and the position of the calibration point, thereby completing the calibration of the robot.
Wherein the origin of the vision system coincides with the index point;
and defining one boundary line which is closest to the x axis of the vision system as a first boundary and defining one boundary line which is closest to the y axis of the vision system as a second boundary.
Has the advantages that: different from the prior art, the invention detects the change situation of the gray value caused by the movement by controlling the front end of the robot to move according to the preset rule and records the moving distance of the front end; calculating the coordinates of the initial position of the front end in a visual coordinate system and the coordinates of the calibration point in a robot coordinate system; and calculating to obtain a conversion relation between the robot coordinate system and the visual coordinate system according to the coordinates of the initial position in the visual coordinate system and the coordinates of the calibration point in the robot coordinate system. By using the method, the coordinates of the initial position of the front end in the visual coordinate system and the coordinates of the calibration point in the robot coordinate system can be calculated, and since the coordinates of the initial position in the robot coordinate system and the coordinates of the calibration point in the visual coordinate system are known, the conversion relation between the robot coordinate system and the visual coordinates can be calculated by using the coordinates of the initial position in the visual coordinate system and the coordinates of the initial position in the robot coordinate system, the coordinates of the calibration point in the visual coordinate system and the coordinates of the calibration point in the robot coordinate system, and the accuracy of the conversion relation between the robot coordinate system and the visual coordinates can be improved.
Drawings
FIG. 1 is a schematic flow chart of a first embodiment of a robot calibration method according to the present invention;
FIG. 2 is a schematic flow chart of step S101 in the first embodiment of the method shown in FIG. 1;
FIG. 3 is a schematic flow chart of step S102 in the first embodiment of the method shown in FIG. 1;
FIG. 4 is a schematic flow chart of step S101 in the first embodiment of the method shown in FIG. 1;
FIG. 5 is a schematic flow chart of step S102 in the first embodiment of the method shown in FIG. 1;
FIG. 6 is a schematic flow chart of step S103 in the first embodiment of the method shown in FIG. 1;
FIGS. 7 a-7 h are schematic flow diagrams of the first embodiment of the robot calibration method of the present invention;
FIG. 8 is a schematic structural diagram of an embodiment of a storage medium of the present invention;
FIG. 9 is a functional block diagram of a first embodiment of a robot calibration system of the present invention;
fig. 10 is a schematic structural diagram of an embodiment of the robot of the present invention.
Detailed Description
In order to make those skilled in the art better understand the technical solution of the present invention, the following describes a robot calibration method, a system, a robot and a storage medium provided by the present invention in detail with reference to the accompanying drawings and the detailed description. It is obvious that the described embodiments are only a part of embodiments of a robot calibration method, system, robot and storage medium of the present invention, and not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments of the present invention without any creative effort belong to the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a schematic flow chart of a first embodiment of the robot calibration method of the present invention, which includes the following steps:
s101, the front end of the robot is controlled to move from an initial position on a calibration plate according to a preset rule, wherein a calibration point for calibration, a first boundary and a second boundary which are determined according to the calibration point are arranged on the calibration plate.
Before the robot is calibrated, a calibration plate is arranged, and calibration points for calibrating the robot are arranged on the calibration plate. The calibration plate is arranged in the visual range of the visual system, so that the calibration plate is photographed in the process of moving the robot to obtain a corresponding image, and the image is subjected to subsequent analysis and calculation. It can be understood that the robot in this embodiment is an industrial robot, the calibration of the robot is the calibration of the front end of the robot, and the mobile robot is the front end of the mobile robot.
In this embodiment, a checkerboard in which black lattices and white lattices are alternately arranged is taken as an example as a calibration board, a calibration point in a visual system is an intersection point of the black lattices and the white lattices, a first boundary and a second boundary are edge lines of two black lattices passing through the calibration point, at this time, gray values on two sides of the first boundary and the second boundary are large differences, one side is a gray value of the black lattices, and the other side is a gray value of the white lattices. Meanwhile, if the coordinate system of the vision system is known and the x axis and the y axis of the vision system are determined, the origin of the vision system coincides with the index point, and at this time, one of the edge lines of the two black and white lattices which is closest to the x axis of the vision system is defined as a first boundary, and one of the edge lines of the two black and white lattices which is closest to the y axis of the vision system is defined as a second boundary.
In this embodiment, this step is to contact the front end of the robot with the calibration plate, and then set the movement rule of the front end, according to which the front end of the robot is moved from the initial position of placement. Further, the movement rule of the front end means that the robot moves step by step according to the minimum movement path of the robot.
It will be appreciated that at this point the coordinates of the initial position of the robot front end on the calibration plate in the robot coordinate system are known, as well as the coordinates of the calibration point in the visual coordinate system of the vision system.
S102, acquiring a plurality of images of the front end acquired by a vision system in the moving process of the front end of the robot in the step S101, detecting the change conditions of the gray values at two sides of the first boundary and the second boundary from the plurality of images, and recording the moving distance of the front end.
In the moving process of the front end of the robot, the image of the calibration plate is obtained through the vision system, and the robot moves step by step according to the minimum step diameter, so that when the image of the calibration plate is obtained, the image of the calibration plate can be obtained once when the front end moves once, the moving times of the front end is recorded, and the moving distance of the front end can be calculated by utilizing the recorded moving times and the moving distance of the front end each time.
Because the vision system needs to analyze the gray value of the acquired image, the acquired image is subjected to binarization processing, and the specific position of the front end on the calibration plate cannot be directly displayed in the acquired image, but only an area of a gray value corresponding to the front end, which is the projection of the front end on the calibration plate. The front end can be regarded as a contact pin, when the front end contacts the calibration board, the front end can form a corresponding projection on the calibration board, the gray value of the projection area is different from the gray values of other positions of the calibration board, and the position of the front end on the calibration board can be obtained by detecting the change of the gray value on the calibration board and calculating the change.
Since the first boundary and the second boundary both pass through the calibration point and the difference between the gray values at the two sides is obvious, the change conditions of the gray values at the two sides of the first boundary and the second boundary are detected from the acquired image, and then a plurality of positions close to the calibration point are respectively obtained at the two sides of the first boundary and the second boundary, and then step S103 is executed.
And S103, calculating to obtain the coordinates of the initial position of the front end in the visual coordinate system and the coordinates of the calibration point in the robot coordinate system according to the change situation of the gray value and the moving distance.
Step S102, a plurality of positions close to the calibration point and a first distance and a second distance between the initial position of the front end of the robot and a first boundary and a second boundary respectively can be obtained; the coordinates of the index point in the robot coordinate system can be calculated from the several positions close to the index point. Theoretically, the difference between the coordinates of the calibration point in the robot coordinate system and the coordinates of the set calibration point in the visual coordinate system is smaller than or equal to the minimum moving path of the robot, and in an ideal state, the coordinates of the calibration point in the robot coordinate system are the same as the coordinates of the calibration point in the visual coordinate system.
And S104, calculating to obtain a conversion relation between the robot coordinate system and the visual coordinate system according to the initial position and the position of the calibration point, thereby completing the calibration of the robot.
In step S103, the coordinates of the initial position of the front end in the visual coordinate system and the coordinates of the calibration point in the robot coordinate system are obtained through calculation, and the coordinates of the initial position of the front end in the robot coordinate system and the coordinates of the calibration point in the visual coordinate system are known, so that a mapping relationship is established according to a matrix equation by using the coordinates of the initial position of the front end in the visual coordinate system and the coordinates of the initial position of the front end in the robot coordinate system, and the coordinates of the calibration point in the robot coordinate system and the coordinates of the calibration point in the visual coordinate system, that is, a conversion relationship between the robot coordinate system and the visual coordinate system can be obtained through calculation, thereby completing the calibration of the robot.
Further, as shown in fig. 2, step S101 includes the following steps:
and S1011, moving the front end from the initial position along the y direction of the visual system.
The control front moves along the y-direction of the vision system. It will be appreciated that the path of movement of the front end is parallel to the y-axis of the vision system, but not necessarily parallel to the set second boundary. When the calibration plate is disposed, the calibration plate may have a certain included angle with the vision system, which results in the second boundary being non-parallel to the y-direction of the vision system, and thus the front end being non-parallel to the second boundary when moving along the y-direction of the vision system.
S1012, moving the tip from the initial position in the x direction of the vision system.
Similarly to step S1011, the movement path of the front end of the robot is parallel to the x-axis of the visual view, but not necessarily parallel to the set first boundary. When the calibration plate is arranged, the calibration plate may have a certain included angle with the vision system, which results in the first boundary being unparallel to the x-direction of the vision system, and further results in the front end being unparallel to the first boundary when moving along the x-direction of the vision system.
It should be noted that step S1011 and step S1012 do not have a strict execution sequence, and step S011 can be executed first, and then step S1012 can be executed; step S1012 may be executed first, and then step S1011 may be executed. The execution of step S1011 and step S1012 does not affect the final calculation result.
Further, as shown in fig. 3, step S102 includes the following steps:
s1021, acquiring a plurality of images, and detecting the change situation of the first gray value at two sides of the first boundary from the plurality of images; when the gray values on two sides of the first boundary change, recording a first moving distance of the front end, and obtaining a first position and a second position of the front end in the y direction before and after the change of the first gray value is caused, and a first distance between the initial position of the front end in the y direction and the first boundary.
It should be noted that this step is not performed after steps S1011 and S1012 are completely performed, and step S1021 is performed during step S1011 to move the front end from the initial position in the y direction of the visual system.
That is, in the step S1011, the front end is moved from the initial position along the y direction of the vision system, an image of the calibration board is obtained once every time the front end is moved, and a change of the gray values on both sides of the first boundary is detected, when the change of the gray values on both sides of the first boundary is detected, the first moving distance of the front end at this time is recorded, and the first position and the second position of the front end in the y direction before and after the change of the first gray value is caused, and the first distance between the initial position of the front end in the y direction and the first boundary are obtained.
Specifically, in this embodiment, the calculation is performed by detecting a change in the gray level values of a plurality of pixels on both sides of the first boundary, and when the change in the gray level values of a plurality of pixels on both sides of the first boundary is detected, the number of times of movement of the front end at that time is recorded, and the first movement distance of the front end can be obtained according to the number of times and the minimum movement step diameter of the front end. When the front end moves from one side of the first boundary to the other side of the first boundary at a certain time, the first gray value is changed, before and after the first gray value is changed, the first position of the front end in the y direction is the position before the movement at this time, and the second position is the position after the movement at this time. The first location and the second location are both the same side of the second boundary. The distance between the first position and the first boundary can be calculated by acquiring the gray values of the pixels on two sides of the first boundary, and the first distance between the initial position of the front end and the first boundary in the y direction can be calculated according to the moving times of the front end moving to the first position.
S1022, acquiring a plurality of images, and detecting the change situation of the second gray value at two sides of the second boundary from the plurality of images; recording a second moving distance of the front end when the gray values at the two sides of the second boundary are changed; and obtaining a third position and a fourth position of the front end in the x-direction before and after causing the second gray value to change, and a second distance between the initial position of the front end in the x-direction and the second boundary.
It should be noted that this step is not performed after steps S1011 and S1012 are completely performed, and step S1022 is performed during step S1012 to move the front end from the initial position in the x direction of the vision system.
That is, this step acquires an image of the calibration plate once per moving the front end of the robot in the x direction of the vision system while moving the front end from the initial position in step S1012, detects a change in the gray values on both sides of the second boundary, records the second moving distance of the front end at this time when a change in the gray values on both sides of the second boundary is detected, and obtains the third position and the fourth position of the front end in the x direction before and after the second gray value is changed, and the second distance between the initial position of the front end in the x direction and the second boundary.
Specifically, in this embodiment, the calculation is performed by detecting a change of the gray level values of the pixels on both sides of the second boundary, when the change of the gray level values of the pixels on both sides of the second boundary is detected, the number of times of movement of the front end at this time is recorded, and the second movement distance of the front end can be obtained according to the number of times and the minimum movement step diameter of the front end. When the front end moves for a certain time and moves from one side of the second boundary to the other side, the second gray value is caused to change, before and after the second gray value is caused to change, the third position of the front end in the x direction is the position before the movement, and the fourth position is the position after the movement. The third location and the fourth location are both the same side of the first boundary. The distance between the third position and the second boundary can be calculated by acquiring the gray values of the pixels on the two sides of the second boundary, and the second distance between the initial position of the front end and the second boundary in the x direction can be calculated according to the moving times of the front end moving to the third position.
Further, as shown in fig. 4, step S101 further includes the following steps:
and S1013, according to the second moving distance, moving the front end from the initial position to a fourth position along the x direction of the vision system, and then moving the front end from the fourth position along the y direction of the vision system.
The front end of the robot is controlled to move in the x direction from its initial position on the calibration plate in the x direction along the x direction of the vision system according to the second distance until the fourth position obtained in step S1022 is reached, and the front end is moved from the fourth position along the y direction of the vision system. Wherein the specific step of moving the front end from the fourth position in the y-direction of the vision system is similar to step S1011.
And S1014, moving the front end from the initial position to a second position along the y direction of the visual system according to the first moving distance, and then moving the front end from the second position along the x direction of the visual system.
And the front end of the control robot moves to the second position obtained in the step S1021 along the y direction of the vision system from the initial position of the front end of the control robot on the calibration board along the y direction of the vision system according to the first distance, and then moves the front end along the x direction of the vision system from the second position. Wherein the specific step of moving the front end from the second position in the x-direction of the vision system is similar to step S1012.
Further, as shown in fig. 5, step S102 further includes the following steps:
s1023, in the process that the front end moves along the y direction of the vision system from the fourth position in the step S1013, a plurality of images are obtained, and the change situation of the first gray value on two sides of the first boundary is detected from the plurality of images; when the gray values on both sides of the first boundary are changed, a fifth position and a sixth position of the front end in the y direction before and after the change of the first gray value is obtained.
It is to be noted that this step is executed in the execution process of step S1013.
That is, this step acquires an image of the calibration plate once every time the front end of the robot is moved in the y direction of the vision system in moving the front end from the fourth position in step S1013. When the front end moves for a certain time and moves from one side of the first boundary to the other side, the first gray value is changed, before and after the first gray value is changed, the fifth position of the front end in the y direction is the position before the movement, the sixth position is the position after the movement, the distance between the fifth position and the sixth position is the minimum moving step diameter of the front end, and the fifth position and the sixth position are respectively arranged on two sides of the first boundary.
S1024, in the process that the front end moves along the x direction of the vision system from the second position in the step S1014, acquiring a plurality of images, and detecting the change situation of the second gray value at two sides of the second boundary from the plurality of images; when the gray values on both sides of the second boundary are changed, the seventh position and the eighth position of the front end in the x direction before and after the second gray value is changed are obtained.
It should be noted that this step is executed in the execution process of step S1014.
That is, this step acquires an image of the calibration plate once every time the front end of the robot is moved in the x direction of the vision system while moving the front end from the second position in step S1014. When the front end moves from one side of the second boundary to the other side of the second boundary, the second gray value is changed, before and after the second gray value is changed, the seventh position of the front end in the x direction is the position before the movement, the eighth position is the position after the movement, the distance between the seventh position and the eighth position is the minimum moving step diameter of the front end, and the seventh position and the eighth position are respectively arranged on two sides of the second boundary.
Further, as shown in fig. 6, step S103 includes the following steps:
and S1031, calculating to obtain the coordinates of the initial position of the front end in the visual coordinate system according to the first distance and the second distance.
The first distance is a distance between the front end and the first boundary in the y-direction, and the second distance is a distance between the front end of the robot and the second boundary in the x-direction. When the calibration plate is arranged, the first boundary is parallel to the x direction of the visual system and the second boundary is parallel to the y direction as much as possible, so that the first boundary can be regarded as the x axis and the second boundary as the y axis. The first distance can be regarded as the distance between the front end and the x-axis in the y direction, and the second distance can be regarded as the distance between the front end and the y-axis in the x direction, that is, the coordinates of the initial position of the front end in the visual coordinate system are obtained. It is understood that there may be an error in setting the calibration plate, and thus there may be an error in the coordinates of the initial position of the leading end in the visual coordinate system.
S1032, obtaining a first reference line according to the first position, the second position, the fifth position and the sixth position, obtaining a second reference line according to the third position, the fourth position, the seventh position and the eighth position, wherein the intersection point of the first reference line and the second reference line is the coordinate of the corrected calibration point in the robot coordinate system.
Specifically, the obtaining of the first reference line according to the first position, the second position, the fifth position and the sixth position is to connect a midpoint between the first position and the second position with a midpoint between the fifth position and the sixth position to obtain the first reference line. The obtaining of the second reference line according to the third position, the fourth position, the seventh position and the eighth position is specifically to connect a midpoint between the third position and the fourth position with a midpoint between the seventh position and the eighth position to obtain the second reference line. And the coordinates of the intersection point of the first reference line and the second reference line are the coordinates of the calibration point in the robot coordinate system. Theoretically, the difference between the coordinate of the calibration point in the robot coordinate system and the coordinate of the set calibration point in the visual coordinate system is smaller than or equal to the minimum moving step diameter of the robot, and in an ideal state, the coordinate of the calibration point in the robot coordinate system is the same as the coordinate of the calibration point in the visual coordinate system. And establishing a mapping relation according to a matrix equation by utilizing the coordinates of the initial position of the front end in the visual coordinate system and the coordinates of the initial position of the front end in the robot coordinate system, the coordinates of the calibration point in the robot coordinate system and the coordinates of the calibration point in the visual coordinate system, so that the conversion relation between the robot coordinate system and the visual coordinate system can be obtained, and the calibration of the robot is completed.
Referring to fig. 7a to 7h, the robot calibration method according to the first embodiment of the present invention is further illustrated according to the above detailed description.
And enabling the calibration plate to be opposite to the visual system, wherein the calibration point on the calibration plate is the origin of the visual system, the first boundary is coincided with the x axis, and the second boundary is coincided with the y axis. The calibration plate is a checkerboard formed by alternately arranging black lattices and white lattices. As shown in fig. 7a, the areas a and D are white squares, and the areas B and C are black squares, and the black square area B, C is not filled in order to clearly show the influence of the movement of the leading end on the gradation value. At the moment, the calibration plate is opposite to the visual system, the x axis and the y axis are respectively a first boundary and a second boundary, and the calibration point in the visual system is the intersection point of the x axis and the y axis.
Fig. 7a is a gray scale image of the calibration board obtained by the vision system, and the shaded area E is the projection on the calibration board when the front end contacts the calibration board (it is understood that only the projection near where the front end contacts the calibration board is drawn in the figure, and the projection of the front end should be continuous and extend to the edge of the calibration board). Each cell in the figure represents a pixel. At this time, the vision system can only obtain the gray distribution on the calibration plate, and cannot directly obtain the position of the contact point of the front end and the calibration plate from the image.
In the process of moving the front end from the initial position in the y direction of the visual system through step S1011, a plurality of images are acquired through step S1021, and a change in the first gray level values on both sides of the first boundary is detected from the plurality of acquired images. Assuming that the front end of the robot moves a minimum step diameter, i.e. moves from one side of the first boundary to the other side of the first boundary, the acquired image is, as shown in fig. 7b, the change of the first gray level values at both sides of the first boundary is calculated according to the gray level distributions of fig. 7a and 7b, and then the position of the contact point F between the front end of the robot and the calibration plate, and the first position 10 and the second position 20 of the front end before and after movement (i.e. the positions of the contact point between the front end and the calibration plate before and after movement) can be obtained. Meanwhile, a first distance between the initial position of the front end of the robot in the y direction and the first boundary is calculated through gray values in pixels on two sides of the first boundary.
The front end of the robot is returned to the initial position, and the front end is moved from the initial position in the x direction of the vision system through step S1012, in which a plurality of images are acquired through step S1022, and a change in the second gray level values on both sides of the second boundary is detected from the plurality of acquired images. Assuming that the front end of the robot moves a minimum step diameter, i.e. moves from one side of the second boundary to the other side of the second boundary, the acquired image is calculated to obtain the variation of the second gray level values at both sides of the second boundary according to the gray level distribution of fig. 7a and 7c, as shown in fig. 7c, and then the third position 30 and the fourth position 40 of the front end of the robot before and after the movement can be obtained. Meanwhile, a second distance between the initial position of the front end in the x direction and the second boundary is calculated through the gray values in the pixels on the two sides of the second boundary.
The front end of the robot is returned to the initial position, and after step S1013, the front end is moved from the initial position to a fourth position in the x direction according to the second moving distance, and then the front end is moved from the fourth position in the y direction of the vision system. In the process of moving the front end from the fourth position in the y direction of the vision system, a plurality of images are acquired, and a change in the first gray values on both sides of the first boundary is detected from the acquired plurality of images, through step S1023. Assuming that the front end of the robot moves a minimum step diameter, i.e. moves from the fourth position to the other side of the first boundary, the acquired image is, as shown in fig. 7d, the change of the first gray level values on both sides of the first boundary is calculated according to the image (not shown) obtained when the front end moves from one side of the second boundary to the other side along the x-direction and the gray level distribution of fig. 7d, and then the fifth position 50 and the sixth position 60 of the front end before and after movement can be obtained.
The front end of the robot is returned to the initial position again, and after step S1014, the front end is moved from the initial position to the second position in the y direction according to the first movement distance, and then moved from the second position in the x direction of the vision system. In the process of moving the front end from the second position in the x direction of the vision system, a plurality of images are acquired, and a change in the second gray values on both sides of the second boundary is detected from the acquired plurality of images, by step S1024. Assuming that the robot front end moves a minimum step diameter, i.e. moves from one side of the second boundary to the other side of the second boundary, the acquired image is, as shown in fig. 7e, the change of the second gray level values at both sides of the second boundary is calculated from the image (not shown) acquired by moving the front end from one side of the first boundary to the other side in the y-direction and the gray level distribution of fig. 7e, and then the seventh position 70 and the eighth position 80 of the front end before and after the movement can be obtained.
According to the first distance between the initial position of the front end in the y direction and the first boundary and the second distance between the initial position of the front end in the x direction and the second boundary, which are obtained in the above steps, the coordinates of the initial position of the front end in the visual coordinate system can be calculated.
As shown in fig. 7f, according to steps S1021 and S1023, the first position 10, the second position 20, the fifth position 50, and the sixth position 60 are acquired, and a midpoint between the first position 10 and the second position 20 and a midpoint between the fifth position 50 and the sixth position 60 are connected, resulting in a first reference line L1 shown in fig. 7 f. As shown in fig. 7g, according to steps S1021 and S1023, the third position 30, the fourth position 40, the seventh position 70, and the eighth position 80 are acquired, and the midpoint between the third position 30 and the fourth position 40 and the midpoint between the seventh position 70 and the eighth position 80 are connected, resulting in a second reference line L2 shown in fig. 7 g. As shown in fig. 7H, the first reference line L1 and the second reference line L2 intersect each other, and the coordinates of the intersection point H are the coordinates of the calibration point in the robot coordinate system.
Since the change of the gray value on both sides of the first boundary or the second boundary is caused by the movement of the front end of the robot by the minimum moving step diameter at a certain time, the obtained distance between the first position and the second position, the distance between the third position and the fourth position, the distance between the fifth position and the sixth position, and the distance between the seventh position and the eighth position are substantially equal to the minimum moving step diameter of the front end, theoretically, the difference between the coordinate of the calibration point in the robot coordinate system and the coordinate of the set calibration point in the visual coordinate system is less than or equal to the minimum moving step diameter of the robot, and ideally, the coordinate of the calibration point in the robot coordinate system is the same as the coordinate of the calibration point in the visual coordinate system. Then, a mapping relation is established according to a matrix equation by using the coordinates of the initial position of the front end in the visual coordinate system and the coordinates of the initial position of the front end in the robot coordinate system, the coordinates of the calibration point in the robot coordinate system and the coordinates of the calibration point in the visual coordinate system, and a conversion relation between the robot coordinate system and the visual coordinate system can be obtained, so that the calibration of the robot is completed.
Referring to fig. 8, fig. 8 is a schematic structural diagram of a storage medium according to an embodiment of the present invention. As shown in fig. 8, the storage medium 400 in the present embodiment stores executable program data 401, and the program data 401 is executed to implement the embodiment of the robot calibration method shown in fig. 1 to 6. In this embodiment, the storage medium 400 may be a storage medium with a storage function, such as a storage module of an intelligent terminal, a mobile storage device (e.g., a mobile hard disk, a usb disk, etc.), a network cloud disk, or an application storage platform.
Referring to fig. 9, fig. 9 is a functional block diagram of the first embodiment of the robot calibration system of the present invention. The robot calibration system 200 comprises a vision apparatus 201, a processor 202 and a memory 204; wherein, the vision device 201 and the memory 204 are respectively connected with the processor 202;
and the vision device 201 is used for shooting the space operated by the front end of the robot and establishing a corresponding vision system. And the memory 204 is used for storing the preset movement rule of the front end and the execution instruction of the processor.
A processor 202 for performing the following acts:
controlling the front end of the robot to move from an initial position on a calibration plate according to a preset rule, wherein a calibration point for calibration, a first boundary and a second boundary determined according to the calibration point are arranged on the calibration plate; acquiring a plurality of images of the front end acquired by a vision system in the moving process, detecting the change conditions of the gray values at two sides of the first boundary and the second boundary from the plurality of images, and recording the moving distance of the front end; calculating to obtain the coordinates of the initial position of the front end in a visual coordinate system and the coordinates of the calibration point in a robot coordinate system according to the change condition of the gray value and the moving distance; and calculating to obtain a conversion relation between a robot coordinate system and the visual coordinate system according to the initial position and the position of the calibration point, thereby completing the calibration of the robot.
In this embodiment, the actions executed by the processor 202 correspond to the robot calibration method shown in fig. 1 to 6, and specific reference is made to the detailed description of the robot calibration method, which is not described herein again.
Referring to fig. 10, fig. 10 is a schematic structural diagram of a robot according to an embodiment of the present invention. The robot 300 includes a robot body 301 and a robot calibration system 302.
The robot 301 comprises a front end of a robot for gripping an object. Robot calibration system 302 includes: a vision device, a processor, and a memory; wherein, the vision device and the memory are respectively connected with the processor. And the vision device is used for shooting the space operated by the front end and establishing a corresponding vision system. And the memory is used for storing the preset movement rule of the front end and the execution instruction of the processor. A processor configured to perform the following acts:
controlling the front end of the robot to move from an initial position on a calibration plate according to a preset rule, wherein a calibration point for calibration, a first boundary and a second boundary determined according to the calibration point are arranged on the calibration plate; acquiring a plurality of images of the front end acquired by a vision system in the moving process, detecting the change conditions of the gray values at two sides of the first boundary and the second boundary from the plurality of images, and recording the moving distance of the front end; calculating to obtain the coordinates of the initial position of the front end in a visual coordinate system and the coordinates of the calibration point in a robot coordinate system according to the change condition of the gray value and the moving distance; and calculating to obtain a conversion relation between a robot coordinate system and the visual coordinate system according to the initial position and the position of the calibration point, thereby completing the calibration of the robot.
The robot calibration system in this embodiment is the same as the robot calibration system shown in fig. 9, and is not described here again.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes performed by the present specification and drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (23)

1. A robot calibration method comprises the following steps:
controlling the front end of the robot to move from an initial position on a calibration plate according to a preset rule, wherein a calibration point for calibration, a first boundary and a second boundary determined according to the calibration point are arranged on the calibration plate;
acquiring a plurality of images of the front end acquired by a vision system in the moving process, detecting the change conditions of the gray values at two sides of the first boundary and the second boundary from the plurality of images, and recording the moving distance of the front end;
calculating to obtain the coordinates of the initial position of the front end in a visual coordinate system and the coordinates of the calibration point in a robot coordinate system according to the change condition of the gray value and the moving distance;
and calculating to obtain a conversion relation between a robot coordinate system and the visual coordinate system according to the initial position and the position of the calibration point, thereby completing the calibration of the robot.
2. The calibration method according to claim 1, wherein the controlling the front end of the robot to move on the calibration plate from the initial position according to the preset rule comprises:
moving the front end from the initial position in a y-direction of the vision system;
and moving the front end from the initial position in an x-direction of the vision system.
3. The calibration method according to claim 2, wherein the acquiring a plurality of images of the front end acquired by a vision system during the moving process, detecting a change of gray values at two sides of the first boundary and the second boundary from the plurality of images, and recording a moving distance of the front end comprises:
acquiring the plurality of images in the process of moving the front end from the initial position along the y direction of the visual system, and detecting the change situation of first gray values on two sides of the first boundary from the plurality of images; when the gray values on two sides of the first boundary change, recording a first moving distance of a front end, and obtaining a first position and a second position of the front end in the y direction before and after the change of the first gray value is caused, and a first distance between an initial position of the front end in the y direction and the first boundary;
acquiring the plurality of images in the process of moving the front end from the initial position along the x direction of the vision system, and detecting the change condition of second gray values on two sides of the second boundary from the plurality of images; recording a second moving distance of the front end when the gray values at the two sides of the second boundary are changed; and obtaining a third position and a fourth position of the front end in the x-direction before and after causing the second gray value to change, and a second distance between the initial position of the front end in the x-direction and the second boundary.
4. The calibration method according to claim 3, wherein the controlling the front end of the robot to move on the calibration plate according to the preset rule further comprises:
moving the front end from the initial position to the fourth position along the x direction of the vision system according to a second moving distance, and then moving the front end from the fourth position along the y direction of the vision system;
and moving the front end from the initial position to the second position along the y direction of the vision system according to the first moving distance, and then moving the front end from the second position along the x direction of the vision system.
5. The calibration method according to claim 4, wherein the acquiring a plurality of images of the front end acquired by a vision system during the moving process, detecting a change of gray values at two sides of the first boundary and the second boundary from the plurality of images, and recording a moving distance of the front end further comprises:
acquiring the plurality of images during the moving of the front end from the fourth position along the y direction of the visual system, and detecting the change situation of the first gray values on two sides of the first boundary from the plurality of images; when the gray values on two sides of the first boundary change, a fifth position and a sixth position of the front end in the y direction before and after the first gray value changes are obtained;
acquiring the plurality of images during the process of moving the front end from the second position along the x direction of the vision system, and detecting the change situation of the second gray value at two sides of the second boundary from the plurality of images; and when the gray values on the two sides of the second boundary are changed, obtaining a seventh position and an eighth position of the front end in the x direction before and after the second gray value is changed.
6. The calibration method according to claim 5, wherein the calculating the coordinates of the initial position of the front end in the visual coordinate system and the coordinates of the calibration point in the robot coordinate system according to the change of the gray value and the moving distance includes:
calculating to obtain the coordinates of the initial position of the front end in the visual coordinate system according to the first distance and the second distance;
and obtaining a first reference line according to the first position, the second position, the fifth position and the sixth position, obtaining a second reference line according to the third position, the fourth position, the seventh position and the eighth position, wherein the coordinates of the intersection point of the first reference line and the second reference line are the coordinates of the calibration point in the robot coordinate system.
7. The calibration method according to claim 6, wherein the obtaining of the first reference line according to the first position, the second position, the fifth position, and the sixth position specifically includes:
connecting a midpoint between the first position and the second position with a midpoint between the fifth position and the sixth position to obtain a first reference line;
obtaining a second reference line according to the third position, the fourth position, the seventh position and the eighth position, specifically:
and connecting a midpoint between the third position and the fourth position with a midpoint between the seventh position and the eighth position to obtain a second reference line.
8. The calibration method according to claim 7, wherein the calibration plate is a checkerboard arranged at intervals of black grids and white grids, and the calibration points arranged on the calibration plate are intersections of the black grids and the white grids; the first boundary and the second boundary determined according to the index point include:
respectively taking the edge lines of the two black and white grids passing through the calibration point as a first boundary line and a second boundary line;
when the origin of the vision system coincides with the index point;
and defining one boundary line which is closest to the x axis of the vision system as a first boundary and defining one boundary line which is closest to the y axis of the vision system as a second boundary.
9. The calibration method according to claim 1, wherein the calibration plate is a checkerboard arranged alternately with black grids and white grids, and the calibration points arranged on the calibration plate are intersections of the black grids and the white grids;
the first boundary and the second boundary determined according to the index point include:
and respectively taking the edge lines of the two black and white grids passing through the calibration point as a first boundary line and a second boundary line.
10. The calibration method according to claim 1, wherein the detecting a change of the gray-scale values at two sides of the first boundary and the second boundary specifically comprises:
and detecting the change situation of the gray value of a plurality of pixels at two sides of the first boundary and the second boundary.
11. A storage medium, wherein the storage medium stores program data executable to implement a robot calibration method according to any one of claims 1-10.
12. A robot calibration system, comprising: a vision device, a processor, and a memory; wherein the vision device and the memory are respectively connected with the processor;
the vision device is used for shooting the space operated by the front end of the robot and establishing a corresponding vision system;
the memory is used for storing preset movement rules of the front end of the robot and execution instructions of the processor;
the processor is configured to perform the following actions:
controlling the front end of the robot to move from an initial position on a calibration plate according to a preset rule, wherein a calibration point for calibration, a first boundary and a second boundary determined according to the calibration point are arranged on the calibration plate;
acquiring a plurality of images of the front end acquired by a vision system in the moving process, detecting the change conditions of the gray values at two sides of the first boundary and the second boundary from the plurality of images, and recording the moving distance of the front end;
calculating to obtain the coordinates of the initial position of the front end in a visual coordinate system and the coordinates of the calibration point in a robot coordinate system according to the change condition of the gray value and the moving distance;
and calculating to obtain a conversion relation between a robot coordinate system and the visual coordinate system according to the initial position and the position of the calibration point, thereby completing the calibration of the robot.
13. The calibration system of claim 12, wherein the controlling the front end of the robot performed by the processor to move from an initial position on the calibration plate according to a preset rule comprises:
moving the front end from the initial position in a y-direction of the vision system;
and moving the front end from the initial position in an x-direction of the vision system.
14. The calibration system according to claim 13, wherein a plurality of images of the front end acquired by the vision system are acquired during the movement, and a change of gray values on both sides of the first boundary and the second boundary is detected from the plurality of images; and recording the moving distance of the front end, comprising:
acquiring the plurality of images in the process of moving the front end from the initial position along the y direction of the visual system, and detecting the change situation of first gray values on two sides of the first boundary from the plurality of images; when the gray values on two sides of the first boundary change, recording a first moving distance of the front end, and obtaining a first position and a second position of the front end in the y direction before and after the change of the first gray value is caused, and a first distance between an initial position of the front end in the y direction and the first boundary;
acquiring the plurality of images in the process of moving the front end from the initial position along the x direction of the vision system, and detecting the change condition of second gray values on two sides of the second boundary from the plurality of images; recording a second moving distance of the front end when the gray values at the two sides of the second boundary are changed; and obtaining a third position and a fourth position of the front end in the x-direction before and after causing the second gray value to change, and a second distance between the initial position of the front end in the x-direction and the second boundary.
15. The calibration system of claim 14, wherein the controlling the front end of the robot performed by the processor to move from an initial position on the calibration plate according to a preset rule further comprises:
moving the front end from the initial position to the fourth position along the x direction of the vision system according to a second moving distance, and then moving the front end from the fourth position along the y direction of the vision system;
and moving the front end from the initial position to the second position along the y direction of the vision system according to the first moving distance, and then moving the front end from the second position along the x direction of the vision system.
16. The calibration system as set forth in claim 15, wherein the processor performs the steps of acquiring a plurality of images of the front end acquired by a vision system during the moving, detecting a change in gray level values on both sides of the first and second boundaries from the plurality of images, and recording a moving distance of the front end, further comprising:
acquiring the plurality of images during the moving of the front end from the fourth position along the y direction of the visual system, and detecting the change situation of the first gray values on two sides of the first boundary from the plurality of images; when the gray values on two sides of the first boundary change, a fifth position and a sixth position of the front end in the y direction before and after the first gray value changes are obtained;
acquiring the plurality of images during the process of moving the front end from the second position along the x direction of the vision system, and detecting the change situation of the second gray value at two sides of the second boundary from the plurality of images; and when the gray values on the two sides of the second boundary are changed, obtaining a seventh position and an eighth position of the front end in the x direction before and after the second gray value is changed.
17. The calibration system of claim 16, wherein the calculation performed by the processor based on the change of the gray value and the moving distance to obtain the coordinates of the initial position of the front end in a visual coordinate system and the coordinates of the calibration point in a robot coordinate system comprises:
calculating the coordinates of the initial position of the front end in the visual system according to the first distance and the second distance;
and obtaining a first reference line according to the first position, the second position, the fifth position and the sixth position, obtaining a second reference line according to the third position, the fourth position, the seventh position and the eighth position, wherein the intersection point of the first reference line and the second reference line is the coordinate of the calibration point in the robot coordinate system.
18. The calibration system of claim 17, wherein the obtaining of the first reference line according to the first position, the second position, the fifth position, and the sixth position is specifically:
connecting a midpoint between the first position and the second position with a midpoint between the fifth position and the sixth position to obtain a first reference line;
obtaining a second reference line according to the third position, the fourth position, the seventh position and the eighth position, specifically:
and connecting a midpoint between the third position and the fourth position with a midpoint between the seventh position and the eighth position to obtain a second reference line.
19. The calibration system of claim 18, wherein the calibrated calibration point is an intersection of a first reference line and a second reference line.
20. The calibration system according to claim 19, wherein the calibration plate is a checkerboard arranged at intervals of black grids and white grids, and the calibration points arranged on the calibration plate are intersections of the black grids and the white grids; the first boundary and the second boundary determined according to the index point include:
respectively taking the edge lines of the two black and white grids passing through the calibration point as a first boundary line and a second boundary line;
when the origin of the vision system coincides with the index point;
and defining one boundary line which is closest to the x axis of the vision system as a first boundary and defining one boundary line which is closest to the y axis of the vision system as a second boundary.
21. The calibration system according to claim 12, wherein the calibration plate is a checkerboard arranged at intervals of black grids and white grids, and the calibration points arranged on the calibration plate are intersections of the black grids and the white grids;
the first boundary and the second boundary determined according to the index point include:
and respectively taking the edge lines of the two black and white grids passing through the calibration point as a first boundary line and a second boundary line.
22. A robot comprises a robot body and a robot calibration system,
the robot comprises a front end of the robot for gripping an object;
the robot calibration system includes:
a vision device, a processor, and a memory; wherein the vision device and the memory are respectively connected with the processor;
the vision device is used for shooting the space operated by the front end of the robot and establishing a corresponding vision system;
the memory is used for storing preset movement rules of the front end of the robot and execution instructions of the processor;
the processor is configured to perform the following actions:
controlling the front end of the robot to move from an initial position on a calibration plate according to a preset rule, wherein a calibration point for calibration, a first boundary and a second boundary determined according to the calibration point are arranged on the calibration plate;
acquiring a plurality of images of the front end acquired by a vision system in the moving process, detecting the change conditions of the gray values at two sides of the first boundary and the second boundary from the plurality of images, and recording the moving distance of the front end;
calculating to obtain the coordinates of the initial position of the front end in a visual coordinate system and the coordinates of the calibration point in a robot coordinate system according to the change condition of the gray value and the moving distance;
and calculating to obtain a conversion relation between a robot coordinate system and the visual coordinate system according to the initial position and the position of the calibration point, thereby completing the calibration of the robot.
23. The robot according to claim 22, wherein the calibration plate is a checkerboard arranged at intervals of black grids and white grids, and the calibration points arranged on the calibration plate are intersections of the black grids and the white grids; the first boundary and the second boundary determined according to the index point include:
respectively taking the edge lines of the two black and white grids passing through the calibration point as a first boundary line and a second boundary line;
when the origin of the vision system coincides with the index point;
and defining one boundary line which is closest to the x axis of the vision system as a first boundary and defining one boundary line which is closest to the y axis of the vision system as a second boundary.
CN201780034033.XA 2017-05-26 2017-05-26 Robot calibration method and system, robot and storage medium Active CN109952176B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/086094 WO2018214147A1 (en) 2017-05-26 2017-05-26 Robot calibration method and system, robot and storage medium

Publications (2)

Publication Number Publication Date
CN109952176A CN109952176A (en) 2019-06-28
CN109952176B true CN109952176B (en) 2022-04-15

Family

ID=64395167

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780034033.XA Active CN109952176B (en) 2017-05-26 2017-05-26 Robot calibration method and system, robot and storage medium

Country Status (2)

Country Link
CN (1) CN109952176B (en)
WO (1) WO2018214147A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109636859B (en) * 2018-12-24 2022-05-10 武汉大音科技有限责任公司 Single-camera-based calibration method for three-dimensional visual inspection
CN110017852B (en) * 2019-04-25 2023-03-21 广东省智能机器人研究院 Navigation positioning error measuring method
CN111145247B (en) * 2019-12-18 2023-07-07 配天机器人技术有限公司 Position degree detection method based on vision, robot and computer storage medium
CN112123329A (en) * 2020-02-21 2020-12-25 深圳市三宝创新智能有限公司 Robot 3D vision hand-eye calibration method
CN114838659B (en) * 2022-04-26 2024-04-12 深圳市商汤科技有限公司 Manipulator testing device, testing method, calibration method and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101698303A (en) * 2009-09-11 2010-04-28 大连理工大学 Automatic calibration method between three-dimensional laser and monocular vision
KR20100125015A (en) * 2009-05-20 2010-11-30 삼성전자주식회사 Apparatus and method for calibration, and calibration rig
CN102063721A (en) * 2011-01-06 2011-05-18 大连理工大学 Method for compensating inclination of straight calibration object in calibration process of external parameter of vision system
CN103192397A (en) * 2012-01-09 2013-07-10 沈阳新松机器人自动化股份有限公司 Off-line visual programming method and system for robot
CN104180753A (en) * 2014-07-31 2014-12-03 东莞市奥普特自动化科技有限公司 Rapid calibration method of robot visual system
CN105945909A (en) * 2016-05-13 2016-09-21 大族激光科技产业集团股份有限公司 Error correction method and system for three-degree-of-freedom parallel robot

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1292878C (en) * 2003-09-03 2007-01-03 中国科学院自动化研究所 Pickup camera self calibration method based on robot motion
CN103292725A (en) * 2012-02-29 2013-09-11 鸿富锦精密工业(深圳)有限公司 Special boundary measuring system and method
CN103292695B (en) * 2013-05-10 2016-02-24 河北科技大学 A kind of single eye stereo vision measuring method
CN104827480A (en) * 2014-02-11 2015-08-12 泰科电子(上海)有限公司 Automatic calibration method of robot system
CN104786226A (en) * 2015-03-26 2015-07-22 华南理工大学 Posture and moving track positioning system and method of robot grabbing online workpiece
CN105234943B (en) * 2015-09-09 2018-08-14 大族激光科技产业集团股份有限公司 A kind of industrial robot teaching device and method of view-based access control model identification

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100125015A (en) * 2009-05-20 2010-11-30 삼성전자주식회사 Apparatus and method for calibration, and calibration rig
CN101698303A (en) * 2009-09-11 2010-04-28 大连理工大学 Automatic calibration method between three-dimensional laser and monocular vision
CN102063721A (en) * 2011-01-06 2011-05-18 大连理工大学 Method for compensating inclination of straight calibration object in calibration process of external parameter of vision system
CN103192397A (en) * 2012-01-09 2013-07-10 沈阳新松机器人自动化股份有限公司 Off-line visual programming method and system for robot
CN104180753A (en) * 2014-07-31 2014-12-03 东莞市奥普特自动化科技有限公司 Rapid calibration method of robot visual system
CN105945909A (en) * 2016-05-13 2016-09-21 大族激光科技产业集团股份有限公司 Error correction method and system for three-degree-of-freedom parallel robot

Also Published As

Publication number Publication date
CN109952176A (en) 2019-06-28
WO2018214147A1 (en) 2018-11-29

Similar Documents

Publication Publication Date Title
CN109952176B (en) Robot calibration method and system, robot and storage medium
US9199379B2 (en) Robot system display device
EP1584426B1 (en) Tool center point calibration system
JP5897624B2 (en) Robot simulation device for simulating workpiece removal process
CN110842901B (en) Robot hand-eye calibration method and device based on novel three-dimensional calibration block
CN110599541A (en) Method and device for calibrating multiple sensors and storage medium
JP2005201824A (en) Measuring device
CN105509671A (en) Method for calibrating central point of robot tool through employing plane calibration plate
KR20210116507A (en) Calibration method, positioning method, apparatus, electronic device and storage medium
CN110722558B (en) Origin correction method and device for robot, controller and storage medium
CN113370221A (en) Robot TCP calibration system, method, device, equipment and storage medium
Ding et al. A robust detection method of control points for calibration and measurement with defocused images
CN110238851B (en) Mobile robot and rapid calibration method and system thereof
CN113567964A (en) Laser radar automatic test method, device and system
CN116276938B (en) Mechanical arm positioning error compensation method and device based on multi-zero visual guidance
CN106959704B (en) Control method and system of three-dimensional topography measuring instrument
CN112809668A (en) Method, system and terminal for automatic hand-eye calibration of mechanical arm
CN109311151B (en) Calibration method and system of robot and calibration plate
CN113269836B (en) 3D camera calibration method and device, computer equipment and storage medium thereof
CN114998446A (en) Calibration method between camera and manipulator, controller, device and storage medium
US11577400B2 (en) Method and apparatus for managing robot system
CN115122333A (en) Robot calibration method and device, electronic equipment and storage medium
CN115990890B (en) Calibration method and device for manipulator, computer equipment and storage medium
CN109615658B (en) Method and device for taking articles by robot, computer equipment and storage medium
CN114800520B (en) High-precision hand-eye calibration method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant