CN112116664B - Method and device for generating hand-eye calibration track, electronic equipment and storage medium - Google Patents

Method and device for generating hand-eye calibration track, electronic equipment and storage medium Download PDF

Info

Publication number
CN112116664B
CN112116664B CN202010923062.5A CN202010923062A CN112116664B CN 112116664 B CN112116664 B CN 112116664B CN 202010923062 A CN202010923062 A CN 202010923062A CN 112116664 B CN112116664 B CN 112116664B
Authority
CN
China
Prior art keywords
robot
target
calibration plate
pose
initial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010923062.5A
Other languages
Chinese (zh)
Other versions
CN112116664A (en
Inventor
许金鹏
温志庆
周德成
李伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ji Hua Laboratory
Original Assignee
Ji Hua Laboratory
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ji Hua Laboratory filed Critical Ji Hua Laboratory
Priority to CN202010923062.5A priority Critical patent/CN112116664B/en
Publication of CN112116664A publication Critical patent/CN112116664A/en
Application granted granted Critical
Publication of CN112116664B publication Critical patent/CN112116664B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a method, a device, electronic equipment and a storage medium for generating a hand-eye calibration track, wherein a plurality of target rotation angle values are obtained; acquiring an initial pose of a robot; generating a plurality of target poses according to the plurality of target rotation angle values and the initial poses; the target pose is a pose obtained after the robot rotates from the initial pose according to the target rotation angle value; moving according to the target poses; at each target pose, position adjustment is carried out according to the position of the calibration plate, so that the calibration plate is completely exposed in the field of view of the camera; recording the adjusted pose data of the robot; therefore, the calibration track does not need to be designed manually, the working efficiency can be improved, and the professional level and experience requirements on staff are low.

Description

Method and device for generating hand-eye calibration track, electronic equipment and storage medium
Technical Field
The present invention relates to the field of industrial robots, and in particular, to a method and apparatus for generating a hand-eye calibration track, an electronic device, and a storage medium.
Background
At present, in order to realize the precision and the intellectualization of an industrial robot, a robot vision system is required to be relied on.
The first step in the application of robot vision is the calibration of robot eyes. The more commonly used robot vision system is usually Eye-in-Hand system, generally a 3D camera is installed on the flange at the tail end of the robot, so that the 3D camera moves along with the movement of the robot, and the accuracy of the Hand-Eye calibration of the vision system determines the machining and manufacturing accuracy of the robot mainly by the accuracy of the Hand-Eye calibration. Therefore, the robot hand-eye calibration is particularly important.
For Eye-in-Hand system, generally, when performing Hand-Eye calibration, a motion track and a photographing position point of a robot are required to be set manually, and then when the robot moves according to the set motion track, a camera is used for collecting a photo of a calibration plate at the preset photographing position point and recording the pose of the robot. The manual design of the robot motion trail and the photographing position point in the calibration is not only low in efficiency, but also high in professional level and experience requirements of staff, so that the cost of the user of the using unit is high.
Disclosure of Invention
In view of the above-mentioned shortcomings of the prior art, an object of an embodiment of the present application is to provide a method, an apparatus, an electronic device, and a storage medium for generating a hand-eye calibration track, which can improve working efficiency and have low requirements on the expertise level and experience of a worker.
In a first aspect, an embodiment of the present application provides a method for generating a hand-eye calibration track, which is applied to a robot, including the steps of:
A1. acquiring a plurality of target rotation angle values;
A2. acquiring an initial pose of a robot;
A3. Generating a plurality of target poses according to the plurality of target rotation angle values and the initial poses; the target pose is a pose obtained after the robot rotates from the initial pose according to the target rotation angle value;
A4. moving according to the target poses;
A5. at each target pose, position adjustment is carried out according to the position of the calibration plate, so that the calibration plate is completely exposed in the field of view of the camera;
A6. And recording the adjusted pose data of the robot.
In the method for generating the hand-eye calibration track, the step A2 comprises the following steps: acquiring an initial posture and an initial position of a robot;
The step A3 comprises the following steps:
A301. According to each target rotation angle value, calculating normalized four elements when the robot rotates around X, Y, Z axes by corresponding angles, and obtaining a plurality of normalized four elements;
A302. Respectively carrying out point multiplication on the initial gestures by using the plurality of normalized four elements, and then carrying out normalization processing to obtain a plurality of target gestures;
A303. and combining the initial position with the target poses to obtain a plurality of target poses.
In the method for generating the hand-eye calibration track, the step A5 comprises the following steps:
A501. Acquiring a picture shot by a camera;
A502. judging whether the calibration plate is complete in the picture;
A503. If the calibration plate is incomplete in the picture, judging the direction of the robot to be moved according to the condition that the calibration plate is missing in the picture;
A504. And moving the robot towards the direction required to move until the calibration plate is completely exposed to the field of view of the camera.
Further, step a503 includes:
Acquiring the missing characteristic point information of the calibration plate in the picture;
and judging the direction in which the robot needs to move according to the position of the missing characteristic point on the calibration plate and the relative position relation between the calibration plate and the robot.
Further, step a504 includes:
The robot moves in a step-by-step mode in the direction needing to move in a preset step-size mode;
after each movement, acquiring a picture shot by a camera and judging whether the calibration plate is complete in the picture;
and stopping moving if the calibration plate is complete in the picture.
In a second aspect, an embodiment of the present application provides a device for generating a hand-eye calibration track, which is applied to a robot, including:
The first acquisition module is used for acquiring a plurality of target rotation angle values;
the second acquisition module is used for acquiring the initial pose of the robot;
The generating module is used for generating a plurality of target poses according to the plurality of target rotation angle values and the initial poses; the target pose is a pose obtained after the robot rotates from the initial pose according to the target rotation angle value;
the first execution module is used for moving according to the plurality of target poses;
The position adjustment module is used for carrying out position adjustment at each target pose according to the position of the calibration plate so that the calibration plate is completely exposed in the field of view of the camera;
and the recording module is used for recording the adjusted pose data of the robot.
In the hand-eye calibration track generating device, the second acquisition module acquires the initial pose and the initial position of the robot when acquiring the initial pose of the robot;
The generation module generates a plurality of target poses based on the plurality of target rotation angle values and the initial poses,
According to each target rotation angle value, calculating normalized four elements when the robot rotates around X, Y, Z axes by corresponding angles, and obtaining a plurality of normalized four elements;
respectively carrying out point multiplication on the initial gestures by using the plurality of normalized four elements, and then carrying out normalization processing to obtain a plurality of target gestures;
And combining the initial position with the target poses to obtain a plurality of target poses.
In the hand-eye calibration track generating device, when the position of the robot is adjusted by the position adjusting module,
Acquiring a picture shot by a camera;
Judging whether the calibration plate is complete in the picture;
If the calibration plate is incomplete in the picture, judging the direction of the robot to be moved according to the condition that the calibration plate is missing in the picture;
and moving the robot towards the direction required to move until the calibration plate is completely exposed to the field of view of the camera.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor and a memory, where the memory stores a computer program, and the processor is configured to execute the steps of the method for generating a hand-eye calibration track by calling the computer program stored in the memory.
In a fourth aspect, an embodiment of the present application provides a storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the hand-eye calibration trajectory generation method described in the operation item.
The beneficial effects are that:
The embodiment of the application provides a method, a device, electronic equipment and a storage medium for generating hand-eye calibration tracks, wherein a plurality of target rotation angle values are obtained; acquiring an initial pose of a robot; generating a plurality of target poses according to the plurality of target rotation angle values and the initial poses; the target pose is a pose obtained after the robot rotates from the initial pose according to the target rotation angle value; moving according to the target poses; at each target pose, position adjustment is carried out according to the position of the calibration plate, so that the calibration plate is completely exposed in the field of view of the camera; recording the adjusted pose data of the robot; therefore, the calibration track does not need to be designed manually, the working efficiency can be improved, and the professional level and experience requirements on staff are low.
Drawings
Fig. 1 is a flowchart of a method for generating a hand-eye calibration track according to an embodiment of the present application.
Fig. 2 is a block diagram of a hand-eye calibration track generating device according to an embodiment of the present application.
Fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. The components of the embodiments of the present application generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the application, as presented in the figures, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by a person skilled in the art without making any inventive effort, are intended to be within the scope of the present application.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only to distinguish the description, and are not to be construed as indicating or implying relative importance.
Referring to fig. 1, the method for generating a hand-eye calibration track provided by the embodiment of the application is applied to a robot, and includes the following steps:
A1. acquiring a plurality of target rotation angle values;
A2. acquiring an initial pose of a robot;
A3. Generating a plurality of target poses according to the plurality of target rotation angle values and the initial poses; the target pose is a pose obtained after the robot rotates from the initial pose according to the target rotation angle value;
A4. Moving according to the multiple target poses;
A5. at each target pose, position adjustment is carried out according to the position of the calibration plate, so that the calibration plate is completely exposed in the field of view of the camera;
A6. And recording the adjusted pose data of the robot.
The method is aimed at Eye-in-Hand systems, i.e. the camera is arranged at the end of the robot. The principle of the method is as follows: given some target rotation angle values, if the robot rotates around the corresponding axial direction according to the given target rotation angle value under the initial pose (the camera can shoot the complete calibration plate under the initial pose), the pose changes after each rotation, so that the position of the calibration plate in the photo can change, when the calibration plate is incomplete in the photo, the calibration plate is fully exposed in the field of view of the camera again through adjusting the position of the robot, the pose of the robot changes again after adjustment, the adjusted pose is taken as a node of the calibration track, and as a plurality of target rotation angle values are given, a plurality of nodes can be obtained, namely the calibration track is obtained, and the fact that the calibration plate can be completely shot at each node is ensured. Therefore, the calibration track does not need to be designed manually, the working efficiency can be improved, and the professional level and experience requirements on staff are low.
In the step A1, a plurality of preset target rotation angle values (for example, 25 °,30 °, 35 °, 40 °, 45 °, 50 °, -25 °, -30 °, -35 °, -40 °, -45 °, -50 °) can be directly obtained; the angle range and the target number can be obtained, and then the corresponding number of angle values in the angle range are selected as target rotation angle values according to the target number. For the latter way, the angle values may be selected at equal intervals in the angle range according to the target number, or may be selected randomly in the angle range.
Wherein, since the pose of the robot is a pose including a pose and a position, the pose is generally represented by three pose angles (euler angles), and the position is generally represented by coordinates (x, y, z) of three axes, the step A2 includes: the initial pose and the initial position of the robot are acquired.
Further, step A3 includes:
A301. According to each target rotation angle value, calculating normalized four elements when the robot rotates around X, Y, Z axes by corresponding angles, and obtaining a plurality of normalized four elements;
A302. Respectively carrying out point multiplication on the initial gestures by using a plurality of normalized four elements, and then carrying out normalization processing to obtain a plurality of target gestures;
A303. and combining the initial position with a plurality of target poses respectively to obtain a plurality of target poses.
According to the Z-Y-X euler angle function in robot kinematics, the rotation transformation matrix can be expressed as:
Wherein, Alpha, beta and gamma are three attitude angles respectively;
The above formula is calculated:
The parameters are simplified into a coefficient matrix of m according to quaternion Is calculated by the following formula
The normalized four elements can then be calculated by the following formula:
Wherein, Is normalized to four elements,/>Is quaternion/>Is a die length of the die.
In step a302, the calculation can be performed by the following formula:
Wherein, For the target gesture,/>For the gesture obtained after dot multiplication,/>For/>Is of the modular length,/>Is the initial pose.
In step a303, the target pose calculated by the target pose is taken as the pose of the target pose, and the initial position obtained in step A2 is taken as the position of the target pose, so that the target pose is obtained.
For example, in step A1, the target rotation angle values of 25 °,30 °, 35 °, 40 °, 45 °, 50 °, -25 °, -30 °, -35 °, -40 °, -45 °, -50 ° are obtained, and in step a301, normalized four elements (α=25°, β=0°, γ=0°) when the robot rotates 25 ° around the X axis, normalized four elements (α=0 °, β=25 °, γ=0°), normalized four elements (α=0 °, β=0 °, γ=25°) when the robot rotates 25 ° around the Z axis, normalized four elements (α=30 °, β=0°, γ=0°), normalized four elements (α=30°, γ=0°), normalized four elements (α=0°, β=30°, γ=0°), normalized four elements (α=0°), γ=0°, normalized four elements (α=0°, γ=0°), and finally, four elements (36 °) are obtained. Therefore, the pose data of 36 robots can be finally obtained through the step A6, and at most 36 nodes of the calibration track can be obtained.
In step A4, when the robot moves according to the plurality of target poses, the robot sequentially moves to each target pose.
In step A3, the target pose data may be numbered according to the sequence of the calculated target pose data, so that in step A4, the robot may be sequentially moved to the target poses according to the sequence of the numbering. Before step A4, a preferred order may be acquired based on the generation of the plurality of target pose data, and the robot may be moved to each target pose in the preferred order by sequentially moving the robot to each target pose in step A4 by minimizing the total path of movement of the robot between each target pose in the preferred order.
Specifically, step A5 includes:
A501. Acquiring a picture shot by a camera;
A502. Judging whether the calibration plate is complete in the picture;
A503. If the calibration plate is incomplete in the picture, judging the direction in which the robot needs to move according to the missing condition of the calibration plate in the picture;
A504. the robot is moved in the direction to be moved until the calibration plate is completely exposed to the field of view of the camera.
In step a502, whether the calibration board is complete in the picture can be determined by identifying whether the number of the feature points is correct, but the method is not limited thereto. For example, the calibration plate is a rectangular plate, four corner points are used as feature points, the number of the feature points is 4, and if the number of the corner points in the picture is not 4, the calibration plate is incomplete in the picture.
Wherein, step a503 includes:
Acquiring the missing characteristic point information of the calibration plate in the picture;
and judging the direction in which the robot needs to move according to the position of the missing characteristic point on the calibration plate and the relative position relation between the calibration plate and the robot.
The feature points are points on the calibration plate, such as corner points of the calibration plate or some dot patterns specially arranged on the upper surface of the calibration plate, and generally, different features (such as colors and/or shapes, but not limited to) are arranged between different feature points so as to determine the identity of the feature points appearing in the picture; taking a rectangular calibration plate as an example, the calibration plate has four corner points, the four corner points are coated with different colors, when the calibration plate lacks a corner point of the lower left corner in a picture (the corner point of the lower left corner is judged to be absent through the colors of the three corner points in the picture), the robot is required to move towards the lower left corner of the calibration plate, the direction from the center of the calibration plate to the lacking corner point can be used as a target direction, and the target direction is converted into the direction in the robot base coordinate system because the conversion relation between the coordinate system of the calibration plate and the coordinate system of the robot is obtained by calibration in advance (for example, calibration is carried out by adopting the existing coordinate system calibration method of the robot), so that the direction in which the robot is required to move is obtained.
It can be seen that the direction from the center of the calibration plate to the missing feature point (hereinafter referred to as the first target direction) can be obtained first, then the first target direction is converted into the direction in the robot base coordinate system according to the conversion relation between the calibration plate coordinate system and the robot base coordinate system, and the direction is used as the direction in which the robot needs to move; if the number of the missing feature points is multiple, the positions of the gravity center points of all the missing feature points can be calculated, then the direction from the center of the calibration plate to the gravity center point (hereinafter referred to as a second target direction) is obtained, then the second target direction is converted into the direction in the robot base coordinate system according to the conversion relation between the calibration plate coordinate system and the robot base coordinate system, and the direction is used as the direction in which the robot needs to move.
If the part of the calibration plate displayed in the picture does not contain the center point of the calibration plate, the center point of the display part can be obtained first (the method for obtaining the center point of a certain area in the image is the prior art), and the center point of the calibration plate in the method is replaced by the center point of the calibration plate to enable the robot to move, until the center of the calibration plate appears in the picture, and then the center point of the calibration plate is used for executing the method to enable the robot to move.
In step a504, the calibration plate is completely exposed to the field of view of the camera, which means that the calibration plate can completely appear in the picture taken by the camera, specifically, step a504 includes:
The robot is moved in a stepping mode in a direction needing to be moved by a preset step length (the step length can be called a first step length);
After each movement, acquiring a picture shot by a camera and judging whether the calibration plate is complete in the picture;
and stopping moving if the calibration plate is complete in the picture.
The method comprises the steps that a picture shot by a camera is acquired every time when the robot moves according to a first step length to judge whether the calibration plate is complete in the picture, and the direction in which the robot needs to move is changed due to the fact that the condition of lacking characteristic points in the picture acquired every time possibly changes, so that the moving direction of the robot automatically changes along with the change of the condition of displaying the calibration plate in the picture along with the movement of the robot, and the calibration plate can be more reliably ensured to be completely exposed in the field of view of the camera.
If the obtained target rotation angle value is too large, a calibration plate is not included in the picture shot at some target pose (because if no feature point is included in the picture, it is difficult to determine which position of the calibration plate the part displayed in the picture belongs to, so that it is also impossible to know which position of the calibration plate the missing part belongs to, and thus the target direction cannot be obtained), so that whether the shot picture includes the calibration plate is generally determined based on whether the feature point of the calibration plate is included in the picture, that is, if the shot picture includes at least one feature point, the calibration plate is included in the picture; at this time, in step A5, the robot may be moved according to a predetermined path until the calibration plate appears in the field of view of the camera (in order that at least one feature point can appear in the picture taken by the camera), and then the position of the position robot is adjusted according to steps a501 to a 504.
For example, the preset path is that the preset path is moved forward by a preset distance, is moved backward by a preset distance after returning to the initial position, is moved left by a preset distance after returning to the initial position, is moved right by a preset distance after returning to the initial position, and stops moving once the calibration plate is found to appear in the visual field of the camera in the moving process; or the preset path is a spiral path with gradually increased radius, and the movement is stopped once the calibration plate is found to appear in the field of view of the camera in the process of moving along the spiral path. When moving along the preset path, the camera can move in a step-by-step mode with a preset step length (the step length can be called a second step length), and after each step of movement, a picture shot by the camera is obtained and whether the calibration plate appears in the picture is judged. If the calibration plate cannot appear in the field of view of the camera even if the calibration plate moves along the preset path, the calibration plate moves to the next target pose to continue the subsequent processing (namely, the target pose is abandoned).
It should be noted that, in order to ensure that the calibration plate does not deviate from the field of view of the camera too much when the robot moves to each target pose, generally, before step A1, the robot may be moved to the optimal initial pose first, so that the calibration plate is located directly below the camera, and the optical axis of the camera faces downward to the center of the positioning plate.
From the above, the method for generating the hand-eye calibration track is realized by acquiring a plurality of target rotation angle values; acquiring an initial pose of a robot; generating a plurality of target poses according to the plurality of target rotation angle values and the initial poses; the target pose is a pose obtained after the robot rotates from the initial pose according to the target rotation angle value; moving according to the target poses; at each target pose, position adjustment is carried out according to the position of the calibration plate, so that the calibration plate is completely exposed in the field of view of the camera; recording the adjusted pose data of the robot; therefore, the calibration track does not need to be designed manually, the working efficiency can be improved, and the professional level and experience requirements on staff are low.
Referring to fig. 2, an embodiment of the present application provides a device for generating a hand-eye calibration track, which is applied to a robot, and includes a first acquisition module 1, a second acquisition module 2, a generation module 3, a first execution module 4, a position adjustment module 5, and a recording module 6;
the first acquisition module 1 is used for acquiring a plurality of target rotation angle values;
The second acquisition module 2 is used for acquiring the initial pose of the robot;
The generating module 3 is used for generating a plurality of target poses according to a plurality of target rotation angle values and the initial poses; the target pose is a pose obtained after the robot rotates from the initial pose according to the target rotation angle value;
the first execution module 4 is used for moving according to a plurality of target poses;
The position adjustment module 5 is used for adjusting the position of each target pose according to the position of the calibration plate, so that the calibration plate is completely exposed in the field of view of the camera;
The recording module 6 is used for recording the adjusted pose data of the robot.
In some embodiments, the first acquisition module 1 directly acquires a plurality of preset target rotation angle values (for example 25 °,30 °, 35 °, 40 °, 45 °,50 °, -25 °, -30 °, -35 °, -40 °, -45 °, -50 °).
In other embodiments, the first obtaining module 1 obtains the angle range and the target number, and then selects a corresponding number of angle values within the angle range as the target rotation angle value according to the target number. The angle values may be selected at equal intervals in the angle range according to the target number, or may be selected randomly in the angle range.
The second acquisition module 2 acquires the initial pose and the initial position of the robot when acquiring the initial pose of the robot;
so that the generation module 3 generates a plurality of target poses from a plurality of target rotation angle values and the initial poses,
According to each target rotation angle value, calculating normalized four elements when the robot rotates around X, Y, Z axes by corresponding angles, and obtaining a plurality of normalized four elements;
respectively carrying out point multiplication on the initial gestures by using a plurality of normalized four elements, and then carrying out normalization processing to obtain a plurality of target gestures;
and combining the initial position with a plurality of target poses respectively to obtain a plurality of target poses.
When the first execution module 4 moves according to the plurality of target poses, the robot is sequentially moved to each target pose.
In some embodiments, when generating the plurality of target pose data, the generating module 3 numbers the target pose data according to the sequence of calculating the target pose data, so that the first executing module 4 sequentially moves the robot to the target poses according to the number sequence.
In other embodiments, the hand-eye calibration trajectory generation device further includes a third acquisition module; the third acquisition module is used for acquiring a preferred sequence according to the generation of a plurality of target pose data, so that the total path of the robot moving among the target poses according to the preferred sequence is shortest; the first execution module 4 thus moves the robot in sequence to each target pose in the order of preference.
Wherein, when the position adjusting module 5 adjusts the position of the robot,
Acquiring a picture shot by a camera;
Judging whether the calibration plate is complete in the picture;
If the calibration plate is incomplete in the picture, judging the direction in which the robot needs to move according to the missing condition of the calibration plate in the picture;
The robot is moved in the direction to be moved until the calibration plate is completely exposed to the field of view of the camera.
Whether the calibration plate is complete in the picture can be judged by identifying whether the number of the characteristic points is correct or not, but the method is not limited to the method. For example, the calibration plate is a rectangular plate, four corner points are used as feature points, the number of the feature points is 4, and if the number of the corner points in the picture is not 4, the calibration plate is incomplete in the picture.
Wherein, when the position adjustment module 5 judges the direction of the robot to move according to the missing condition of the calibration plate in the picture,
Acquiring the missing characteristic point information of the calibration plate in the picture;
and judging the direction in which the robot needs to move according to the position of the missing characteristic point on the calibration plate and the relative position relation between the calibration plate and the robot.
The characteristic points are points on the calibration plate, such as corner points of the calibration plate or some dot patterns specially arranged on the upper surface of the calibration plate; taking a rectangular calibration plate as an example, the calibration plate has four corner points, when the calibration plate lacks the corner point of the lower left corner in the picture, the robot needs to move towards the lower left corner direction of the calibration plate at the moment, the direction from the center of the calibration plate to the lacking corner point can be used as a target direction, and the target direction is converted into the direction in the robot base coordinate system because the conversion relation between the coordinate system of the calibration plate and the coordinate system of the robot base coordinate system can be obtained in advance (for example, the calibration is carried out by adopting the existing robot workpiece coordinate system calibration method), so as to obtain the direction in which the robot needs to move.
It can be seen that the direction from the center of the calibration plate to the missing feature point (hereinafter referred to as the first target direction) can be obtained first, then the first target direction is converted into the direction in the robot base coordinate system according to the conversion relation between the calibration plate coordinate system and the robot base coordinate system, and the direction is used as the direction in which the robot needs to move; if the number of the missing feature points is multiple, the positions of the gravity center points of all the missing feature points can be calculated, then the direction from the center of the calibration plate to the gravity center point (hereinafter referred to as a second target direction) is obtained, then the second target direction is converted into the direction in the robot base coordinate system according to the conversion relation between the calibration plate coordinate system and the robot base coordinate system, and the direction is used as the direction in which the robot needs to move.
If the part of the calibration plate displayed in the picture does not contain the center point of the calibration plate, the center point of the display part can be obtained first (the method for obtaining the center point of a certain area in the image is the prior art), and the center point of the calibration plate in the method is replaced by the center point of the calibration plate to enable the robot to move, until the center of the calibration plate appears in the picture, and then the center point of the calibration plate is used for executing the method to enable the robot to move.
Wherein, when the position adjustment module 5 moves the robot in the direction of the movement,
The robot is moved in a stepping mode in a direction needing to be moved by a preset step length (the step length can be called a first step length);
After each movement, acquiring a picture shot by a camera and judging whether the calibration plate is complete in the picture;
and stopping moving if the calibration plate is complete in the picture.
The method comprises the steps that a picture shot by a camera is acquired every time when the robot moves according to a first step length to judge whether the calibration plate is complete in the picture, and the direction in which the robot needs to move is changed due to the fact that the condition of lacking characteristic points in the picture acquired every time possibly changes, so that the moving direction of the robot automatically changes along with the change of the condition of displaying the calibration plate in the picture along with the movement of the robot, and the calibration plate can be more reliably ensured to be completely exposed in the field of view of the camera.
If the obtained target rotation angle value is too large, a calibration plate is not included in the pictures shot at some target pose positions possibly; at this time, the position adjustment module 5 may make the robot move according to a preset path until the calibration plate appears in the field of view of the camera, and then execute the above steps: acquiring a picture shot by a camera; judging whether the calibration plate is complete in the picture; if the calibration plate is incomplete in the picture, judging the direction in which the robot needs to move according to the missing condition of the calibration plate in the picture; the robot is moved in the direction to be moved until the calibration plate is completely exposed to the field of view of the camera.
For example, the preset path is that the preset path is moved forward by a preset distance, is moved backward by a preset distance after returning to the initial position, is moved left by a preset distance after returning to the initial position, is moved right by a preset distance after returning to the initial position, and stops moving once the calibration plate is found to appear in the visual field of the camera in the moving process; or the preset path is a spiral path with gradually increased radius, and the movement is stopped once the calibration plate is found to appear in the field of view of the camera in the process of moving along the spiral path. When moving along the preset path, the camera can move in a step-by-step mode with a preset step length (the step length can be called a second step length), and after each step of movement, a picture shot by the camera is obtained and whether the calibration plate appears in the picture is judged.
From the above, the hand-eye calibration track generating device obtains a plurality of target rotation angle values; acquiring an initial pose of a robot; generating a plurality of target poses according to the plurality of target rotation angle values and the initial poses; the target pose is a pose obtained after the robot rotates from the initial pose according to the target rotation angle value; moving according to the target poses; at each target pose, position adjustment is carried out according to the position of the calibration plate, so that the calibration plate is completely exposed in the field of view of the camera; recording the adjusted pose data of the robot; therefore, the calibration track does not need to be designed manually, the working efficiency can be improved, and the professional level and experience requirements on staff are low.
Referring to fig. 3, an embodiment of the present application further provides an electronic device 100, including a processor 101 and a memory 102, where the memory 102 stores a computer program, and the processor 101 is configured to execute the steps of the method for generating a hand-eye calibration track by calling the computer program stored in the memory 102.
The processor 101 is electrically connected to the memory 102. The processor 101 is a control center of the electronic device 100, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by running or calling computer programs stored in the memory 102, and calling data stored in the memory 102, thereby performing overall monitoring of the electronic device.
Memory 102 may be used to store computer programs and data. The memory 102 stores a computer program having instructions executable in a processor. The computer program may constitute various functional modules. The processor 101 executes various functional applications and data processing by calling a computer program stored in the memory 102.
In this embodiment, the processor 101 in the electronic device 100 loads instructions corresponding to the processes of one or more computer programs into the memory 102 according to the following steps, and the processor 101 executes the computer programs stored in the memory 102, so as to implement various functions: acquiring a plurality of target rotation angle values; acquiring an initial pose of a robot; generating a plurality of target poses according to the plurality of target rotation angle values and the initial poses; the target pose is a pose obtained after the robot rotates from the initial pose according to the target rotation angle value; moving according to the target poses; at each target pose, position adjustment is carried out according to the position of the calibration plate, so that the calibration plate is completely exposed in the field of view of the camera; and recording the adjusted pose data of the robot.
From the above, the electronic device obtains a plurality of target rotation angle values; acquiring an initial pose of a robot; generating a plurality of target poses according to the plurality of target rotation angle values and the initial poses; the target pose is a pose obtained after the robot rotates from the initial pose according to the target rotation angle value; moving according to the target poses; at each target pose, position adjustment is carried out according to the position of the calibration plate, so that the calibration plate is completely exposed in the field of view of the camera; recording the adjusted pose data of the robot; therefore, the calibration track does not need to be designed manually, the working efficiency can be improved, and the professional level and experience requirements on staff are low.
The embodiment of the application also provides a storage medium, on which a computer program is stored, which when being executed by a processor, runs the steps of the hand-eye calibration track generating method described above to realize the following functions: acquiring a plurality of target rotation angle values; acquiring an initial pose of a robot; generating a plurality of target poses according to the plurality of target rotation angle values and the initial poses; the target pose is a pose obtained after the robot rotates from the initial pose according to the target rotation angle value; moving according to the target poses; at each target pose, position adjustment is carried out according to the position of the calibration plate, so that the calibration plate is completely exposed in the field of view of the camera; and recording the adjusted pose data of the robot.
The storage medium may be implemented by any type of volatile or non-volatile Memory device or combination thereof, such as static random access Memory (Static Random Access Memory, SRAM), electrically erasable Programmable Read-Only Memory (ELECTRICALLY ERASABLE PROGRAMMABLE READ-Only Memory, EEPROM), erasable Programmable Read-Only Memory (Erasable Programmable Read Only Memory, EPROM), programmable Read-Only Memory (PROM), read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic disk, or optical disk.
In summary, although the present invention has been described with reference to the preferred embodiments, it is not limited thereto, and various modifications and variations can be made by those skilled in the art without departing from the spirit and scope of the present invention.

Claims (8)

1. The hand-eye calibration track generation method is applied to a robot, and a camera is arranged at the tail end of the robot, and is characterized by comprising the following steps:
A1. acquiring a plurality of target rotation angle values;
A2. Acquiring an initial pose of a robot; in the initial pose, the optical axis of the camera faces downwards to the center of the calibration plate;
A3. Generating a plurality of target poses according to the plurality of target rotation angle values and the initial poses; the target pose is a pose obtained after the robot rotates from the initial pose according to the target rotation angle value;
A4. moving according to the target poses;
A5. at each target pose, position adjustment is carried out according to the position of the calibration plate, so that the calibration plate is completely exposed in the field of view of the camera;
A6. recording the adjusted pose data of the robot;
the step A2 comprises the following steps: acquiring an initial posture and an initial position of a robot;
The step A3 comprises the following steps:
A301. According to each target rotation angle value, calculating normalized four elements when the robot rotates around X, Y, Z axes by corresponding angles, and obtaining a plurality of normalized four elements;
A302. Respectively carrying out point multiplication on the initial gestures by using the plurality of normalized four elements, and then carrying out normalization processing to obtain a plurality of target gestures;
A303. combining the initial position with the target poses to obtain a plurality of target poses;
the calculation formulas of the normalized four elements are specifically as follows:
Wherein, Is normalized four elements; /(I)Is four elements/>Is a die length of (2); /(I)Is a four element, and/>,m11=cαcβ,m12=/>,m21=/>,m22=/>,m23=/>,m31=,m32=/>,m33=/>;/>; Alpha, beta and gamma are attitude angles when the robot rotates around an X axis, a Y axis and a Z axis respectively;
the calculation formulas of the plurality of target postures are specifically as follows:
Wherein, For the target gesture,/>For the gesture obtained after dot multiplication,/>For/>Is of the modular length,/>Is the initial pose.
2. The hand-eye calibration trajectory generation method according to claim 1, wherein step A5 includes:
A501. Acquiring a picture shot by a camera;
A502. judging whether the calibration plate is complete in the picture;
A503. If the calibration plate is incomplete in the picture, judging the direction of the robot to be moved according to the condition that the calibration plate is missing in the picture;
A504. And moving the robot towards the direction required to move until the calibration plate is completely exposed to the field of view of the camera.
3. The method for generating a hand-eye calibration trajectory according to claim 2, wherein step a503 comprises:
Acquiring the missing characteristic point information of the calibration plate in the picture;
and judging the direction in which the robot needs to move according to the position of the missing characteristic point on the calibration plate and the relative position relation between the calibration plate and the robot.
4. The method of generating a hand-eye calibration trajectory according to claim 2, wherein step a504 includes:
The robot moves in a step-by-step mode in the direction needing to move in a preset step-size mode;
after each movement, acquiring a picture shot by a camera and judging whether the calibration plate is complete in the picture;
and stopping moving if the calibration plate is complete in the picture.
5. The utility model provides a hand eye calibration orbit generation device is applied to the robot, the end of robot is provided with the camera, its characterized in that includes:
The first acquisition module is used for acquiring a plurality of target rotation angle values;
The second acquisition module is used for acquiring the initial pose of the robot; in the initial pose, the optical axis of the camera faces downwards to the center of the calibration plate;
The generating module is used for generating a plurality of target poses according to the plurality of target rotation angle values and the initial poses; the target pose is a pose obtained after the robot rotates from the initial pose according to the target rotation angle value;
the first execution module is used for moving according to the plurality of target poses;
The position adjustment module is used for carrying out position adjustment at each target pose according to the position of the calibration plate so that the calibration plate is completely exposed in the field of view of the camera;
The recording module is used for recording the adjusted pose data of the robot;
the second acquisition module acquires the initial pose and the initial position of the robot when acquiring the initial pose of the robot;
The generation module generates a plurality of target poses based on the plurality of target rotation angle values and the initial poses,
According to each target rotation angle value, calculating normalized four elements when the robot rotates around X, Y, Z axes by corresponding angles, and obtaining a plurality of normalized four elements;
respectively carrying out point multiplication on the initial gestures by using the plurality of normalized four elements, and then carrying out normalization processing to obtain a plurality of target gestures;
Combining the initial position with the target poses to obtain a plurality of target poses;
the calculation formulas of the normalized four elements are specifically as follows:
Wherein, Is normalized four elements; /(I)Is four elements/>Is a die length of (2); /(I)Is a four element, and/>,m11=cαcβ,m12=/>,m21=/>,m22=/>,m23=/>,m31=,m32=/>,m33=/>;/>; Alpha, beta and gamma are attitude angles when the robot rotates around an X axis, a Y axis and a Z axis respectively;
the calculation formulas of the plurality of target postures are specifically as follows:
Wherein, For the target gesture,/>For the gesture obtained after dot multiplication,/>For/>Is of the modular length,/>Is the initial pose.
6. The hand-eye calibration trajectory generation device of claim 5, wherein said position adjustment module, when adjusting the position of the robot,
Acquiring a picture shot by a camera;
Judging whether the calibration plate is complete in the picture;
If the calibration plate is incomplete in the picture, judging the direction of the robot to be moved according to the condition that the calibration plate is missing in the picture;
and moving the robot towards the direction required to move until the calibration plate is completely exposed to the field of view of the camera.
7. An electronic device comprising a processor and a memory, said memory having stored therein a computer program for executing the steps of the hand-eye calibration trajectory generation method of any one of claims 1-4 by invoking said computer program stored in said memory.
8. A storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the hand-eye calibration trajectory generation method of any one of claims 1 to 4.
CN202010923062.5A 2020-09-04 2020-09-04 Method and device for generating hand-eye calibration track, electronic equipment and storage medium Active CN112116664B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010923062.5A CN112116664B (en) 2020-09-04 2020-09-04 Method and device for generating hand-eye calibration track, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010923062.5A CN112116664B (en) 2020-09-04 2020-09-04 Method and device for generating hand-eye calibration track, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112116664A CN112116664A (en) 2020-12-22
CN112116664B true CN112116664B (en) 2024-05-28

Family

ID=73802260

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010923062.5A Active CN112116664B (en) 2020-09-04 2020-09-04 Method and device for generating hand-eye calibration track, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112116664B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115439555A (en) * 2022-08-29 2022-12-06 佛山职业技术学院 Multi-phase machine external parameter calibration method without public view field

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106885585A (en) * 2016-12-30 2017-06-23 国家测绘地理信息局卫星测绘应用中心 A kind of satellite borne photography measuring system integration calibration method based on bundle adjustment
CN107063228A (en) * 2016-12-21 2017-08-18 上海交通大学 Targeted attitude calculation method based on binocular vision
CN107478223A (en) * 2016-06-08 2017-12-15 南京理工大学 A kind of human body attitude calculation method based on quaternary number and Kalman filtering
CN107498558A (en) * 2017-09-19 2017-12-22 北京阿丘科技有限公司 Full-automatic hand and eye calibrating method and device
CN108549322A (en) * 2018-04-11 2018-09-18 广州启帆工业机器人有限公司 Pose synchronization method and device for arc track motion of robot
CN108592950A (en) * 2018-05-17 2018-09-28 北京航空航天大学 A kind of monocular camera and Inertial Measurement Unit are with respect to established angle scaling method
JP2018202608A (en) * 2018-09-28 2018-12-27 キヤノン株式会社 Robot device, control method of robot device, program, and recording medium
CN110202573A (en) * 2019-06-04 2019-09-06 上海知津信息科技有限公司 Full-automatic hand and eye calibrating, working face scaling method and device
CN110497386A (en) * 2019-08-26 2019-11-26 中科新松有限公司 A kind of cooperation Robot Hand-eye relationship automatic calibration device and method
CN111152223A (en) * 2020-01-09 2020-05-15 埃夫特智能装备股份有限公司 Full-automatic robot hand-eye calibration method
CN111347426A (en) * 2020-03-26 2020-06-30 季华实验室 Mechanical arm accurate placement track planning method based on 3D vision
CN111415417A (en) * 2020-04-14 2020-07-14 大连理工江苏研究院有限公司 Mobile robot topology experience map construction method integrating sparse point cloud
CN111515944A (en) * 2020-03-30 2020-08-11 季华实验室 Automatic calibration method for non-fixed path robot

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6415190B2 (en) * 2014-09-03 2018-10-31 キヤノン株式会社 ROBOT DEVICE, ROBOT CONTROL PROGRAM, RECORDING MEDIUM, AND ROBOT DEVICE CONTROL METHOD
JP7003463B2 (en) * 2017-07-11 2022-01-20 セイコーエプソン株式会社 Robot control device, robot system, and camera calibration method

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107478223A (en) * 2016-06-08 2017-12-15 南京理工大学 A kind of human body attitude calculation method based on quaternary number and Kalman filtering
CN107063228A (en) * 2016-12-21 2017-08-18 上海交通大学 Targeted attitude calculation method based on binocular vision
CN106885585A (en) * 2016-12-30 2017-06-23 国家测绘地理信息局卫星测绘应用中心 A kind of satellite borne photography measuring system integration calibration method based on bundle adjustment
CN107498558A (en) * 2017-09-19 2017-12-22 北京阿丘科技有限公司 Full-automatic hand and eye calibrating method and device
CN108549322A (en) * 2018-04-11 2018-09-18 广州启帆工业机器人有限公司 Pose synchronization method and device for arc track motion of robot
CN108592950A (en) * 2018-05-17 2018-09-28 北京航空航天大学 A kind of monocular camera and Inertial Measurement Unit are with respect to established angle scaling method
JP2018202608A (en) * 2018-09-28 2018-12-27 キヤノン株式会社 Robot device, control method of robot device, program, and recording medium
CN110202573A (en) * 2019-06-04 2019-09-06 上海知津信息科技有限公司 Full-automatic hand and eye calibrating, working face scaling method and device
CN110497386A (en) * 2019-08-26 2019-11-26 中科新松有限公司 A kind of cooperation Robot Hand-eye relationship automatic calibration device and method
CN111152223A (en) * 2020-01-09 2020-05-15 埃夫特智能装备股份有限公司 Full-automatic robot hand-eye calibration method
CN111347426A (en) * 2020-03-26 2020-06-30 季华实验室 Mechanical arm accurate placement track planning method based on 3D vision
CN111515944A (en) * 2020-03-30 2020-08-11 季华实验室 Automatic calibration method for non-fixed path robot
CN111415417A (en) * 2020-04-14 2020-07-14 大连理工江苏研究院有限公司 Mobile robot topology experience map construction method integrating sparse point cloud

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Data Selection for Hand-eye Calibration: A Vector Quantization Approach;Jochen Schmidt;The International Journal of Robotics Research;20080930;第27卷(第9期);1027-1053 *
Robust Hand–Eye Calibration of an Endoscopic Surgery Robot Using Dual Quaternions;Jochen Schmidt;DAGM 2003:Pattern Recognition;20030912;548-556 *
手眼标定之基本原理(https://blog.csdn.net/Yong_Qi2015/article/details/83960141);3D视觉工坊;CSDN;1 *

Also Published As

Publication number Publication date
CN112116664A (en) 2020-12-22

Similar Documents

Publication Publication Date Title
CN106426172B (en) A kind of scaling method and system of industrial robot tool coordinates system
US9519736B2 (en) Data generation device for vision sensor and detection simulation system
CN104842352B (en) Robot system using visual feedback
TWI670153B (en) Robot and robot system
CN106483963B (en) Automatic calibration method of robot system
US11014233B2 (en) Teaching point correcting method, program, recording medium, robot apparatus, imaging point creating method, and imaging point creating apparatus
CN113246135B (en) Robot hand-eye calibration method and device, electronic equipment and storage medium
US20080027580A1 (en) Robot programming method and apparatus with both vision and force
CN113664835B (en) Automatic hand-eye calibration method and system for robot
CN112123341B (en) Robot double-arm coordinated motion control method and device and electronic equipment
CN113370221B (en) Robot TCP calibration system, method, device, equipment and storage medium
CN113664838B (en) Robot positioning placement control method and device, electronic equipment and storage medium
CN112809668A (en) Method, system and terminal for automatic hand-eye calibration of mechanical arm
CN112116664B (en) Method and device for generating hand-eye calibration track, electronic equipment and storage medium
US11577400B2 (en) Method and apparatus for managing robot system
CN114505864B (en) Hand-eye calibration method, device, equipment and storage medium
CN108705530A (en) Method and system for automatically correcting path of industrial robot
CN113814987B (en) Multi-camera robot hand-eye calibration method and device, electronic equipment and storage medium
CN115194769A (en) Coordinate system calibration method and device, robot and storage medium
CN110533727B (en) Robot self-positioning method based on single industrial camera
CN114407012B (en) Robot motion calibration method, apparatus, computer device and storage medium
CN215701709U (en) Configurable hand-eye calibration device
US20230130816A1 (en) Calibration system, calibration method, and calibration apparatus
CN112643718B (en) Image processing apparatus, control method therefor, and storage medium storing control program therefor
CN110672009B (en) Reference positioning, object posture adjustment and graphic display method based on machine vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant