CN115351482A - Welding robot control method, welding robot control device, welding robot, and storage medium - Google Patents

Welding robot control method, welding robot control device, welding robot, and storage medium Download PDF

Info

Publication number
CN115351482A
CN115351482A CN202211057960.2A CN202211057960A CN115351482A CN 115351482 A CN115351482 A CN 115351482A CN 202211057960 A CN202211057960 A CN 202211057960A CN 115351482 A CN115351482 A CN 115351482A
Authority
CN
China
Prior art keywords
point cloud
matrix
welding robot
dimensional
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211057960.2A
Other languages
Chinese (zh)
Inventor
植美浃
陈烈臻
高建文
张兆彪
陈泓亨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Qianhai Ruiji Technology Co ltd
China International Marine Containers Group Co Ltd
CIMC Containers Holding Co Ltd
Original Assignee
Shenzhen Qianhai Ruiji Technology Co ltd
China International Marine Containers Group Co Ltd
CIMC Containers Holding Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Qianhai Ruiji Technology Co ltd, China International Marine Containers Group Co Ltd, CIMC Containers Holding Co Ltd filed Critical Shenzhen Qianhai Ruiji Technology Co ltd
Priority to CN202211057960.2A priority Critical patent/CN115351482A/en
Publication of CN115351482A publication Critical patent/CN115351482A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K37/00Auxiliary devices or processes, not specially adapted to a procedure covered by only one of the preceding main groups
    • B23K37/02Carriages for supporting the welding or cutting element
    • B23K37/0252Steering means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/005Manipulators for mechanical processing tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Abstract

The application provides a control method and device of a welding robot, the welding robot and a storage medium. The control method of the welding robot comprises the following steps: acquiring original three-dimensional point cloud data of a calibration block through a three-dimensional camera arranged at the tail end of a welding gun of a welding robot; identifying a first position of an angular point feature of a calibration block based on the original three-dimensional point cloud data, and recording a first point cloud located at the first position; driving the welding robot to move by taking the first position as a target position of the tail end of the welding gun, acquiring a second position actually reached by the tail end of the welding gun after the movement is finished, and recording second point cloud located at the second position; and compensating the reference hand-eye matrix based on a transformation matrix obtained by matching the first point cloud and the second point cloud to obtain a target hand-eye matrix. The method and the device can reduce errors of calibration compensation of hands and eyes of the welding robot.

Description

Welding robot control method, welding robot control device, welding robot, and storage medium
Technical Field
The application relates to the technical field of welding, in particular to a welding robot control method and device, a welding robot and a storage medium.
Background
The current welding modes mainly comprise manual teaching programming robot welding and traditional vision guiding robot welding. Under the welding mode of the manual teaching programming robot, the position relation between a welding gun and a camera of the welding robot is large due to human factors, even if compensation is carried out after hand-eye calibration, errors are still large, calibration compensation of welding of the traditional vision-guided robot is still achieved through a two-dimensional camera, actual conditions are difficult to reflect for the welding robot, and large errors are easy to generate. Therefore, the above-mentioned hand-eye calibration compensation method for a welding robot may cause large errors.
Disclosure of Invention
An object of the present application is to provide a method and an apparatus for controlling a welding robot, and a storage medium, which can reduce errors in calibration and compensation of a hand-eye of the welding robot at least to some extent.
According to an aspect of an embodiment of the present application, there is provided a control method of a welding robot, the method including:
acquiring original three-dimensional point cloud data of a calibration block through a three-dimensional camera arranged at the tail end of a welding gun of a welding robot; the calibration block is calibrated with a plurality of angular points, and the positions of at least two angular points in the plurality of angular points in the horizontal direction and the vertical direction are different;
identifying a first position of the corner feature of the calibration block based on the original three-dimensional point cloud data, and recording a first point cloud located at the first position; the corner feature is a feature of a corner of the calibration block;
driving the welding robot to move by taking the first position as a target position of the tail end of the welding gun, acquiring a second position actually reached by the tail end of the welding gun after the movement is finished, and recording second point cloud located at the second position;
and compensating the reference hand-eye matrix based on a transformation matrix obtained by matching the first point cloud and the second point cloud to obtain a target hand-eye matrix.
According to an aspect of an embodiment of the present application, there is provided a control apparatus of a welding robot, the apparatus including:
the point cloud data acquisition module is used for acquiring original three-dimensional point cloud data of the calibration block through a three-dimensional camera arranged at the tail end of a welding gun of the welding robot; the calibration block is calibrated with a plurality of angular points, and the positions of at least two angular points in the plurality of angular points in the horizontal direction and the vertical direction are different;
the identification module is used for identifying a first position of the corner feature of the calibration block based on the original three-dimensional point cloud data and recording a first point cloud located at the first position; the corner feature is a feature of a corner of the calibration block;
the point cloud acquisition module is used for driving the welding robot to move by taking the first position as a target position of the tail end of the welding gun, acquiring a second position actually reached by the tail end of the welding gun after the welding robot finishes moving, and recording second point cloud located at the second position;
and the compensation module is used for compensating the reference hand-eye matrix based on a transformation matrix obtained by matching the first point cloud and the second point cloud to obtain a target hand-eye matrix.
In some embodiments of the present application, based on the foregoing technical solutions, the control device of the welding robot is configured to:
matching the first point cloud to the second point cloud to obtain the transformation matrix;
sequentially multiplying the inverse matrix of the current pose matrix, the transformation matrix, the current pose matrix and the reference hand-eye matrix to obtain the target hand-eye matrix; the current pose matrix is a pose matrix when the welding robot acquires the original three-dimensional point cloud data of a calibration block, and the pose matrix is a matrix describing the position and the posture of the welding robot.
In some embodiments of the present application, based on the foregoing technical solutions, the control device of the welding robot is configured to:
mapping the original three-dimensional point cloud data to a welding robot base coordinate system based on the current pose matrix and the reference hand-eye matrix to obtain target three-dimensional point cloud data;
in the target three-dimensional point cloud data, a first location of a corner feature of the calibration block is identified.
In some embodiments of the present application, based on the foregoing technical solutions, the control device of the welding robot is configured to:
the product of the current pose matrix and the reference hand-eye matrix is obtained to obtain a target matrix;
and mapping the original three-dimensional point cloud data to a welding robot base coordinate system based on the target matrix to obtain the target three-dimensional point cloud data.
In some embodiments of the present application, based on the foregoing technical solutions, the control device of the welding robot is configured to:
determining a target point pair between the first point cloud and the second point cloud; the target point pair is a correspondence of a point in the first point cloud to a point in the second point cloud;
constructing a rotation and translation matrix through the target point pairs;
transforming the first point cloud to a coordinate system of the second point cloud by using the rotation and translation matrix to obtain a transformed point cloud;
constructing an error function of the transformed point cloud and the second point cloud;
if the error is detected to be smaller than or equal to a preset threshold value according to the error function, determining a transformation matrix based on the rotation and translation matrix;
and if the error is detected to be larger than the preset threshold value according to the error function, updating the first point cloud according to the transformed point cloud, and iterating the rotation and translation matrix between the updated first point cloud and the second point cloud.
In some embodiments of the present application, based on the foregoing technical solutions, the control device of the welding robot is configured to:
based on the target hand-eye matrix, performing weld extraction on three-dimensional camera data of a weld of a workpiece to be welded to obtain the position of the weld of the workpiece to be welded in a three-dimensional space; the three-dimensional camera data is data of the weld collected by the three-dimensional camera;
and welding the workpieces to be welded based on the positions of the welding seams in the three-dimensional space.
In some embodiments of the present application, based on the foregoing technical solutions, the control device of the welding robot is configured to:
and acquiring original three-dimensional point cloud data of one frame of the calibration block through a three-dimensional camera arranged at the tail end of the welding gun of the welding robot.
According to an aspect of an embodiment of the present application, there is provided a welding robot including: one or more processors; a storage device for storing one or more programs that, when executed by the one or more processors, cause the welding robot to implement the methods provided in the various alternative implementations described above.
According to an aspect of embodiments of the present application, there is provided a computer program medium having stored thereon computer readable instructions, which, when executed by a processor of a computer, cause the computer to perform the method provided in the above various alternative implementations.
According to an aspect of embodiments herein, there is provided a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method provided in the various alternative implementations described above.
In the technical scheme provided by the embodiment of the application, the method comprises the steps of collecting original three-dimensional point data of a calibration block through a three-dimensional camera arranged at the tail end of a welding gun of a welding robot, identifying a first position of an angular point feature of the calibration block, obtaining a first point cloud located at the first position, driving the tail end of the welding gun of the welding robot to move towards the first position, collecting a second position which the tail end of the welding gun actually reaches after the movement is completed, obtaining a second point cloud located at the second position, and compensating a reference hand-eye matrix based on a transformation matrix obtained by matching the first point cloud and the second point cloud to obtain a target hand-eye matrix. The problem that the position relation between a welding gun and a camera of a welding robot is large due to human factors in an artificial teaching mode is avoided, the problem that errors are large due to the fact that the traditional vision-guided robot is difficult to reflect the actual situation of a three-dimensional space through a two-dimensional camera is avoided, three-dimensional data can be combined, a reference hand-eye matrix is compensated by utilizing a transformation matrix between first point cloud and second point cloud, and therefore errors of hand-eye calibration compensation of the welding robot are reduced.
Other features and advantages of the present application will be apparent from the following detailed description, or may be learned by practice of the application.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The above and other objects, features and advantages of the present application will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings.
Fig. 1 shows a flow chart of a control method of a welding robot according to a first embodiment of the present application.
Fig. 2 shows a schematic structural diagram of a calibration block according to an embodiment of the present application.
Fig. 3 shows a top view of a calibration block according to an embodiment of the present application.
Fig. 4 is a flowchart illustrating a control method of a welding robot according to a second embodiment of the present application.
Fig. 5 is a schematic structural diagram showing a control device of a welding robot according to a third embodiment of the present application.
Fig. 6 shows a schematic structural diagram of a welding robot according to the fourth embodiment of the present application.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these example embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The drawings are merely schematic illustrations of the present application and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more example embodiments. In the following description, numerous specific details are provided to give a thorough understanding of example embodiments of the present application. One skilled in the relevant art will recognize, however, that the subject matter of the present application can be practiced without one or more of the specific details, or with other methods, components, steps, and so forth. In other instances, well-known structures, methods, implementations, or operations are not shown or described in detail to avoid obscuring aspects of the application.
Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
Fig. 1 shows a flow chart of a control method of a welding robot according to a first embodiment of the present application. The control method of the welding robot comprises the following steps:
step S101: acquiring original three-dimensional point cloud data of a calibration block through a three-dimensional camera arranged at the tail end of a welding gun of a welding robot; the calibration block is calibrated with a plurality of angular points, and the positions of at least two angular points in the plurality of angular points in the horizontal direction and the vertical direction are different;
a calibration block is an object for calibration information. The original three-dimensional point cloud data is data composed of point clouds with three dimensions of an X axis, a Y axis and a Z axis, which are directly acquired by a three-dimensional camera. In order to obtain three-dimensional point cloud data, a plurality of angular points are calibrated by a calibration block, and the positions of at least two angular points in the plurality of angular points in the horizontal direction and the vertical direction are different. The corner points may be corner points of a specific shape on the calibration block or may be points of interest on the calibration block.
A horizontal direction such as the direction of a plane formed by the X-axis and the Y-axis, and a vertical direction such as the direction of a plane formed by the X-axis and the Z-axis. The positions of the two angular points in the horizontal direction and the vertical direction are different, so that the two angular points can reflect the positions in a three-dimensional space, and the positions in the vertical direction are different, which means that the coordinates of the Z axis are different. In addition, the positions in the horizontal direction are different, which means that the coordinates of the X axis or the Y axis are different, and the aim is to avoid the superposition of the two angular points in the horizontal direction so as to facilitate the three-dimensional camera to acquire the information of the two angular points.
FIG. 2 illustrates a block diagram of a flag block in an embodiment. In the calibration block shown in fig. 2, a calibration plate is arranged in the horizontal direction and a calibration column is arranged in the vertical direction.
FIG. 3 illustrates a top view of a calibration block in one embodiment. A plurality of quadrangles are arranged on the calibration plate, and vertexes of the quadrangles can be used as angular points. For example, pt0, pt1, pt2, and pt3 may be used as corner points on the calibration board. The upper surface of the calibration column is in a trapezoid shape and is provided with 5 vertexes pt4, pt5, pt6, pt7 and pt8, and the 5 vertexes can be used as the angular points of the calibration column. It can be obtained that the calibration block has the following corner points: pt0, pt1, pt2, pt3, pt4, pt5, pt6, pt7 and pt8.
In addition, in another embodiment, the calibration block structure may be adjusted, or the number of corner points may be adjusted. It is sufficient if at least two of the plurality of corner points of the calibration block have different positions in the horizontal direction and the vertical direction.
As an alternative embodiment, the method for acquiring the original three-dimensional point cloud data of the calibration block by the three-dimensional camera arranged at the end of the welding robot welding gun comprises the following steps: and acquiring original three-dimensional point cloud data of one frame of the calibration block through a three-dimensional camera arranged at the tail end of the welding gun of the welding robot.
Through collecting a frame of original three-dimensional point cloud data, only process to a frame of image, compare in handling more simply in the multiframe image, can improve and carry out the efficiency that the hand-eye marks the compensation to welding robot.
Step S102: identifying a first position of the corner feature of the calibration block based on the original three-dimensional point cloud data, and recording a first point cloud located at the first position; the corner feature is a feature of a corner of the calibration block;
the first position is the position where the corner feature identified from the original three-dimensional point cloud data is located. The first point cloud is a point cloud located at a first location.
Step S103: driving the welding robot to move by taking the first position as a target position of the tail end of the welding gun, acquiring a second position actually reached by the tail end of the welding gun after the movement is finished, and recording second point cloud located at the second position;
the target position is the position to which the tip of the welding gun is to be moved. The second position is the position that the tip of the welding gun actually reaches. The second point cloud is a point cloud located at a second location.
Step S104: and compensating the reference hand-eye matrix based on a transformation matrix obtained by matching the first point cloud and the second point cloud to obtain a target hand-eye matrix.
The transformation matrix is a matrix indicating a transformation between the first point cloud and the second point cloud. The reference hand-eye matrix is a hand-eye matrix of the welding robot set in advance. The target hand-eye matrix is a matrix compensated for the reference hand-eye matrix.
As an optional implementation manner, after compensating the reference hand-eye matrix to obtain the target hand-eye matrix, the method further includes: based on the target hand-eye matrix, performing weld extraction on three-dimensional camera data of a weld of a workpiece to be welded to obtain the position of the weld of the workpiece to be welded in a three-dimensional space; the three-dimensional camera data is data of the weld collected by the three-dimensional camera; and welding the workpieces to be welded based on the positions of the welding seams in the three-dimensional space.
By adopting the mode, the welding robot can be assisted to position the workpiece, and the relative position between the tail end of the welding gun of the welding robot and the optical center of the three-dimensional camera can be rapidly calculated. By combining the angular point characteristics of the calibration block and the angular point characteristics reached by the tail end of the welding gun for calculation and compensation, the method is simple in strategy and easy to realize, and avoids bringing more system errors and calculation complexity errors, so that the accuracy of tasks such as positioning and identification by using a camera module in the later period is high. By utilizing the characteristics of the three-dimensional point cloud data, only single-frame sampling is needed, the speed of compensation calculation is increased, and the precision and accuracy of workpiece positioning, weld extraction and welding execution are improved. The efficiency and the accuracy of identifying the welding seam can be improved.
Fig. 4 is a flowchart illustrating a control method of a welding robot according to a second embodiment of the present application. The control method of the welding robot comprises the following steps:
step S201: acquiring original three-dimensional point cloud data of a calibration block through a three-dimensional camera arranged at the tail end of a welding gun of a welding robot; the calibration block is calibrated with a plurality of angular points, and the positions of at least two angular points in the plurality of angular points in the horizontal direction and the vertical direction are different;
step S202: identifying a first position of the corner feature of the calibration block based on the original three-dimensional point cloud data, and recording a first point cloud located at the first position; the corner feature is a feature of a corner of the calibration block;
as an optional implementation, identifying a first location of a corner feature of the calibration block based on the original three-dimensional point cloud data includes: mapping the original three-dimensional point cloud data to a welding robot base coordinate system based on the current pose matrix and the reference hand-eye matrix to obtain target three-dimensional point cloud data; in the target three-dimensional point cloud data, a first position of a corner feature of the calibration block is identified.
The target three-dimensional point cloud data is data mapped to a robot base coordinate system by the original three-dimensional point cloud data.
As an optional implementation manner, mapping the original three-dimensional point cloud data to a welding robot base coordinate system based on the current pose matrix and the reference hand-eye matrix to obtain target three-dimensional point cloud data, including: the product of the current pose matrix and the reference hand-eye matrix is obtained to obtain a target matrix; and mapping the original three-dimensional point cloud data to a welding robot base coordinate system based on the target matrix to obtain the target three-dimensional point cloud data.
As an optional implementation, matching the first point cloud to the second point cloud to obtain the transformation matrix includes: determining a target point pair between the first point cloud and the second point cloud; the target point pair is a correspondence of a point in the first point cloud to a point in the second point cloud; constructing a rotation and translation matrix through the target point pairs; transforming the first point cloud to a coordinate system of the second point cloud by using the rotation and translation matrix to obtain a transformed point cloud; constructing an error function of the transformed point cloud and the second point cloud; if the detected error is smaller than or equal to a preset threshold value according to the error function, determining a transformation matrix based on the rotation and translation matrix; and if the error is detected to be larger than the preset threshold value according to the error function, updating the first point cloud according to the transformed point cloud, and iterating the rotation and translation matrix between the updated first point cloud and the second point cloud.
Step S203: driving the welding robot to move by taking the first position as a target position of the tail end of the welding gun, acquiring a second position actually reached by the tail end of the welding gun after the movement is finished, and recording second point cloud located at the second position;
step S204: matching the first point cloud to the second point cloud to obtain the transformation matrix;
step S205: sequentially multiplying the inverse matrix of the current pose matrix, the transformation matrix, the current pose matrix and the reference hand-eye matrix to obtain the target hand-eye matrix; the current pose matrix is a pose matrix when the welding robot acquires the original three-dimensional point cloud data of a calibration block, and the pose matrix is a matrix describing the position and the posture of the welding robot.
The above technical solution is described below with reference to an embodiment in a specific scenario.
Firstly, a three-dimensional camera at the tail end of a welding gun of the welding robot is driven to acquire three-dimensional point cloud data pcd0 of a calibration block, a current pose matrix toolPose of the welding robot is recorded, and the three-dimensional point cloud data pcd1 is converted under a base coordinate system of the welding robot through a hand-eye matrix handleeye.
And secondly, extracting and recording corner point features pt0, pt1, \ 8230and pt8 on the calibration block from the three-dimensional point cloud data pcd1 of the calibration block. Meanwhile, the tail end of a welding gun of the welding robot is driven to the corner feature position corresponding to the standard-reaching fixed block, and the current corner positions pt0', pt1', \8230, pt8' are recorded. Wherein the translation relationship between pcd0 and pcd1 is as follows:
pcd1=transform(pcd0,toolPose*handEye)。
and moreover, based on the angular point features pt0, pt1, \ 8230;, pt8 in the calibration block three-dimensional point cloud data and the corresponding three-dimensional space positions pt0', pt1' \ 8230;, pt8' thereof are recorded as point clouds pcd2 and pcd3, respectively. And then matching the pcd2 to the position of the pcd3 by adopting a point cloud registration algorithm, and obtaining a transformation matrix Tran at the moment.
Then, the hand-eye matrix hand eye of the robot is compensated and calculated through the transformation matrix Tran, and therefore the compensated hand-eye matrix hand eye' is obtained. Wherein, the compensation hand-eye matrix is:
handEye’=toolPose-1*Tran*toolPose*handEye。
and finally, the compensated hand-eye matrix hand eye' assists a welding robot to extract and weld the weld of the workpiece.
By adopting the mode, the calibration block three-dimensional point cloud data acquired by the three-dimensional camera is converted into the welding robot base coordinate system, the calibration block corner point feature points are picked from the calibration block three-dimensional point cloud data, and the welding robot is driven to acquire the corresponding calibration block corner point feature positions. And performing point cloud registration on the two groups of data to obtain a transformation matrix of mutual conversion of the two groups of data, and finally compensating the robot hand-eye matrix. The welding robot can be assisted to quickly determine the position of the tail end of the welding gun relative to the actual welding seam of the workpiece.
The compensation matrix of the hand-eye transformation matrix is rapidly solved, the precision of the relative position conversion matrix between the three-dimensional camera and the tail end of the welding gun of the welding robot is further improved, and therefore the welding robot is assisted to rapidly and accurately determine the actual point position of the welding gun of the welding robot and the welding seam of the workpiece. The positioning and extraction of the welding line of the intelligent robot welding system can be improved, and the high efficiency and the high precision of robot welding are achieved.
Fig. 5 shows a control apparatus of a welding robot according to a third embodiment of the present application, the control apparatus of the welding robot including:
a point cloud data acquisition module 301, configured to acquire original three-dimensional point cloud data of a calibration block through a three-dimensional camera arranged at the end of a welding gun of the welding robot; the calibration block is calibrated with a plurality of angular points, and at least two angular points in the plurality of angular points have different positions in the horizontal direction and the vertical direction;
an identifying module 302, configured to identify a first position of the corner feature of the calibration block based on the original three-dimensional point cloud data, and record a first point cloud located at the first position; the corner feature is a feature of a corner of the calibration block;
the point cloud acquisition module 303 is configured to drive the welding robot to move with the first position as a target position of the tail end of the welding gun, acquire a second position actually reached by the tail end of the welding gun after the movement is completed, and record a second point cloud located at the second position;
a compensation module 304, configured to compensate the reference hand-eye matrix based on a transformation matrix obtained by matching the first point cloud and the second point cloud, so as to obtain a target hand-eye matrix.
In an exemplary embodiment of the present application, the control device of the welding robot is configured to:
matching the first point cloud to the second point cloud to obtain the transformation matrix;
sequentially multiplying the inverse matrix of the current pose matrix, the transformation matrix, the current pose matrix and the reference hand-eye matrix to obtain the target hand-eye matrix; the current pose matrix is a pose matrix when the welding robot acquires the original three-dimensional point cloud data of the calibration block, and the pose matrix is a matrix describing the position and the posture of the welding robot.
In an exemplary embodiment of the present application, the control device of the welding robot is configured to:
mapping the original three-dimensional point cloud data to a welding robot base coordinate system based on the current pose matrix and the reference hand-eye matrix to obtain target three-dimensional point cloud data;
in the target three-dimensional point cloud data, a first position of a corner feature of the calibration block is identified.
In an exemplary embodiment of the present application, the control device of the welding robot is configured to:
the product of the current pose matrix and the reference hand-eye matrix is obtained to obtain a target matrix;
and mapping the original three-dimensional point cloud data to a welding robot base coordinate system based on the target matrix to obtain the target three-dimensional point cloud data.
In an exemplary embodiment of the present application, the control device of the welding robot is configured to:
determining a target point pair between the first point cloud and the second point cloud; the target point pair is a correspondence of a point in the first point cloud to a point in the second point cloud;
constructing a rotation and translation matrix through the target point pairs;
transforming the first point cloud to a coordinate system of the second point cloud by using the rotation and translation matrix to obtain a transformed point cloud;
constructing an error function of the transformed point cloud and the second point cloud;
if the detected error is smaller than or equal to a preset threshold value according to the error function, determining a transformation matrix based on the rotation and translation matrix;
and if the error is detected to be larger than the preset threshold value according to the error function, updating the first point cloud according to the transformed point cloud, and iterating the rotation and translation matrix between the updated first point cloud and the second point cloud.
In an exemplary embodiment of the present application, the control device of the welding robot is configured to:
based on the target hand-eye matrix, performing weld extraction on three-dimensional camera data of a weld of a workpiece to be welded to obtain the position of the weld of the workpiece to be welded in a three-dimensional space; the three-dimensional camera data is data of the weld collected by the three-dimensional camera;
and welding the workpieces to be welded based on the positions of the welding seams in the three-dimensional space.
In an exemplary embodiment of the present application, the control device of the welding robot is configured to:
and acquiring original three-dimensional point cloud data of one frame of the calibration block through a three-dimensional camera arranged at the tail end of the welding gun of the welding robot.
A welding robot 40 according to the fourth embodiment of the present application is described below with reference to fig. 6. The welding robot 40 shown in fig. 6 is only an example, and should not bring any limitation to the function and the range of use of the embodiment of the present application.
As shown in fig. 6, the welding robot 40 is in the form of a general purpose computing device. The components of the welding robot 40 may include, but are not limited to: the at least one processing unit 410, the at least one memory unit 420, and a bus 430 that couples various system components including the memory unit 420 and the processing unit 410.
Wherein the storage unit stores program code which can be executed by the processing unit 410 such that the processing unit 410 performs the steps according to various exemplary embodiments of the present invention as described in the description part of the above exemplary methods of the present specification. For example, processing unit 410 may perform various steps as shown in fig. 1.
The storage unit 420 may include readable media in the form of volatile storage units, such as a random access memory unit (RAM) 4201 and/or a cache memory unit 4202, and may further include a read only memory unit (ROM) 4203.
The storage unit 420 may also include a program/utility 4204 having a set (at least one) of program modules 4205, such program modules 4205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 430 may be any bus representing one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The welding robot 40 can also communicate with one or more devices that enable a user to interact with the welding robot 40, and/or with any device (e.g., router, modem, etc.) that enables the welding robot 40 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 450. An input/output (I/O) interface 450 is connected to the display unit 440. Also, the welding robot 40 can communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 460. As shown, the network adapter 460 communicates with the other modules of the welding robot 40 via the bus 430. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the welding robot 40, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, to name a few.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, and may also be implemented by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present application can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to make a computing device (including a welding robot) execute the method according to the embodiments of the present application.
In an exemplary embodiment of the present application, there is also provided a computer-readable storage medium having stored thereon computer-readable instructions which, when executed by a processor of a computer, cause the computer to perform the method described in the above method embodiment section.
According to an embodiment of the present application, there is also provided a program product for implementing the method in the above-described method embodiment, which can employ a portable compact disc read only memory (CD-ROM) and include program code, and can be run on a welding robot. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as JAVA, C + +, or the like, as well as conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functions of two or more modules or units described above may be embodied in one module or unit according to embodiments of the application. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
Moreover, although the steps of the methods in this application are depicted in the drawings in a particular order, this does not require or imply that these steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present application can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to make a computing device (including a welding robot) execute the method according to the embodiments of the present application.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.

Claims (10)

1. A method of controlling a welding robot, comprising:
acquiring original three-dimensional point cloud data of a calibration block through a three-dimensional camera arranged at the tail end of a welding gun of a welding robot; the calibration block is calibrated with a plurality of angular points, and the positions of at least two angular points in the plurality of angular points in the horizontal direction and the vertical direction are different;
identifying a first position of the corner feature of the calibration block based on the original three-dimensional point cloud data, and recording a first point cloud located at the first position; the corner feature is a feature of a corner of the calibration block;
driving the welding robot to move by taking the first position as a target position of the tail end of the welding gun, acquiring a second position actually reached by the tail end of the welding gun after the movement is finished, and recording second point cloud located at the second position;
and compensating the reference hand-eye matrix based on a transformation matrix obtained by matching the first point cloud and the second point cloud to obtain a target hand-eye matrix.
2. The method of claim 1, wherein compensating a reference hand-eye matrix based on a transformation matrix obtained by matching the first point cloud and the second point cloud to obtain a target hand-eye matrix comprises:
matching the first point cloud to the second point cloud to obtain the transformation matrix;
sequentially multiplying the inverse matrix of the current pose matrix, the transformation matrix, the current pose matrix and the reference hand-eye matrix to obtain the target hand-eye matrix; the current pose matrix is a pose matrix when the welding robot acquires the original three-dimensional point cloud data of the calibration block, and the pose matrix is a matrix describing the position and the posture of the welding robot.
3. The method of claim 2, wherein identifying a first location of a corner feature of the calibration block based on the original three-dimensional point cloud data comprises:
mapping the original three-dimensional point cloud data to a welding robot base coordinate system based on the current pose matrix and the reference hand-eye matrix to obtain target three-dimensional point cloud data;
in the target three-dimensional point cloud data, a first position of a corner feature of the calibration block is identified.
4. The method of claim 3, wherein mapping the original three-dimensional point cloud data to a welding robot base coordinate system based on the current pose matrix and the reference hand-eye matrix to obtain target three-dimensional point cloud data comprises:
the product of the current pose matrix and the reference hand-eye matrix is obtained to obtain a target matrix;
and mapping the original three-dimensional point cloud data to a welding robot base coordinate system based on the target matrix to obtain the target three-dimensional point cloud data.
5. The method of claim 2, wherein matching the first point cloud to the second point cloud results in the transformation matrix comprising:
determining a target point pair between the first point cloud and the second point cloud; the target point pairs are correspondences of points in the first point cloud and points in the second point cloud;
constructing a rotation and translation matrix through the target point pairs;
transforming the first point cloud to a coordinate system of the second point cloud by using the rotation and translation matrix to obtain a transformed point cloud;
constructing an error function of the transformed point cloud and the second point cloud;
if the detected error is smaller than or equal to a preset threshold value according to the error function, determining a transformation matrix based on the rotation and translation matrix;
and if the error is detected to be larger than the preset threshold value according to the error function, updating the first point cloud according to the transformed point cloud, and iterating a rotation and translation matrix between the updated first point cloud and the second point cloud.
6. The method of claim 1, wherein after compensating the reference hand-eye matrix to obtain the target hand-eye matrix, the method further comprises:
based on the target hand-eye matrix, performing weld extraction on three-dimensional camera data of a weld of a workpiece to be welded to obtain the position of the weld of the workpiece to be welded in a three-dimensional space; the three-dimensional camera data is data of the weld collected by the three-dimensional camera;
and welding the workpieces to be welded based on the positions of the welding seams in the three-dimensional space.
7. The method of claim 1, wherein acquiring raw three-dimensional point cloud data of a calibration block by a three-dimensional camera disposed at a distal end of a welding robotic torch comprises:
and acquiring original three-dimensional point cloud data of one frame of the calibration block through a three-dimensional camera arranged at the tail end of the welding gun of the welding robot.
8. A control device for a welding robot, comprising:
the point cloud data acquisition module is used for acquiring original three-dimensional point cloud data of the calibration block through a three-dimensional camera arranged at the tail end of a welding gun of the welding robot; the calibration block is calibrated with a plurality of angular points, and the positions of at least two angular points in the plurality of angular points in the horizontal direction and the vertical direction are different;
the identification module is used for identifying a first position of the corner feature of the calibration block based on the original three-dimensional point cloud data and recording a first point cloud located at the first position; the corner feature is a feature of a corner of the calibration block;
the point cloud acquisition module is used for driving the welding robot to move by taking the first position as a target position of the tail end of the welding gun, acquiring a second position actually reached by the tail end of the welding gun after the welding robot finishes moving, and recording second point cloud located at the second position;
and the compensation module is used for compensating the reference hand-eye matrix based on a transformation matrix obtained by matching the first point cloud and the second point cloud to obtain a target hand-eye matrix.
9. A welding robot, comprising:
one or more processors;
storage means for storing one or more programs that, when executed by the one or more processors, cause the welding robot to implement the method of any one of claims 1 to 7.
10. A computer-readable storage medium having stored thereon computer-readable instructions which, when executed by a processor of a computer, cause the computer to perform the method of any one of claims 1 to 7.
CN202211057960.2A 2022-08-31 2022-08-31 Welding robot control method, welding robot control device, welding robot, and storage medium Pending CN115351482A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211057960.2A CN115351482A (en) 2022-08-31 2022-08-31 Welding robot control method, welding robot control device, welding robot, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211057960.2A CN115351482A (en) 2022-08-31 2022-08-31 Welding robot control method, welding robot control device, welding robot, and storage medium

Publications (1)

Publication Number Publication Date
CN115351482A true CN115351482A (en) 2022-11-18

Family

ID=84004718

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211057960.2A Pending CN115351482A (en) 2022-08-31 2022-08-31 Welding robot control method, welding robot control device, welding robot, and storage medium

Country Status (1)

Country Link
CN (1) CN115351482A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115990890A (en) * 2023-03-23 2023-04-21 深圳广成创新技术有限公司 Calibration method and device for manipulator, computer equipment and storage medium
CN117381798A (en) * 2023-12-11 2024-01-12 法奥意威(苏州)机器人系统有限公司 Hand-eye calibration method and device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115990890A (en) * 2023-03-23 2023-04-21 深圳广成创新技术有限公司 Calibration method and device for manipulator, computer equipment and storage medium
CN117381798A (en) * 2023-12-11 2024-01-12 法奥意威(苏州)机器人系统有限公司 Hand-eye calibration method and device
CN117381798B (en) * 2023-12-11 2024-04-12 法奥意威(苏州)机器人系统有限公司 Hand-eye calibration method and device

Similar Documents

Publication Publication Date Title
CN110780285B (en) Pose calibration method, system and medium for laser radar and combined inertial navigation
CN115351482A (en) Welding robot control method, welding robot control device, welding robot, and storage medium
JP4021413B2 (en) Measuring device
CN109032348B (en) Intelligent manufacturing method and equipment based on augmented reality
US11911912B2 (en) Robot control apparatus and method for learning task skill of the robot
JP5113666B2 (en) Robot teaching system and display method of robot operation simulation result
CN113284191A (en) Dispensing method, system, equipment and storage medium based on visual guidance
CN115972093B (en) Workpiece surface measuring method and device and wing wallboard soft mold polishing method
CN111368927B (en) Labeling result processing method, device, equipment and storage medium
CN114474041A (en) Welding automation intelligent guiding method and system based on cooperative robot
CN115284292A (en) Mechanical arm hand-eye calibration method and device based on laser camera
CN113319859B (en) Robot teaching method, system and device and electronic equipment
CN110853098B (en) Robot positioning method, device, equipment and storage medium
CN110319771B (en) Online measurement method, device, system, electronic equipment and storage medium
CN117103277A (en) Mechanical arm sensing method based on multi-mode data fusion
CN109333527B (en) Interaction method and device with robot, electronic equipment and storage medium
US11577400B2 (en) Method and apparatus for managing robot system
CN111220100B (en) Laser beam-based measurement method, device, system, control device, and medium
CN113345023A (en) Positioning method and device of box body, medium and electronic equipment
CN115284297B (en) Workpiece positioning method, robot, and robot working method
CN117140517A (en) High-precision automatic hand-eye calibration method and system for mechanical arm
CN115570562B (en) Robot assembly pose determining method and device, robot and storage medium
CN114494669B (en) Transparent plane object superposition method, controller, superposition device and storage medium
CN116859920B (en) Precision compensation method and device, electronic equipment and punching system
WO2024057836A1 (en) Control method for controlling transport of object, transport device for transporting object, and work system provided with transport device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination