CN117584121A - Welding robot path planning method based on point cloud scene understanding - Google Patents

Welding robot path planning method based on point cloud scene understanding Download PDF

Info

Publication number
CN117584121A
CN117584121A CN202311535958.6A CN202311535958A CN117584121A CN 117584121 A CN117584121 A CN 117584121A CN 202311535958 A CN202311535958 A CN 202311535958A CN 117584121 A CN117584121 A CN 117584121A
Authority
CN
China
Prior art keywords
welding
robot
point cloud
global
path
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311535958.6A
Other languages
Chinese (zh)
Inventor
刘今越
李怡隆
武宇森
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hebei University of Technology
Original Assignee
Hebei University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hebei University of Technology filed Critical Hebei University of Technology
Priority to CN202311535958.6A priority Critical patent/CN117584121A/en
Publication of CN117584121A publication Critical patent/CN117584121A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K37/00Auxiliary devices or processes, not specially adapted to a procedure covered by only one of the preceding main groups
    • B23K37/02Carriages for supporting the welding or cutting element

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Manipulator (AREA)
  • Laser Beam Processing (AREA)
  • Numerical Control (AREA)

Abstract

The invention provides a welding robot path planning method based on point cloud scene understanding, which comprises the following steps: step one, performing global three-dimensional reconstruction by using a global vision sensor to obtain a point cloud of a workpiece to be welded; denoising, filtering, semantic segmentation and feature extraction are carried out on the obtained point cloud, so that a global welding track of the to-be-welded piece is obtained; thirdly, performing welding strategy planning according to the obtained global welding track, and adjusting the welding posture of the robot; step four, dividing a welding path based on a line laser tracking model; and fifthly, guiding the robot to reach an initial welding point according to the global welding strategy, accurately identifying welding seam characteristics by utilizing a laser vision sensor to obtain a final welding path, smoothing the final welding path in real time, and fusing the welding pose of the robot by combining the welding strategy in the step three to ensure that the robot completes welding operation.

Description

Welding robot path planning method based on point cloud scene understanding
Technical Field
The invention relates to the technical field of welding robots, in particular to a welding robot path planning method based on point cloud scene understanding.
Background
With the further development of intelligent manufacturing, the welding robot is widely applied to industrial production due to high efficiency, accuracy and stability, and the working efficiency is greatly improved. At present, the welding robot is used in industry, and most of the modes are still traditional manual teaching modes, and the modes have high requirements on the operation capacity of workers, are complex to operate, have long repeated working time and are difficult to realize wide applicability when welding pieces or other conditions change.
In the research of the field of the welding robot at present, the robot autonomous programming system effectively solves the problems of low efficiency, large error and the like of the traditional manual teaching mode, can well complete the welding operation programming work based on offline programming, and can play a certain role in the current welding field. There are still many difficulties with complex space curved welding and high-adaptability welding. In order to enhance the perception capability of the robot in welding operation, the improvement of external sensing of the robot is a better solution, and accurate tracking and extraction of welding seams can be realized by utilizing line structured light to design a laser vision sensor. However, the laser sensor is mainly used for accurately extracting the weld joint position through an image, cannot provide effective reference for robot gesture planning, and lacks perception of global three-dimensional characteristics of welding parts. Therefore, consideration is required for global three-dimensional point cloud reconstruction with other sensors. And performing track planning on the working pose of the robot by using the space three-dimensional point cloud, and providing a foundation for the application of the three-dimensional point cloud to the welding field. Although the method provides some foundations for autonomous operation of the welding robot, extraction of welding target characteristics and global perception of welding track, the problems of single operation scene, manual teaching of welding starting and ending points, low intelligent degree and the like still exist.
Disclosure of Invention
This section is intended to outline some aspects of embodiments of the invention and to briefly introduce some preferred embodiments. Some simplifications or omissions may be made in this section as well as in the description summary and in the title of the application, to avoid obscuring the purpose of this section, the description summary and the title of the invention, which should not be used to limit the scope of the invention.
Aiming at the defects, the invention provides a welding robot path planning strategy based on multi-element sensing fusion. The invention provides a welding robot path planning method based on point cloud scene understanding. And then, aiming at the problem of insufficient accuracy of the welding track extracted by the global three-dimensional reconstruction point cloud, obtaining an accurate welding track by utilizing local accurate visual sensing, and simultaneously carrying out error compensation on the welding path by combining global planning information so as to ensure the welding quality in the welding process.
The technical scheme adopted for solving the technical problems is as follows: a welding robot path planning method based on point cloud scene understanding comprises the following steps: step one, performing global three-dimensional reconstruction by using a global vision sensor to obtain a point cloud of a workpiece to be welded; denoising, filtering, semantic segmentation and feature extraction are carried out on the obtained point cloud, so that a global welding track of the to-be-welded piece is obtained; thirdly, performing welding strategy planning according to the obtained global welding track, and adjusting the welding posture of the robot; step four, dividing a welding path based on a line laser tracking model; and fifthly, guiding the robot to reach an initial welding point according to the global welding strategy, accurately identifying welding seam characteristics by utilizing a laser vision sensor to obtain a final welding path, smoothing the final welding path in real time, and fusing the welding pose of the robot by combining the welding strategy in the step three to ensure that the robot completes welding operation.
Further, the first step includes: the invention aims at solving the problem of path planning of a welding robot in an industrial scene, and a global vision camera is assembled on the robot to reconstruct three-dimensionally for a workpiece to be welded in the operation scene. And (3) calibrating the coordinate relationship between the camera and the robot to obtain a conversion relationship between the camera and the robot, so that the point cloud obtained by three-dimensional reconstruction is converted into the robot coordinate system.
Further, the second step includes: and processing the point cloud data obtained by three-dimensional reconstruction, wherein the obtained point cloud has a large number of miscellaneous points due to the problems of cameras or light rays and the like. In order to solve the problems, in the step, the point cloud data is firstly downsampled by using a mean value filtering method, and then noise points are removed by using an outlier removing method, so that a welding part point cloud with better quality is obtained. And then carrying out feature extraction and semantic segmentation on the point cloud, highlighting geometric information and features contained in the point cloud, and understanding various semantic features to obtain welding track features of the point cloud and global welding track features.
Further, the third step includes: and (3) carrying out global welding strategy planning based on the global welding track characteristics obtained in the step two. Due to the relation between the local laser sensor installation pose and the tracking recognition range and the welding piece size and the welding curve curvature, a global welding strategy of the welding robot needs to be determined, such as strategies of whether the pose needs to be changed in the welding process, whether multi-section welding needs to be performed or not, and the like. Therefore, the global welding pose needs to be subjected to strategy planning so as to ensure that the path planning of the welding robot is reasonable. The method comprises two parts, namely welding gesture adjustment, wherein in the welding operation process, a certain angle is required to be stored between the moving direction of a welding gun and the tangential direction of a welding line, gesture change smoothness and stability are required to be ensured, gesture adjustment is carried out by utilizing an equivalent angle-axis representation principle, three continuous characteristic points are used as calculation points, an equivalent axis angle relation of middle characteristic points is calculated by an equivalent rotation matrix formula, and the equivalent axis angle relation is calculated according to the positive kinematic relation of a robot, so that a real-time gesture adjustment method of the position and the gesture of the welding gun in the welding operation process is defined, and real-time position and gesture conversion is completed.
Further, the fourth step includes: the second part of global path planning is based on welding path segmentation of a line laser tracking model, the accurate welding seam position in the welding operation process is needed to be obtained through laser welding seam tracking, but the sensor identification range is limited, and due to the limitation of the installation position and the identification range of the laser sensor, the sensor can lose a tracking target or have welding gun posture adjustment mutation in the tracking process, so that welding seam tracking welding can not be continuously completed. Therefore, the relation between the laser sensor light plane and the global welding path is utilized, and whether the welding track can be smoothly tracked or not is identified by solving the equation of the relation, so that whether multi-section welding is performed or not is defined, and the welding quality is ensured.
Further, the fifth step includes: and sending an instruction to guide the robot to reach the initial welding position through the obtained global welding strategy. And then, obtaining accurate weld characteristic points through weld characteristic recognition by using a designed laser sensor, and converting the accurate weld characteristic points into a robot coordinate system by using a sensor calibration relation so as to obtain a final welding path. In order to ensure the smoothness and stability of the welding track, when welding path points are generated in real time, a path curve is fitted by using the direction vector projection method provided by the invention, and the welding path is smoothed. After the welding path is obtained, the welding robot pose information obtained by utilizing the global planning strategy is combined with the welding path information obtained in real time, the welding path information and the welding path information are aligned and fused, and finally the welding robot pose information can be obtained in real time and is sent to a robot to complete welding operation.
The welding robot path planning method based on the point cloud scene understanding has the advantages that the welding robot path planning method based on the point cloud scene understanding is provided, and the global vision sensor is utilized to obtain the welding part point cloud information through environment perception. And processing the acquired weldment point cloud information through environment understanding and global gesture planning to obtain rough welding seam global position information and robot global gesture information, so as to realize global planning of robot operation. The welding seam position is accurately identified through the laser sensor, the extracted welding path is subjected to smoothing treatment by utilizing a path smoothing and gesture transformation algorithm, and the real-time gesture change of the robot in welding operation is realized by combining the global gesture information of the robot, so that the stability in welding operation is ensured, and the autonomous operation of the welding robot is realized. The problem that the robot cannot realize autonomous teaching programming under the condition of no model in the current welding robot field is solved, the welding efficiency is improved, and the welding robot can adapt to various unstructured welding operation requirements.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, it being obvious that the drawings in the description below are only some embodiments of the invention, and that other drawings can be obtained from these drawings without inventive faculty for a person skilled in the art.
FIG. 1 is a flow chart of the present invention;
FIG. 2 is a schematic diagram of the system of the present invention;
FIG. 3 is a schematic diagram of a working scenario of the present invention;
FIG. 4 is a schematic view of a weldment point cloud reconstruction of the present invention;
FIG. 5 is a schematic view of feature segmentation in accordance with the present invention;
FIG. 6 is a schematic diagram of a weld trace extraction in accordance with the present invention;
FIG. 7 is a schematic diagram of curvilinear welding of the present invention;
FIG. 8 is a schematic view of a weld trace point according to the present invention;
FIG. 9 is a schematic representation of an equivalent angle-axis representation of the present invention;
FIG. 10 is a schematic diagram of the identification range of the laser sensor according to the present invention;
FIG. 11 is a schematic diagram of the laser tracking mutation situation of the present invention;
FIG. 12 is a schematic diagram of laser tracking according to the present invention;
FIG. 13 is a schematic view of an index space according to the present invention;
FIG. 14 is a graph showing the calculation results of the present invention;
FIG. 15 is a flow chart of the real-time trajectory generation of the present invention;
FIG. 16 is a flow chart of weld feature extraction in accordance with the present invention;
FIG. 17 is an experimental view of a curved riser weldment of the present invention;
fig. 18 is an experimental view of a straight riser weldment of the present invention.
Detailed Description
Referring to fig. 1 and 2, a welding robot path planning method based on point cloud scene understanding includes the following steps: step one, performing global three-dimensional reconstruction by using a global vision sensor to obtain a point cloud of a workpiece to be welded; denoising, filtering, semantic segmentation and feature extraction are carried out on the obtained point cloud, so that a global welding track of the to-be-welded piece is obtained; thirdly, performing welding strategy planning according to the obtained global welding track, and adjusting the welding posture of the robot; step four, dividing a welding path based on a line laser tracking model; and fifthly, guiding the robot to reach an initial welding point according to the global welding strategy, accurately identifying welding seam characteristics by utilizing a laser vision sensor to obtain a final welding path, smoothing the final welding path in real time, and fusing the welding pose of the robot by combining the welding strategy in the step three to ensure that the robot completes welding operation.
Referring to fig. 3 and 4, step one includes: and assembling a global vision camera on the robot, and carrying out three-dimensional reconstruction aiming at the to-be-welded piece in the operation scene. And (3) calibrating the coordinate relationship between the camera and the robot to obtain a conversion relationship between the camera and the robot, so that the point cloud obtained by three-dimensional reconstruction is converted into the robot coordinate system.
Referring to fig. 5 and 6, step two includes: for the point cloud data obtained by three-dimensional reconstruction, the geometrical information and the characteristics contained in the point cloud data can be highlighted through semantic segmentation, and for weldments, the extracted semantic information is beneficial to understanding and extracting the weldment model, so that the welding track of the weldment model is obtained. For example, a straight weld may be understood as an intersection where multiple spatial planes in the weld intersect; a general space curve weld can be understood as a multi-time B-spline intersection of a space plane and a space surface of a weld; the arc weld joint can be understood as an intersection line of a space cylindrical surface and a space plane in the welding part; an intersecting line weld is understood to be the intersection of two cylindrical surfaces of space in a weld. By understanding various semantic features, the welding track features can be obtained. The feature segmentation of the point cloud is carried out by combining RANSAC (random sample consensus) feature extraction and cluster segmentation extraction. The RANSAC feature extraction is utilized to carry out iterative computation for a plurality of times, and point clouds with obvious geometric features, such as plane features, cylindrical features, spherical features and the like, are segmented. And then carrying out other complex curved surface characteristics on the rest weldment point clouds, and completing the weldment point cloud segmentation flow by using a clustering segmentation extraction method. The segmented point cloud is shown in fig. 5 and 6.
Referring to fig. 7, 8, and 9, step three includes: due to the requirement of the welding process, a certain angle is required to be saved between the moving direction of the welding gun and the tangential direction of the welding line in the welding operation process, so that the posture of the welding gun needs to be adjusted under the condition of facing the curved welding line. In the process of adjusting the posture of the welding gun, the smoothness and stability of posture change are required to be ensured, and the posture is adjusted by utilizing the principle of equivalent angle-axis representation. And calculating the equivalent axial angle relation of the middle characteristic points by using the equivalent rotation matrix formula by taking the three continuous characteristic points as calculation points, and calculating according to the positive kinematics relation of the robot, thereby defining a real-time pose adjusting method of the welding gun in the welding operation process and finishing real-time position and pose transformation.
As shown in FIG. 8, let p 1 ,p 2 ,p 3 Respectively, 3 continuous welding track points, the coordinates of which are respectively (x 1 ,y 1 ,z 1 ),(x 2 ,y 2 ,z 2 ),(x 3 ,y 3 ,z 3 ). From p 1 ,p 2 ,p 3 3-point determination of two direction vectorsVector->And (3) performing cross multiplication to obtain:
wherein the vector isIs from vector->To vector->An equivalent axis of limited rotation of the robot tip pose.
As shown in FIG. 9, the coordinate system { A } is that the robot is inTool coordinate system in the direction, coordinate system { B } is robot in +.>Direction +.>Direction tool coordinate System, robot is in the sub ∈>Exercise to->Is represented by the equivalent angle-axis as R K (θ)。
Wherein cθ=cos θ, sθ=sin θ, vθ=1-cos θ, andthe sign of θ is determined by the right hand rule. Thus, three continuous welding points can obtain the middle gesture transformation matrix, and a group of complete welding gestures are finally obtained by using the gesture transformation matrix, so that more information is provided for the following global track planning.
Referring to fig. 10, 11, 12, 13, and 14, step four includes: during the welding operation, the precise weld position needs to be obtained by laser weld tracking. According to the line laser sensor principle, the sensor identification range is limited, as shown in fig. 10. Taking a box type welding piece as an example, when the corner fillet of the box is too small, the sensor can lose a tracking target or have a sudden change of posture adjustment of a welding gun in the tracking process due to the limitation of the installation position and the identification range of the laser sensor, and the welding seam tracking welding cannot be continuously completed, as shown in fig. 11.
Since the laser sensor has an identification range, in the case a, the characteristic point can be correctly identified at the point p2 in the process of moving the welding gun from the point p1 to the point p2, the laser sensor can continue to track the welding seam, and in this case, the welding track of the box body can be defined as a continuous welding path. In case b, the characteristic point cannot be identified at the p2 point in the process of moving the welding gun from the p1 point to the p2 point, so that the laser sensor cannot continue to track the welding seam, and in this case, the welding track of the box body needs to be defined as a multi-section welding path so as to ensure the welding quality.
From the above, it is necessary to determine the global weld track, and the laser tracking process is shown in fig. 12. The point O in the figure is the origin of the TCP coordinate system at the tail end of the welding gun, and the quadrilateral ABCD is the identifiable range of the laser sensor light plane. According to the conversion relation between the laser sensor and the TCP at the tail end of the welding gunIt is known that the laser sensor weld recognition condition is satisfied when the welding locus has an intersection with the laser plane. Let the pose matrix of the current point O of the welding gun TCP be +.>The laser sensor light plane and the welding gun TCP conversion matrix are +.>A. B, C, D four points have a coordinate (x) in the light plane i ,y i ,z i ) Then under the current TCP end pose, the coordinates of A, B, C, D four points in the base coordinate system (X i ,Y i ,Z i ) The method comprises the following steps:
the four points can be used for calculating the plane ABCD equation of the laser sensor light plane under the basic coordinate system { B }:
A 0 X+B 0 Y+C 0 Z+D 0 =0 (5)
referring to fig. 13, in the base coordinate system { B }, defining an index space S by using the laser plane ABCD, wherein all the welding track point sets in S are { MN }, and performing space curve fitting on the point sets { MN }, to obtain the parameter equation of the curve MN as follows:
the simultaneous formulas (5) and (6) can be obtained:
if there is an intersection point P (x, y, z) between the curve MN and the plane ABCD, and P E is ABCD, the current welding gun end TCP pose is reasonable and the tracking track is continuous, as shown in FIG. 14 a.
If there is an intersection point P (x, y, z) between the curve MN and the plane ABCD, andor when the intersection point does not exist, the TCP pose at the tail end of the current welding gun is unreasonable, the tracking track cannot be smooth and continuous, and multi-section welding planning processing is needed, as shown in fig. 14 b) and c).
The global welding strategy planning is carried out on the welding robot by utilizing global information obtained by environment perception so as to verify the feasibility of welding gesture calculation, and meanwhile, the area possibly with unreasonable laser tracking in the tracking process is subjected to sectional processing so as to ensure the stability and continuity of the welding seam tracking of the welding robot in the actual welding operation.
Referring to fig. 15 and 16, step five includes: and sending an instruction to guide the robot to reach the initial welding position through the obtained global welding strategy. And then, obtaining accurate weld characteristic points through weld characteristic recognition by using a designed laser sensor, and converting the accurate weld characteristic points into a robot coordinate system by using a sensor calibration relation so as to obtain a final welding path. In order to ensure the smoothness and stability of the welding track, when welding path points are generated in real time, a path curve is fitted by using the direction vector projection method provided by the invention, and the welding path is smoothed. The specific process flow is shown in fig. 15.
After the robot reaches the initial welding position, the extracted welding seam position has larger deviation due to the influence of the accuracy of the global camera, so that the laser sensor is required to accurately identify the welding seam track to ensure the stability of welding operation. The working flow is as follows: firstly, acquiring original weld groove data in real time through a laser sensor. And obtaining the original contour of the weld groove through image processing, smooth filtering and other modes. And finally, obtaining accurate data by detecting characteristic points of the welding line, wherein the specific flow is shown in fig. 16.
After the welding seam track is obtained through the laser sensor, due to the influences of factors such as the machining precision, surface defects, groove defects and the like of the welding piece, individual noise points still exist in the obtained welding seam track, and the welding quality of a robot is seriously influenced, so that the smooth denoising treatment is needed for the welding seam track points.
In the process of weld tracking, if smooth denoising treatment is needed, a welding track curve expression is needed to be obtained, and because the welding track is an uncertain space curve, real-time rapid fitting is needed to obtain the space curve expression. The laser sensors are scanned in sequence to obtain points (x 1 ,y 1 ,z 1 ),(x 2 ,y 2 ,z 2 ),(x 3 ,y 3 ,z 3 ),...,(x n ,y n ,z n ) The space curve expression to be fitted is defined as:
F(x,y)=m 0 +m 1 x+m 2 x 2 +...+m k x k +n 0 +n 1 y+n 2 y 2 +...+n k y k =z (7)
if the space curve fitting is directly carried out on the discrete points, the method has the defects of large calculated amount, long consumed time, unfavorable real-time tracking and the like, so that the problem of the space three-dimensional curve is solved into a plane two-dimensional curve problem by adopting a direction projection method, namely, the space three-dimensional curve F (x, y, z) is projected onto two specific planes according to the direction trend of the curve, and the unique space curve can be determined by utilizing the two projected curved surfaces, so that the two curved surfaces are combined to obtain the required space three-dimensional curve immediately.
The sampling frequency of the laser sensor is very high, and a welding track obtained in a certain time does not have a complex curve. Therefore, the local welding track is approximately regarded as a curve obtained by intersecting two cubic curved surfaces, namely, two-dimensional curves obtained by directional projection are both cubic curves.
When defining the projection direction of welding track, a group of welding track points obtained by real-time window function are used as the starting point P of the group of points 0 To the end point P n Unit vector of directionIs the direction of movement of the welding path, namely:
by means of vectorsAnd judging three components in a coordinate system, and then selecting two planes from three planes of x, y and z for projection to obtain two tertiary curved surfaces for fitting.
When curve fitting is carried out, the methods such as Gaussian, least square method and 3 times of B spline curve fitting are adopted, compared, the calculated amount of the Gaussian fitting method is too large, the time is long, the least square method is poor in fitting data to scattered data points, the curve fitted by the three times of B spline fitting method has good coincidence degree with the data, and a good fitting effect can be achieved. Therefore, the fitting of the projection curve is performed by adopting a 3-degree B spline curve fitting method, and the parameter expression of a single-segment three-degree B spline curve is as follows:
wherein:
and obtaining a smoothed welding track through the fitted B-spline curve, and fusing the smoothed welding track with a welding gesture obtained during global planning to obtain a final welding gesture.
Referring to fig. 17 and 18, through experimental verification, the welding robot path planning method based on point cloud scene understanding can be applied to welding operation scenes under non-standardization, can realize automatic identification and automatic welding planning effects, and improves the operation efficiency of welding operation.
The foregoing description is only illustrative of the specific process flow and procedure embodiments of the present invention, and is not intended to limit the scope of the invention, so that the substitution of equivalent components, algorithm control terms, or equivalent changes and modifications within the scope of the invention shall still fall within the scope of the patent coverage. In addition, the technical characteristics and technical scheme, technical characteristics and technical scheme can be freely combined for use.

Claims (9)

1. A welding robot path planning method based on point cloud scene understanding is characterized in that: the method comprises the following steps: step one, performing global three-dimensional reconstruction by using a global vision sensor to obtain a point cloud of a workpiece to be welded; denoising, filtering, semantic segmentation and feature extraction are carried out on the obtained point cloud, so that a global welding track of the to-be-welded piece is obtained; thirdly, performing welding strategy planning according to the obtained global welding track, and adjusting the welding posture of the robot; step four, dividing a welding path based on a line laser tracking model; and fifthly, guiding the robot to reach an initial welding point according to the global welding strategy, accurately identifying welding seam characteristics by utilizing a laser vision sensor to obtain a final welding path, smoothing the final welding path in real time, and fusing the welding pose of the robot by combining the welding strategy in the step three to ensure that the robot completes welding operation.
2. The welding robot path planning method based on point cloud scene understanding as recited in claim 1, wherein the method comprises the following steps: in the first step: the invention relates to a research object, which is a problem of reconstruction of an industrial robot for welding in the industrial field, wherein a global vision camera is assembled on the robot, and three-dimensional reconstruction is carried out on a piece to be welded in an operation scene. And (3) calibrating the coordinate relationship between the camera and the robot to obtain a conversion relationship between the camera and the robot, so that the point cloud obtained by three-dimensional reconstruction is converted into the robot coordinate system.
3. The welding robot path planning method based on point cloud scene understanding as recited in claim 1, wherein the method comprises the following steps: in the second step, the point cloud data obtained by three-dimensional reconstruction in the first step is processed, and the obtained point clouds have a large number and many miscellaneous points due to the problems of cameras or light rays and the like. In order to solve the problems, in the step, the point cloud data is firstly downsampled by using a mean value filtering method, and then noise points are removed by using an outlier removing method, so that a welding part point cloud with better quality is obtained. And then carrying out feature extraction and semantic segmentation on the point cloud, highlighting geometric information and features contained in the point cloud, and understanding various semantic features to obtain welding track features of the point cloud and global welding track features.
4. The welding robot path planning method based on point cloud scene understanding as recited in claim 1, wherein the method comprises the following steps: and thirdly, performing global welding strategy planning based on the global welding track characteristics obtained in the second step. Due to the relation between the local laser sensor installation pose and the tracking recognition range and the welding piece size and the welding curve curvature, a global welding strategy of the welding robot needs to be determined, such as strategies of whether the pose needs to be changed in the welding process, whether multi-section welding needs to be performed or not, and the like. Therefore, the global welding pose needs to be subjected to strategy planning so as to ensure that the path planning of the welding robot is reasonable. The method comprises two parts, namely welding gesture adjustment, wherein in the welding operation process, a certain angle is required to be stored between the moving direction of a welding gun and the tangential direction of a welding line, gesture change smoothness and stability are required to be ensured, gesture adjustment is carried out by utilizing an equivalent angle-axis representation principle, three continuous characteristic points are used as calculation points, an equivalent axis angle relation of middle characteristic points is calculated by an equivalent rotation matrix formula, and the equivalent axis angle relation is calculated according to the positive kinematic relation of a robot, so that a real-time gesture adjustment method of the position and the gesture of the welding gun in the welding operation process is defined, and real-time position and gesture conversion is completed.
5. The welding robot path planning method based on point cloud scene understanding as recited in claim 4, wherein: in the third step, due to the requirement of the welding process, a certain angle is required to be kept between the moving direction of the welding gun and the tangential direction of the welding line in the welding operation process, so that the posture of the welding gun needs to be adjusted under the condition of facing the curved welding line. In the process of adjusting the posture of a welding gun, the smoothness and stability of posture change are required to be ensured, the posture is adjusted by utilizing the principle of equivalent angle-axis representation, and the core conversion matrix is an equivalent angle-axis matrix R K (θ)。
6. The welding robot path planning method based on point cloud scene understanding as recited in claim 1, wherein the method comprises the following steps: in the fourth step, the second part of global path planning is based on welding path segmentation of a line laser tracking model, the accurate weld position in the welding operation process is required to be obtained through laser weld tracking, but the sensor identification range is limited, and due to the limitation of the installation position and the identification range of the laser sensor, the sensor can lose a tracking target or have a welding gun posture adjustment mutation in the tracking process, and the welding seam tracking welding cannot be continuously completed.
7. The welding robot path planning method based on point cloud scene understanding as recited in claim 6, wherein: and step four, performing global welding strategy planning on the welding robot by using global information obtained by environment sensing to verify the feasibility of welding posture calculation, and simultaneously performing segmentation processing on an area possibly with unreasonable laser tracking in the tracking process to ensure the stability and continuity of welding seam tracking of the welding robot in actual welding operation.
8. The welding robot path planning method based on point cloud scene understanding as recited in claim 1, wherein the method comprises the following steps: and fifthly, sending an instruction to guide the robot to reach the initial welding position through the obtained global welding strategy. And then, obtaining accurate weld characteristic points through weld characteristic recognition by using a designed laser sensor, and converting the accurate weld characteristic points into a robot coordinate system by using a sensor calibration relation so as to obtain a final welding path. In order to ensure the smoothness and stability of the welding track, when welding path points are generated in real time, a path curve is fitted by using the direction vector projection method provided by the invention, and the welding path is smoothed.
9. The welding robot path planning method based on point cloud scene understanding as recited in claim 8, wherein: and fifthly, after the welding path is obtained, the welding robot posture information obtained by utilizing the global planning strategy is combined with the welding path information obtained in real time, the welding path information and the welding path information are aligned and fused, and finally the welding robot posture information can be obtained in real time and is sent to a robot to finish welding operation.
CN202311535958.6A 2023-11-17 2023-11-17 Welding robot path planning method based on point cloud scene understanding Pending CN117584121A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311535958.6A CN117584121A (en) 2023-11-17 2023-11-17 Welding robot path planning method based on point cloud scene understanding

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311535958.6A CN117584121A (en) 2023-11-17 2023-11-17 Welding robot path planning method based on point cloud scene understanding

Publications (1)

Publication Number Publication Date
CN117584121A true CN117584121A (en) 2024-02-23

Family

ID=89917496

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311535958.6A Pending CN117584121A (en) 2023-11-17 2023-11-17 Welding robot path planning method based on point cloud scene understanding

Country Status (1)

Country Link
CN (1) CN117584121A (en)

Similar Documents

Publication Publication Date Title
CN110227876B (en) Robot welding path autonomous planning method based on 3D point cloud data
CN111805051B (en) Groove cutting method, device, electronic equipment and system
CN112518072B (en) Spatial intersecting curve weld joint structure modeling method based on line structure light vision
Hou et al. A teaching-free welding method based on laser visual sensing system in robotic GMAW
CN113798634B (en) Method, system and equipment for teaching spatial circular weld and tracking weld
CN112288707A (en) Robot weld polishing algorithm based on feature point recognition
Zhou et al. Intelligent guidance programming of welding robot for 3D curved welding seam
Shah et al. A review paper on vision based identification, detection and tracking of weld seams path in welding robot environment
Geng et al. A novel welding path planning method based on point cloud for robotic welding of impeller blades
Ma et al. A fast and robust seam tracking method for spatial circular weld based on laser visual sensor
Ma et al. An efficient and robust complex weld seam feature point extraction method for seam tracking and posture adjustment
Wu et al. A teaching-free welding position guidance method for fillet weld based on laser vision sensing and EGM technology
Zou et al. Research on 3D curved weld seam trajectory position and orientation detection method
Ge et al. An efficient system based on model segmentation for weld seam grinding robot
Pachidis et al. Vision-based path generation method for a robot-based arc welding system
Wang et al. Fuzzy-PI double-layer stability control of an online vision-based tracking system
Fang et al. A vision-based method for narrow weld trajectory recognition of arc welding robots
CN115770988A (en) Intelligent welding robot teaching method based on point cloud environment understanding
CN117584121A (en) Welding robot path planning method based on point cloud scene understanding
CN114237150B (en) Robot weld milling path control method and device based on weld features
Wang et al. Multilayer positioning strategy for tubesheet welding robot based on point cloud model
Yu et al. Multiseam tracking with a portable robotic welding system in unstructured environments
CN114842144A (en) Binocular vision three-dimensional reconstruction method and system
Hao et al. A novel multi-seam extraction method for structured workpieces with medium-thick plates based on DLP vision
Rodionov et al. Methods of automatic correction of technological trajectory of laser welding complex by means of computer vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication