CN116542914A - Weld joint extraction and fitting method based on 3D point cloud - Google Patents

Weld joint extraction and fitting method based on 3D point cloud Download PDF

Info

Publication number
CN116542914A
CN116542914A CN202310405726.2A CN202310405726A CN116542914A CN 116542914 A CN116542914 A CN 116542914A CN 202310405726 A CN202310405726 A CN 202310405726A CN 116542914 A CN116542914 A CN 116542914A
Authority
CN
China
Prior art keywords
point cloud
point
welding
weld
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310405726.2A
Other languages
Chinese (zh)
Inventor
叶锦华
林炜盛
黄斯凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fuzhou University
Original Assignee
Fuzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuzhou University filed Critical Fuzhou University
Priority to CN202310405726.2A priority Critical patent/CN116542914A/en
Publication of CN116542914A publication Critical patent/CN116542914A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Data Mining & Analysis (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Quality & Reliability (AREA)
  • Algebra (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention aims to provide a weld joint extraction and fitting method based on 3D point cloud, which is based on a line structure light camera to obtain point cloud; performing preprocessing operations including point cloud splicing, statistical filtering, straight-through interception and uniform sampling on the point cloud data, and extracting the point cloud data of the welding workpiece; calculating the principal curvature of each point of the workpiece point cloud by using a moving least square method to fit a third-order curved surface; identifying potential weld point clouds with curvature values meeting requirements by setting a threshold value; ICP registration is carried out on the standard point cloud derived from the CAD model, points of other contour lines are removed, and a required weld point cloud is extracted; and converting the cloud coordinates of the welding points into a base coordinate system of the robot, and performing B spline fitting to obtain a welding track. The method can obtain the space track of the complex welding seam without manual teaching, has the advantages of high extraction speed, accurate track and the like, and is suitable for the types of welding seams such as angle joint, lap joint and the like.

Description

Weld joint extraction and fitting method based on 3D point cloud
Technical Field
The invention belongs to the technical field of automatic welding, and particularly relates to a welding line extraction and fitting method based on 3D point cloud.
Background
Welding is one of the most important steps in the industrial manufacturing process. Because the welding is a work with high working strength and severe working environment and has high technical requirements on workers, the welding automation technology has great significance in improving the production efficiency and protecting the health of the workers.
Today, most welding robots used in production lines work in a manner that teaches programmed welding. Teaching programming, i.e. the operator manually controls the articulation of the robot via the teach pendant to move the robot to a predetermined position, while recording the position and transmitting it to the robot controller, after which the robot can automatically repeat the task according to instructions. But teaching on-line programming is cumbersome and inefficient. The accuracy is determined by visual inspection of the teaching aid. At present, a small part of welding robots are also programmed offline. Off-line programming is realized by simulating simulation programming on a computer, but the accuracy is not guaranteed, and certain correction is needed on teaching programming.
In practice, the weld on the workpiece is often complex. For teaching programming welding, if the shape of the welding line is complex, multiple teaching operations are needed, so that the workload is increased, and the production efficiency of a factory is seriously reduced. In order to improve the automation degree of welding, improve production efficiency, shorten production time, the automatic extraction technology of welding seams develops along with the improvement, and support is provided for automatic programming by automatically extracting space welding seams, so that a tedious and time-consuming teaching process is avoided. When the three-dimensional space curve weld joint is complex and irregular in shape, if manual teaching programming is used, more path points need to be taught, teaching accuracy depends on experience and level of an operator to a large extent, efficiency is low, and welding quality cannot be guaranteed. The advantage of automatically extracting and fitting the welding line is particularly remarkable.
At present, the traditional welding seam extraction technology benefits from the development of two-dimensional machine vision technology. However, the traditional technology based on two-dimensional vision is greatly influenced by the background environment, and the oxidation degree, illumination intensity and angle of the surface of the workpiece can have a certain influence on the weld extraction effect. And the positioning accuracy of the two-dimensional vision is greatly influenced by the calibration accuracy of the camera. The point cloud data is acquired by adopting the structured light camera, so that high-precision three-dimensional space coordinate data can be acquired, the surface and the shape of a workpiece can be well represented, and the method is a powerful basis for acquiring high-precision welding seams. Research of a structured light camera to acquire point cloud data and apply the point cloud data to the welding field in combination with a robot is an important and promising research direction of welding robot technology.
Disclosure of Invention
Aiming at the problems and the points to be improved in the prior art, the invention provides a weld joint extraction and fitting method based on 3D point cloud, relates to weld joint extraction aiming at corner joint weld joints and lap joint weld joints, can realize high-precision and rapid extraction of weld joint tracks, and is based on accurate positioning of the starting point and the end point of the weld joint which are difficult to finish by a two-dimensional visual method.
The basic scheme comprises the following steps: acquiring a point cloud based on a line structured light camera; performing preprocessing operations including point cloud splicing, statistical filtering, straight-through interception and uniform sampling on the point cloud data, and extracting the point cloud data of the welding workpiece; calculating the principal curvature of each point of the workpiece point cloud by using a moving least square method to fit a third-order curved surface; identifying potential weld point clouds with curvature values meeting requirements by setting a threshold value; ICP registration is carried out on the standard point cloud derived from the CAD model, points of other contour lines are removed, and a required weld point cloud is extracted; and converting the cloud coordinates of the welding points into a base coordinate system of the robot, and performing B spline fitting to obtain a welding track. The method can obtain the space track of the complex welding seam without manual teaching, has the advantages of high extraction speed, accurate track and the like, and is suitable for the types of welding seams such as angle joint, lap joint and the like.
The technical scheme adopted for solving the technical problems is as follows:
a weld joint extraction and fitting method based on 3D point cloud comprises the following steps:
step S1: acquiring a transformation matrix T from a camera coordinate system to a robot base coordinate system ctob
Step S2: acquiring point cloud data through multi-gesture shooting;
step S3: performing point cloud preprocessing on the point cloud data: extracting point cloud data of a welding workpiece through point cloud splicing, statistical filtering, straight-through interception and uniform sampling;
step S4: potential weld point identification: selecting point clouds by using curvature characteristics, calculating the main curvature of each point by using a mobile least square fitting third-order curved surface, and identifying potential weld points by adjusting and setting a threshold value;
step S5: and (3) extracting welding seams: performing point cloud ICP registration on the point cloud obtained in the step S4 and the point cloud obtained by sampling the three-dimensional model of the welding seam of the workpiece, and determining the actual welding seam point cloud;
step S6: fitting a welding line track: and converting the weld point cloud into a coordinate system of the robot through a conversion matrix, and performing B-spline fitting on the point cloud data to obtain a weld track.
Further, in step S1, a transformation matrix from a camera coordinate system to a robot base coordinate system is obtained through line structured light camera hand-eye calibration.
Specifically, a camera is fixed at the tail end of the mechanical arm by using an eye-on-hand mounting mode, a calibration plate is fixed on a welding table, the camera shoots pictures of the calibration plate under a plurality of different poses of the mechanical arm, and the camera calculates the phase by inputting the pose of the tail end of the mechanical armPose conversion matrix R from mechanical coordinate system to mechanical arm base coordinate system ctob Then, acquiring point cloud data by multi-gesture shooting, and recording a gesture matrix during shooting for splicing operation of subsequent preprocessing.
Further, in step S2, point cloud data is acquired from the line structured light camera by multi-pose shooting.
Further, the step S3 specifically includes the following steps:
step S31: and (3) performing point cloud splicing: recording a pose matrix of the mechanical arm when the point cloud data are acquired in multiple poses, calculating a pose transformation matrix, multiplying the point cloud data by the pose transformation matrix to obtain complete point clouds of the welding workpiece obtained by shooting at multiple angles under the same coordinate system:
wherein D is i Is the average distance of k neighbors of the ith point, S max Is a distance threshold;
step S32: and (3) carrying out statistical filtering: constructing a kd-tree for neighbor searching, setting the number of points of the neighbor searching as k, calculating the global distance average value and standard deviation of the whole point cloud, calculating the average distance of k neighbors of any point, and marking the points of which the average distance value of the k neighbors is not within the distance threshold value formed by the global distance average value and the standard deviation as outliers; removing outliers to achieve the effect of removing system noise;
step S33: and (3) performing straight-through interception: setting a distance range between a camera on the manipulator and a welding workpiece on a welding table, and eliminating points with coordinate values out of the range to obtain a point cloud of the welding workpiece:
wherein X is l 、X h 、Y l 、Y h 、Z l 、Z h Respectively representing the minimum value and the maximum value of the value range in three directions of X, Y, Z axis, x i 、y i 、z i Representing x, y and z coordinates of an ith point in the point cloud;
step S34: and (3) uniformly sampling: setting a radius r, dividing a point cloud space by using a sphere with the radius r, and selecting a point closest to the center of the sphere from all points of the current sphere to replace all points in the sphere.
Splicing a plurality of incomplete original point clouds containing a workpiece and a background into a point cloud containing a complete workpiece through point cloud splicing; removing system noise outliers through statistical filtering; removing the background and the welding table through straight-through interception to obtain a workpiece point cloud; the point cloud density is reduced by uniformly sampling on the premise of not changing the point cloud data.
Further, step S4 uses curvature as a feature to identify potential weld points, and uses a moving least square method to fit out a local third-order surface to calculate curvature values of the points, and specifically includes the following steps:
step S41: constructing a kd-tree, performing k neighbor search on each point, calculating a normal vector of the point by using a result obtained by searching, and constructing a local coordinate system by taking the vector as a z-axis;
step S42: fitting a third-order curved surface on the point obtained by k neighbor search under a local coordinate system, projecting the current point onto the curved surface after fitting the third-order curved surface, and calculating the curvature value of the projection point; the curvature calculation method is as follows:
wherein h is u 、h v 、h uu 、h vv 、h uv Respectively replaceThe surface is multi-step, and partial differentiation of the formula in the u direction and the v direction is realized;
step S43: step S41 and step S42 are executed for each point, the curvature of each point is obtained, and potential welding points are identified through setting a curvature threshold value.
Further, the step S5 specifically includes the following steps:
step S51: sampling an ideal three-dimensional model of the welded workpiece and a three-dimensional model of the marked weld respectively to obtain an ideal workpiece point cloud MC 1 And weld model point cloud MC 2
Step S52: point cloud C of workpieces to be actually welded 0 Ideal three-dimensional model point cloud MC for welding workpiece 1 ICP registration is carried out to obtain a coordinate transformation matrix T, and the transformation matrix T is used for carrying out point cloud MC on the welding seam model 2 Transforming to the coordinate system of the actual welding workpiece point cloud to obtain a new welding seam model point cloud MC 2’
Step S53: setting a distance threshold value, and traversing the potential welding seam point C obtained in the step S4 4 Calculating each potential weld joint point and a new weld joint model point cloud MC 2’ The distance of the nearest point in the model (C) is kept, potential welding points with the distance smaller than a threshold value are reserved, and other points are removed to obtain an actual welding point cloud (C) 5
Further, step S6 is performed on the weld point cloud C 5 Performing cubic B-spline fitting, which specifically comprises the following steps:
step S61: using the transformation matrix T obtained in step S1 ctob Cloud C of weld points 5 Converting into a coordinate system of the robot to obtain a point cloud C 6 Point cloud C 6 Reordered according to the X coordinates and determining the basis functions of the third-order B-spline:
step S62: for reordered C 6 And (3) carrying out piecewise fitting on the point cloud by using a cubic B spline curve, and finally completing the work of extracting and fitting the welding line.
Compared with the prior art, the method and the device have the advantages that the point cloud data are acquired by using the structured light camera to conduct full-automatic weld joint extraction and fitting, a plurality of incomplete original point clouds containing workpieces and backgrounds are spliced into a point cloud containing complete workpieces through point cloud splicing, noise outliers of the system are removed through statistical filtering, the backgrounds and welding platforms are removed through straight-through interception to obtain workpiece point clouds, the density of the point clouds is reduced on the premise that the point cloud data are not changed through uniform sampling, the curvature of each point is obtained through moving a least square fitting third-order curved surface, a curvature threshold value is set to screen potential points, ICP registration is conducted with a three-dimensional model standard point cloud to automatically extract weld joint points, the weld joint points are converted to a robot base standard system through a conversion matrix, and then third-order B fitting is conducted to complete recognition extraction and track fitting of weld joints.
The welding robot has the advantages that manual teaching is not needed in the whole process, the welding workpiece is photographed through the structured light camera, the welding seam track can be extracted, a good foundation is provided for subsequent track planning of the welding robot, the process of manual teaching which is extremely time-consuming is omitted, the welding robot is suitable for being applied to industrial production lines, and the welding robot has the advantages of being efficient, high in accuracy, quick and the like.
Drawings
The invention is described in further detail below with reference to the attached drawings and detailed description:
FIG. 1 is a general flow chart of an embodiment of the present invention;
FIG. 2 is a schematic diagram of point cloud data acquisition according to an embodiment of the present invention;
FIG. 3 is a three-dimensional model of a workpiece used in an embodiment of the invention;
fig. 4 is a detailed flowchart of a method for performing weld extraction and fitting according to an embodiment of the present invention.
Detailed Description
In order to make the features and advantages of the present patent more comprehensible, embodiments accompanied with figures are described in detail below:
it should be noted that the following detailed description is illustrative and is intended to provide further explanation of the present application. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments in accordance with the present application. As used herein, the singular is also intended to include the plural unless the context clearly indicates otherwise, and furthermore, it is to be understood that the terms "comprises" and/or "comprising" when used in this specification are taken to specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof.
The processing procedures of the weld joint extraction and fitting method based on the 3D point cloud provided by the embodiment are shown in fig. 1 and 4, and the method comprises the following steps:
step S1: hand-eye calibration of line structured light camera (such as RVX series camera) to obtain conversion matrix T from camera coordinate system to robot base coordinate system ctob
Step S2: as shown in fig. 2, multi-pose shooting is performed, and point cloud data is acquired from a line structured light camera;
step S3: performing point cloud preprocessing on the point cloud data, and acquiring point cloud data of the welding workpiece through point cloud splicing, statistical filtering, straight-through interception and uniform sampling;
step S4: potential weld points are identified, point clouds are screened by using curvature characteristics, the principal curvature of each point is calculated by utilizing a mobile least square method to fit a third-order curved surface, and the potential weld points are identified by setting a proper threshold value;
step S5: extracting a welding line, namely performing point cloud ICP registration on the point cloud obtained in the step S4 and the point cloud obtained by sampling the three-dimensional model of the welding line of the workpiece as shown in FIG. 3, and determining the actual welding line point cloud;
step S6: and fitting a weld track, namely converting the weld point cloud into a coordinate system of the robot through a conversion matrix, and performing B-spline fitting on point cloud data to obtain the weld track.
In the present embodiment, for step S1, the camera is fixed to the camera by means of eye-on-hand mountingThe tail end of the mechanical arm is provided with a calibration plate fixed on a welding table, a camera shoots pictures of the calibration plate under a plurality of different poses of the mechanical arm, and a pose conversion matrix R from a camera coordinate system to a mechanical arm base coordinate system is calculated by inputting the pose of the tail end of the mechanical arm ctob Then, acquiring point cloud data by multi-gesture shooting, and recording a gesture matrix during shooting, wherein the gesture matrix is used for preprocessing splicing operation subsequently.
Step S3, preprocessing point cloud data, and acquiring point cloud data of a welding workpiece from an original plurality of point clouds through point cloud splicing, statistical filtering, straight-through interception and uniform sampling, wherein the method specifically comprises the following steps: splicing a plurality of incomplete original point clouds containing the workpiece and the background into a point cloud containing the complete workpiece through point cloud splicing; removing system noise outliers through statistical filtering; removing the background and the welding table through straight-through interception to obtain a workpiece point cloud; the point cloud density is reduced by uniformly sampling on the premise of not changing the point cloud data.
The implementation of the step S3 specifically comprises the following steps:
step S31: performing point cloud splicing, recording a pose matrix of the mechanical arm when acquiring point cloud data in multiple poses, calculating a pose transformation matrix, multiplying the point cloud data by the pose transformation matrix to obtain a complete point cloud C of a welded workpiece obtained by shooting at multiple angles under the same coordinate system 0
Step S32: statistical filtering, introducing C 0 Obtaining a point cloud C for removing system noise outliers 1 : the kd-tree is constructed for neighbor searching, the number of points of the neighbor searching is set to be k, k is set to be 50 in the embodiment, the global distance average value and standard deviation of the whole point cloud are calculated, the average distance of k neighbors of any point is calculated, points, the average distance value of which is not in a distance threshold value formed by the global distance average value and the standard deviation, are marked as outliers, the outliers are removed, and the effect of removing system noise is achieved.
Wherein D is i Is the i-th pointAverage distance of k nearest neighbors, S max Is a distance threshold.
Step S33: cut through, introduce C 1 Obtaining a welding workpiece point cloud C 2 : and setting a rough distance range between a camera on the manipulator and a welding workpiece on the welding table, and eliminating points with coordinate values out of the range to obtain a point cloud of the welding workpiece. X in this application instance l 、X h 、Y l 、Y h 、Z l 、Z h Are respectively 50mm, 200mm, 350mm, 420mm, 0mm and 200mm.
Wherein X is l 、X h 、Y l 、Y h 、Z l 、Z h Respectively representing the minimum value and the maximum value of the value range in three directions of X, Y, Z axis, x i 、y i 、z i Representing the x, y, z coordinates of the ith point in the point cloud.
Step S34: sampling uniformly, introducing C 2 Obtaining C after reducing the density of the point cloud 3 : the radius r is set to be 1.5mm in the embodiment, the point cloud space is divided by a sphere with the radius r, all points in the sphere are replaced by points closest to the center of the sphere in all points of the current sphere, and the number of points is greatly reduced under the condition that the structure of the point cloud is maintained and the coordinate value of the point cloud is not changed.
Preferably, the implementation of step S4 specifically includes the following steps:
step S41: constructing kd-Tree and introducing C 3 In this embodiment, k in step S4 is set to 20, and the normal vector of each point is calculated by using the searched result, and the local coordinate system is constructed by using the vector as the z axis.
Step S42: and fitting a third-order curved surface to the point searched by the k neighbor under the local coordinate system, projecting the current point onto the curved surface after fitting the third-order curved surface, and calculating the curvature value of the projection point. The curvature calculation method is as follows:
wherein h is u 、h v 、h uu 、h vv 、h uv Representing partial differentiation of curved surfaces in two directions of u and v
Step S43: step S41 and step S42 are carried out on each point to obtain the curvature of each point, and a potential welding point C is identified by setting a curvature threshold value 4 The curvature threshold value is set to 0.8 in this embodiment.
Preferably, the implementation of step S5 specifically includes the following steps:
step S51: sampling an ideal three-dimensional model of the welded workpiece and a three-dimensional model of the marked weld joint to obtain an ideal workpiece point cloud MC 1 And weld model point cloud MC 2
Step S52: introduction of C 0 Point cloud C of workpiece to be actually welded 0 Ideal three-dimensional model point cloud MC for welding workpiece 1 ICP registration is carried out on the two to obtain a coordinate transformation matrix T, and the welding seam model point cloud MC is obtained by using the transformation matrix T 2 Transforming to the coordinate system of the actual welding workpiece point cloud to obtain a new welding seam model point cloud MC 2’
Step S53: setting a distance threshold value, introducing C 4 Traversing the potential welding point C obtained in the step S4 4 Calculating each potential weld joint point and a new weld joint model point cloud MC 2’ The distance of the nearest point in the model (C) is kept, potential welding points with the distance smaller than a threshold value are reserved, and other points are removed to obtain an actual welding point cloud (C) 5 Distance threshold in this embodimentSet to 2mm.
Preferably, step S6 pairs of weld points cloud C 5 Performing cubic B-spline fitting, wherein the specific implementation comprises the following steps:
step S61: introduction of C 5 Using the matrix T obtained in step S1 ctob Cloud C of weld points 5 Converting into a coordinate system of the robot to obtain a point cloud C 6 Point cloud C 6 Reordered by X coordinates and determine the basis functions of the third order B-spline. The basis functions are as follows:
step S62: reordered C as described above 6 The point cloud performs a piecewise fit of the cubic B-spline curve. Finally, the work of extracting and fitting the welding seam is completed.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the invention in any way, and any person skilled in the art may make modifications or alterations to the disclosed technical content to the equivalent embodiments. However, any simple modification, equivalent variation and variation of the above embodiments according to the technical substance of the present invention still fall within the protection scope of the technical solution of the present invention.
The present patent is not limited to the above-mentioned best embodiment, any person can obtain other various methods for extracting and fitting the weld based on the 3D point cloud under the teaching of the present patent, and all equivalent changes and modifications made according to the scope of the present application should be covered by the present patent.

Claims (7)

1. A weld joint extraction and fitting method based on a 3D point cloud is characterized by comprising the following steps:
step S1: acquiring a transformation matrix T from a camera coordinate system to a robot base coordinate system ctob
Step S2: acquiring point cloud data through multi-gesture shooting;
step S3: performing point cloud preprocessing on the point cloud data: extracting point cloud data of a welding workpiece through point cloud splicing, statistical filtering, straight-through interception and uniform sampling;
step S4: potential weld point identification: selecting point clouds by using curvature characteristics, calculating the main curvature of each point by using a mobile least square fitting third-order curved surface, and identifying potential weld points by adjusting and setting a threshold value;
step S5: and (3) extracting welding seams: performing point cloud ICP registration on the point cloud obtained in the step S4 and the point cloud obtained by sampling the three-dimensional model of the welding seam of the workpiece, and determining the actual welding seam point cloud;
step S6: fitting a welding line track: and converting the weld point cloud into a coordinate system of the robot through a conversion matrix, and performing B-spline fitting on the point cloud data to obtain a weld track.
2. The 3D point cloud based weld extraction and fitting method of claim 1, wherein: in step S1, a transformation matrix from a camera coordinate system to a robot base coordinate system is obtained through line structured light camera hand-eye calibration.
3. The 3D point cloud based weld extraction and fitting method of claim 1, wherein in step S2, point cloud data is acquired from a line structured light camera by multi-pose shooting.
4. The 3D point cloud based weld extraction and fitting method of claim 1, wherein:
the step S3 specifically comprises the following steps:
step S31: and (3) performing point cloud splicing: recording a pose matrix of the mechanical arm when the point cloud data are acquired in multiple poses, calculating a pose transformation matrix, multiplying the point cloud data by the pose transformation matrix to obtain complete point clouds of the welding workpiece obtained by shooting at multiple angles under the same coordinate system:
wherein D is i Is the average distance of k neighbors of the ith point, S max Is a distance threshold;
step S32: and (3) carrying out statistical filtering: constructing a kd-tree for neighbor searching, setting the number of points of the neighbor searching as k, calculating the global distance average value and standard deviation of the whole point cloud, calculating the average distance of k neighbors of any point, and marking the points of which the average distance value of the k neighbors is not within the distance threshold value formed by the global distance average value and the standard deviation as outliers; removing outliers to achieve the effect of removing system noise;
step S33: and (3) performing straight-through interception: setting a distance range between a camera on the manipulator and a welding workpiece on a welding table, and eliminating points with coordinate values out of the range to obtain a point cloud of the welding workpiece:
wherein X is l 、X h 、Y l 、Y h 、Z l 、Z h Respectively representing the minimum value and the maximum value of the value range in three directions of X, Y, Z axis, x i 、y i 、z i Representing x, y and z coordinates of an ith point in the point cloud;
step S34: and (3) uniformly sampling: setting a radius r, dividing a point cloud space by using a sphere with the radius r, and selecting a point closest to the center of the sphere from all points of the current sphere to replace all points in the sphere.
5. The 3D point cloud based weld extraction and fitting method of claim 1, wherein:
step S4, using curvature as a characteristic to identify potential weld points, and using a moving least square method to fit out a local third-order curved surface to calculate curvature values of the points, wherein the method specifically comprises the following steps:
step S41: constructing a kd-tree, performing k neighbor search on each point, calculating a normal vector of the point by using a result obtained by searching, and constructing a local coordinate system by taking the vector as a z-axis;
step S42: fitting a third-order curved surface on the point obtained by k neighbor search under a local coordinate system, projecting the current point onto the curved surface after fitting the third-order curved surface, and calculating the curvature value of the projection point; the curvature calculation method is as follows:
wherein h is u 、h v 、h uu 、h vv 、h uv Respectively representing partial differentiation of the curved surface multi-grain in the u and v directions;
step S43: step S41 and step S42 are executed for each point, the curvature of each point is obtained, and potential welding points are identified through setting a curvature threshold value.
6. The 3D point cloud based weld extraction and fitting method of claim 1, wherein:
the step S5 specifically comprises the following steps:
step S51: sampling an ideal three-dimensional model of the welded workpiece and a three-dimensional model of the marked weld respectively to obtain an ideal workpiece point cloud MC 1 And weld model point cloud MC 2
Step S52: point cloud C of workpieces to be actually welded 0 Ideal three-dimensional model point cloud MC for welding workpiece 1 ICP registration is carried out to obtain a coordinate transformation matrix T, and the transformation matrix T is used for carrying out point cloud MC on the welding seam model 2 Conversion to actual weldingObtaining a new welding seam model point cloud MC under a coordinate system of the workpiece point cloud 2 ,;
Step S53: setting a distance threshold value, and traversing the potential welding seam point C obtained in the step S4 4 Calculating each potential weld joint point and a new weld joint model point cloud MC 2 The distance of the nearest point in the cloud C is reserved, potential welding points with the distance smaller than a threshold value are reserved, and other points are removed to obtain an actual welding point cloud C 5
7. The 3D point cloud based weld extraction and fitting method of claim 1, wherein:
step S6 welding seam point cloud C 5 Performing cubic B-spline fitting, which specifically comprises the following steps:
step S61: using the transformation matrix T obtained in step S1 ctob Cloud C of weld points 5 Converting into a coordinate system of the robot to obtain a point cloud C 6 Point cloud C 6 Reordered according to the X coordinates and determining the basis functions of the third-order B-spline:
step S62: for reordered C 6 And (3) carrying out piecewise fitting on the point cloud by using a cubic B spline curve, and finally completing the work of extracting and fitting the welding line.
CN202310405726.2A 2023-04-17 2023-04-17 Weld joint extraction and fitting method based on 3D point cloud Pending CN116542914A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310405726.2A CN116542914A (en) 2023-04-17 2023-04-17 Weld joint extraction and fitting method based on 3D point cloud

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310405726.2A CN116542914A (en) 2023-04-17 2023-04-17 Weld joint extraction and fitting method based on 3D point cloud

Publications (1)

Publication Number Publication Date
CN116542914A true CN116542914A (en) 2023-08-04

Family

ID=87442644

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310405726.2A Pending CN116542914A (en) 2023-04-17 2023-04-17 Weld joint extraction and fitting method based on 3D point cloud

Country Status (1)

Country Link
CN (1) CN116542914A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117226855A (en) * 2023-11-15 2023-12-15 泉州华中科技大学智能制造研究院 Weld polishing track planning method based on three-dimensional point cloud
CN117884811A (en) * 2024-01-31 2024-04-16 南京航空航天大学 Three-dimensional structure workpiece weld joint positioning device based on DLP three-dimensional vision
CN118212233A (en) * 2024-05-20 2024-06-18 法奥意威(苏州)机器人系统有限公司 Linear weld joint identification method and device and electronic equipment

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117226855A (en) * 2023-11-15 2023-12-15 泉州华中科技大学智能制造研究院 Weld polishing track planning method based on three-dimensional point cloud
CN117226855B (en) * 2023-11-15 2024-03-15 泉州华中科技大学智能制造研究院 Weld polishing track planning method based on three-dimensional point cloud
CN117884811A (en) * 2024-01-31 2024-04-16 南京航空航天大学 Three-dimensional structure workpiece weld joint positioning device based on DLP three-dimensional vision
CN118212233A (en) * 2024-05-20 2024-06-18 法奥意威(苏州)机器人系统有限公司 Linear weld joint identification method and device and electronic equipment

Similar Documents

Publication Publication Date Title
CN116542914A (en) Weld joint extraction and fitting method based on 3D point cloud
CN104841593B (en) Control method of robot automatic spraying system
JP5981143B2 (en) Robot tool control method
CN114289934B (en) Automatic welding system and method for large structural part based on three-dimensional vision
CN114515924B (en) Automatic welding system and method for tower foot workpiece based on weld joint identification
CN114571153A (en) Weld joint identification and robot weld joint tracking method based on 3D point cloud
CN110171000B (en) Groove cutting method, device and control equipment
CN113920060A (en) Autonomous operation method and device for welding robot, electronic device, and storage medium
CN113146172A (en) Multi-vision-based detection and assembly system and method
KR102096897B1 (en) The auto teaching system for controlling a robot using a 3D file and teaching method thereof
CN117047237B (en) Intelligent flexible welding system and method for special-shaped parts
Geng et al. A method of welding path planning of steel mesh based on point cloud for welding robot
Xiao et al. An automatic calibration algorithm for laser vision sensor in robotic autonomous welding system
CN116765569A (en) Robot post nail roller surfacing path planning method based on point cloud
CN115018813A (en) Method for robot to autonomously identify and accurately position welding line
CN114842144A (en) Binocular vision three-dimensional reconstruction method and system
González et al. Adaptive edge finishing process on distorted features through robot-assisted computer vision
CN117506931A (en) Groove cutting path planning and correcting equipment and method based on machine vision
CN111435400A (en) Part repairing method and device and 3D printer
JP3450609B2 (en) Offline teaching device for robots
Wu et al. Research on Welding Guidance System of Intelligent Perception for Steel Weldment
CN110060330B (en) Three-dimensional modeling method and device based on point cloud image and robot
Yusen et al. A method of welding path planning of steel mesh based on point cloud for welding robot
Kuryło et al. Design of an automated system for measuring car bodies
CN118314138B (en) Laser processing method and system based on machine vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination