CN112958959A - Automatic welding and detection method based on three-dimensional vision - Google Patents

Automatic welding and detection method based on three-dimensional vision Download PDF

Info

Publication number
CN112958959A
CN112958959A CN202110171644.7A CN202110171644A CN112958959A CN 112958959 A CN112958959 A CN 112958959A CN 202110171644 A CN202110171644 A CN 202110171644A CN 112958959 A CN112958959 A CN 112958959A
Authority
CN
China
Prior art keywords
welding
robot
dimensional
dimensional vision
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110171644.7A
Other languages
Chinese (zh)
Inventor
杨涛
彭磊
李晓晓
姜军委
马力
王芳
周翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi'an Chishine Optoelectronics Technology Co ltd
Original Assignee
Xi'an Chishine Optoelectronics Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xi'an Chishine Optoelectronics Technology Co ltd filed Critical Xi'an Chishine Optoelectronics Technology Co ltd
Priority to CN202110171644.7A priority Critical patent/CN112958959A/en
Publication of CN112958959A publication Critical patent/CN112958959A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K37/00Auxiliary devices or processes, not specially adapted to a procedure covered by only one of the preceding main groups

Landscapes

  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Numerical Control (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses an automatic welding and detection method based on three-dimensional vision, which comprises the following steps: building an automatic welding and detecting system based on three-dimensional vision; calibrating the relation between the welding and three-dimensional vision systems and the robot coordinate system; aligning a workpiece coordinate system and a robot coordinate system using three-dimensional vision; extracting welding characteristics to generate welding parameters; welding is carried out by using a robot; and detecting the welding quality. The invention uses a three-dimensional vision system mounted at the tail end of the robot to automatically finish the procedures of alignment, position finding and detection in the welding process. Compared with the traditional discrete system, the efficiency is improved, the cost is reduced, the automation degree of the welding process is greatly improved, and the high-degree automatic production of the whole processing flow is favorably realized.

Description

Automatic welding and detection method based on three-dimensional vision
The technical field is as follows:
the invention relates to an automatic welding and detection method based on three-dimensional vision, which mainly uses a three-dimensional vision technology and a robot technology to finish the automatic welding and welding quality detection method. The invention belongs to the field of industrial automation and machine vision.
Background art:
in the field of robot automated welding, multiple processes such as alignment, position finding, welding, detection and the like are generally required. The common alignment method comprises the following steps: 1) and (4) manually aligning, namely moving the tail end of the robot to a specific characteristic by using a manually operated robot, and converting the relation between the robot coordinate system and the workpiece coordinate system by using the positioning reference on the workpiece. 2) The welding wire and arc method judges the relative relation between the robot and the workpiece through a pre-programmed robot program and current and voltage signals when the robot approaches the workpiece, and corrects the position of the workpiece by acquiring data of different positions so as to finish alignment. 3) And (3) laser positioning, namely, scanning the workpiece by using a robot to drive a line or point laser, and then aligning by using three-dimensional information, or sampling data of a specific position, and further correcting the position of the workpiece to finish alignment. Among the methods, manual alignment is low in efficiency, and automation cannot be realized; the welding wire and arc method needs to be programmed in advance, and only can be used for correcting deviation, and the method is invalid under the condition of complex conditions or large difference of pose positions of workpieces, and in addition, the method is low in efficiency. In the method of locating by using laser, if the deviation is corrected by using laser scanning, the complex scene can not be processed as the welding wire and the electric arc method. If the scanning method is used, due to the limitation of the motion precision of the robot, the precision of the point cloud obtained by scanning is low, the alignment error can be increased, and the efficiency of the method for scanning by driving the laser by using the robot is very low.
And the position searching of the welding position is realized, and the alignment process is completed under the condition that the consistency of the workpieces is better, namely the position searching is completed, and the welding can be carried out according to a pre-programmed program. Under the condition that the consistency of the workpieces is poor, the welding seam also needs to be positioned. The common methods are as follows: 1) and (5) manual teaching. A manually operated robot was used to find the welding trajectory. 2) And (5) laser locating, and using line laser to locate the welding seam. The manual method requires occupied labor hours, has high requirements on the skills of front-line workers, and is not suitable for the automatic development direction of welding. The laser locating can only locate a simple straight welding line, cannot locate a complex welding line, and is easy to lose effectiveness due to interference of welding spots.
And (5) detecting the quality of the welding surface. After the welding is completed, the existence of the welding leakage and the quality of the welding are also needed to be detected. A common detection method is to scan a line laser or an external 3D camera, and then perform detection. The efficiency of line laser is low, and the external 3D camera is inconvenient to use, and the cost is increased.
The invention aims to utilize the latest 3D vision technology to complete a plurality of procedures of alignment, position finding and detection with low cost and high efficiency under a set of hardware system, thereby realizing an automatic welding and detection solution.
The invention content is as follows:
the invention aims to provide a three-dimensional vision-based automatic welding and detection method which can finish a plurality of processes of alignment, position finding and detection in robot welding application with low cost and high efficiency.
An automatic welding and detection method based on three-dimensional vision comprises the following steps:
building an automated welding and detection system based on three-dimensional vision
(II) calibrating the relation between the two systems of welding and three-dimensional vision and the robot coordinate system
(III) Aligning the workpiece coordinate System and the robot coordinate System Using three-dimensional Vision
(IV) extracting welding characteristics to generate welding parameters
(V) welding by robot
(VI) detecting the welding quality
In the step (one), the three-dimensional vision-based automatic welding and detection system comprises a robot system for a motion execution mechanism, a welding system, a three-dimensional vision system and an upper computer, and is shown in fig. 1.
The robot system is an actuating mechanism for adjusting position and posture, is a multi-axis industrial robot system and comprises a robot body and a robot controller; the welding system comprises different components according to different welding processes and is used for completing the complete welding process; the three-dimensional vision system is used for acquiring three-dimensional characteristic information of a workpiece to be welded, and is a high-precision 3D camera, wherein the high precision means that the measurement precision is higher than 1 mm. The 3D camera is a 3D camera with a depth map frame rate greater than 1 frame per second. The 3D camera is a low power, small volume, low weight 3D camera. The 3D camera and a welding actuator, such as a welding gun, are simultaneously mounted at the end of the robot. The 3D camera, preferably a MEMS-based structured light 3D camera, to meet the above features; and the upper computer is used for performing feature calculation and generating a control program.
And (II) respectively calibrating the coordinate conversion relation between the robot and the welding system and the coordinate relation between the robot and the three-dimensional vision system, wherein the two calibration processes have no sequential relation. So as to unify both the coordinate system of the welding system and the coordinate system of the three-dimensional vision system into the coordinate system of the robot.
The step (c) includes the steps of:
1) the robot system is used to point the 3D camera at the area where the workpiece is located, the 3D camera being located at a distance within its working range. 2) Shooting a point cloud picture of a workpiece by using a 3D camera; 3) and registering the shot point cloud and the point cloud of the digital three-dimensional model by using a point cloud feature-based registration method. 4) And calculating the conversion relation between the workpiece coordinate system and the robot coordinate system.
Sources of the digital three-dimensional model include, but are not limited to: scanning, splicing and point cloud fusion are reversely carried out by using a 3D camera at the tail end of the robot; modeling by using three-dimensional CAD software; the transformation is performed using an existing model. The digital three-dimensional model is preferred in such a way that the designed CAD three-dimensional digital model is selected in case of a good consistency of the workpiece with the original CAD designed three-dimensional model, otherwise the inversely obtained digital three-dimensional model is preferred.
And (IV) after the alignment in the step (III) is finished, sequentially arranging point clouds according to a predefined photographing position, and extracting welding features from the point clouds. The welding characteristics at least comprise one of the following characteristics: the track, the width, the starting point and the ending point of the welding seam, the radius and the circle center of the circular arc, the intersection line of the plane, the intersection line of the curved surface and the plane, and the intersection line of the curved surface and the curved surface.
The step (v) includes the steps of:
1) and (5) carrying out parameterization programming on the welding track and the welding attitude of the robot by using the characteristic parameters provided in the step (four).
In another embodiment of the present invention, the weld seam characteristics and normal characteristics provided in step (four) are used herein to directly calculate the trajectory and pose of the robot according to a computer program to generate robot control parameters.
2) The information of the robot and the attachment (welding system and three-dimensional vision system), and the welding track and attitude information are used to perform interference check to prevent collision during welding.
3) And (3) welding: and D, closing the three-dimensional vision system, starting the welding system, controlling the robot to weld according to the robot control program obtained in the step five, closing the welding system after the process is finished, and starting the three-dimensional vision system.
The step (vi) is characterized by comprising the following substeps:
1) and (4) shooting the three-dimensional point cloud of the welding place in the step (five) by using a three-dimensional vision system of the robot tail end. The shooting mode is that according to the track in the step (V) and the size of the field of view of the three-dimensional vision system, the shooting position is calculated, and at least 30% of areas are overlapped each time of shooting; then, calculating a photographing posture by using a digital model, so that the photographing direction is parallel to a main normal vector of the surface of the workpiece at the photographing position; sequentially shooting point clouds at the welding positions; performing primary splicing by using the pose of the robot; and carrying out global optimization by using an ICP (inductively coupled plasma) method to obtain point clouds of all welding positions.
The point cloud of the welding position is one piece of point cloud or a plurality of pieces of independent point clouds. Depending on whether the distribution of the welding positions is continuous or not.
2) And comparing the reversed point cloud containing the welding seam information with the point cloud of the original digital three-dimensional model to solve the welding quality parameter. The method is characterized in that point clouds obtained in the reverse direction and original point clouds are registered, then the distance D of the closest point is obtained in the space field of each point of the reverse point clouds, the corresponding point clouds are abandoned when the distance value D is below a threshold value, and points above the threshold value are reserved; at the moment, the selected point is the welding line point cloud, and the average distance D of the welding line point cloud is calculated in a segmented modeaAnd a maximum value DmaxAnd the average coordinate value P (X) of the point clouda,Ya,Za) And judging whether the quality of the welding seam is qualified or not and whether welding is missed or not by using the parameters and referring to an empirical threshold.
3) And outputting the detection result for a computer or a human to decide the subsequent operation.
Positive effects of the invention
The invention uses a three-dimensional vision system mounted at the tail end of the robot to automatically finish the procedures of alignment, position finding and detection in the welding process. Compared with the traditional discrete system, the efficiency is improved, the cost is reduced, the automation degree of the welding process is greatly improved, and the high-degree automatic production of the whole processing flow is favorably realized.
Drawings
FIG. 1 is an automated welding and inspection system based on three-dimensional vision. 1, a welding system; 2 a three-dimensional vision system; 3 an industrial robot system; 4 upper computer
FIG. 2 weld inspection flow
Detailed Description
The invention aims to use the three-dimensional vision technology and the robot technology to realize the alignment, position finding and detection processes in the welding process on a set of hardware system in a low-cost, high-efficiency and automatic manner. In order to achieve the purpose, the method provides the following exemplary technical scheme:
building an automated welding and detection system based on three-dimensional vision
As shown in FIG. 1, the constructed system comprises a robot system 3 for a motion executing mechanism, a welding system 1 and a host computer 4 of a three-dimensional vision system 2. The robot system is an actuating mechanism for adjusting position and posture, is a multi-axis industrial robot system and comprises a robot body and a robot controller; the welding system comprises different components according to different welding processes and is used for completing the complete welding process; the three-dimensional vision system is used for acquiring three-dimensional characteristic information of a workpiece to be welded, and is a high-precision 3D camera, wherein the high precision means that the measurement precision is higher than 1 mm. The 3D camera is a 3D camera with a depth map frame rate greater than 1 frame per second. The 3D camera is a low power, small volume, low weight 3D camera. The 3D camera and a welding actuator, such as a welding gun, are simultaneously mounted at the end of the robot. The 3D camera, preferably a MEMS-based structured light 3D camera, to meet the above features; and the upper computer is used for performing feature calculation and generating a control program.
(II) calibrating the relation between the two systems of welding and three-dimensional vision and the robot coordinate system
3D camera and robot calibration: using a robot to mount a 3D camera, shooting a calibration plate with known coordinate points, and recording the position and the posture of the robot; keeping the calibration plate still, changing the position and the posture of the robot for multiple times, and shooting the calibration plate; the optimized hand-eye transformation matrix RT is calculated using least squares.
Calibration of a welding executing mechanism and a robot: guiding the tail end (welding gun tail end) of the welding system to a fixed space point by using a robot (a preferable scheme is to use a fixed tip as a reference point), changing the position and the posture of the robot, ensuring that the tail end space coordinate of the welding system is unchanged (always aligned with the fixed tip), and calculating the position of the tail end coordinate of the welding system in the robot coordinate system after carrying out the operation for multiple times.
(III) Aligning the workpiece coordinate System and the robot coordinate System Using three-dimensional Vision
And shooting the workpiece by using a 3D camera at the tail end of the robot to obtain a 3D point cloud at an angle. And then registering the point cloud and the point cloud of the three-dimensional model of the workpiece to be welded to obtain a conversion matrix between the point cloud and the three-dimensional data of the workpiece, namely the conversion relation between the coordinate system of the workpiece and the coordinate system of the robot.
The three-dimensional data here uses a three-dimensional digital model designed in advance by CAD software.
(IV) extracting welding characteristics to generate welding parameters
Firstly, after alignment is finished, shooting a point cloud according to a preset shooting position; and then using the shot point cloud for feature extraction.
The feature extraction is to extract welding feature parameters according to specific case conditions, and the extraction method comprises the following steps:
1) for plate-like structures, their intersecting lines, and the intersection points of the intersecting lines, are extracted by plane fitting.
2) For tubular structures, fitting is performed as a cylinder to find the centerline of the cylinder, and the intersection points and lines of intersection of the centerline, cylinder surface and other features.
3) And for the free-form surface, calculating a geodesic line at the maximum curvature position of the free-form surface.
And (V) using a robot to perform welding. Comprises the following steps:
1) and (5) carrying out parameterization programming on the welding track and the welding attitude of the robot by using the characteristic parameters provided in the step (four). The method is suitable for the condition that the weld joint features are relatively simple, such as simple features of line segments, arcs and points.
In another embodiment of the present invention, the weld seam characteristics and normal characteristics provided in step (four) are used herein to directly calculate the trajectory and pose of the robot according to a computer program to generate robot control parameters. The scheme is suitable for complex welding seams such as irregular space curves.
2) The information of the robot and the attachment (welding system and three-dimensional vision system), and the welding track and attitude information are used to perform interference check to prevent collision during welding.
3) And (3) welding: and D, closing the three-dimensional vision system, starting the welding system, controlling the robot to weld according to the robot control program obtained in the step five, closing the welding system after the process is finished, and starting the three-dimensional vision system.
(VI) detecting the welding quality
1) And (4) shooting the three-dimensional point cloud of the welding place in the step (five) by using a three-dimensional vision system of the robot tail end. The shooting mode is that according to the track in the step (V) and the size of the field of view of the three-dimensional vision system, the shooting position is calculated, and at least 30% of areas are overlapped each time of shooting; then, calculating a photographing posture by using a digital model, so that the photographing direction is parallel to a main normal vector of the surface of the workpiece at the photographing position; sequentially shooting point clouds at the welding positions; performing primary splicing by using the pose of the robot; and carrying out global optimization by using an ICP (inductively coupled plasma) method to obtain point clouds of all welding positions.
The point cloud of the welding position is one piece of point cloud or a plurality of pieces of independent point clouds. Depending on whether the distribution of the welding positions is continuous or not.
2) And comparing the reversed point cloud containing the welding seam information with the original point cloud to solve the welding quality parameters. The method is characterized in that point clouds obtained in the reverse direction and original point clouds are registered, then the distance D of the closest point is obtained in the space field of each point of the reverse point clouds, the corresponding point clouds are abandoned when the distance value D is below a threshold value, and points above the threshold value are reserved; at the moment, the selected point cloud is the welding line point cloud, and whether welding missing exists is judged firstly; secondly, calculating the average distance D of the welding point cloud in a segmented manneraAnd a maximum value DmaxAnd the average coordinate value P (X) of the point clouda,Ya,Za) And judging whether the quality of the welding seam is qualified or not by using the parameters and referring to an empirical threshold. When D is presentaAnd DmaxAnd both are greater than a threshold; the welding quality is not qualified here, DaIs not greater than a threshold value and DmaxAbove the threshold, indicating a defect; daGreater than a threshold value and DmaxWhen the welding quality is not more than the threshold value, the welding quality is unqualified, and the rest is qualified. P (X)a,Ya,Za) Exceeding the threshold value indicates a deviation in position and failure. The flow chart is shown in figure 2
3) And outputting the detection result for a computer or a human to decide the subsequent operation.
In the device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims or in different embodiments does not indicate that a combination of these measures cannot be used to advantage.
In the device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims or in different embodiments does not indicate that a combination of these measures cannot be used to advantage.
It should be emphasized that the term "comprises/comprising" when used in this specification is taken to specify the presence of stated features, integers, steps or components but does not preclude the presence or addition of one or more other features, integers, steps, components or groups thereof.
The features of the methods described above and below may be implemented in software and may be executed on a data processing system or other processing tool by executing computer-executable instructions. The instructions may be program code loaded into memory (e.g., RAM) from a storage medium or from another computer via a computer network. Alternatively, the described features may be implemented by hardwired circuitry instead of software, or by a combination of hardwired circuitry and software.

Claims (10)

1. An automatic welding and detection method based on three-dimensional vision is characterized by comprising the following steps:
building an automatic welding and detecting system based on three-dimensional vision;
secondly, calibrating the relation between the welding system and the robot coordinate system and the relation between the three-dimensional vision system and the robot coordinate system;
(III) aligning the workpiece coordinate system and the robot coordinate system by using three-dimensional vision;
fourthly, extracting welding characteristics to generate welding parameters;
(V) using a robot to carry out welding;
and (VI) detecting the welding quality.
2. The three-dimensional vision automated welding and inspection method of claim 1, wherein in step (a), the three-dimensional vision based automated welding and inspection system comprises a robot system for motion actuator, a welding system, and a three-dimensional vision system, and an upper computer; the robot system is an actuating mechanism for adjusting position and posture, is a multi-axis industrial robot system, and comprises a robot body and a robot controller.
3. The three-dimensional vision automated welding and inspection method of claim 1, wherein the three-dimensional vision system is used to obtain three-dimensional feature information of the workpiece to be welded, and is a high precision 3D camera, and the high precision means that the measurement precision is less than 1 mm; the 3D camera is a 3D camera with the depth map frame rate larger than 1 frame per second; the 3D camera is a low power, small volume, low weight 3D camera; the 3D camera and the welding executing mechanism are mounted at the tail end of the robot.
4. The three-dimensional visual automated welding and inspection method of claim 1, wherein said 3D camera, preferably a MEMS-based structured light 3D camera, satisfies the above characteristics; and the upper computer is used for performing feature calculation and generating a control program.
5. The three-dimensional vision automated welding and inspection method of claim 1, wherein in step (two), the coordinate transformation relationship between the robot and the welding system and the coordinate relationship between the robot and the three-dimensional vision system are respectively calibrated, and there is no precedence relationship between the two calibration processes; so as to unify both the coordinate system of the welding system and the coordinate system of the three-dimensional vision system into the coordinate system of the robot.
6. The three-dimensional visual automated welding and inspection method of claim 1, wherein step (iii) comprises the steps of:
1) using the robot system to point the 3D camera to the area where the workpiece is located, wherein the distance where the 3D camera is located is within the working range of the 3D camera; 2) shooting a point cloud picture of a workpiece by using a 3D camera; 3) registering the shot point cloud and the point cloud of the digital three-dimensional model by using a point cloud feature-based registration method; 4) and calculating the conversion relation between the workpiece coordinate system and the robot coordinate system.
7. The three-dimensional visual automated welding and inspection method of claim 1,
sources of the digital three-dimensional model include, but are not limited to: scanning, splicing and point cloud fusion are reversely carried out by using a 3D camera at the tail end of the robot; modeling by using three-dimensional CAD software; transforming using the existing model; the digital three-dimensional model is preferred in such a way that the designed CAD three-dimensional digital model is selected in case of a good consistency of the workpiece with the original CAD designed three-dimensional model, otherwise the inversely obtained digital three-dimensional model is preferred.
8. The three-dimensional visual automated welding and inspection method of claim 1, wherein in step (four), after the alignment of step (three) is completed, point clouds are sequentially arranged according to a predefined photographing position, and welding features are extracted from the point clouds; the welding characteristics at least comprise one of the following characteristics: the track, the width, the starting point and the ending point of the welding seam, the radius and the circle center of the circular arc, the intersection line of the plane, the intersection line of the curved surface and the plane, and the intersection line of the curved surface and the curved surface.
9. The three-dimensional visual automated welding and inspection method of claim 1, wherein step (five) comprises the steps of:
1) using the characteristic parameters provided in the step (four) to carry out parameterization programming on the welding track and the welding attitude of the robot;
in another embodiment of the present invention, the welding seam characteristics and normal characteristics provided in the step (four) are used here to directly calculate the track and the posture of the robot according to a computer program to generate robot control parameters;
2) interference check is carried out by using information of the robot and the auxiliary mechanism (a welding system and a three-dimensional vision system) and information of welding track and posture, and collision is prevented in the welding process;
3) and (3) welding: and D, closing the three-dimensional vision system, starting the welding system, controlling the robot to weld according to the robot control program obtained in the step five, closing the welding system after the process is finished, and starting the three-dimensional vision system.
10. The three-dimensional visual automated welding and inspection method of claim 1,
the step (vi) is characterized by comprising the following substeps:
1) shooting the three-dimensional point cloud of the welding place in the step (five) by using a three-dimensional vision system of the robot tail end; the shooting mode is that according to the track in the step (V) and the size of the field of view of the three-dimensional vision system, the shooting position is calculated, and at least 30% of areas are overlapped each time of shooting; then, calculating a photographing posture by using a digital model, so that the photographing direction is parallel to a main normal vector of the surface of the workpiece at the photographing position; sequentially shooting point clouds at the welding positions; performing primary splicing by using the pose of the robot; performing global optimization by using an ICP (inductively coupled plasma) method to obtain point clouds of all welding positions;
the point cloud of the welding position is one piece of point cloud or a plurality of pieces of independent point clouds; depending on whether the distribution of the welding locations is continuous or not;
2) comparing the reversed point cloud containing the welding seam information with the point cloud of the original digital three-dimensional model to solve welding quality parameters; the method is characterized in that point clouds obtained in the reverse direction and original point clouds are registered, then the distance D of the closest point is obtained in the space field of each point of the reverse point clouds, the corresponding point clouds are abandoned when the distance value D is below a threshold value, and points above the threshold value are reserved; at the moment, the selected point is the welding line point cloud, and the average distance D of the welding line point cloud is calculated in a segmented modeaAnd poleLarge value of DmaxAnd the average coordinate value P (X) of the point clouda,Ya,Za) Judging whether the quality of the welding seam is qualified or not and whether welding is missed or not by using the parameters and referring to an empirical threshold;
3) and outputting the detection result for a computer or a human to decide the subsequent operation.
CN202110171644.7A 2021-02-08 2021-02-08 Automatic welding and detection method based on three-dimensional vision Pending CN112958959A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110171644.7A CN112958959A (en) 2021-02-08 2021-02-08 Automatic welding and detection method based on three-dimensional vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110171644.7A CN112958959A (en) 2021-02-08 2021-02-08 Automatic welding and detection method based on three-dimensional vision

Publications (1)

Publication Number Publication Date
CN112958959A true CN112958959A (en) 2021-06-15

Family

ID=76275376

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110171644.7A Pending CN112958959A (en) 2021-02-08 2021-02-08 Automatic welding and detection method based on three-dimensional vision

Country Status (1)

Country Link
CN (1) CN112958959A (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113533498A (en) * 2021-07-26 2021-10-22 成都盛锴科技有限公司 Welding seam detection positioning method and positioning device of automatic eddy current flaw detection system
CN113681133A (en) * 2021-08-30 2021-11-23 南京衍构科技有限公司 Intelligent welding method of redundant degree of freedom robot with vision
CN113681119A (en) * 2021-09-13 2021-11-23 上海柏楚电子科技股份有限公司 Data processing method and device for welding seam detection, and welding control method and device
CN113780900A (en) * 2021-11-09 2021-12-10 深圳市裕展精密科技有限公司 Welding detection system and method based on edge calculation
CN114324351A (en) * 2021-12-17 2022-04-12 四川阳光坚端铝业有限公司 Aluminum alloy welding quality evaluation method
CN114434059A (en) * 2022-04-08 2022-05-06 西安知象光电科技有限公司 Automatic welding system and method for large structural part with combined robot and three-dimensional vision
CN114749848A (en) * 2022-05-31 2022-07-15 深圳了然视觉科技有限公司 Steel bar welding automatic system based on 3D vision guide
CN114841959A (en) * 2022-05-05 2022-08-02 广州东焊智能装备有限公司 Automatic welding method and system based on computer vision
CN117047237A (en) * 2023-10-11 2023-11-14 太原科技大学 Intelligent flexible welding system and method for special-shaped parts
CN118196109A (en) * 2024-05-20 2024-06-14 法奥意威(苏州)机器人系统有限公司 Arc-shaped weld joint identification method and device
CN118196109B (en) * 2024-05-20 2024-09-27 法奥意威(苏州)机器人系统有限公司 Arc-shaped weld joint identification method and device

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0784631A (en) * 1993-09-14 1995-03-31 Fanuc Ltd Method for correcting robot teaching program
CN103231162A (en) * 2013-04-17 2013-08-07 柳州市自动化科学研究所 Device and method for visual detection of welding quality of robot
CN104400279A (en) * 2014-10-11 2015-03-11 南京航空航天大学 CCD-based method and system for automatic identification and track planning of pipeline space weld seams
CN104690422A (en) * 2015-01-11 2015-06-10 沈阳汇能机器人自动化有限公司 Robot laser welding visual inspection programmed control system and implementing method thereof
CN108453439A (en) * 2018-03-14 2018-08-28 清华大学天津高端装备研究院洛阳先进制造产业研发基地 The robot welding track self-programming system and method for view-based access control model sensing
CN110227876A (en) * 2019-07-15 2019-09-13 西华大学 Robot welding autonomous path planning method based on 3D point cloud data
CN110524581A (en) * 2019-09-16 2019-12-03 西安中科光电精密工程有限公司 A kind of flexible welding robot system and its welding method
US20200269340A1 (en) * 2018-07-25 2020-08-27 Tonggao Advanced Manufacturing Technology Co., Ltd. Active Laser Vision Robust Weld Tracking System and Weld Position Detection Method
CN112223294A (en) * 2020-10-22 2021-01-15 湖南大学 Mechanical arm machining track correction method based on three-dimensional vision

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0784631A (en) * 1993-09-14 1995-03-31 Fanuc Ltd Method for correcting robot teaching program
CN103231162A (en) * 2013-04-17 2013-08-07 柳州市自动化科学研究所 Device and method for visual detection of welding quality of robot
CN104400279A (en) * 2014-10-11 2015-03-11 南京航空航天大学 CCD-based method and system for automatic identification and track planning of pipeline space weld seams
CN104690422A (en) * 2015-01-11 2015-06-10 沈阳汇能机器人自动化有限公司 Robot laser welding visual inspection programmed control system and implementing method thereof
CN108453439A (en) * 2018-03-14 2018-08-28 清华大学天津高端装备研究院洛阳先进制造产业研发基地 The robot welding track self-programming system and method for view-based access control model sensing
US20200269340A1 (en) * 2018-07-25 2020-08-27 Tonggao Advanced Manufacturing Technology Co., Ltd. Active Laser Vision Robust Weld Tracking System and Weld Position Detection Method
CN110227876A (en) * 2019-07-15 2019-09-13 西华大学 Robot welding autonomous path planning method based on 3D point cloud data
CN110524581A (en) * 2019-09-16 2019-12-03 西安中科光电精密工程有限公司 A kind of flexible welding robot system and its welding method
CN112223294A (en) * 2020-10-22 2021-01-15 湖南大学 Mechanical arm machining track correction method based on three-dimensional vision

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113533498A (en) * 2021-07-26 2021-10-22 成都盛锴科技有限公司 Welding seam detection positioning method and positioning device of automatic eddy current flaw detection system
CN113681133A (en) * 2021-08-30 2021-11-23 南京衍构科技有限公司 Intelligent welding method of redundant degree of freedom robot with vision
CN113681133B (en) * 2021-08-30 2022-07-08 南京衍构科技有限公司 Intelligent welding method of redundant degree of freedom robot with vision
CN113681119A (en) * 2021-09-13 2021-11-23 上海柏楚电子科技股份有限公司 Data processing method and device for welding seam detection, and welding control method and device
CN113681119B (en) * 2021-09-13 2023-08-15 上海柏楚电子科技股份有限公司 Data processing method and device for weld detection and welding control method and device
CN113780900A (en) * 2021-11-09 2021-12-10 深圳市裕展精密科技有限公司 Welding detection system and method based on edge calculation
CN114324351A (en) * 2021-12-17 2022-04-12 四川阳光坚端铝业有限公司 Aluminum alloy welding quality evaluation method
US11951575B2 (en) 2022-04-08 2024-04-09 Xi'an Chishine Optoelectronics Technology Co., Ltd Automatic welding system and method for large structural parts based on hybrid robots and 3D vision
CN114434059A (en) * 2022-04-08 2022-05-06 西安知象光电科技有限公司 Automatic welding system and method for large structural part with combined robot and three-dimensional vision
CN114434059B (en) * 2022-04-08 2022-07-01 西安知象光电科技有限公司 Automatic welding system and method for large structural part with combined robot and three-dimensional vision
WO2023193362A1 (en) * 2022-04-08 2023-10-12 西安知象光电科技有限公司 Hybrid robot and three-dimensional vision based large-scale structural part automatic welding system and method
CN114841959A (en) * 2022-05-05 2022-08-02 广州东焊智能装备有限公司 Automatic welding method and system based on computer vision
CN114749848A (en) * 2022-05-31 2022-07-15 深圳了然视觉科技有限公司 Steel bar welding automatic system based on 3D vision guide
CN114749848B (en) * 2022-05-31 2024-08-23 深圳了然视觉科技有限公司 Automatic steel bar welding system based on 3D visual guidance
CN117047237B (en) * 2023-10-11 2024-01-19 太原科技大学 Intelligent flexible welding system and method for special-shaped parts
CN117047237A (en) * 2023-10-11 2023-11-14 太原科技大学 Intelligent flexible welding system and method for special-shaped parts
CN118196109A (en) * 2024-05-20 2024-06-14 法奥意威(苏州)机器人系统有限公司 Arc-shaped weld joint identification method and device
CN118196109B (en) * 2024-05-20 2024-09-27 法奥意威(苏州)机器人系统有限公司 Arc-shaped weld joint identification method and device

Similar Documents

Publication Publication Date Title
CN112958959A (en) Automatic welding and detection method based on three-dimensional vision
CN112847353B (en) Multi-segment welding seam track correction method based on offline programming software
CN109903279B (en) Automatic teaching method and device for welding seam movement track
CN114289934B (en) Automatic welding system and method for large structural part based on three-dimensional vision
CN110227876A (en) Robot welding autonomous path planning method based on 3D point cloud data
CN114434059B (en) Automatic welding system and method for large structural part with combined robot and three-dimensional vision
CN114515924B (en) Automatic welding system and method for tower foot workpiece based on weld joint identification
CN110450150B (en) Trajectory tracking control method and trajectory tracking system
US20240075629A1 (en) Autonomous welding robots
CN112620926B (en) Welding spot tracking method and device and storage medium
CN117047237B (en) Intelligent flexible welding system and method for special-shaped parts
CN114161048A (en) Iron tower foot parametric welding method and device based on 3D vision
CN113223071B (en) Workpiece weld joint positioning method based on point cloud reconstruction
CN111496344A (en) V-shaped groove information processing method based on laser sensor
CN112958974A (en) Interactive automatic welding system based on three-dimensional vision
CN117885096B (en) Method and device for controlling welding operation of robot tail end welding gun
CN114888501A (en) Teaching-free programming building component welding device and method based on three-dimensional reconstruction
CN113385869A (en) Robot welding equipment for large square lattice component based on machine vision and welding seam positioning method
CN117300464A (en) Intersecting line weld detection and track optimization system and method based on structured light camera
CN110456729B (en) Trajectory tracking control method and trajectory tracking system
CN114800574B (en) Robot automatic welding system and method based on double three-dimensional cameras
CN116330279A (en) Welding robot parameterized programming method and system based on machine vision and neural network
Wang et al. A path correction method based on global and local matching for robotic autonomous systems
CN116175035B (en) Intelligent welding method for steel structure high-altitude welding robot based on deep learning
CN117532612A (en) Equipment for realizing automatic planning of tail end path of industrial robot by adopting various structured light cameras

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210615