CN114663360A - Welding deformation measuring method based on vision measurement - Google Patents

Welding deformation measuring method based on vision measurement Download PDF

Info

Publication number
CN114663360A
CN114663360A CN202210199149.1A CN202210199149A CN114663360A CN 114663360 A CN114663360 A CN 114663360A CN 202210199149 A CN202210199149 A CN 202210199149A CN 114663360 A CN114663360 A CN 114663360A
Authority
CN
China
Prior art keywords
image
welding
workpiece
camera
boundary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210199149.1A
Other languages
Chinese (zh)
Inventor
张白
张巍巍
石明全
陈伟
焦海波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anxinjie Intelligent Robot Ningxia Co ltd
Original Assignee
Anxinjie Intelligent Robot Ningxia Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anxinjie Intelligent Robot Ningxia Co ltd filed Critical Anxinjie Intelligent Robot Ningxia Co ltd
Priority to CN202210199149.1A priority Critical patent/CN114663360A/en
Publication of CN114663360A publication Critical patent/CN114663360A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to a welding deformation measuring method based on vision measurement, which comprises the following steps: s1, before welding, shooting the spliced and undeformed workpiece by a camera to obtain a first image; s2, after welding, photographing the workpiece at the same position by using a camera to obtain a second image; s3, extracting boundary feature points in the first image by adopting a boundary feature point extraction algorithm; and S4, adopting the same boundary characteristic point extraction algorithm, taking the boundary characteristic point of the first image as a searched reference point, searching the boundary characteristic point in the second image by a set search radius, comparing the pixel distance between the reference point and the searched boundary characteristic point, and taking the maximum pixel distance as the welding deformation. According to the method, the workpieces before and after welding are photographed at the same position, the welding deformation can be determined through comparative analysis, and the method is high in accuracy and efficiency.

Description

Welding deformation measuring method based on vision measurement
Technical Field
The invention relates to the technical field of welding, in particular to a welding deformation measuring method based on visual measurement.
Background
Welding is an accurate, reliable and low-cost method for connecting materials, is widely applied in the fields of automobiles, ships, aerospace, mining machinery and the like, is one of key manufacturing processes of the manufacturing industry, and has irreplaceability. The shape and size change of the welded workpiece caused by the action of the uneven temperature field in the welding process is called welding deformation. For all fusion welding, large residual stress exists in a welding seam and a heat affected zone thereof, and the existence of the residual stress can cause deformation and cracking of a welding component and reduce the bearing capacity of the welding component; meanwhile, stress concentration caused by pits, excess height and undercut exists at the weld toe part of the weld joint; and the slag defects and the microcracks at the weld toes form a source for early initiation of cracks. In the prior art, the deformation of a workpiece is usually checked manually after the welding is finished, and manual correction is needed when the deformation is too large. In summary, the manual inspection of the welded workpiece is not accurate.
Disclosure of Invention
The invention aims to provide a welding deformation measuring method based on visual measurement, which can improve the accuracy of measurement.
In order to achieve the above object, the embodiments of the present invention provide the following technical solutions:
the welding deformation measuring method based on the vision measurement comprises the following steps:
s1, before welding, shooting the spliced and undeformed workpiece by a camera to obtain a first image;
s2, after welding, photographing the welded workpiece at the same position by using a camera to obtain a second image;
s3, extracting boundary feature points in the first image by adopting a boundary feature point extraction algorithm;
and S4, adopting the same boundary characteristic point extraction algorithm, taking the boundary characteristic point of the first image as a searched reference point, searching the boundary characteristic point in the second image by a set search radius, comparing the pixel distance between the reference point and the searched boundary characteristic point, and taking the maximum pixel distance as the welding deformation.
In a further optimized solution, in the step S1, the workpiece is set in the operation space of the welding robot, the welding robot exits from the photographing space of the camera, and the camera photographs the workpiece to obtain the first image.
In a further optimized scheme, in step S2, the welding robot welds the workpiece, after the welding is completed, the welding robot exits from the photographing space of the camera, and the camera photographs the welded workpiece at the same position to obtain a second image.
In the above scheme, before shooing, let welding robot withdraw from the space of shooing of camera earlier and then shoot, so can avoid the interference of welding robot in the image to in more accurately extract the boundary profile, improve welding deformation's measurement accuracy then.
In a further optimized scheme, the boundary feature point extraction algorithm adopts a Hough line detection algorithm, and an intersection point of more than two Hough lines is used as a boundary feature point.
In a further optimized solution, in the step S1, the spliced but undeformed workpiece is assembled by bonding with metal glue or spot welding. The parts are assembled and spliced through glue bonding or spot welding in the step, so that the state after welding is simulated, the assembly process can be guaranteed, the workpiece is not deformed, the connection firmness can be guaranteed, and the simulation effect is closer to the state of the welded workpiece.
In a further optimized scheme, in step S3, the posture between the camera and the workpiece is obtained through the first image, the boundary feature point to be compared is extracted from the three-dimensional model of the workpiece, the boundary feature point to be compared is transformed into the image coordinate system through the camera calibration matrix, and the extracted boundary feature point to be compared is taken as the extracted boundary feature point in the first image.
The boundary feature points in the first image may be extracted directly from the image or may be extracted indirectly. According to the scheme, the boundary feature points are extracted from the three-dimensional model and then converted into the first image, namely the boundary feature points in the first image are determined in an indirect mode, so that errors generated in image extraction can be reduced, and accuracy of the boundary feature points is improved.
In a further preferred embodiment, the step of obtaining the pose between the camera and the workpiece from the first image comprises: and placing a standard ball at the designated position of the workpiece before welding, photographing the workpiece before welding, and obtaining the posture between the camera and the workpiece by referring to the standard ball.
The method is suitable for ensuring the condition that the welding is carried out at the same position before and after welding, and the embodiment of the invention also provides a method suitable for the condition that the welding is not carried out at the same position before and after welding. Specifically, the welding deformation measuring method based on visual measurement comprises the following steps:
s1, before welding, defining the position of the workpiece at the moment as P1, and photographing the spliced and undeformed workpiece by using a camera to obtain a first image;
s2, placing the first independent part of the welding workpiece at the same position P1, and obtaining a second image;
s3, placing a welding workpiece, defining the position of the workpiece to be P2, photographing a first independent part of the workpiece before welding by using a camera to obtain a third image, extracting boundary contour points in the second image by adopting a boundary characteristic point extraction algorithm, and calculating a transformation matrix of the second image and the third image according to the coordinates of the same boundary characteristic points in the second image and the coordinates in the third image;
s4, after the welding is finished at the position P2, the workpiece is photographed by a camera to obtain a fourth image;
s5, extracting boundary feature points in the first image by adopting the same boundary feature point extraction algorithm, and transforming the boundary feature points to a camera space consistent with the position P2 through a transformation matrix;
and S6, adopting the same boundary characteristic point extraction algorithm, taking the boundary characteristic point after the first image coordinate transformation as a searched reference point, searching the boundary characteristic point in the fourth image by a set search radius, comparing the pixel distance between the reference point and the searched boundary characteristic point, and taking the maximum pixel distance as the welding deformation.
In the further optimization scheme, after the boundary feature point extraction algorithm extracts the boundary feature points in the first image, manual confirmation is performed.
Compared with the prior art, the invention has the following beneficial effects: according to the method, the workpieces before and after welding are photographed, the welding deformation can be determined through comparative analysis, the accuracy is high, and the efficiency is higher compared with that of manual inspection.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a flowchart of a welding deformation measuring method based on visual measurement in example 1.
Fig. 2 is a flowchart of a welding deformation measuring method based on visual measurement in embodiment 2.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. The devices of the embodiments of the invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
Example 1
Referring to fig. 1, the present embodiment provides a welding deformation measuring method based on visual measurement, including the following steps:
and S1, before welding, shooting the spliced and undeformed workpiece by a camera to obtain a first image.
In this step, the spliced but undeformed state means that a plurality of components constituting the complete workpiece are assembled in a manner that does not cause deformation, so as to simulate the post-welding state. As a preferable embodiment, metal glue can be adopted for bonding, or welding can be adopted by spot welding, so that deformation is not caused, and the assembly stability can be guaranteed.
In this step, when gathering the image, after setting up the work piece in welding robot's operation space, in order to avoid welding robot's interference, let welding robot withdraw from the space of shooing of camera earlier, then the camera is shot the work piece again, obtains first image. The first image is the image of the workpiece simulating the welding state but not deformed and is used as a reference image.
And S2, after welding, photographing the welded workpiece at the same position by using a camera to obtain a second image.
The welding robot welds the workpiece, and when the workpiece is welded sequentially by a plurality of parts, that is, the parts 1 and 2 are welded first, and then the part 3 is welded on the part 2, the plurality of parts are welded sequentially.
In the step, after the welding robot finishes welding the workpiece, the welding robot exits from the photographing space of the camera, and the camera photographs the welded workpiece at the same position to obtain a second image. The second image is the welded workpiece image.
The same position in the step means that the camera keeps the same posture and photographs the same area of the workpiece, so that the accuracy is higher.
And S3, extracting boundary characteristic points in the first image.
In order to improve the accuracy of the boundary feature point extraction result, in a more optimized scheme, the position of the workpiece is located before the image is acquired in step S1. Specifically, a standard ball is placed at a designated position of a workpiece, the workpiece before welding is photographed, the posture between a camera and the workpiece is obtained by referring to the standard ball, then boundary feature points needing to be compared are extracted from a three-dimensional model of the workpiece, the boundary feature points needing to be compared are converted into an image coordinate system through a camera calibration matrix, and the extracted boundary feature points needing to be compared are used as boundary feature points extracted from a first image.
In the step, a Hough line detection algorithm is adopted to extract boundary feature points, and intersection points of more than two Hough lines are used as the boundary feature points.
And because the distance between the platform where the workpiece is located and the camera is a known quantity, carrying out coordinate transformation on the boundary feature points in the three-dimensional model of the workpiece determined according to the first image boundary according to the camera transformation matrix to obtain the first image boundary feature points.
It is easily understood that there is no order between steps S3 and S2, and step S3 may be executed after step S1 is executed.
And S4, adopting the same boundary characteristic point extraction algorithm, taking the boundary characteristic point of the first image as a searched reference point, searching the boundary characteristic point in the second image by a set search radius, comparing the pixel distance between the reference point and the searched boundary characteristic point, and taking the maximum pixel distance as the welding deformation.
According to the measuring method, the outer contour characteristic points of the image before and after deformation under the same camera position posture are compared, so that the welding deformation can be rapidly measured, the measuring accuracy is high, and the efficiency is higher compared with that of manual detection.
Example 2
Referring to fig. 2, the present embodiment provides another implementation of the welding deformation measuring method based on visual measurement, and the method is more flexible and suitable for more scenes. Specifically, the method comprises the following steps:
and S1, before welding, defining the position of the workpiece at the moment as P1, and photographing the spliced and undeformed workpiece by using a camera to obtain a first image.
As in step S1 of embodiment 1, a plurality of parts may be assembled by bonding with metal glue, and the welded state is simulated to obtain a reference image, i.e., a first image.
S2, the first individual part of the welding workpiece is placed at the same position P1, and the part is photographed by the camera to obtain a second image.
It should be noted here that the camera takes a picture while keeping the same posture as in step S1, and thus the visual area of the camera includes not only the component but also the environmental space.
In addition, the workpiece is formed by welding and splicing a plurality of parts, each part is independent, and when welding, one part is welded with the other part, for example, the part 1 is welded with the part 2, and then is welded with the part 3, so that all parts are welded in sequence, and therefore, the first independent part is the part 1, namely the first part needing to be placed when welding is prepared.
The essence of this step is also obtaining the reference image for more accurate coordinate transformation. It is easily understood that the execution sequence between steps S1 and S2 is not shown.
S3, placing the welding workpiece, defining the position of the workpiece at the moment as P2, photographing a first independent part of the workpiece before welding by using a camera to obtain a third image, extracting boundary contour points in the second image by adopting a boundary characteristic point extraction algorithm, and calculating a transformation matrix of the second image and the third image according to the coordinates of the same boundary characteristic points in the second image and the coordinates in the third image.
In the step, the workpieces are randomly placed, the position P2 of the workpieces can be the same as that P1, or can be different from that P1, the method is more applicable, and the necessary positions before and after welding do not need to be limited to be the same.
It should be noted that although the workpiece may be placed arbitrarily during welding, in order to reduce the calculation difficulty and improve the accuracy, it is preferable that the workpiece position P1 coincide with one side of P2, that is, the workpiece is attached to one reference positioning side during placement.
And S4, after the workpiece is welded at the position P2, the welded workpiece is photographed by a camera to obtain a fourth image.
And S5, extracting boundary feature points in the first image by adopting the same boundary feature point extraction algorithm, and transforming the boundary feature points to a camera space consistent with the position P2 through a transformation matrix.
And S6, adopting the same boundary characteristic point extraction algorithm, taking the boundary characteristic point after the first image coordinate transformation as a searched reference point, searching the boundary characteristic point in the fourth image by a set search radius, comparing the pixel distance between the reference point and the searched boundary characteristic point, and taking the maximum pixel distance as the welding deformation.
Similarly, before the camera takes a picture, the welding robot exits from the shooting space of the camera, and then the camera takes a picture of the workpiece, so that the shielding of the welding robot is avoided.
The boundary feature point extraction algorithm can also adopt a Hough line detection algorithm, and the intersection point of more than two Hough lines is used as the boundary feature point.
In order to improve the accuracy of selecting the boundary feature points, the boundary feature point extraction algorithm extracts the boundary feature points in the first image and then carries out manual confirmation.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. The welding deformation measuring method based on the visual measurement is characterized by comprising the following steps of:
s1, before welding, shooting the spliced and undeformed workpiece by a camera to obtain a first image;
s2, after welding, photographing the welded workpiece at the same position by using a camera to obtain a second image;
s3, extracting boundary feature points in the first image by adopting a boundary feature point extraction algorithm;
and S4, adopting the same boundary characteristic point extraction algorithm, taking the boundary characteristic point of the first image as a searched reference point, searching the boundary characteristic point in the second image by a set search radius, comparing the pixel distance between the reference point and the searched boundary characteristic point, and taking the maximum pixel distance as the welding deformation.
2. The method for measuring welding deformation based on visual measurement as claimed in claim 1, wherein in step S1, the workpiece is placed in the working space of the welding robot, the welding robot exits the photographing space of the camera, and the camera photographs the workpiece to obtain the first image.
3. The method for measuring welding deformation based on visual measurement as claimed in claim 1, wherein in step S2, after the welding robot completes welding the workpiece, the welding robot exits the photographing space of the camera, and the camera photographs the welded workpiece at the same position to obtain the second image.
4. The welding deformation measuring method based on the vision measurement as claimed in claim 1, wherein the boundary feature point extraction algorithm adopts a Hough straight line detection algorithm, and an intersection point of more than two Hough straight lines is used as a boundary feature point.
5. The welding deformation measuring method based on the visual measurement is characterized by comprising the following steps of:
s1, before welding, defining the position of the workpiece at the moment as P1, and photographing the spliced and undeformed workpiece by using a camera to obtain a first image;
s2, placing the first independent part of the welding workpiece at the same position P1, and obtaining a second image;
s3, placing a welding workpiece, defining the position of the workpiece to be P2, photographing a first independent part of the workpiece before welding by using a camera to obtain a third image, extracting boundary contour points in the second image by adopting a boundary characteristic point extraction algorithm, and calculating a transformation matrix of the second image and the third image according to the coordinates of the same boundary characteristic points in the second image and the coordinates in the third image;
s4, after the welding is finished at the position P2, the welded workpiece is photographed by a camera to obtain a fourth image;
s5, extracting boundary feature points in the first image by adopting the same boundary feature point extraction algorithm, and transforming the boundary feature points to a camera space consistent with the position P2 through a transformation matrix;
and S6, adopting the same boundary characteristic point extraction algorithm, taking the boundary characteristic point after the first image coordinate transformation as a searched reference point, searching the boundary characteristic point in the fourth image by a set search radius, comparing the pixel distance between the reference point and the searched boundary characteristic point, and taking the maximum pixel distance as the welding deformation.
6. The method for measuring welding deformation based on visual measurement as claimed in claim 5, wherein in step S1, the workpiece is placed in the working space of the welding robot, the welding robot exits the photographing space of the camera, and the camera photographs the workpiece to obtain the first image.
7. The method for measuring welding deformation based on visual measurement of claim 5, wherein in step S4, after the welding robot completes the welding of the workpiece, the welding robot exits the photographing space of the camera, and the camera photographs the welded workpiece to obtain the second image.
8. The welding deformation measuring method based on the vision measurement as claimed in claim 5, characterized in that the boundary feature point extraction algorithm adopts Hough straight line detection algorithm, and the intersection point of more than two Hough straight lines is used as the boundary feature point.
9. The method of claim 5, wherein the work position P1 coincides with an edge of P2.
10. The method for measuring welding deformation based on visual measurement as claimed in claim 5, characterized in that the boundary feature point extraction algorithm extracts the boundary feature points in the first image and then performs manual confirmation.
CN202210199149.1A 2022-03-02 2022-03-02 Welding deformation measuring method based on vision measurement Pending CN114663360A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210199149.1A CN114663360A (en) 2022-03-02 2022-03-02 Welding deformation measuring method based on vision measurement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210199149.1A CN114663360A (en) 2022-03-02 2022-03-02 Welding deformation measuring method based on vision measurement

Publications (1)

Publication Number Publication Date
CN114663360A true CN114663360A (en) 2022-06-24

Family

ID=82027730

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210199149.1A Pending CN114663360A (en) 2022-03-02 2022-03-02 Welding deformation measuring method based on vision measurement

Country Status (1)

Country Link
CN (1) CN114663360A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116441775A (en) * 2023-06-02 2023-07-18 安徽工布智造工业科技有限公司 Assembly method for correcting welding thermal deformation of H-shaped steel based on 3D vision

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116441775A (en) * 2023-06-02 2023-07-18 安徽工布智造工业科技有限公司 Assembly method for correcting welding thermal deformation of H-shaped steel based on 3D vision
CN116441775B (en) * 2023-06-02 2023-10-20 安徽工布智造工业科技有限公司 Assembly method for correcting welding thermal deformation of H-shaped steel based on 3D vision

Similar Documents

Publication Publication Date Title
CN110524581B (en) Flexible welding robot system and welding method thereof
CN110530877B (en) Welding appearance quality detection robot and detection method thereof
CN107214703B (en) Robot self-calibration method based on vision-assisted positioning
KR102056664B1 (en) Method for work using the sensor and system for performing thereof
WO2011140646A1 (en) Method and system for generating instructions for an automated machine
CN112958959A (en) Automatic welding and detection method based on three-dimensional vision
CN112648934B (en) Automatic elbow geometric form detection method
CN113146172A (en) Multi-vision-based detection and assembly system and method
CN114663360A (en) Welding deformation measuring method based on vision measurement
CN105328304A (en) Welding seam starting point automatic position searching method based on statistics
CN114289934A (en) Three-dimensional vision-based automatic welding system and method for large structural part
CN114473309A (en) Welding position identification method for automatic welding system and automatic welding system
CN106093070A (en) A kind of detection method of line source scanning weld seam
CN114654465A (en) Welding seam tracking and extracting method based on line laser structure optical vision sensing
JP2010025615A (en) Automatic inspection system of spot welding
CN115415694A (en) Welding method, system and device for sheet metal process
CN109483545B (en) Weld joint reconstruction method, intelligent robot welding method and system
CN107451991A (en) A kind of vertical position welding welding track computational methods and system
CN113269729B (en) Assembly body multi-view detection method and system based on depth image contrast
CN114581368A (en) Bar welding method and device based on binocular vision
CN116596883A (en) Metal structural part weld joint identification method, system and equipment based on machine vision
CN116542914A (en) Weld joint extraction and fitting method based on 3D point cloud
CN116912165A (en) Aluminum alloy sheet welding defect detection method based on improved YOLOv5
CN115770988A (en) Intelligent welding robot teaching method based on point cloud environment understanding
CN106447781A (en) Minkowski based and automatic installation oriented collision detection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination