CN117047354A - Welding system and welding equipment - Google Patents

Welding system and welding equipment Download PDF

Info

Publication number
CN117047354A
CN117047354A CN202311212940.2A CN202311212940A CN117047354A CN 117047354 A CN117047354 A CN 117047354A CN 202311212940 A CN202311212940 A CN 202311212940A CN 117047354 A CN117047354 A CN 117047354A
Authority
CN
China
Prior art keywords
welding
image
welded
pixel
gray value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311212940.2A
Other languages
Chinese (zh)
Inventor
艾意其
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Industrial Edge Intelligent Innovation Center Co ltd
Original Assignee
Guangdong Industrial Edge Intelligent Innovation Center Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Industrial Edge Intelligent Innovation Center Co ltd filed Critical Guangdong Industrial Edge Intelligent Innovation Center Co ltd
Priority to CN202311212940.2A priority Critical patent/CN117047354A/en
Publication of CN117047354A publication Critical patent/CN117047354A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K37/00Auxiliary devices or processes, not specially adapted to a procedure covered by only one of the preceding main groups
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K37/00Auxiliary devices or processes, not specially adapted to a procedure covered by only one of the preceding main groups
    • B23K37/02Carriages for supporting the welding or cutting element
    • B23K37/0258Electric supply or control circuits therefor

Landscapes

  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The embodiment of the application relates to the technical field of welding, and discloses a welding system and welding equipment, wherein the welding system comprises: the welding platform is used for placing a piece to be welded; the light source is obliquely arranged towards the welding platform and is used for obliquely irradiating the surface of the to-be-welded piece on the welding platform so as to form a shadow at a welding seam on the surface of the to-be-welded piece; the shooting device is used for shooting and imaging the surface of the piece to be welded irradiated by the light source; the welding device is used for welding the surface of the piece to be welded on the welding platform; a controller for performing the steps of: acquiring a first image of the surface of a piece to be welded, which is shot by a shooting device; identifying at least a portion of the shadow region in the first image as a welding path; and controlling the welding device to move along the welding route so as to weld the welding seam of the surface to be welded. Through the mode, the embodiment of the application realizes high-precision automatic welding.

Description

Welding system and welding equipment
Technical Field
The embodiment of the application relates to the technical field of welding, in particular to a welding system and welding equipment.
Background
At present, most of welding work is manually operated, and in some welding environments with dangerous operation, a welding robot is generally required to replace manual work to finish the welding operation, but due to the limitation of a control system, the welding precision of welding of the welding robot cannot meet the industrial requirement.
Therefore, how to provide an automatic welding system with high precision is a technical problem to be solved.
Disclosure of Invention
In view of the foregoing, embodiments of the present application provide a welding system for realizing high-precision automatic welding.
According to an aspect of an embodiment of the present application, there is provided a welding system including: the welding platform is used for placing a piece to be welded; the light source is obliquely arranged towards the welding platform and is used for obliquely irradiating the surface of the to-be-welded piece on the welding platform so as to form a shadow at a welding seam on the surface of the to-be-welded piece; the shooting device is used for shooting and imaging the surface of the piece to be welded irradiated by the light source; the welding device is used for welding the surface of the piece to be welded on the welding platform; a controller for performing the steps of: acquiring a first image of the surface of a piece to be welded, which is shot by a shooting device; identifying at least a portion of the shadow region in the first image as a welding path; and controlling the welding device to move along the welding route so as to weld the welding seam of the surface to be welded.
In an alternative mode, the shooting device is further used for shooting and imaging the surface of the welding platform on which the piece to be welded is placed; the controller is further configured to, before performing the step of controlling the welding device to move along the welding path to weld the weld of the surface to be welded, perform the steps of: establishing a first coordinate system in the first image, and determining a first coordinate of a welding route relative to the surface of the piece to be welded in the first image; acquiring a second image of the surface of the welding platform, on which the piece to be welded is placed, shot by the shooting device; establishing a second coordinate system in the second image, and determining a second coordinate of the surface of the piece to be welded relative to the surface of the welding platform in the second image; determining a third coordinate of the welding route relative to the welding platform surface based on the first coordinate and the second coordinate; determining the actual coordinates of the welding route on the welding platform based on the relation between the third coordinates and the actual coordinates of the welding platform; the welding device is controlled to move to a starting point among the actual coordinates of the welding line on the welding platform.
In an alternative manner, the step of identifying at least a portion of the shadow area in the first image as a welding route performed by the controller includes the steps of: converting the first image into a gray scale image; determining pixel points with gray values larger than a preset gray value in the gray image as pixel points corresponding to surface shadows of the to-be-welded piece; taking a circle tangent to the boundary of the gray level image and having the smallest radius as an inscription circle corresponding to each pixel point in a circle formed by taking the center point of each pixel point corresponding to the surface shadow of the piece to be welded as the center of the circle; taking the inner area of the inscribed circle with the largest radius in the inscribed circles corresponding to each pixel point as a target area; at least a portion of the shadow region in the target region is identified as a welding path.
In an alternative manner, the step of identifying at least a portion of the shadow area in the target area as the welding route performed by the controller includes the steps of: carrying out brightness extraction on the gray value of each pixel point in the target area to obtain the brightness characteristic intensity of each pixel point; replacing the gray value of each pixel point with the brightness characteristic intensity of the pixel point to form a response image; dividing the response image into a plurality of partial images; the method comprises the steps of encoding the magnitude relation of brightness characteristic intensities between all pixel points of a local image and a central pixel point to obtain a first encoding characteristic vector of the local image, wherein the brightness characteristic intensity of the central pixel point is used as comparison intensity, the pixel points corresponding to the brightness characteristic intensity which is larger than the comparison intensity are encoded into one type, and the pixel points corresponding to the brightness characteristic intensity which is smaller than or equal to the comparison intensity are encoded into another type; inputting the first coding feature vector corresponding to each partial image into a trained weld joint recognition model for recognition to obtain a second coding feature vector corresponding to the weld joint; matching a feature vector segment corresponding to the second coding feature vector from the first coding feature vector; matching pixel points corresponding to the feature vector segments from the local image; and connecting the pixel points corresponding to the feature vector segments into a welding route.
In an alternative manner, before the controller determines that the pixel point with the gray value greater than the preset gray value in the gray image is the pixel point corresponding to the surface shadow of the piece to be welded, the controller is further configured to perform the following steps: selecting a plurality of target gray values; the following steps are respectively executed for each selected target gray value: dividing pixel points with gray values larger than the target gray value in the first image into a first area, and dividing pixel points with gray values smaller than or equal to the target gray value in the first image into a second area; calculating a first average gray value and a first pixel occupation ratio of all gray values in the first area and a second average gray value and a second pixel occupation ratio of all gray values in the second area; calculating the product of the square of the difference between the first average gray value and the second average gray value, the first pixel occupation ratio and the second pixel occupation ratio to obtain the inter-class variance of the first image; and taking the target gray value corresponding to the inter-class variance with the largest value in all the calculated inter-class variances as a preset gray value.
In an alternative manner, the step of converting the first image into a grayscale image performed by the controller includes: and processing the first image through a Gaussian filter to obtain a gray image.
In an optional manner, the step of performing luminance extraction on the gray value of each pixel point in the target area by the controller to obtain the luminance feature intensity of each pixel point includes: and extracting the brightness of the gray value of each pixel point in the target area through a Gabor filter to obtain the brightness characteristic intensity of each pixel point.
In an alternative, the camera is a CCD laser camera.
In an alternative, the camera is arranged perpendicular to the welding platform.
According to another aspect of embodiments of the present application, there is provided a welding apparatus comprising a welding system according to any of the embodiments.
According to the embodiment of the application, the surface of the to-be-welded piece on the welding platform is obliquely irradiated by the light source, so that a shadow is formed at the welding seam on the surface of the to-be-welded piece, the surface of the to-be-welded piece irradiated by the light source is shot and imaged by the shooting device, the characteristic that obvious brightness difference is formed between the welding seam and the periphery is utilized, the shadow area of the shot image is identified by the controller to obtain the shadow area of the corresponding welding seam, the shadow area of the corresponding welding seam is the welding route, and the welding device is controlled by the controller to weld along the welding route, so that high-precision automatic welding is realized.
The foregoing description is only an overview of the present application, and is intended to be implemented in accordance with the teachings of the present application in order that the same may be more clearly understood and to make the same and other objects, features and advantages of the present application more readily apparent.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the application. Also, like reference numerals are used to designate like parts throughout the figures. In the drawings:
FIG. 1 illustrates a schematic diagram of a welding system provided by an embodiment of the present application;
FIG. 2 shows a flow chart of a controller for welding according to an embodiment of the present application;
FIG. 3 is a flow chart of a controller for welding according to another embodiment of the present application;
FIG. 4 is a flow chart illustrating the sub-steps of step 2200 in FIG. 2 performed by a controller provided in an embodiment of the present application;
FIG. 5 is a flowchart illustrating the substeps performed by the controller of FIG. 4 of step 2250 in accordance with an embodiment of the present application;
fig. 6 is a flowchart illustrating steps performed by the controller according to an embodiment of the present application before step 2220 in fig. 4.
Reference numerals in the specific embodiments are as follows:
a 100-welding system;
110-welding platform, 120-to-be-welded piece, 130-shooting device, 140-light source, 150-welding device and 160-controller.
Detailed Description
Embodiments of the technical scheme of the present application will be described in detail below with reference to the accompanying drawings. The following examples are only for more clearly illustrating the technical aspects of the present application, and thus are merely examples, and are not intended to limit the scope of the present application.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs; the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application; the terms "comprising" and "having" and any variations thereof in the description of the application and the claims and the description of the drawings above are intended to cover a non-exclusive inclusion.
In the description of embodiments of the present application, the technical terms "first," "second," and the like are used merely to distinguish between different objects and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated, a particular order or a primary or secondary relationship. In the description of the embodiments of the present application, the meaning of "plurality" is two or more unless explicitly defined otherwise.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
In the description of the embodiments of the present application, the term "and/or" is merely an association relationship describing an association object, and indicates that three relationships may exist, for example, a and/or B may indicate: there are three cases, a, B, a and B simultaneously. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship.
In the description of the embodiments of the present application, the term "plurality" means two or more (including two), and similarly, "plural sets" means two or more (including two), and "plural sheets" means two or more (including two).
In the description of the embodiments of the present application, the orientation or positional relationship indicated by the technical terms "center", "longitudinal", "transverse", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", "axial", "radial", "circumferential", etc. are based on the orientation or positional relationship shown in the drawings, and are merely for convenience of description and simplification of the description, and do not indicate or imply that the apparatus or element referred to must have a specific orientation, be configured and operated in a specific orientation, and therefore should not be construed as limiting the embodiments of the present application.
In the description of the embodiments of the present application, unless explicitly specified and limited otherwise, the terms "mounted," "connected," "secured" and the like should be construed broadly and may be, for example, fixedly connected, detachably connected, or integrally formed; or may be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communicated with the inside of two elements or the interaction relationship of the two elements. The specific meaning of the above terms in the embodiments of the present application will be understood by those of ordinary skill in the art according to specific circumstances.
At present, most welding work is manually operated, and in welding environments with some operation dangers, a welding robot is generally required to replace manual work to complete the welding operation, for example: high temperature, toxic gases or radiation environments, etc. There may be a safety risk of manually performing the welding operation, and the welding robot may operate in these severe environments. The control system of the welding robot is relatively simple, the control of the welding path is usually based on a simple linear interpolation algorithm, the accuracy of the simple control algorithm for identifying the welding seam is low, and the welding precision can not meet the industrial requirement.
The present inventors have found in research that the problem of simplicity of the control algorithm can be solved by applying the machine vision based image recognition technique to the control system. Under the light source of oblique illumination, shoot the piece that waits to weld, wherein the piece surface that waits to weld of metal class can form the high brightness area because of reflection of light, the welding seam of welding department can form the shadow area of low brightness because sunken, therefore in shooting the image that forms, the luminance difference of welding seam and the piece metal surface that waits to weld makes the image recognition technology can more accurately discern the welding seam, in order to form the welding route. Based on the mode, the application provides a high-precision automatic welding system and welding equipment.
Referring specifically to fig. 1, fig. 1 illustrates a schematic structural diagram of a welding system according to an embodiment of the present application, and as shown in fig. 1, a welding system 100: a welding platform 110 for placing a piece 120 to be welded; a light source 140 obliquely disposed toward the welding stage 110 for obliquely irradiating the surface of the workpiece 120 to be welded on the welding stage 110 to form a shadow at the weld on the surface of the workpiece 120 to be welded; a photographing device 130 for photographing and imaging the surface of the workpiece 120 to be welded irradiated by the light source 140; a welding device 150 for welding the surface of the workpiece 120 to be welded on the welding platform 110; a controller 160 for performing the steps shown in fig. 2.
The parts 120 to be welded are generally metal materials or parts whose surfaces are plated with metal materials. The surface of the workpiece 120 to be welded on the welding stage 110 is obliquely irradiated by the light source 140 to form a shadow at the weld on the surface of the workpiece 120 to be welded, and then the surface of the workpiece 120 to be welded irradiated by the light source 140 is photographed and imaged by the photographing device 130. The surface of the metal to-be-welded piece can form reflection with larger brightness, and the welding seam at the welding position can form shadow due to the concave, so that obvious brightness difference is formed between the welding seam and the surrounding in the image obtained by shooting by the shooting device 130.
The photographing device 130 may be configured as a camera with a distortion correction function so that images photographed from different angles can be directly used for the recognition process of the controller 160 after correction adjustment of the photographing device 130.
Referring to fig. 2, a flowchart illustrating steps performed by the controller according to an embodiment of the present application is shown, and as shown in fig. 2, the controller 160 is configured to perform the following steps:
step 2100: acquiring a first image of the surface of a piece to be welded, which is shot by a shooting device;
step 2200: identifying at least a portion of the shadow region in the first image as a welding path;
in the first image captured by the capturing device 130, since a significant difference in brightness is formed between the weld and the surroundings, the controller 160 recognizes the weld as a shadow region in the image. The controller 160 recognizes the weld using a machine vision-based image recognition technique, and recognizes a shadow area corresponding to the weld as a welding route. Because the brightness difference between the weld and the surrounding is obvious, the controller 160 uses the image recognition technology to recognize the weld with good recognition effect, and the obtained welding line is a high-precision welding line.
Step 2300: and controlling the welding device to move along the welding route so as to weld the welding seam of the surface to be welded.
After obtaining the high-precision welding route, the controller 160 controls the welding device 150 to weld along the welding route to achieve high-precision automatic welding.
The surface of the workpiece 120 to be welded on the welding platform 110 is obliquely irradiated by the light source 140, so that a shadow is formed at a welding seam on the surface of the workpiece 120 to be welded, then the surface of the workpiece 120 to be welded irradiated by the light source 140 is photographed and imaged by the photographing device 130, the characteristic that obvious brightness difference is formed between the welding seam and the periphery is utilized, the shadow area of the photographed image is recognized by the controller 160 to obtain a relatively accurate welding route, and the welding device 150 is controlled by the controller 160 to weld along the welding route, so that high-precision automatic welding is realized.
Referring to fig. 3, a flowchart of welding by a controller according to another embodiment of the present application is shown, and as shown in fig. 3, the controller 160 is configured to perform the following steps:
step 3100: acquiring a first image of the surface of a piece to be welded, which is shot by a shooting device;
step 3200: identifying at least a portion of the shadow region in the first image as a welding path;
since step 3100 to step 3200 are the same as step 2100 to step 2200, the specific implementation of step 3100 to step 3200 can be referred to step 2100 to step 2200, and will not be described here again.
Step 3300: establishing a first coordinate system in the first image, and determining a first coordinate of a welding route relative to the surface of the piece to be welded in the first image;
after the controller 160 recognizes the first image to obtain the welding line, the controller 160 obtains the coordinates of the entire welding line in the first coordinate system with respect to the workpiece 120 to be welded by establishing the coordinate system.
Preferably, the coordinate values of the cell coordinates of the first coordinate system established in the first image are set according to the pixel cell.
Step 3400: acquiring a second image of the surface of the welding platform, on which the piece to be welded is placed, shot by the shooting device;
the photographing device 130 is further configured to photograph and image the surface of the welding platform 110 on which the workpiece 120 to be welded is placed to obtain a second image. The controller 160 is also used to acquire a second image.
Step 3500: establishing a second coordinate system in the second image, and determining a second coordinate of the surface of the piece to be welded relative to the surface of the welding platform in the second image;
after the controller 160 acquires the second image, the controller obtains the coordinates of the entire part 120 to be welded in the second coordinate system with respect to the welding platform 110 by establishing the coordinate system.
Preferably, the coordinate values of the cell coordinates of the second coordinate system established in the second image are set according to the pixel cell.
Step 3600: determining a third coordinate of the welding route relative to the welding platform surface based on the first coordinate and the second coordinate;
preferably, in obtaining the coordinate values of the unit coordinates of the first coordinate system set according to the pixel unit and the coordinate values of the unit coordinates of the second coordinate system set according to the pixel unit, the coordinate values of the unit coordinates in the first coordinate system and the coordinate values of the unit coordinates in the second coordinate system may be unified by the pixel unit, so that the coordinate values of the unit coordinates in the two coordinate systems correspond to the same number of pixel units, and then the first coordinate and the second coordinate after the measurement unit of the unified coordinate values are calculated, so as to obtain the third coordinate of the welding route relative to the surface of the welding platform 110.
Step 3700: determining the actual coordinates of the welding route on the welding platform based on the relation between the third coordinates and the actual coordinates of the welding platform;
preferably, after obtaining the third coordinate of the welding line with respect to the surface of the welding platform 110, the third coordinate of the whole welding line with respect to the surface of the welding platform 110 is converted into the actual coordinate of the whole welding line with respect to the surface of the welding platform 110 based on the proportional relation between the pixel unit and the actual physical measurement unit, so that the welding device 150 can be directly moved to the actual coordinate according to the actual physical distance.
It should be noted that, if the coordinate values of the unit coordinates in the first coordinate system and the coordinate values of the unit coordinates in the second coordinate system cannot be directly converted, the proportional relationship between the coordinate values of the unit coordinates of the two units and the actual physical measurement unit needs to be determined, the first coordinate is converted into the first coordinate of the welding line on the actual physical measurement relative to the surface of the workpiece 120 to be welded according to the proportional relationship, the second coordinate is converted into the second coordinate of the workpiece 120 to be welded on the actual physical measurement relative to the surface of the welding platform 110 according to the proportional relationship, and then the first coordinate and the second coordinate are calculated to obtain the actual coordinate of the welding line on the welding platform 110, so that the welding device 150 can directly move to the actual coordinate according to the actual physical distance.
Step 3800: the welding device is controlled to move to a starting point among the actual coordinates of the welding line on the welding platform.
A starting point is determined from the actual coordinates such that the controller 160 can control the welding device 150 to move to the starting point, from which the welding device 150 can weld along the actual coordinates of the welding line.
Preferably, the start point may be set as a point having the largest or smallest abscissa value in the actual coordinates, or the start point may be set as a point having the largest or smallest ordinate value.
Step 3900: and controlling the welding device to move along the welding route so as to weld the welding seam of the surface to be welded.
Step 3900 is the same as step 2300, and thus, the specific implementation of step 3900 can be referred to step 2300 and will not be described herein.
It should be noted that, the execution sequence of steps 3100 to 3300 and 3400 to 3500 is not limited, and the steps of acquiring the first image to determine the first coordinate, acquiring the second image to determine the second coordinate may be performed first, and the controller 160 may also perform the steps of acquiring the first image to determine the first coordinate and acquiring the second image to determine the second coordinate simultaneously.
After the controller 160 recognizes the welding line in the first image, it is necessary to determine the actual coordinates of the welding line with respect to the welding platform 110 based on the actual physical metric unit and the starting point of the welding line, so that the controller 160 may control the welding device 150 to directly move to the starting point, and then control the welding device 150 to perform welding according to the actual coordinates of the welding line, so as to improve the degree of automation of the welding system 100, thereby realizing high-precision automated welding.
Fig. 4 is a flowchart illustrating the sub-steps of step 2200 in fig. 2 performed by the controller according to an embodiment of the present application, and as shown in fig. 4, step 2220 performed by the controller 160 includes the following steps:
step 2210: converting the first image into a gray scale image;
preferably, the pixel points in the first image have RGB values, and the RGB values of each pixel point in the first image are weighted to obtain a gray image.
Step 2220: determining pixel points with gray values larger than a preset gray value in the gray image as pixel points corresponding to surface shadows of the to-be-welded piece;
the weld is a shadow due to the depression, and a significant brightness difference is formed after the shadow is imaged with the surrounding metal material, and after the first image is converted into the gray image, a difference between the gray value of the pixel corresponding to the shadow region and the gray value of the pixel corresponding to the surrounding metal material is very large, so that the region formed by the pixel greater than the preset gray value can be determined as the surface shadow region of the workpiece 120 to be welded.
Preferably, before the pixel point with the gray value greater than the preset gray value in the gray image is determined as the pixel point corresponding to the surface shadow of the workpiece to be welded, the gray value of the pixel point with the gray value greater than the preset gray value in the gray image is amplified, the gray value of the pixel point with the gray value less than the preset gray value in the gray image is reduced, and then the pixel point corresponding to the surface shadow of the workpiece to be welded 120 in the gray image is determined through the preset gray value, so that the accuracy and precision of the weld joint recognition are improved.
Step 2230: taking a circle tangent to the boundary of the gray level image and having the smallest radius as an inscription circle corresponding to each pixel point in a circle formed by taking the center point of each pixel point corresponding to the surface shadow of the piece to be welded as the center of the circle;
step 2240: taking the inner area of the inscribed circle with the largest radius in the inscribed circles corresponding to each pixel point as a target area;
in the inscribed circle corresponding to each pixel point corresponding to the surface shadow of the piece to be welded, the inner area of the inscribed circle with the largest radius is used as a target area to realize image recognition of the welding seam so as to eliminate the shadow area formed by the non-welding seam, thereby realizing the reduction of the recognition area and being beneficial to improving the accuracy, the precision and the recognition efficiency of the image recognition of the welding seam by the controller 160.
It should be noted that the shape of the target area is not specifically limited in the embodiment of the present application. Preferably, the inner region of the inscribed circle is determined as the target region, because the algorithm determines the inscribed circle for a shorter time. The algorithm only needs to determine two parameters of a circle center and a radius, but determines other figures, such as a rectangle, and at least needs to determine the position parameters of three vertexes to determine a rectangle, so that the parameters for determining the inscribed circle are less, and the time for determining the inscribed circle by the algorithm is shorter. In actual use, a suitable shape can be selected as the shape of the target area according to actual needs.
Step 2250: at least a portion of the shadow region in the target region is identified as a welding path.
By utilizing the characteristic that the shadow formed by the concave weld seam and the surrounding represent obvious brightness difference, the area formed by the pixel points with the gray value larger than the preset gray value can be determined as the shadow area, the inscribed circle containing the shadow area in the gray image is extracted to obtain the target area, and then the shadow area corresponding to the weld seam is identified from the shadow area of the target area, so that the identification range of the controller 160 can be shortened, and the accuracy and the efficiency of identification are improved.
Fig. 5 shows a flowchart of the substeps of step 2250 in fig. 4 performed by the controller according to an embodiment of the present application, and as shown in fig. 5, step 2250 performed by the controller 160 includes the following steps:
step 2251: carrying out brightness extraction on the gray value of each pixel point in the target area to obtain the brightness characteristic intensity of each pixel point;
wherein, luminance characteristic intensity includes: and extracting the frequency and the direction of the brightness change to obtain a brightness characteristic intensity value reflecting the texture characteristics of the welding seam, and extracting the brightness change of the welding seam edge to obtain a brightness characteristic intensity value reflecting the characteristics of the welding seam edge.
Step 2252: replacing the gray value of each pixel point with the brightness characteristic intensity of the pixel point to form a response image;
step 2253: dividing the response image into a plurality of partial images;
preferably, the response image is divided into four partial images, and in an actual welding scene, one image usually contains one welding line, so that the response image is divided into four partial images, which are enough to accurately represent the change of the local characteristics of the shadow area corresponding to the welding line in the image.
Step 2254: the method comprises the steps of encoding the magnitude relation of brightness characteristic intensities between all pixel points of a local image and a central pixel point to obtain a first encoding characteristic vector of the local image, wherein the brightness characteristic intensity of the central pixel point is used as comparison intensity, the pixel points corresponding to the brightness characteristic intensity which is larger than the comparison intensity are encoded into one type, and the pixel points corresponding to the brightness characteristic intensity which is smaller than or equal to the comparison intensity are encoded into another type;
the central pixel point can be any pixel point in the local image, preferably, all pixel points in the local image can be traversed as the central pixel point to obtain a plurality of first coding feature vectors, and the first coding feature vectors of the plurality of central pixel points are combined to obtain the first coding feature vector of the local image; the first encoding feature vector of the partial image may also be determined using a pixel point at a center position of the partial image as a center pixel point.
The coding mode may be binary coding, for example: pixels corresponding to luminance feature intensities greater than the comparison intensity are encoded as "1", and pixels corresponding to luminance feature intensities less than or equal to the comparison intensity are encoded as "0".
By encoding the magnitude relation of the brightness characteristic intensities between all the pixel points and the central pixel point of the local image, the encoding characteristic vector for describing the brightness characteristic intensity distribution can be obtained, and the first encoding characteristic vector obtained by encoding has good description capability on the texture characteristics of the gray level image, so that the welding system 100 can recognize the welding seam from the image with high precision.
Step 2255: inputting the first coding feature vector corresponding to each partial image into a trained weld joint recognition model for recognition to obtain a second coding feature vector corresponding to the weld joint;
wherein the second coded feature vector is used to describe the texture and shape of the weld.
The first coding feature vector corresponding to each partial image is input into a trained weld joint recognition model for recognition, so that a shadow area which is not formed by the depression of the weld joint in the shadow area is eliminated, and the obtained second coding feature vector is the coding feature vector of the corresponding weld joint.
Step 2256: matching a feature vector segment corresponding to the second coding feature vector from the first coding feature vector;
step 2257: matching pixel points corresponding to the feature vector segments from the local image;
the pixel points corresponding to the feature vector segments are a plurality of pixel points corresponding to the welding lines.
Step 2258: and connecting the pixel points corresponding to the feature vector segments into a welding route.
And connecting pixel points corresponding to the feature vector segments, which are obtained by identifying each partial image, to obtain a welding route in the response image.
The texture and shape information of the welding seam can be reflected by a response image obtained by carrying out brightness extraction on the gray value of the pixel point; through coding the partial images of the response images and identifying the coded feature vectors obtained by coding, the pixel points corresponding to the welding lines can be accurately identified, the shadow areas formed by non-welding lines are eliminated, and finally the pixel points corresponding to the welding lines are connected to obtain a welding route, so that the welding system 100 can weld according to the welding route, and high-precision automatic welding is realized.
Fig. 6 is a flowchart illustrating steps performed by the controller before step 2220 in fig. 4, where, as shown in fig. 6, the controller 160 is further configured to, before performing step 2220, perform the following steps:
step 2260: selecting a plurality of target gray values;
and selecting a plurality of target gray values to determine an optimal gray value so that the pixel points larger than the gray value in the gray image are the pixel points corresponding to the surface shadows of the to-be-welded parts.
Preferably, all integer values in the [0, 255] interval may be selected as the target gray value, and the controller 160 traverses the [0, 255] interval and takes all integer values in the interval as the target gray value.
Preferably, a plurality of target gray values may be selected by a dichotomy, i.e., {0, 255} or {0, 64, 128, 255} or {0, 32, 64, 96, 128, 192, 255} is extracted by dichotomy of the interval of [0, 255] as the target gray value.
Step 2270: the following steps are respectively executed for each selected target gray value:
step 2271: dividing pixel points with gray values larger than the target gray value in the first image into a first area, and dividing pixel points with gray values smaller than or equal to the target gray value in the first image into a second area;
step 2272: calculating a first average gray value and a first pixel occupation ratio of all gray values in the first area and a second average gray value and a second pixel occupation ratio of all gray values in the second area;
the average gray value of a certain area is obtained by summing gray values of all pixel points in the area, and then calculating the quotient of the sum value and the number of all pixel points in the area, wherein the pixel occupation ratio is the number occupation ratio of the pixel points of the certain area in the pixel points of the total area.
Step 2273: calculating the product of the square of the difference between the first average gray value and the second average gray value, the first pixel occupation ratio and the second pixel occupation ratio to obtain the inter-class variance of the first image;
step 2280: and taking the target gray value corresponding to the inter-class variance with the largest value in all the calculated inter-class variances as a preset gray value.
The larger the inter-class variance is, the larger the difference of the gray values of the two areas is, the more pixels are correctly classified, namely, the larger the inter-class variance is, the more pixels representing shadow areas can be distinguished by the corresponding target gray value, and the pixels in the first area corresponding to the inter-class variance with the largest value can be accurately corresponding to the pixels of the shadow areas.
The controller 160 selects a plurality of target gray values, the corresponding inter-class variance is determined through the target gray values, the target gray value corresponding to the inter-class variance with the maximum determined value is a preset gray value, the preset gray value can accurately distinguish the pixel points corresponding to the shadow area in the gray image, and the accuracy of automatic welding of the welding system 100 is improved through the preset gray value.
To improve the accuracy and precision with which the controller 160 recognizes the weld, in an alternative manner, the step of converting the first image to a grayscale image performed by the controller 160 includes:
and processing the first image through a Gaussian filter to obtain a gray image.
Preferably, a gaussian filter can be used for weighting RGB values of pixel points in the first image to obtain gray values corresponding to the pixel points, and the gray images obtained by using the gaussian filter have relatively small damage to edges, so that accuracy and precision of weld joint recognition are improved.
In an alternative manner, the step performed by the controller 160 to extract the brightness of the gray value of each pixel in the target area, to obtain the brightness characteristic intensity of each pixel includes:
and extracting the brightness of the gray value of each pixel point in the target area through a Gabor filter to obtain the brightness characteristic intensity of each pixel point.
Preferably, the Gabor filter can be used for extracting the brightness of the gray value of each pixel point in the target area to obtain the brightness characteristic intensity of each pixel point, and the brightness characteristic intensity obtained by extracting the Gabor filter has certain robustness to different illumination conditions, so that the accuracy and precision of weld joint identification are improved.
In an alternative, the camera 130 is a CCD laser camera.
Preferably, the surface of the workpiece 120 to be welded irradiated by the light source 140 may be photographed and imaged by a CCD laser camera, which can rapidly collect images to accommodate high-intensity automated welding, and the CCD laser camera imaging has high resolution, thereby improving the accuracy of the automated welding performed by the welding system 100.
In an alternative, the camera 130 is arranged perpendicular to the welding platform 110.
Preferably, the photographing device 130 is disposed perpendicular to the welding platform 110, so that the correction difficulty of the image photographed by the photographing device 130 is low and can be directly recognized by the controller 160, so as to improve the accuracy and precision of the automated welding machine of the welding system 100.
According to the embodiment of the application, the surface of the to-be-welded piece 120 on the welding platform 110 is obliquely irradiated by the light source 140 so as to form a shadow at a welding seam on the surface of the to-be-welded piece 120, and then the surface of the to-be-welded piece 120 irradiated by the light source 140 is photographed and imaged by the photographing device 130; the controller 160 converts the photographed image into a gray image, and cuts an area formed by pixels with gray values larger than a preset gray value in the gray image by utilizing the characteristic that obvious brightness difference is formed between the shadow and the surrounding, and recognizes a shadow area corresponding to a welding line from a target area obtained by cutting, so that the recognition range of the controller 160 for recognizing a welding route is reduced, and the welding precision and the welding efficiency of the welding system 100 are improved; the controller 160 may reflect texture and shape information of the seam by extracting the gray values of the pixels in the target area, and accurately identify the pixels corresponding to the seam by encoding the partial image of the response image and identifying the encoded feature vector obtained by encoding, thereby eliminating the shadow area formed by the non-seam, and finally connecting the pixels corresponding to the seam to obtain the welding line, so that the welding system 100 can weld according to the welding line, and high-precision automatic welding is realized.
According to another aspect of an embodiment of the present application, there is provided a welding apparatus including the welding system according to any one of the embodiments.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application, and are intended to be included within the scope of the appended claims and description. In particular, the technical features mentioned in the respective embodiments may be combined in any manner as long as there is no structural conflict. The present application is not limited to the specific embodiments disclosed herein, but encompasses all technical solutions falling within the scope of the claims.

Claims (10)

1. A welding system, the welding system comprising:
the welding platform is used for placing a piece to be welded;
the light source is obliquely arranged towards the welding platform and is used for obliquely irradiating the surface of the to-be-welded piece on the welding platform so as to form a shadow at a welding seam on the surface of the to-be-welded piece;
the shooting device is used for shooting and imaging the surface of the to-be-welded piece irradiated by the light source;
the welding device is used for welding the surface of the to-be-welded piece on the welding platform;
a controller for performing the steps of:
acquiring a first image of the surface of the piece to be welded, which is shot by the shooting device;
identifying at least a portion of the shadow region in the first image as a welding route;
and controlling the welding device to move along the welding route so as to weld the welding seam of the surface to be welded.
2. The welding system of claim 1, wherein the camera is further configured to take an image of a surface of a welding platform on which the part to be welded is placed;
the controller is further configured to, before performing the step of controlling the welding device to move along the welding path to weld the weld seam of the surface to be welded, perform the steps of:
establishing a first coordinate system in the first image, and determining a first coordinate of the welding route relative to the surface of the piece to be welded in the first image;
acquiring a second image of the surface of the welding platform, on which the piece to be welded is placed, shot by the shooting device;
establishing a second coordinate system in the second image, and determining a second coordinate of the surface of the piece to be welded relative to the surface of the welding platform in the second image;
determining a third coordinate of the welding route relative to the welding platform surface based on the first coordinate and the second coordinate;
determining the actual coordinates of the welding route on the welding platform based on the relation between the third coordinates and the actual coordinates of the welding platform;
and controlling the welding device to move to a starting point among actual coordinates of the welding route on the welding platform.
3. The welding system of claim 1, wherein the step of identifying at least a portion of the shadow area in the first image as a welding route performed by the controller comprises the steps of:
converting the first image into a gray scale image;
determining pixel points with gray values larger than a preset gray value in the gray image as pixel points corresponding to the surface shadows of the to-be-welded piece;
taking a circle tangent to the boundary of the gray level image and having the smallest radius as an inscription circle corresponding to each pixel point in a circle formed by taking the center point of each pixel point corresponding to the surface shadow of the workpiece to be welded as the center of the circle;
taking the inner area of the inscribed circle with the largest radius in the inscribed circles corresponding to each pixel point as a target area;
at least a portion of the shadow areas in the target area are identified as welding routes.
4. A welding system according to claim 3, wherein said step of identifying at least a portion of the shadow area in the target area as a welding route performed by the controller comprises the steps of:
carrying out brightness extraction on the gray value of each pixel point in the target area to obtain the brightness characteristic intensity of each pixel point;
replacing the gray value of each pixel point with the brightness characteristic intensity of the pixel point to form a response image;
dividing the response image into a plurality of partial images;
encoding the magnitude relation of the brightness characteristic intensities between all pixel points of the local image and the central pixel point to obtain a first encoding characteristic vector of the local image, wherein the brightness characteristic intensity of the central pixel point is used as comparison intensity, the pixel points corresponding to the brightness characteristic intensity which is larger than the comparison intensity are encoded into one type, and the pixel points corresponding to the brightness characteristic intensity which is smaller than or equal to the comparison intensity are encoded into another type;
inputting the first coding feature vector corresponding to each partial image into a trained weld joint recognition model for recognition to obtain a second coding feature vector corresponding to the weld joint;
matching a feature vector segment corresponding to the second coding feature vector from the first coding feature vector;
matching pixel points corresponding to the feature vector segments from the partial image;
and connecting the pixel points corresponding to the feature vector segments into the welding route.
5. The welding system of claim 3, wherein the controller is further configured to, prior to performing the determining that the pixel in the gray image having a gray value greater than the preset gray value is the pixel corresponding to the surface shadow of the piece to be welded, perform the steps of:
selecting a plurality of target gray values;
the following steps are respectively executed for each selected target gray value:
dividing pixel points with gray values larger than the target gray value in the first image into a first area, and dividing pixel points with gray values smaller than or equal to the target gray value in the first image into a second area;
calculating a first average gray value and a first pixel occupation ratio of all gray values in the first area and a second average gray value and a second pixel occupation ratio of all gray values in the second area;
calculating the product of the square of the difference between the first average gray value and the second average gray value, the first pixel occupation ratio and the second pixel occupation ratio to obtain the inter-class variance of the first image;
and taking the target gray value corresponding to the inter-class variance with the largest value in all the calculated inter-class variances as the preset gray value.
6. The welding system of claim 3, wherein the step of converting the first image to a grayscale image performed by the controller comprises:
and processing the first image through a Gaussian filter to obtain a gray image.
7. The welding system of claim 4, wherein the step of performing, by the controller, a brightness extraction on the gray value of each pixel in the target area to obtain a brightness characteristic intensity of each pixel comprises:
and extracting the brightness of the gray value of each pixel point in the target area through a Gabor filter to obtain the brightness characteristic intensity of each pixel point.
8. The welding system of claim 1, wherein the camera is a CCD laser camera.
9. The welding system of claim 1, wherein the camera is disposed perpendicular to the welding platform.
10. A welding apparatus, the welding apparatus comprising: the welding system of any of claims 1-9.
CN202311212940.2A 2023-09-19 2023-09-19 Welding system and welding equipment Pending CN117047354A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311212940.2A CN117047354A (en) 2023-09-19 2023-09-19 Welding system and welding equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311212940.2A CN117047354A (en) 2023-09-19 2023-09-19 Welding system and welding equipment

Publications (1)

Publication Number Publication Date
CN117047354A true CN117047354A (en) 2023-11-14

Family

ID=88669428

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311212940.2A Pending CN117047354A (en) 2023-09-19 2023-09-19 Welding system and welding equipment

Country Status (1)

Country Link
CN (1) CN117047354A (en)

Similar Documents

Publication Publication Date Title
CN111462110B (en) Welding seam quality detection method, device and system and electronic equipment
CN107578464B (en) Conveyor belt workpiece three-dimensional contour measuring method based on line laser scanning
CN111307818A (en) Visual online detection method for laser welding spot of lithium battery tab
CN111284154B (en) Seal control machine seal control method, device and system based on image recognition
CN111982921A (en) Hole defect detection method and device, conveying platform and storage medium
CN112304960B (en) High-resolution image object surface defect detection method based on deep learning
KR20080032856A (en) Recognition method of welding line position in shipbuilding subassembly stage
CN111915485B (en) Rapid splicing method and system for feature point sparse workpiece images
CN114240845A (en) Surface roughness measuring method by adopting light cutting method applied to cutting workpiece
CN112560713A (en) Image recognition method, device, equipment and cooking system
CN113822810A (en) Method for positioning workpiece in three-dimensional space based on machine vision
CN113970560B (en) Defect three-dimensional detection method based on multi-sensor fusion
CN117047354A (en) Welding system and welding equipment
CN114998571B (en) Image processing and color detection method based on fixed-size markers
CN116883498A (en) Visual cooperation target feature point positioning method based on gray centroid extraction algorithm
CN116839473A (en) Weld positioning and size calculating method and device, storage medium and electronic equipment
AU2021368390B2 (en) Multi-target recognition system and method for follow-up robot based on coded thermal infrared mark
CN115388785A (en) Flexible wire harness measuring method and system based on vision
US20210129342A1 (en) Controller, control method using controller, and control system
CN113012115A (en) Bolt three-dimensional imaging detection system and method
CN111667429A (en) Target positioning and correcting method for inspection robot
RU2280838C2 (en) Method of contact-free measurement of objects having defocused borders onto image
KR100266805B1 (en) Method for measuring the position and arrangement state of a sub-assembly member in a shipbuilding process via two-step vision system
CN115830053B (en) Machine vision-based cord steel mosaic edge positioning method and system
CN112926438B (en) Detection method and device, detection equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination