CN115810133B - Welding control method based on image processing and point cloud processing and related equipment - Google Patents

Welding control method based on image processing and point cloud processing and related equipment Download PDF

Info

Publication number
CN115810133B
CN115810133B CN202310087975.1A CN202310087975A CN115810133B CN 115810133 B CN115810133 B CN 115810133B CN 202310087975 A CN202310087975 A CN 202310087975A CN 115810133 B CN115810133 B CN 115810133B
Authority
CN
China
Prior art keywords
data
point cloud
point
welding
component
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310087975.1A
Other languages
Chinese (zh)
Other versions
CN115810133A (en
Inventor
田璐璐
齐株锐
卜磊
雷俊
方舟
靳程锐
刘峻佑
曹秀伟
曹哲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Construction Science and Technology Group Co Ltd
Original Assignee
China Construction Science and Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Construction Science and Technology Group Co Ltd filed Critical China Construction Science and Technology Group Co Ltd
Priority to CN202310087975.1A priority Critical patent/CN115810133B/en
Publication of CN115810133A publication Critical patent/CN115810133A/en
Application granted granted Critical
Publication of CN115810133B publication Critical patent/CN115810133B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a welding control method and related equipment based on image processing and point cloud processing, wherein the method comprises the following steps: acquiring image data and point cloud data of equipment to be welded; acquiring reference category data corresponding to pixel points in image data through a trained image feature recognition model; acquiring weight data corresponding to the pixel points according to the reference part category information and the reference welding point category information in the reference category data; performing point cloud smoothing on the point cloud data to obtain dense point cloud, and acquiring target class data of each point in the dense point cloud through a trained point cloud feature recognition model according to the dense point cloud, the reference class data and the weight data corresponding to the pixel points; performing point cloud clustering according to the target category data and the dense point cloud to obtain a target component and component category data corresponding to the target component; and acquiring the coordinates of the welding area of the standard component corresponding to each target component according to each component type, and further performing welding control. The invention is beneficial to improving the welding accuracy.

Description

Welding control method based on image processing and point cloud processing and related equipment
Technical Field
The invention relates to the technical field of image processing, in particular to a welding control method and related equipment based on image processing and point cloud processing.
Background
Along with the development of scientific technology, especially the development of automatic control technology, the application of the automatic control technology is also more and more widespread, and the automatic control technology can be applied in different scenes. For example, automatic welding may be implemented based on automatic control techniques when welding equipment.
In the prior art, standard components are usually pre-constructed to obtain corresponding standard information, including coordinates of the area where welding is required. And determining corresponding standard information for the equipment to be welded according to the type of the equipment to be welded, so that automatic welding is performed according to the coordinates of the area to be welded in the standard information, and the welding of the equipment to be welded is realized. The problem with the prior art is that the standard information is information obtained when the welding device is in an ideal state, i.e. in a state where no errors are considered to be present, for example when the standard is the object of measurement. However, in the actual welding process, there is usually a certain difference between the device to be welded and the standard component, for example, there may be a shape difference or a deviation difference of the welding area, and these differences may affect the welding effect, which is not beneficial to improving the welding accuracy.
Accordingly, there is a need for improvement and development in the art.
Disclosure of Invention
The invention mainly aims to provide a welding control method based on image processing and point cloud processing and related equipment, and aims to solve the problem that in the prior art, an automatic welding scheme is not beneficial to improving welding accuracy by directly carrying out automatic welding according to coordinates of an area to be welded in standard information corresponding to equipment to be welded.
In order to achieve the above object, a first aspect of the present invention provides a welding control method based on image processing and point cloud processing, wherein the welding control method based on image processing and point cloud processing includes:
acquiring image data and point cloud data of equipment to be welded, wherein one pixel point in the image data corresponds to one point in the point cloud data;
acquiring reference category data corresponding to each pixel point in the image data through a trained image feature recognition model according to the image data, wherein the reference category data comprises reference component category information and reference welding point category information, the reference component category information is any one of a plurality of preset component categories and interference categories, and the reference welding point category information is any one of welding points and non-welding points;
Acquiring weight data corresponding to each pixel point according to the reference component category information and the reference welding point category information;
performing point cloud smoothing on the point cloud data to obtain dense point cloud, and acquiring target class data corresponding to each point in the dense point cloud through a trained point cloud feature recognition model according to the dense point cloud, the reference class data and the weight data corresponding to each pixel point, wherein the target class data comprises target component class information and target welding point class information;
performing point cloud clustering according to the target category data and the dense point cloud to obtain clustered target components and component category data corresponding to the target components, wherein the component category data comprises component categories and actual component welding area coordinates;
and obtaining standard component welding area coordinates corresponding to each target component according to each component category, and performing welding control on the equipment to be welded according to the actual component welding area coordinates and the standard component welding area coordinates.
Optionally, the acquiring the image data and the point cloud data of the device to be welded includes:
Acquiring an image of the equipment to be welded through a preset camera, and acquiring image data corresponding to the equipment to be welded, wherein the image data comprises RGB images and depth images, and pixel points of the RGB images and the depth images are in one-to-one correspondence;
and carrying out laser point cloud scanning on the equipment to be welded to obtain point cloud data corresponding to the equipment to be welded, wherein the number of points in the point cloud data is equal to the number of pixel points in the RGB image or the depth image.
Optionally, the image feature recognition model is trained according to the following steps:
inputting training image data in image processing training data into the image feature recognition model, carrying out feature recognition through the image feature recognition model, and acquiring training class data corresponding to each pixel point in the training image data, wherein the image processing training data comprises a plurality of groups of training image groups, and each group of training image groups comprises training image data and labeling class data corresponding to each pixel point in the training image data;
and adjusting model parameters of the image feature recognition model according to the training type data and the labeling type data, and continuously executing the step of inputting training image data in the image processing training data into the image feature recognition model until a preset image processing training condition is met, so as to obtain a trained image feature recognition model.
Optionally, the obtaining weight data corresponding to each pixel according to the reference component category information and the reference welding point category information includes:
and acquiring a preset weight level data table, and setting corresponding weight values for the pixel points according to the weight level data table, the reference component category information and the reference welding point category information as the weight data, wherein the weight value corresponding to the pixel point of the reference component category information, which is the preset component category, is higher than the weight value corresponding to the pixel point of the reference component category information, which is the interference category, and the weight value corresponding to the pixel point of the reference welding point category information, which is the welding point, is higher than the weight value corresponding to the pixel point of the reference component, which is the non-welding point.
Optionally, the performing the point cloud smoothing process on the point cloud data to obtain a dense point cloud, and obtaining, according to the dense point cloud, reference category data and weight data corresponding to each pixel point, target category data corresponding to each point in the dense point cloud through a trained point cloud feature recognition model, where the method includes:
performing point cloud smoothing on the point cloud data to obtain dense point clouds, wherein the dense point clouds comprise original points and newly added points, and the original points are points in the point cloud data;
Acquiring reference class data and weight data corresponding to each point in the dense point cloud according to the reference class data and weight data corresponding to each pixel point, wherein the reference class data and weight data of any one original point are identical to the reference class data and weight data of the pixel point corresponding to the original point, and the reference class data and weight data of any one newly added point are identical to the reference class data and weight data of the nearest original point corresponding to the newly added point;
and acquiring target category data corresponding to each point in the dense point cloud through a trained point cloud feature recognition model according to the dense point cloud and the reference category data and the weight data corresponding to each point in the dense point cloud.
Optionally, the point cloud feature recognition model is trained according to the following steps:
inputting training point cloud data in point cloud processing training data into the point cloud feature recognition model, and acquiring training target class data corresponding to each point in the training point cloud data through the point cloud feature recognition model, wherein the point cloud processing training data comprises a plurality of groups of training point cloud groups, each group of training point cloud groups comprises the training point cloud data and marking target class data corresponding to each point in the training point cloud data, and the training point cloud data comprises a plurality of points, training reference class data corresponding to each point and training weight data;
And adjusting model parameters of the point cloud feature recognition model according to the training target class data and the labeling target class data, and continuously executing the step of inputting training point cloud data in the point cloud processing training data into the point cloud feature recognition model until preset point cloud processing training conditions are met, so as to obtain a trained point cloud feature recognition model.
Optionally, the performing welding control on the to-be-welded device according to the actual component welding area coordinates and the standard component welding area coordinates includes:
calculating and obtaining welding coordinate offset data according to the actual part welding area coordinates and the standard part welding area coordinates;
adjusting a preset welding path corresponding to the equipment to be welded according to the coordinate offset data to obtain a corrected welding path;
and welding the welding equipment according to the corrected welding path.
The second aspect of the present invention provides a welding control system based on image processing and point cloud processing, wherein the welding control system based on image processing and point cloud processing includes:
the data acquisition module is used for acquiring image data and point cloud data of equipment to be welded, wherein one pixel point in the image data corresponds to one point in the point cloud data;
The image processing module is used for acquiring reference category data corresponding to each pixel point in the image data through a trained image feature recognition model according to the image data, wherein the reference category data comprises reference component category information and reference welding point category information, the reference component category information is any one of a plurality of preset component categories and interference categories, and the reference welding point category information is any one of welding points and non-welding points;
the weight acquisition module is used for acquiring weight data corresponding to each pixel point according to the reference component category information and the reference welding point category information;
the point cloud processing module is used for carrying out point cloud smoothing processing on the point cloud data to obtain dense point clouds, and obtaining target category data corresponding to each point in the dense point clouds through a trained point cloud feature recognition model according to the dense point clouds, reference category data and weight data corresponding to each pixel point, wherein the target category data comprises target component category information and target welding point category information;
the point cloud clustering module is used for carrying out point cloud clustering according to the target category data and the dense point cloud to obtain clustered target components and component category data corresponding to the target components, wherein the component category data comprises component categories and actual component welding area coordinates;
And the control module is used for acquiring the standard part welding area coordinates corresponding to each target part according to each part type, and performing welding control on the equipment to be welded according to the actual part welding area coordinates and the standard part welding area coordinates.
A third aspect of the present invention provides an intelligent terminal including a memory, a processor, and a welding control program based on image processing and point cloud processing stored in the memory and executable on the processor, wherein the welding control program based on image processing and point cloud processing implements the steps of any one of the welding control methods based on image processing and point cloud processing when executed by the processor.
A fourth aspect of the present invention provides a computer-readable storage medium having stored thereon a welding control program based on image processing and point cloud processing, which when executed by a processor, implements the steps of any one of the above welding control methods based on image processing and point cloud processing.
From the above, in the scheme of the invention, the image data and the point cloud data of the equipment to be welded are obtained, wherein one pixel point in the image data corresponds to one point in the point cloud data; acquiring reference category data corresponding to each pixel point in the image data through a trained image feature recognition model according to the image data, wherein the reference category data comprises reference component category information and reference welding point category information, the reference component category information is any one of a plurality of preset component categories and interference categories, and the reference welding point category information is any one of welding points and non-welding points; acquiring weight data corresponding to each pixel point according to the reference component category information and the reference welding point category information; performing point cloud smoothing on the point cloud data to obtain dense point cloud, and acquiring target class data corresponding to each point in the dense point cloud through a trained point cloud feature recognition model according to the dense point cloud, the reference class data and the weight data corresponding to each pixel point, wherein the target class data comprises target component class information and target welding point class information; performing point cloud clustering according to the target category data and the dense point cloud to obtain clustered target components and component category data corresponding to the target components, wherein the component category data comprises component categories and actual component welding area coordinates; and obtaining standard component welding area coordinates corresponding to each target component according to each component category, and performing welding control on the equipment to be welded according to the actual component welding area coordinates and the standard component welding area coordinates.
Compared with the prior art, in the scheme of the invention, the image data of the equipment to be welded is taken as a reference, and the characteristic recognition is carried out by combining the image data and the point cloud data. Specifically, the result obtained by image feature recognition is used as reference information (i.e. reference category data), meanwhile, the point cloud data is subjected to smoothing processing to obtain dense point cloud, the point cloud feature recognition is carried out in combination with the reference information to obtain target category data corresponding to each point in the dense point cloud, and then point cloud clustering is achieved, and each target component and component category data thereof are determined. The part type data comprise actual part welding area coordinates determined according to feature recognition, and when welding control is performed on the basis of the actual part welding area coordinates and standard part welding area coordinates, differences between actual equipment to be welded and standard parts can be considered, so that improvement of welding accuracy is facilitated.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a welding control method based on image processing and point cloud processing according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a welding control system based on image processing and point cloud processing according to an embodiment of the present invention;
fig. 3 is a schematic block diagram of an internal structure of an intelligent terminal according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted in the context of "when …" or "once" or "in response to a determination" or "in response to a classification. Similarly, the phrase "if determined" or "if classified to [ described condition or event ]" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon classification to [ described condition or event ]" or "in response to classification to [ described condition or event ]".
The following description of the embodiments of the present invention will be made more fully hereinafter with reference to the accompanying drawings, in which embodiments of the invention are shown, it being evident that the embodiments described are only some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, but the present invention may be practiced in other ways other than those described herein, and persons skilled in the art will readily appreciate that the present invention is not limited to the specific embodiments disclosed below.
Along with the development of scientific technology, especially the development of automatic control technology, the application of the automatic control technology is also more and more widespread, and the automatic control technology can be applied in different scenes. For example, automatic welding may be implemented based on automatic control techniques when welding equipment.
In the prior art, standard components are usually pre-constructed to obtain corresponding standard information, including coordinates of the area where welding is required. And determining corresponding standard information for the equipment to be welded according to the type of the equipment to be welded, so that automatic welding is performed according to the coordinates of the area to be welded in the standard information, and the welding of the equipment to be welded is realized. The problem with the prior art is that the standard information is information obtained when the welding device is in an ideal state, i.e. in a state where no errors are considered to be present, for example when the standard is the object of measurement. However, in the actual welding process, there is usually a certain difference between the device to be welded and the standard component, for example, there may be a shape difference or a deviation difference of the welding area, and these differences may affect the welding effect, which is unfavorable for improving the welding accuracy and the welding quality.
In order to solve at least one of the above problems, in the present invention, image data and point cloud data of a device to be welded are obtained, wherein a pixel point in the image data corresponds to a point in the point cloud data; acquiring reference category data corresponding to each pixel point in the image data through a trained image feature recognition model according to the image data, wherein the reference category data comprises reference component category information and reference welding point category information, the reference component category information is any one of a plurality of preset component categories and interference categories, and the reference welding point category information is any one of welding points and non-welding points; acquiring weight data corresponding to each pixel point according to the reference component category information and the reference welding point category information; performing point cloud smoothing on the point cloud data to obtain dense point cloud, and acquiring target class data corresponding to each point in the dense point cloud through a trained point cloud feature recognition model according to the dense point cloud, the reference class data and the weight data corresponding to each pixel point, wherein the target class data comprises target component class information and target welding point class information; performing point cloud clustering according to the target category data and the dense point cloud to obtain clustered target components and component category data corresponding to the target components, wherein the component category data comprises component categories and actual component welding area coordinates; and obtaining standard component welding area coordinates corresponding to each target component according to each component category, and performing welding control on the equipment to be welded according to the actual component welding area coordinates and the standard component welding area coordinates.
Compared with the prior art, in the scheme of the invention, the image data of the equipment to be welded is taken as a reference, and the characteristic recognition is carried out by combining the image data and the point cloud data. Specifically, the result obtained by image feature recognition is used as reference information (i.e. reference category data), meanwhile, the point cloud data is subjected to smoothing processing to obtain dense point cloud, the point cloud feature recognition is carried out in combination with the reference information to obtain target category data corresponding to each point in the dense point cloud, and then point cloud clustering is achieved, and each target component and component category data thereof are determined. The part type data comprise actual part welding area coordinates determined according to feature recognition, and when welding control is performed on the basis of the actual part welding area coordinates and standard part welding area coordinates, differences between actual equipment to be welded and standard parts can be considered, so that improvement of welding accuracy and welding quality is facilitated.
Meanwhile, in the invention, the collected point cloud data is subjected to point cloud smoothing processing to obtain dense point cloud, and the clustering effect is improved according to the smoothed dense point cloud, so that the accuracy of component identification is improved.
Exemplary method
As shown in fig. 1, an embodiment of the present invention provides a welding control method based on image processing and point cloud processing, and specifically, the method includes the following steps:
step S100, obtaining image data and point cloud data of equipment to be welded, wherein one pixel point in the image data corresponds to one point in the point cloud data.
The device to be welded is a device to be welded, and it should be noted that the device to be welded may include a plurality of parts, one part may include one or more areas to be welded, and the required devices or other parts may be welded on each part through the welding areas, so as to obtain the welded device. In this embodiment, the above-mentioned equipment to be welded is exemplified by an assembled building module (such as a steel column, a purlin, etc.) which needs to be welded, but is not limited thereto.
The image data is data obtained by performing image acquisition on the area where the equipment to be welded is located, and the point cloud data is data obtained by performing point cloud acquisition on the area where the equipment to be welded is located. It should be noted that, a pixel point in the image data represents a specific position in the area where the device to be welded is located, and the position is also subjected to point cloud acquisition to obtain a point in the point cloud data. The point cloud data includes a plurality of points and point cloud information (e.g., reflection information, coordinate information, color information, etc.) corresponding to each point.
Specifically, the acquiring the image data and the point cloud data of the device to be welded includes: acquiring an image of the equipment to be welded through a preset camera, and acquiring image data corresponding to the equipment to be welded, wherein the image data comprises RGB images and depth images, and pixel points of the RGB images and the depth images are in one-to-one correspondence; and carrying out laser point cloud scanning on the equipment to be welded to obtain point cloud data corresponding to the equipment to be welded, wherein the number of points in the point cloud data is equal to the number of pixel points in the RGB image or the depth image.
In this embodiment, the preset camera includes a pre-calibrated RGB camera and a depth image acquisition camera, and in the images acquired by the two cameras, the pixel points are in one-to-one correspondence. For example, the first pixel in the RGB image represents the location point a of the target area (i.e., the area where the device to be welded is located), and the first pixel in the depth image also represents the location point a. Meanwhile, in this embodiment, the target area is scanned by using the LiDAR laser point cloud to obtain corresponding point cloud data. And the number of the points in the point cloud data is equal to the number of the pixel points in the RGB image or the depth image, namely one pixel point corresponds to one point cloud point, so that the point cloud identification can be carried out by combining the image information, and the identification effect is improved.
Step 200, obtaining reference class data corresponding to each pixel point in the image data through a trained image feature recognition model according to the image data, wherein the reference class data comprises reference component class information and reference welding point class information, the reference component class information is any one of a plurality of preset component classes and interference classes, and the reference welding point class information is any one of welding points and non-welding points.
In this embodiment, image data is first subjected to image feature recognition, and reference category data corresponding to each pixel point is determined as reference information, so that accuracy of a subsequent point cloud feature recognition processing process is improved. Specifically, the trained image feature recognition model is used for performing feature recognition processing on an input image to determine reference category data corresponding to each pixel point in the input image. The reference category data comprises information of two aspects, namely the reference component category information on one hand and the reference welding point category information on the other hand. The reference component category information is used to indicate what type of component a point is, for example, input component a or component B, and in this embodiment, any one of the reference component category information is any one of a plurality of preset component categories representing respective component types on the device to be welded, for example, component a, component B, and the like, and an interference category. And if a point is a point on the background environment or other surrounding equipment, i.e. does not belong to the point on the equipment to be welded, the corresponding reference component category information is an interference category.
The above-mentioned reference welding point category information is used to indicate whether the point is a welding point, and it should be noted that, for a certain component, the welding area is usually fixed, so that an image feature recognition model for recognizing the welding point can be trained in advance. Specifically, for a point in the image data, the corresponding reference category data may be (component a category, welding point), representing that the point belongs to a welding point of component a.
In this embodiment, the image feature recognition model is trained according to the following steps: inputting training image data in image processing training data into the image feature recognition model, carrying out feature recognition through the image feature recognition model, and acquiring training class data corresponding to each pixel point in the training image data, wherein the image processing training data comprises a plurality of groups of training image groups, and each group of training image groups comprises training image data and labeling class data corresponding to each pixel point in the training image data; and adjusting model parameters of the image feature recognition model according to the training type data and the labeling type data, and continuously executing the step of inputting training image data in the image processing training data into the image feature recognition model until a preset image processing training condition is met, so as to obtain a trained image feature recognition model.
The image processing training data are data which are acquired in advance and used for training the image feature recognition model. The training image data and the label type data in the image processing training data are both in the same data format as the corresponding data (e.g., the image data input to the model and the reference type data output from the model) when the image feature recognition model is used. Specifically, when the image data input into the model includes a depth image and an RGB image, the training image data also includes a training depth image and a training RGB image for training, and other data are the same and are not described herein.
The preset image processing training conditions are preset conditions for limiting the training completion, and may include that the training iteration number of the model reaches the preset iteration number corresponding to the model, or the loss value is smaller than the preset loss threshold corresponding to the model, which is not limited specifically herein.
And step S300, obtaining weight data corresponding to each pixel point according to the reference component category information and the reference welding point category information.
In this embodiment, a corresponding weight value (i.e., weight data) is set for each pixel point. Specifically, one pixel point corresponds to one point in the point cloud data, so that the weight value of the pixel point can be used as the weight value corresponding to the point in the point cloud data, different weights are set for each point in the point cloud data, the attention mechanism is obtained based on the weights in the processing process, the point with the larger weight value can be regarded as the more important point, more attention can be obtained, and therefore a better feature recognition effect is obtained.
When the weight is set, the weight value of the pixel point corresponding to the component which is more concerned by the user is higher. Specifically, the obtaining weight data corresponding to each pixel according to the reference component category information and the reference welding point category information includes: and acquiring a preset weight level data table, and setting corresponding weight values for the pixel points according to the weight level data table, the reference component category information and the reference welding point category information as the weight data, wherein the weight value corresponding to the pixel point of the reference component category information, which is the preset component category, is higher than the weight value corresponding to the pixel point of the reference component category information, which is the interference category, and the weight value corresponding to the pixel point of the reference welding point category information, which is the welding point, is higher than the weight value corresponding to the pixel point of the reference component, which is the non-welding point.
The preset weight level data table is a preset table for storing weight value data or weight level data, and it should be noted that, for different components in the to-be-welded device, the user has different degrees of care about the different components, so that the corresponding weight values or weight levels are different, for example, the weight level corresponding to the component a is higher than the weight level corresponding to the component B, and when the weight data is determined, the weight value of a point in the component a is higher than the weight value of a point in the component B. Meanwhile, for the same component, the weight value of the point in the welding area is higher than that of the point in the non-welding area. Further, the point corresponding to the interference category is not the object that needs to be focused on in the scheme, so that the weight value is the lowest. Thus, different weight values are set for different types of points, and a better feature recognition effect can be obtained.
Step S400, carrying out point cloud smoothing processing on the point cloud data to obtain dense point clouds, and obtaining target category data corresponding to each point in the dense point clouds through a trained point cloud feature recognition model according to the dense point clouds, the reference category data and the weight data corresponding to each pixel point, wherein the target category data comprises target component category information and target welding point category information.
In this embodiment, considering that the points in the point cloud data obtained when the point cloud scanning is performed are sparse, the point cloud smoothing processing is performed to obtain dense point clouds, so that better point cloud feature recognition and clustering effects can be obtained. It should be noted that, the point cloud smoothing process may be implemented by adopting an interpolation manner, and the point cloud data of the newly added point is a mean value of the points of the attachment, and may also be implemented by adopting other manners, which are not limited herein specifically. The trained point cloud feature recognition model is used for performing feature recognition and classification on the input point cloud, determining the category (namely target category data) of each point, and performing attention allocation on each point in the point cloud data according to reference information (namely reference category data and weight data corresponding to each pixel point) in the recognition processing of the point cloud feature recognition model.
The target component category information corresponding to one point is any one of a plurality of preset component categories and interference categories, and the target welding point category information is any one of welding points and non-welding points.
Specifically, the performing the point cloud smoothing process on the point cloud data to obtain a dense point cloud, and obtaining, according to the dense point cloud, reference class data and weight data corresponding to each pixel point, target class data corresponding to each point in the dense point cloud through a trained point cloud feature recognition model, where the method includes: performing point cloud smoothing on the point cloud data to obtain dense point clouds, wherein the dense point clouds comprise original points and newly added points, and the original points are points in the point cloud data; acquiring reference class data and weight data corresponding to each point in the dense point cloud according to the reference class data and weight data corresponding to each pixel point, wherein the reference class data and weight data of any one original point are identical to the reference class data and weight data of the pixel point corresponding to the original point, and the reference class data and weight data of any one newly added point are identical to the reference class data and weight data of the nearest original point corresponding to the newly added point; and acquiring target category data corresponding to each point in the dense point cloud data through a trained point cloud feature recognition model according to the dense point cloud and the reference category data and the weight data corresponding to each point in the dense point cloud.
The new points are new points inserted in the process of performing the point cloud smoothing process. The nearest original point corresponding to the newly added point is an original point nearest to the newly added point, and when there are a plurality of nearest original points, one of the nearest original points may be arbitrarily selected as the nearest original point. In this embodiment, after obtaining the dense point cloud and the reference category data and weight data of each point therein, the data is input to a trained point cloud feature recognition model to obtain target category data corresponding to each point in the dense point cloud.
Specifically, the point cloud feature recognition model is trained according to the following steps: inputting training point cloud data in point cloud processing training data into the point cloud feature recognition model, and acquiring training target class data corresponding to each point in the training point cloud data through the point cloud feature recognition model, wherein the point cloud processing training data comprises a plurality of groups of training point cloud groups, each group of training point cloud groups comprises the training point cloud data and marking target class data corresponding to each point in the training point cloud data, and the training point cloud data comprises a plurality of points, training reference class data corresponding to each point and training weight data; and adjusting model parameters of the point cloud feature recognition model according to the training target class data and the labeling target class data, and continuously executing the step of inputting training point cloud data in the point cloud processing training data into the point cloud feature recognition model until preset point cloud processing training conditions are met, so as to obtain a trained point cloud feature recognition model.
The training target category data comprises training target component category information and training target welding point category information, and the corresponding labeling target category data comprises labeling target component category information and labeling target welding point category information. The preset point cloud processing training condition is a preset condition for limiting the completion of training the point cloud feature recognition model, specifically, the preset point cloud processing training condition may be set such that the training iteration number of the point cloud feature recognition model reaches a preset iteration number threshold corresponding to the point cloud feature recognition model, or the loss value is smaller than a loss value threshold corresponding to the point cloud feature recognition model, and may further include other conditions, which are not specifically limited herein.
And S500, performing point cloud clustering according to the target category data and the dense point cloud, and acquiring each clustered target component and component category data corresponding to each target component, wherein the component category data comprises component categories and actual component welding area coordinates.
Specifically, when clustering is performed, the same type of points are clustered together as one component or one welding area, and the clustered obtained component is recorded as a target component. Specifically, the equipment to be welded is divided into a plurality of target components, and the component category and the actual component welding area coordinates corresponding to each target component are respectively determined. The actual part welding area is formed by all welding points in the target part, so that the coordinates of the actual part welding area can be obtained according to the coordinates of the welding points. In this embodiment, the actual component welding area coordinates include boundary point coordinates of an actual welding area.
It should be noted that when the device to be welded deforms or the welding point is offset due to other reasons, the offset directions and the offset degrees corresponding to the different components may be different, so in this embodiment, offset calculation and welding control are performed for each target component, which is beneficial to further improving the accuracy of welding.
And S600, obtaining standard component welding area coordinates corresponding to the target components according to the component types, and performing welding control on the equipment to be welded according to the actual component welding area coordinates and the standard component welding area coordinates.
Specifically, for each target component, standard components of the same component class are pre-constructed, and corresponding standard component welding area coordinates are determined according to the standard components. And there may be a difference between the obtained standard component welding area coordinates and the obtained actual component welding area coordinates, and the welding process needs to be adjusted and controlled according to the difference.
In this embodiment, the performing welding control on the to-be-welded device according to the actual component welding area coordinate and the standard component welding area coordinate includes: calculating and obtaining welding coordinate offset data according to the actual part welding area coordinates and the standard part welding area coordinates; adjusting a preset welding path corresponding to the equipment to be welded according to the coordinate offset data to obtain a corrected welding path; and welding the welding equipment according to the corrected welding path.
The above welding coordinate shift data is used to describe the difference between the obtained standard component welding region coordinates and the obtained actual component welding region coordinates, for example, in which direction an area boundary is shifted by how much.
In one application scenario, a LiDAR laser point cloud scan is used to obtain an initial point cloud, a point cloud density method is used to generate a dense point cloud, and the initial point cloud contains surrounding equipment, environmental noise points and the like. Point cloud segmentation may be implemented using a pointnet++ model (i.e., a trained point cloud feature recognition model). The point cloud PointNet++ model is a semantic segmentation model at a point level, namely, classification semantics are given to each point in the point cloud data, and the point cloud PointNet++ model specifically distinguishes whether the current point belongs to a part to be welded or an environment or other surrounding equipment, and which target part type (such as a preset stilt, purline and the like) the current point specifically belongs to. In this model, each point feature is extracted by a shared multi-layer perceptron (MLP), and then the features of all points are combined into a global feature vector using a pooling operation. For classification tasks, the vector can be directly used as a feature, and the MLP can be used for classification. For segmentation tasks, it is necessary to splice point features with global features and then classify each point with MLP. Model training is completed through manual point cloud data labeling, supervised learning, back propagation and gradient descent.
After category information corresponding to each point in the dense point cloud is obtained, a standard steel structural member model stored in a Building Information Model (BIM) template database is quickly called through category keywords, point level comparison is carried out, and the position and shape difference of a welding point (in the standard model) and the welding point in the current actual equipment to be welded are mainly compared. The process can be realized by automatic locating, because the mechanical arm is high in repetition angle after being arranged, but the precision of a to-be-machined piece is low, the welding position and the standard position of each to-be-welded device and the shape and the standard shape of the to-be-welded device are different, and the superposition of the differences can cause the problem of welding quality. Therefore, the difference between the current to-be-welded equipment (target component) and the standard component can be calculated through high-precision point cloud comparison, and the difference coordinates are input into the robot numerical control system, so that the welding position is subjected to secondary locating (the correct starting point and the correct end point are found). Furthermore, in the welding process, the real-time welding seam and the standard welding seam are compared by using the similar point cloud generation and semantic segmentation method, so that the accuracy of a welding path is ensured. The standard welding line is the same as the standard component in shape, is stored in the BIM model through an IFC data format, can be called at any time, is parameterized information, and can be synchronously adjusted along with the design shape of the component. Thus, the welding accuracy can be further improved through welding seam comparison and adjustment.
In the scheme of the invention, the image data of the equipment to be welded is taken as a reference, and the characteristic recognition is carried out by combining the image data and the point cloud data. Specifically, the result obtained by image feature recognition is used as reference information (i.e. reference category data), meanwhile, the point cloud data is subjected to smoothing processing to obtain dense point cloud, the point cloud feature recognition is carried out in combination with the reference information to obtain target category data corresponding to each point in the dense point cloud, and then point cloud clustering is achieved, and each target component and component category data thereof are determined. The part type data comprise actual part welding area coordinates determined according to feature recognition, and when welding control is performed on the basis of the actual part welding area coordinates and standard part welding area coordinates, differences between actual equipment to be welded and standard parts can be considered, so that improvement of welding accuracy is facilitated.
Exemplary apparatus
As shown in fig. 2, corresponding to the above welding control method based on image processing and point cloud processing, an embodiment of the present invention further provides a welding control system based on image processing and point cloud processing, where the welding control system based on image processing and point cloud processing includes:
A data obtaining module 710, configured to obtain image data and point cloud data of a device to be welded, where a pixel point in the image data corresponds to a point in the point cloud data;
the image processing module 720 is configured to obtain, according to the image data and through a trained image feature recognition model, reference class data corresponding to each pixel point in the image data, where the reference class data includes reference component class information and reference welding point class information, the reference component class information is any one of a plurality of preset component classes and interference classes, and the reference welding point class information is any one of a welding point and a non-welding point;
a weight obtaining module 730, configured to obtain weight data corresponding to each pixel according to the reference component category information and the reference welding point category information;
the point cloud processing module 740 is configured to perform a point cloud smoothing process on the point cloud data to obtain a dense point cloud, and obtain target class data corresponding to each point in the dense point cloud through a trained point cloud feature recognition model according to the dense point cloud, the reference class data and the weight data corresponding to each pixel point, where the target class data includes target component class information and target welding point class information;
The point cloud clustering module 750 is configured to perform point cloud clustering according to the target category data and the dense point cloud, and obtain each clustered target component and component category data corresponding to each target component, where the component category data includes a component category and an actual component welding area coordinate;
and a control module 760, configured to obtain, according to each component type, a standard component welding area coordinate corresponding to each target component, and perform welding control on the to-be-welded device according to the actual component welding area coordinate and the standard component welding area coordinate.
Specifically, in this embodiment, the specific functions of the welding control system and the modules thereof based on the image processing and the point cloud processing may refer to corresponding descriptions in the welding control method based on the image processing and the point cloud processing, which are not described herein again.
The manner of dividing each module of the welding control system by the image processing and the point cloud processing is not limited to this, and is not particularly limited.
Based on the above embodiment, the present invention further provides an intelligent terminal, and a functional block diagram thereof may be shown in fig. 3. The intelligent terminal comprises a processor and a memory. The memory of the intelligent terminal comprises a welding control program based on image processing and point cloud processing, and the memory provides an environment for the operation of the welding control program based on the image processing and the point cloud processing. The welding control program based on the image processing and the point cloud processing realizes the steps of any one of the welding control methods based on the image processing and the point cloud processing when being executed by the processor. It should be noted that the above-mentioned intelligent terminal may also include other functional modules or units, which are not limited herein.
It will be appreciated by those skilled in the art that the schematic block diagram shown in fig. 3 is merely a block diagram of a portion of the structure related to the present invention and does not constitute a limitation of the smart terminal to which the present invention is applied, and in particular, the smart terminal may include more or less components than those shown in the drawings, or may combine some components, or have different arrangements of components.
The embodiment of the invention also provides a computer readable storage medium, wherein the computer readable storage medium stores a welding control program based on image processing and point cloud processing, and when the welding control program based on the image processing and the point cloud processing is executed by a processor, the steps of any welding control method based on the image processing and the point cloud processing provided by the embodiment of the invention are realized.
It should be understood that the sequence number of each step in the above embodiment does not mean the sequence of execution, and the execution sequence of each process should be determined by its function and internal logic, and should not be construed as limiting the implementation process of the embodiment of the present invention.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the above-described system is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present invention. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed system/intelligent terminal and method may be implemented in other manners. For example, the system/intelligent terminal embodiments described above are merely illustrative, e.g., the division of the modules or elements described above is merely a logical functional division, and may be implemented in other ways, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted, or not performed.
The integrated modules/units described above, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer-readable storage medium. Based on such understanding, the present invention may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the steps of each method embodiment may be implemented. The computer program comprises computer program code, and the computer program code can be in a source code form, an object code form, an executable file or some intermediate form and the like. The computer readable medium may include: any entity or device capable of carrying the computer program code described above, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. The content of the computer readable storage medium can be appropriately increased or decreased according to the requirements of the legislation and the patent practice in the jurisdiction.
The above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, those of ordinary skill in the art will understand that; the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions are not intended to depart from the spirit and scope of the various embodiments of the invention, which are also within the spirit and scope of the invention.

Claims (10)

1. A welding control method based on image processing and point cloud processing, the method comprising:
acquiring image data and point cloud data of equipment to be welded, wherein one pixel point in the image data corresponds to one point in the point cloud data;
acquiring reference category data corresponding to each pixel point in the image data through a trained image feature recognition model according to the image data, wherein the reference category data comprises reference component category information and reference welding point category information, the reference component category information is any one of a plurality of preset component categories and interference categories, and the reference welding point category information is any one of welding points and non-welding points;
Acquiring weight data corresponding to each pixel point according to the reference component category information and the reference welding point category information;
performing point cloud smoothing on the point cloud data to obtain dense point cloud, and acquiring target class data corresponding to each point in the dense point cloud through a trained point cloud feature recognition model according to the dense point cloud, reference class data and weight data corresponding to each pixel point, wherein the target class data comprises target component class information and target welding point class information;
performing point cloud clustering according to the target category data and the dense point cloud, and acquiring clustered target components and component category data corresponding to the target components, wherein the component category data comprises component categories and actual component welding area coordinates;
and obtaining standard part welding area coordinates corresponding to the target parts according to the part types, and performing welding control on the equipment to be welded according to the actual part welding area coordinates and the standard part welding area coordinates.
2. The welding control method based on image processing and point cloud processing according to claim 1, wherein the acquiring image data and point cloud data of the device to be welded includes:
Acquiring an image of the equipment to be welded through a preset camera, and acquiring image data corresponding to the equipment to be welded, wherein the image data comprises RGB images and depth images, and pixel points of the RGB images and the depth images are in one-to-one correspondence;
and carrying out laser point cloud scanning on the equipment to be welded to obtain point cloud data corresponding to the equipment to be welded, wherein the number of points in the point cloud data is equal to the number of pixel points in the RGB image or the depth image.
3. The welding control method based on image processing and point cloud processing according to claim 1, wherein the image feature recognition model is trained according to the steps of:
inputting training image data in image processing training data into the image feature recognition model, carrying out feature recognition through the image feature recognition model, and acquiring training class data corresponding to each pixel point in the training image data, wherein the image processing training data comprises a plurality of groups of training image groups, and each group of training image groups comprises training image data and labeling class data corresponding to each pixel point in the training image data;
And adjusting model parameters of the image feature recognition model according to the training type data and the labeling type data, and continuously executing the step of inputting training image data in the image processing training data into the image feature recognition model until a preset image processing training condition is met, so as to obtain a trained image feature recognition model.
4. The welding control method based on image processing and point cloud processing according to claim 1, wherein the obtaining weight data corresponding to each pixel point according to the reference component category information and the reference welding point category information includes:
and acquiring a preset weight level data table, and setting corresponding weight values for the pixel points according to the weight level data table, the reference component category information and the reference welding point category information as the weight data, wherein the weight value corresponding to the pixel point of the reference component category information, which is the preset component category, is higher than the weight value corresponding to the pixel point of the reference component category information, which is the interference category, and the weight value corresponding to the pixel point of the reference welding point category information, which is the welding point, is higher than the weight value corresponding to the pixel point of the reference component, which is the non-welding point.
5. The welding control method based on image processing and point cloud processing according to claim 1, wherein the performing the point cloud smoothing on the point cloud data to obtain a dense point cloud, and obtaining target class data corresponding to each point in the dense point cloud according to the dense point cloud, the reference class data and the weight data corresponding to each pixel point by a trained point cloud feature recognition model includes:
performing point cloud smoothing on the point cloud data to obtain dense point cloud, wherein the dense point cloud comprises an original point and a newly added point, and the original point is a point in the point cloud data;
acquiring reference class data and weight data corresponding to each point in the dense point cloud according to the reference class data and weight data corresponding to each pixel point, wherein the reference class data and weight data of any one original point are identical to the reference class data and weight data of the pixel point corresponding to the original point, and the reference class data and weight data of any one newly added point are identical to the reference class data and weight data of the nearest original point corresponding to the newly added point;
and acquiring target category data corresponding to each point in the dense point cloud through a trained point cloud feature recognition model according to the dense point cloud and the reference category data and the weight data corresponding to each point in the dense point cloud.
6. The welding control method based on image processing and point cloud processing as recited in claim 5, wherein said point cloud feature recognition model is trained according to the steps of:
inputting training point cloud data in point cloud processing training data into the point cloud feature recognition model, and acquiring training target class data corresponding to each point in the training point cloud data through the point cloud feature recognition model, wherein the point cloud processing training data comprises a plurality of groups of training point cloud groups, each group of training point cloud groups comprises training point cloud data and marking target class data corresponding to each point in the training point cloud data, and the training point cloud data comprises a plurality of points, training reference class data corresponding to each point and training weight data;
and adjusting model parameters of the point cloud feature recognition model according to the training target class data and the labeling target class data, and continuously executing the step of inputting training point cloud data in the point cloud processing training data into the point cloud feature recognition model until preset point cloud processing training conditions are met, so as to obtain a trained point cloud feature recognition model.
7. The welding control method based on image processing and point cloud processing according to claim 1, wherein the welding control of the device to be welded according to the actual part welding area coordinates and the standard part welding area coordinates comprises:
calculating according to the actual part welding area coordinates and the standard part welding area coordinates to obtain welding coordinate offset data;
adjusting a preset welding path corresponding to the equipment to be welded according to the coordinate offset data to obtain a corrected welding path;
and welding the welding equipment according to the corrected welding path.
8. A welding control system based on image processing and point cloud processing, the system comprising:
the data acquisition module is used for acquiring image data and point cloud data of equipment to be welded, wherein one pixel point in the image data corresponds to one point in the point cloud data;
the image processing module is used for acquiring reference category data corresponding to each pixel point in the image data through a trained image feature recognition model according to the image data, wherein the reference category data comprises reference component category information and reference welding point category information, the reference component category information is any one of a plurality of preset component categories and interference categories, and the reference welding point category information is any one of welding points and non-welding points;
The weight acquisition module is used for acquiring weight data corresponding to each pixel point according to the reference component category information and the reference welding point category information;
the point cloud processing module is used for carrying out point cloud smoothing processing on the point cloud data to obtain dense point clouds, and obtaining target category data corresponding to all points in the dense point clouds through a trained point cloud feature recognition model according to the dense point clouds, reference category data and weight data corresponding to all pixel points, wherein the target category data comprises target component category information and target welding point category information;
the point cloud clustering module is used for carrying out point cloud clustering according to the target category data and the point cloud data, and obtaining each clustered target component and component category data corresponding to each target component, wherein the component category data comprises component categories and actual component welding area coordinates;
and the control module is used for acquiring standard component welding area coordinates corresponding to each target component according to each component category, and performing welding control on the equipment to be welded according to the actual component welding area coordinates and the standard component welding area coordinates.
9. An intelligent terminal, characterized in that it comprises a memory, a processor and a welding control program based on image processing and point cloud processing stored on the memory and executable on the processor, which when executed by the processor implements the steps of the welding control method based on image processing and point cloud processing according to any one of claims 1-7.
10. A computer-readable storage medium, characterized in that it has stored thereon a welding control program based on image processing and point cloud processing, which when executed by a processor, implements the steps of the welding control method based on image processing and point cloud processing according to any one of claims 1-7.
CN202310087975.1A 2023-02-09 2023-02-09 Welding control method based on image processing and point cloud processing and related equipment Active CN115810133B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310087975.1A CN115810133B (en) 2023-02-09 2023-02-09 Welding control method based on image processing and point cloud processing and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310087975.1A CN115810133B (en) 2023-02-09 2023-02-09 Welding control method based on image processing and point cloud processing and related equipment

Publications (2)

Publication Number Publication Date
CN115810133A CN115810133A (en) 2023-03-17
CN115810133B true CN115810133B (en) 2023-05-23

Family

ID=85487826

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310087975.1A Active CN115810133B (en) 2023-02-09 2023-02-09 Welding control method based on image processing and point cloud processing and related equipment

Country Status (1)

Country Link
CN (1) CN115810133B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116168032B (en) * 2023-04-25 2023-07-18 浙江工业大学 Point cloud-based method and point cloud-based system for detecting defects of terminating weld joints
CN116168031B (en) * 2023-04-25 2023-08-29 中建科技集团有限公司 Welding code generation method based on three-dimensional image, welding system and related equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111805131A (en) * 2020-09-02 2020-10-23 季华实验室 Weld track real-time positioning method and device, storage medium and terminal
WO2021213223A1 (en) * 2020-04-20 2021-10-28 广东利元亨智能装备股份有限公司 Weld quality inspection method, apparatus and system, and electronic device
CN114227054A (en) * 2022-01-05 2022-03-25 南昌大学 Automatic detection method for tube plate welding seam based on 3D point cloud
CN115035034A (en) * 2022-05-09 2022-09-09 武汉中观自动化科技有限公司 Automatic positioning method and system for thin-wall welding
CN115409808A (en) * 2022-08-31 2022-11-29 深圳前海瑞集科技有限公司 Weld joint recognition method and device, welding robot and storage medium
CN115409863A (en) * 2022-08-31 2022-11-29 深圳前海瑞集科技有限公司 Workpiece weld extraction method and device, computer readable medium and electronic equipment
CN115409805A (en) * 2022-08-31 2022-11-29 深圳前海瑞集科技有限公司 Workpiece weld joint identification method and device, computer readable medium and electronic equipment

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102007008598A1 (en) * 2007-02-19 2008-08-21 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Automatic programming of robots to weld stapled profiles onto micropanels using digital image capture
WO2022016152A1 (en) * 2020-07-17 2022-01-20 Path Robotics, Inc. Real time feedback and dynamic adjustment for welding robots

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021213223A1 (en) * 2020-04-20 2021-10-28 广东利元亨智能装备股份有限公司 Weld quality inspection method, apparatus and system, and electronic device
CN111805131A (en) * 2020-09-02 2020-10-23 季华实验室 Weld track real-time positioning method and device, storage medium and terminal
CN114227054A (en) * 2022-01-05 2022-03-25 南昌大学 Automatic detection method for tube plate welding seam based on 3D point cloud
CN115035034A (en) * 2022-05-09 2022-09-09 武汉中观自动化科技有限公司 Automatic positioning method and system for thin-wall welding
CN115409808A (en) * 2022-08-31 2022-11-29 深圳前海瑞集科技有限公司 Weld joint recognition method and device, welding robot and storage medium
CN115409863A (en) * 2022-08-31 2022-11-29 深圳前海瑞集科技有限公司 Workpiece weld extraction method and device, computer readable medium and electronic equipment
CN115409805A (en) * 2022-08-31 2022-11-29 深圳前海瑞集科技有限公司 Workpiece weld joint identification method and device, computer readable medium and electronic equipment

Also Published As

Publication number Publication date
CN115810133A (en) 2023-03-17

Similar Documents

Publication Publication Date Title
CN111563442B (en) Slam method and system for fusing point cloud and camera image data based on laser radar
CN115810133B (en) Welding control method based on image processing and point cloud processing and related equipment
US11878433B2 (en) Method for detecting grasping position of robot in grasping object
CN110135503B (en) Deep learning identification method for parts of assembly robot
CN109685199B (en) Method and apparatus for creating table containing information on pooling type, and test method and test apparatus using the same
CN109934847B (en) Method and device for estimating posture of weak texture three-dimensional object
CN109118473B (en) Angular point detection method based on neural network, storage medium and image processing system
CN110264444B (en) Damage detection method and device based on weak segmentation
CN111105495A (en) Laser radar mapping method and system fusing visual semantic information
JP2015522200A (en) Human face feature point positioning method, apparatus, and storage medium
CN112336342B (en) Hand key point detection method and device and terminal equipment
CN110310305B (en) Target tracking method and device based on BSSD detection and Kalman filtering
CN115496923B (en) Multi-mode fusion target detection method and device based on uncertainty perception
US20230237777A1 (en) Information processing apparatus, learning apparatus, image recognition apparatus, information processing method, learning method, image recognition method, and non-transitory-computer-readable storage medium
WO2024012333A1 (en) Pose estimation method and apparatus, related model training method and apparatus, electronic device, computer readable medium and computer program product
CN112949519A (en) Target detection method, device, equipment and storage medium
CN110276801B (en) Object positioning method and device and storage medium
CN115239760A (en) Target tracking method, system, equipment and storage medium
CN117372604B (en) 3D face model generation method, device, equipment and readable storage medium
CN111723688A (en) Human body action recognition result evaluation method and device and electronic equipment
CN115205806A (en) Method and device for generating target detection model and automatic driving vehicle
CN115335872A (en) Training method of target detection network, target detection method and device
CN111144422A (en) Positioning identification method and system for aircraft component
CN117523428B (en) Ground target detection method and device based on aircraft platform
CN116237937A (en) Visual positioning method, control method of robot, related equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant