CN115810133A - Welding control method based on image processing and point cloud processing and related equipment - Google Patents

Welding control method based on image processing and point cloud processing and related equipment Download PDF

Info

Publication number
CN115810133A
CN115810133A CN202310087975.1A CN202310087975A CN115810133A CN 115810133 A CN115810133 A CN 115810133A CN 202310087975 A CN202310087975 A CN 202310087975A CN 115810133 A CN115810133 A CN 115810133A
Authority
CN
China
Prior art keywords
point cloud
data
point
welding
category
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310087975.1A
Other languages
Chinese (zh)
Other versions
CN115810133B (en
Inventor
田璐璐
齐株锐
卜磊
雷俊
方舟
靳程锐
刘峻佑
曹秀伟
曹哲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Construction Science and Technology Group Co Ltd
Original Assignee
China Construction Science and Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Construction Science and Technology Group Co Ltd filed Critical China Construction Science and Technology Group Co Ltd
Priority to CN202310087975.1A priority Critical patent/CN115810133B/en
Publication of CN115810133A publication Critical patent/CN115810133A/en
Application granted granted Critical
Publication of CN115810133B publication Critical patent/CN115810133B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a welding control method and related equipment based on image processing and point cloud processing, wherein the method comprises the following steps: acquiring image data and point cloud data of equipment to be welded; acquiring reference category data corresponding to pixel points in image data through a trained image feature recognition model; acquiring weight data corresponding to the pixel points according to the reference part category information and the reference welding point category information in the reference category data; performing point cloud smoothing processing on the point cloud data to obtain dense point cloud, and acquiring target category data of each point in the dense point cloud through a trained point cloud feature identification model according to the dense point cloud, reference category data and weight data corresponding to pixel points; carrying out point cloud clustering according to the target category data and the dense point cloud to obtain a target component and component category data corresponding to the target component; and acquiring the coordinates of the welding area of the standard component corresponding to each target component according to the type of each component, and further performing welding control. The invention is beneficial to improving the welding accuracy.

Description

Welding control method based on image processing and point cloud processing and related equipment
Technical Field
The invention relates to the technical field of image processing, in particular to a welding control method and related equipment based on image processing and point cloud processing.
Background
With the development of scientific technology, especially the development of automatic control technology, the application of automatic control technology is more and more extensive, and the automatic control technology can be applied in different scenes. For example, automatic welding may be achieved based on automatic control techniques when welding the device.
In the prior art, standard parts are usually pre-constructed to obtain corresponding standard information, including the coordinates of the area where welding is to be performed. And determining corresponding standard information according to the type of the equipment to be welded, and performing automatic welding according to the coordinates of the area needing to be welded in the standard information to realize the welding of the equipment to be welded. The problem with the prior art is that the standard information is information of the welding equipment in an ideal state (i.e. in a state in which it is assumed that there is no error), for example, information obtained when a standard is taken as a measurement object. However, in the actual welding process, there is usually a certain difference between the device to be welded and the standard part, for example, there may be a shape difference or a deviation difference of the welding area, and these differences may affect the welding effect and are not favorable for improving the welding accuracy.
Thus, there is still a need for improvement and development of the prior art.
Disclosure of Invention
The invention mainly aims to provide a welding control method based on image processing and point cloud processing and related equipment, and aims to solve the problem that in the prior art, the scheme of automatically welding directly according to the coordinates of an area needing to be welded in standard information corresponding to equipment to be welded is not favorable for improving the welding accuracy.
In order to achieve the above object, a first aspect of the present invention provides a welding control method based on image processing and point cloud processing, wherein the welding control method based on image processing and point cloud processing includes:
acquiring image data and point cloud data of equipment to be welded, wherein one pixel point in the image data corresponds to one point in the point cloud data;
acquiring reference type data corresponding to each pixel point in the image data through a trained image feature recognition model according to the image data, wherein the reference type data comprises reference component type information and reference welding point type information, the reference component type information is any one of a plurality of preset component types and interference types, and the reference welding point type information is any one of a welding point and a non-welding point;
acquiring weight data corresponding to each pixel point according to the reference component type information and the reference welding point type information;
performing point cloud smoothing on the point cloud data to obtain dense point cloud, and acquiring target category data corresponding to each pixel point in the dense point cloud through a trained point cloud feature recognition model according to the dense point cloud, the reference category data corresponding to each pixel point and the weight data, wherein the target category data comprises target component category information and target welding point category information;
performing point cloud clustering according to the target category data and the dense point cloud, and acquiring each clustered target component and component category data corresponding to each target component, wherein the component category data comprise a component category and an actual component welding area coordinate;
and acquiring standard component welding area coordinates corresponding to each target component according to each component type, and performing welding control on the equipment to be welded according to the actual component welding area coordinates and the standard component welding area coordinates.
Optionally, the obtaining image data and point cloud data of the device to be welded includes:
acquiring an image of the equipment to be welded through a preset camera, and acquiring image data corresponding to the equipment to be welded, wherein the image data comprise RGB images and depth images, and pixel points of the RGB images and the depth images correspond to one another one by one;
and performing laser point cloud scanning on the equipment to be welded to obtain point cloud data corresponding to the equipment to be welded, wherein the number of points in the point cloud data is equal to the number of pixel points in the RGB image or the depth image.
Optionally, the image feature recognition model is trained according to the following steps:
inputting training image data in image processing training data into the image feature recognition model, performing feature recognition through the image feature recognition model, and acquiring training category data corresponding to each pixel point in the training image data, wherein the image processing training data comprises a plurality of groups of training image groups, and each group of training image group comprises the training image data and label category data corresponding to each pixel point in the training image data;
and adjusting model parameters of the image feature recognition model according to the training type data and the labeling type data, and continuing to execute the step of inputting training image data in the image processing training data into the image feature recognition model until preset image processing training conditions are met, so as to obtain a trained image feature recognition model.
Optionally, the obtaining of the weight data corresponding to each pixel point according to the reference component type information and the reference welding point type information includes:
acquiring a preset weight level data table, and respectively setting corresponding weight values for the pixel points as the weight data according to the weight level data table, the reference component category information and the reference welding point category information, wherein the reference component category information is that the weight values corresponding to the pixel points of the preset component category are higher than the weight values corresponding to the pixel points of the reference component category information of the interference category, and the reference welding point category information is that the weight values corresponding to the pixel points of the welding points are higher than the weight values corresponding to the pixel points of the reference component reference welding point category information of the non-welding points.
Optionally, the performing point cloud smoothing processing on the point cloud data to obtain dense point cloud, and obtaining target category data corresponding to each point in the dense point cloud through a trained point cloud feature recognition model according to the dense point cloud, the reference category data corresponding to each pixel point, and the weight data, includes:
performing point cloud smoothing processing on the point cloud data to obtain dense point cloud, wherein the dense point cloud comprises an original point and a newly added point, and the original point is a point in the point cloud data;
acquiring reference category data and weight data corresponding to each pixel point in the dense point cloud according to the reference category data and the weight data corresponding to each pixel point, wherein the reference category data and the weight data of any original point are the same as those of a pixel point corresponding to the original point, and the reference category data and the weight data of any newly added point are the same as those of the nearest original point corresponding to the newly added point;
and acquiring target category data corresponding to each point in the dense point cloud through a trained point cloud feature recognition model according to the dense point cloud and the reference category data and the weight data corresponding to each point in the dense point cloud.
Optionally, the point cloud feature recognition model is trained according to the following steps:
inputting training point cloud data in point cloud processing training data into the point cloud feature recognition model, and acquiring training target class data corresponding to each point in the training point cloud data through the point cloud feature recognition model, wherein the point cloud processing training data comprises a plurality of groups of training point cloud sets, each group of training point cloud set comprises the training point cloud data and labeling target class data corresponding to each point in the training point cloud data, and the training point cloud data comprises a plurality of points, training reference class data corresponding to each point and training weight data;
and adjusting model parameters of the point cloud feature recognition model according to the training target class data and the labeled target class data, and continuing to execute the step of inputting training point cloud data in the point cloud processing training data into the point cloud feature recognition model until preset point cloud processing training conditions are met, so as to obtain the trained point cloud feature recognition model.
Optionally, the performing welding control on the device to be welded according to the actual component welding area coordinate and the standard component welding area coordinate includes:
calculating and acquiring welding coordinate offset data according to the actual component welding area coordinates and the standard component welding area coordinates;
adjusting a preset welding path corresponding to the equipment to be welded according to the coordinate offset data to obtain a corrected welding path;
and welding the welding equipment according to the corrected welding path.
The invention provides a welding control system based on image processing and point cloud processing, wherein the welding control system based on image processing and point cloud processing comprises:
the device comprises a data acquisition module, a data acquisition module and a data processing module, wherein the data acquisition module is used for acquiring image data and point cloud data of equipment to be welded, and one pixel point in the image data corresponds to one point in the point cloud data;
an image processing module, configured to obtain, according to the image data, reference class data corresponding to each pixel point in the image data through a trained image feature recognition model, where the reference class data includes reference component class information and reference welding point class information, the reference component class information is any one of multiple preset component classes and interference classes, and the reference welding point class information is any one of a welding point and a non-welding point;
a weight obtaining module, configured to obtain weight data corresponding to each pixel point according to the reference component type information and the reference welding point type information;
the point cloud processing module is used for carrying out point cloud smoothing processing on the point cloud data to obtain dense point cloud, and acquiring target category data corresponding to each pixel point in the dense point cloud through a trained point cloud feature identification model according to the dense point cloud, the reference category data corresponding to each pixel point and the weight data, wherein the target category data comprises target component category information and target welding point category information;
the point cloud clustering module is used for carrying out point cloud clustering according to the target category data and the dense point cloud to obtain each clustered target component and component category data corresponding to each target component, wherein the component category data comprise component categories and actual component welding area coordinates;
and the control module is used for acquiring the coordinates of the welding area of the standard component corresponding to each target component according to each component type and carrying out welding control on the equipment to be welded according to the coordinates of the welding area of the actual component and the coordinates of the welding area of the standard component.
A third aspect of the present invention provides an intelligent terminal, where the intelligent terminal includes a memory, a processor, and a welding control program based on image processing and point cloud processing, which is stored in the memory and is executable on the processor, and the welding control program based on image processing and point cloud processing is executed by the processor to implement any one of the steps of the welding control method based on image processing and point cloud processing.
A fourth aspect of the present invention provides a computer-readable storage medium, in which a welding control program based on image processing and point cloud processing is stored, and when being executed by a processor, the welding control program based on image processing and point cloud processing implements the steps of any one of the welding control methods based on image processing and point cloud processing.
As can be seen from the above, in the scheme of the present invention, image data and point cloud data of a device to be welded are obtained, wherein one pixel point in the image data corresponds to one point in the point cloud data; acquiring reference type data corresponding to each pixel point in the image data through a trained image feature recognition model according to the image data, wherein the reference type data comprises reference component type information and reference welding point type information, the reference component type information is any one of a plurality of preset component types and interference types, and the reference welding point type information is any one of a welding point and a non-welding point; acquiring weight data corresponding to each pixel point according to the reference component type information and the reference welding point type information; performing point cloud smoothing on the point cloud data to obtain dense point cloud, and acquiring target category data corresponding to each pixel point in the dense point cloud through a trained point cloud feature recognition model according to the dense point cloud, the reference category data corresponding to each pixel point and the weight data, wherein the target category data comprises target component category information and target welding point category information; performing point cloud clustering according to the target category data and the dense point cloud, and acquiring each clustered target component and component category data corresponding to each target component, wherein the component category data comprise a component category and an actual component welding area coordinate; and acquiring standard component welding area coordinates corresponding to each target component according to each component type, and performing welding control on the equipment to be welded according to the actual component welding area coordinates and the standard component welding area coordinates.
Compared with the prior art, in the scheme of the invention, the image data of the equipment to be welded is taken as reference, and the image data and the point cloud data are combined for feature recognition. Specifically, the result obtained by image feature recognition is used as reference information (namely reference category data), meanwhile, the point cloud data is smoothed to obtain dense point cloud, point cloud feature recognition is carried out by combining the reference information to obtain target category data corresponding to each point in the dense point cloud, point cloud clustering is further achieved, and each target component and component category data thereof are determined. The part category data comprises actual part welding area coordinates determined according to feature recognition, and when welding control is performed on the basis of the actual part welding area coordinates and the standard part welding area coordinates, the difference between actual equipment to be welded and a standard part can be considered, so that welding accuracy is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flowchart of a welding control method based on image processing and point cloud processing according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of a welding control system based on image processing and point cloud processing according to an embodiment of the present invention;
fig. 3 is a schematic block diagram of an internal structure of an intelligent terminal according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the specification of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when …" or "upon" or "in response to a determination" or "in response to a classification". Similarly, the phrase "if it is determined" or "if it is classified to [ a described condition or event ]" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon classifying to [ a described condition or event ]" or "in response to classifying to [ a described condition or event ]".
The technical solutions in the embodiments of the present invention are clearly and completely described below with reference to the drawings of the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, but the present invention may be practiced in other ways than those specifically described and will be readily apparent to those of ordinary skill in the art without departing from the spirit of the present invention, and therefore the present invention is not limited to the specific embodiments disclosed below.
With the development of scientific technology, especially the development of automatic control technology, the application of automatic control technology is more and more extensive, and the automatic control technology can be applied in different scenes. For example, automatic welding may be achieved based on automatic control techniques when welding the device.
In the prior art, a standard part is usually constructed in advance to obtain corresponding standard information, wherein the corresponding standard information comprises the coordinates of an area needing to be welded. And determining corresponding standard information according to the type of the equipment to be welded, and performing automatic welding according to the coordinates of the area needing to be welded in the standard information to realize the welding of the equipment to be welded. The problem with the prior art is that the standard information is information of the welding apparatus in an ideal state (i.e., a state in which it is considered that there is no error), for example, information obtained when a standard is taken as a measurement object. However, in the actual welding process, there is usually a certain difference between the device to be welded and the standard part, for example, there may be a shape difference or a deviation difference of the welding area, and these differences may affect the welding effect and are not favorable for improving the welding accuracy and the welding quality.
In order to solve at least one of the problems, the invention is characterized in that image data and point cloud data of a device to be welded are obtained, wherein one pixel point in the image data corresponds to one point in the point cloud data; acquiring reference type data corresponding to each pixel point in the image data through a trained image feature recognition model according to the image data, wherein the reference type data comprise reference component type information and reference welding point type information, the reference component type information is any one of multiple preset component types and interference types, and the reference welding point type information is any one of a welding point and a non-welding point; acquiring weight data corresponding to each pixel point according to the reference component type information and the reference welding point type information; performing point cloud smoothing on the point cloud data to obtain dense point cloud, and acquiring target category data corresponding to each pixel point in the dense point cloud through a trained point cloud feature recognition model according to the dense point cloud, the reference category data corresponding to each pixel point and the weight data, wherein the target category data comprises target component category information and target welding point category information; performing point cloud clustering according to the target category data and the dense point cloud, and acquiring each clustered target component and component category data corresponding to each target component, wherein the component category data comprise a component category and an actual component welding area coordinate; and acquiring standard component welding area coordinates corresponding to each target component according to each component type, and performing welding control on the equipment to be welded according to the actual component welding area coordinates and the standard component welding area coordinates.
Compared with the prior art, in the scheme of the invention, the image data of the equipment to be welded is taken as reference, and the image data and the point cloud data are combined for feature recognition. Specifically, the result obtained by image feature recognition is used as reference information (i.e., reference category data), meanwhile, the point cloud data is smoothed to obtain dense point cloud, point cloud feature recognition is performed by combining the reference information to obtain target category data corresponding to each point in the dense point cloud, point cloud clustering is further realized, and each target component and component category data thereof are determined. The part category data comprises actual part welding area coordinates determined according to the characteristic identification, and when welding control is carried out on the actual part welding area coordinates and the standard part welding area coordinates, the difference between actual equipment to be welded and a standard part can be considered, so that the welding accuracy and the welding quality are improved.
Meanwhile, in the invention, point cloud smoothing processing is carried out on the collected point cloud data to obtain dense point cloud, and according to the smoothed dense point cloud, the clustering effect is favorably improved, so that the accuracy of component identification is favorably improved.
Exemplary method
As shown in fig. 1, an embodiment of the present invention provides a welding control method based on image processing and point cloud processing, and specifically, the method includes the following steps:
step S100, image data and point cloud data of a device to be welded are obtained, wherein one pixel point in the image data corresponds to one point in the point cloud data.
The device to be welded may include a plurality of parts, one part may include one or more regions to be welded, and a device or other parts required to be welded may be welded to each part through the welding regions, so as to obtain a welded device. In the present embodiment, the above-mentioned apparatus to be welded is a fabricated building module (for example, a steel column, a purlin, etc.) that needs to be welded, and is not specifically described, but not limited.
The image data is obtained by carrying out image acquisition on the area where the equipment to be welded is located, and the point cloud data is obtained by carrying out point cloud acquisition on the area where the equipment to be welded is located. It should be noted that, a pixel point in the image data represents a specific position in the area where the device to be welded is located, and the point cloud collection is also performed on the position to obtain a point in the point cloud data. The point cloud data includes a plurality of points and point cloud information (such as reflection information, coordinate information, color information, and the like) corresponding to each point.
Specifically, the above obtaining image data and point cloud data of the device to be welded includes: acquiring an image of the equipment to be welded through a preset camera, and acquiring image data corresponding to the equipment to be welded, wherein the image data comprises an RGB (red, green and blue) image and a depth image, and pixel points of the RGB image and the depth image correspond to one another one by one; and performing laser point cloud scanning on the equipment to be welded to obtain point cloud data corresponding to the equipment to be welded, wherein the number of points in the point cloud data is equal to the number of pixel points in the RGB image or the depth image.
In this embodiment, the preset cameras include an RGB camera and a depth image capturing camera that are calibrated in advance, and in the images captured by the RGB camera and the depth image capturing camera, the pixel points correspond to one another. For example, the first pixel point in the RGB image represents the position point a of the target area (i.e., the area where the device to be welded is located), and then the first pixel point in the depth image also represents the position point a. Meanwhile, in the embodiment, the target area is scanned by using LiDAR laser point cloud to obtain corresponding point cloud data. And the number of the points in the point cloud data is equal to the number of the pixel points in the RGB image or the depth image, namely, one pixel point corresponds to one point cloud point, and the point cloud identification can be carried out by combining image information, so that the identification effect is improved.
Step S200, obtaining reference type data corresponding to each pixel point in the image data through a trained image feature recognition model according to the image data, where the reference type data includes reference component type information and reference welding point type information, the reference component type information is any one of multiple preset component types and interference types, and the reference welding point type information is any one of a welding point and a non-welding point.
In this embodiment, image feature recognition is performed on image data, and reference category data corresponding to each pixel point is determined to serve as reference information, so that accuracy of a subsequent point cloud feature recognition processing process is improved. Specifically, the trained image feature recognition model is used for performing feature recognition processing on the input image to determine reference category data corresponding to each pixel point in the input image. The reference type data includes information of two aspects, namely reference component type information on one hand and reference welding point type information on the other hand. The reference component category information is used to indicate what type of component a or B is at a point, and in this embodiment, any one of the reference component category information is any one of a plurality of preset component categories representing respective types of components on the device to be welded, such as component a, component B, and the like, and an interference category. And if a point is a point on the background environment or other surrounding equipment, i.e. does not belong to a point on the equipment to be welded, its corresponding reference component class information is an interference class.
The above-mentioned reference welding point class information is used to indicate whether the point is a welding point, and it should be noted that, for a certain component, the welding area is usually fixed, so that the image feature recognition model for recognizing the welding point can be trained in advance. Specifically, for a point in the image data, the corresponding reference category data may be (component a category, welding point), representing that the point belongs to a welding point of the component a.
In this embodiment, the image feature recognition model is trained according to the following steps: inputting training image data in image processing training data into the image feature recognition model, performing feature recognition through the image feature recognition model, and acquiring training category data corresponding to each pixel point in the training image data, wherein the image processing training data comprises a plurality of groups of training image groups, and each group of training image group comprises the training image data and label category data corresponding to each pixel point in the training image data; and adjusting model parameters of the image feature recognition model according to the training type data and the labeling type data, and continuously executing the step of inputting training image data in the image processing training data into the image feature recognition model until preset image processing training conditions are met to obtain a trained image feature recognition model.
The image processing training data is data which is acquired in advance and used for training an image feature recognition model. The training image data and the label type data in the image processing training data are in the same data format as the corresponding data when the image feature recognition model is used (for example, the image data input to the model and the reference type data output from the model). Specifically, when the image data input to the model includes a depth image and an RGB image, the training image data also includes a training depth image and a training RGB image for training, and other data are similar and will not be described herein again.
The preset image processing training condition is a preset condition for limiting completion of training, and may include that the number of training iterations of the model reaches the preset number of iterations corresponding to the model, or a loss value is smaller than a preset loss threshold corresponding to the model, which is not specifically limited herein.
And step S300, acquiring weight data corresponding to each pixel point according to the reference component type information and the reference welding point type information.
In this embodiment, a corresponding weight value (i.e., weight data) is set for each pixel. Specifically, one pixel point corresponds to one point in the point cloud data, so that the weight value of the pixel point can be used as the corresponding weight value in the point cloud data, different weights are set for each point in the point cloud data, an attention mechanism is obtained based on the weights in the processing process, the point with the larger weight value can be regarded as the more important point, more attention can be obtained, and a better feature recognition effect is obtained.
When the weight is set, the weighted value of the pixel point corresponding to the part which is more concerned by the user is higher. Specifically, the acquiring weight data corresponding to each pixel point according to the reference member type information and the reference welding point type information includes: acquiring a preset weight level data table, and respectively setting corresponding weight values for the pixel points as the weight data according to the weight level data table, the reference component category information and the reference welding point category information, wherein the reference component category information is that the weight values corresponding to the pixel points of the preset component category are higher than the weight values corresponding to the pixel points of the reference component category information of the interference category, and the reference welding point category information is that the weight values corresponding to the pixel points of the welding points are higher than the weight values corresponding to the pixel points of the reference component reference welding point category information of the non-welding points.
The preset weight level data table is a preset table for storing weight value data or weight level data, and it should be noted that, for different components in the to-be-welded apparatus, the degree of interest of the user is different, and therefore the corresponding weight values or weight levels are also different, for example, if the weight level corresponding to the component a is higher than the weight level corresponding to the component B, when the weight data is determined, the weight value of a point in the component a is higher than the weight value of a point in the component B. Meanwhile, in the same part, the weight value of the point in the welding area is higher than that of the point in the non-welder area. Furthermore, the point corresponding to the interference category is not an object that needs to be focused on in the present scheme, and therefore the weight value is the lowest. Thus, different weight values are set for different types of points, and a better feature recognition effect can be obtained.
And step S400, performing point cloud smoothing on the point cloud data to obtain dense point cloud, and acquiring target category data corresponding to each pixel point in the dense point cloud through a trained point cloud feature recognition model according to the dense point cloud, the reference category data corresponding to each pixel point and the weight data, wherein the target category data comprises target component category information and target welding point category information.
In the embodiment, the point cloud smoothing processing is performed to obtain dense point cloud in consideration of sparseness of points in point cloud data obtained during point cloud scanning, so that better point cloud feature identification and clustering effects can be obtained. It should be noted that the point cloud smoothing processing may be implemented by using an interpolation method, and the point cloud data of the newly added point is an average value of the points of the accessory, and may also use other methods, which are not specifically limited herein. The trained point cloud feature recognition model is used for performing feature recognition and classification on the input point cloud, determining the category (namely target category data) of each point, and performing attention distribution on each point in the point cloud data according to reference information (namely reference category data and weight data corresponding to each pixel point) in the recognition process of the point cloud feature recognition model.
It should be noted that the target component type information corresponding to a point is any one of multiple preset component types and interference types, and the target welding point type information is any one of a welding point and a non-welding point.
Specifically, the performing point cloud smoothing processing on the point cloud data to obtain dense point cloud, and according to the dense point cloud, reference category data and weight data corresponding to each pixel point, obtaining target category data corresponding to each point in the dense point cloud through a trained point cloud feature recognition model includes: performing point cloud smoothing processing on the point cloud data to obtain dense point cloud, wherein the dense point cloud comprises an original point and a newly added point, and the original point is a point in the point cloud data; acquiring reference category data and weight data corresponding to each pixel point in the dense point cloud according to the reference category data and the weight data corresponding to each pixel point, wherein the reference category data and the weight data of any original point are the same as those of a pixel point corresponding to the original point, and the reference category data and the weight data of any newly added point are the same as those of the nearest original point corresponding to the newly added point; and acquiring target category data corresponding to each point in the dense point cloud data through a trained point cloud feature recognition model according to the dense point cloud and the reference category data and the weight data corresponding to each point in the dense point cloud.
Wherein, the new points are new points inserted in the process of point cloud smoothing processing. The nearest original point corresponding to the new point is the original point closest to the new point, and when a plurality of original points closest to the new point exist, one of the original points can be selected as the nearest original point. In this embodiment, after the dense point cloud and the reference category data and the weight data of each point therein are obtained, the reference category data and the weight data are input to the trained point cloud feature recognition model to obtain the target category data corresponding to each point in the dense point cloud.
Specifically, the point cloud feature recognition model is trained according to the following steps: inputting training point cloud data in point cloud processing training data into the point cloud feature recognition model, and acquiring training target class data corresponding to each point in the training point cloud data through the point cloud feature recognition model, wherein the point cloud processing training data comprises a plurality of groups of training point cloud sets, each group of training point cloud set comprises the training point cloud data and labeling target class data corresponding to each point in the training point cloud data, and the training point cloud data comprises a plurality of points, training reference class data corresponding to each point and training weight data; and adjusting model parameters of the point cloud feature recognition model according to the training target class data and the labeled target class data, and continuing to execute the step of inputting training point cloud data in the point cloud processing training data into the point cloud feature recognition model until preset point cloud processing training conditions are met, so as to obtain the trained point cloud feature recognition model.
The training target class data comprises training target component class information and training target welding point class information, and correspondingly, the labeling target class data comprises labeling target component class information and labeling target welding point class information. The preset point cloud processing training condition is a preset condition for limiting completion of training of the point cloud feature recognition model, specifically, the preset point cloud processing training condition may be set such that the training iteration number of the point cloud feature recognition model reaches a preset iteration number threshold corresponding to the point cloud feature recognition model, or a loss value is smaller than a loss value threshold corresponding to the point cloud feature recognition model, and may further include other conditions, which are not specifically limited herein.
And S500, carrying out point cloud clustering according to the target category data and the dense point cloud, and acquiring each clustered target component and component category data corresponding to each target component, wherein the component category data comprise a component category and actual component welding area coordinates.
Specifically, when clustering is performed, points of the same type are clustered together to be used as one component or one welding area, and the clustered component is marked as a target component. Specifically, the device to be welded is divided into a plurality of target components, and the component type and the actual component welding area coordinates corresponding to each target component are determined respectively. The actual component welding area is formed by all welding points in the target component, so that the actual component welding area coordinate can be obtained according to the coordinate of the welding points. In this embodiment, the actual component welding area coordinates include boundary point coordinates of an actual welding area.
It should be noted that when the device to be welded deforms or the welding point deviates due to other reasons, the deviation directions and the deviation degrees corresponding to different components may be different, so that in this embodiment, the deviation calculation and the welding control are performed on each target component, which is beneficial to further improving the accuracy of welding.
Step S600, obtaining a standard component welding area coordinate corresponding to each target component according to each component type, and performing welding control on the to-be-welded device according to the actual component welding area coordinate and the standard component welding area coordinate.
Specifically, for each target component, standard components of the same component category are constructed in advance, and corresponding standard component welding area coordinates are determined according to the standard components. And there may be a difference between the obtained standard component welding area coordinates and the obtained actual component welding area coordinates, and the welding process needs to be adjusted and controlled according to the difference.
In this embodiment, the performing welding control on the to-be-welded device according to the actual component welding area coordinates and the standard component welding area coordinates includes: calculating and acquiring welding coordinate offset data according to the actual component welding area coordinates and the standard component welding area coordinates; adjusting a preset welding path corresponding to the equipment to be welded according to the coordinate offset data to obtain a corrected welding path; and welding the welding equipment according to the corrected welding path.
The welding coordinate offset data is used to describe the difference between the obtained standard component welding area coordinates and the obtained actual component welding area coordinates, for example, how much the one area boundary is offset in which direction.
In one application scenario, liDAR laser point cloud scanning is used for obtaining initial point cloud, a point cloud dense method is used for generating dense point cloud, and the initial point cloud comprises surrounding equipment, environmental noise points and the like. Point cloud segmentation can be achieved using a PointNet + + model (i.e., a trained point cloud feature recognition model). The point cloud PointNet + + model is a point-level semantic segmentation model, that is, each point in the point cloud data is assigned with a classification semantic meaning, specifically, whether the current point belongs to a component to be welded or an environment or other surrounding equipment is distinguished, and which target component type the current point specifically belongs to (for example, preset types of pillars, purlins, and the like). In the model, each point feature is extracted through a shared multi-layer perceptron (MLP), and then all the point features are combined into a global feature vector by using a pooling operation. For the classification task, the vector can be directly used as a feature to perform classification by using MLP. For the segmentation task, the point features are spliced with the global features, and then each point is classified by using MLP. The model training is completed by manually marking point cloud data, supervising learning, back propagation and gradient descent.
After the category information corresponding to each point in the dense point cloud is obtained, a standard steel structural part model stored in a Building Information Model (BIM) template database is quickly called through category keywords, point level comparison is carried out, and the position and shape difference between a welding point (in the standard model) and the welding point in the current actual device to be welded is mainly compared. The process can be realized by automatic position finding, because the mechanical arm has high repetition angle after being set, but the precision of the workpiece to be processed is low, the welding position and the standard position of each device to be welded and the shape and the standard shape of the device to be welded have differences, and the difference superposition can cause the welding quality problem. Therefore, through point cloud high-precision comparison, the difference of the points to be welded between (the target part of) the current equipment to be welded and the standard part can be calculated, and the difference coordinates are input into a robot numerical control system, so that the welding position is subjected to secondary position finding (the correct starting point and the correct end point are found). Furthermore, in the welding process, the real-time welding seam and the standard welding seam are compared by using the similar point cloud generation and semantic segmentation method, so that the accuracy of the welding path is ensured. The standard welding line has the same shape as a standard component, is stored in a BIM (building information modeling) model through an IFC (information modeling) data format and can be called at any time, is parametric information and can be synchronously adjusted along with the design shape of the component. Therefore, the welding accuracy can be further improved through the weld comparison and adjustment.
Therefore, in the scheme of the invention, the image data of the equipment to be welded is taken as a reference, and the image data and the point cloud data are combined for feature recognition. Specifically, the result obtained by image feature recognition is used as reference information (namely reference category data), meanwhile, the point cloud data is smoothed to obtain dense point cloud, point cloud feature recognition is carried out by combining the reference information to obtain target category data corresponding to each point in the dense point cloud, point cloud clustering is further achieved, and each target component and component category data thereof are determined. The part category data comprises actual part welding area coordinates determined according to feature recognition, and when welding control is performed on the basis of the actual part welding area coordinates and the standard part welding area coordinates, the difference between actual equipment to be welded and a standard part can be considered, so that welding accuracy is improved.
Exemplary device
As shown in fig. 2, in correspondence to the welding control method based on image processing and point cloud processing, an embodiment of the present invention further provides a welding control system based on image processing and point cloud processing, where the welding control system based on image processing and point cloud processing includes:
a data obtaining module 710, configured to obtain image data and point cloud data of a device to be welded, where a pixel point in the image data corresponds to one point in the point cloud data;
an image processing module 720, configured to obtain, according to the image data, reference class data corresponding to each pixel point in the image data through a trained image feature recognition model, where the reference class data includes reference component class information and reference welding point class information, the reference component class information is any one of multiple preset component classes and interference classes, and the reference welding point class information is any one of a welding point and a non-welding point;
a weight obtaining module 730, configured to obtain weight data corresponding to each pixel point according to the reference component type information and the reference welding point type information;
the point cloud processing module 740 is configured to perform point cloud smoothing on the point cloud data to obtain dense point clouds, and obtain target category data corresponding to each pixel in the dense point clouds according to the dense point clouds, the reference category data corresponding to each pixel, and the weight data by using a trained point cloud feature recognition model, where the target category data includes target component category information and target welding point category information;
a point cloud clustering module 750, configured to perform point cloud clustering according to the target category data and the dense point cloud, and obtain each clustered target component and component category data corresponding to each target component, where the component category data includes a component category and an actual component welding area coordinate;
and the control module 760 is configured to obtain coordinates of a welding area of the standard component corresponding to each target component according to each component type, and perform welding control on the device to be welded according to the coordinates of the welding area of the actual component and the coordinates of the welding area of the standard component.
Specifically, in this embodiment, the specific functions of the welding control system based on image processing and point cloud processing and the modules thereof may refer to the corresponding descriptions in the welding control method based on image processing and point cloud processing, and are not described herein again.
The division method of each module of the welding control system based on the image processing and the point cloud processing is not exclusive, and is not particularly limited herein.
Based on the above embodiment, the present invention further provides an intelligent terminal, and a schematic block diagram thereof may be as shown in fig. 3. The intelligent terminal comprises a processor and a memory. The intelligent terminal comprises a storage device and a control device, wherein the storage device comprises a welding control program based on image processing and point cloud processing, and the storage device provides an environment for the operation of the welding control program based on the image processing and the point cloud processing. When executed by a processor, the welding control program based on image processing and point cloud processing realizes the steps of any one of the welding control methods based on image processing and point cloud processing. It should be noted that the above-mentioned intelligent terminal may further include other functional modules or units, which are not specifically limited herein.
It will be understood by those skilled in the art that the block diagram shown in fig. 3 is only a block diagram of a part of the structure related to the solution of the present invention, and does not constitute a limitation of the intelligent terminal to which the solution of the present invention is applied, and in particular, the intelligent terminal may include more or less components than those shown in the figure, or combine some components, or have a different arrangement of components.
The embodiment of the invention also provides a computer-readable storage medium, wherein a welding control program based on image processing and point cloud processing is stored on the computer-readable storage medium, and when being executed by a processor, the welding control program based on image processing and point cloud processing realizes the steps of any one of the welding control methods based on image processing and point cloud processing provided by the embodiment of the invention.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned functions may be distributed as different functional units and modules according to needs, that is, the internal structure of the system may be divided into different functional units or modules to implement all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the description of each embodiment has its own emphasis, and reference may be made to the related description of other embodiments for parts that are not described or recited in any embodiment.
Those of ordinary skill in the art would appreciate that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed system/intelligent terminal and method can be implemented in other ways. For example, the above-described system/intelligent terminal embodiments are merely illustrative, and for example, the division of the above modules or units is only one logical function division, and the actual implementation may be implemented by another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed.
The integrated modules/units described above may be stored in a computer-readable storage medium if implemented in the form of software functional units and sold or used as separate products. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium and can implement the steps of the embodiments of the method when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying the above-mentioned computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, read-Only Memory (ROM), random Access Memory (RAM), electrical carrier wave signal, telecommunication signal, software distribution medium, etc. It should be noted that the contents contained in the computer-readable storage medium can be increased or decreased as required by legislation and patent practice in the jurisdiction.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those skilled in the art; the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein.

Claims (10)

1. A welding control method based on image processing and point cloud processing is characterized by comprising the following steps:
acquiring image data and point cloud data of equipment to be welded, wherein one pixel point in the image data corresponds to one point in the point cloud data;
acquiring reference category data corresponding to each pixel point in the image data through a trained image feature recognition model according to the image data, wherein the reference category data comprises reference component category information and reference welding point category information, the reference component category information is any one of multiple preset component categories and interference categories, and the reference welding point category information is any one of a welding point and a non-welding point;
acquiring weight data corresponding to each pixel point according to the reference component type information and the reference welding point type information;
performing point cloud smoothing on the point cloud data to obtain dense point cloud, and acquiring target category data corresponding to each pixel point in the dense point cloud through a trained point cloud feature identification model according to the dense point cloud, the reference category data corresponding to each pixel point and the weight data, wherein the target category data comprises target component category information and target welding point category information;
performing point cloud clustering according to the target category data and the dense point cloud, and acquiring each clustered target component and component category data corresponding to each target component, wherein the component category data comprise a component category and an actual component welding area coordinate;
and acquiring standard component welding area coordinates corresponding to each target component according to each component category, and performing welding control on the equipment to be welded according to the actual component welding area coordinates and the standard component welding area coordinates.
2. The image processing and point cloud processing based welding control method according to claim 1, wherein the obtaining of the image data and the point cloud data of the device to be welded includes:
acquiring an image of the device to be welded through a preset camera, and acquiring image data corresponding to the device to be welded, wherein the image data comprises an RGB (red, green and blue) image and a depth image, and pixel points of the RGB image and the depth image are in one-to-one correspondence;
and performing laser point cloud scanning on the equipment to be welded to obtain point cloud data corresponding to the equipment to be welded, wherein the number of points in the point cloud data is equal to the number of pixel points in the RGB image or the depth image.
3. The image processing and point cloud processing based welding control method of claim 1, wherein the image feature recognition model is trained according to the following steps:
inputting training image data in image processing training data into the image feature recognition model, performing feature recognition through the image feature recognition model, and acquiring training category data corresponding to each pixel point in the training image data, wherein the image processing training data comprises a plurality of groups of training image groups, and each group of training image group comprises the training image data and label category data corresponding to each pixel point in the training image data;
and adjusting model parameters of the image feature recognition model according to the training category data and the labeling category data, and continuing to execute the step of inputting training image data in the image processing training data into the image feature recognition model until preset image processing training conditions are met, so as to obtain the trained image feature recognition model.
4. The image processing and point cloud processing-based welding control method according to claim 1, wherein the obtaining of the weight data corresponding to each pixel point according to the reference part category information and the reference welding point category information includes:
acquiring a preset weight level data table, and respectively setting corresponding weight values for the pixel points as the weight data according to the weight level data table, the reference part category information and the reference welding point category information, wherein the reference part category information is that the weight values corresponding to the pixel points of the preset part category are higher than the weight values corresponding to the pixel points of the reference part category information of the interference category, and the reference welding point category information is that the weight values corresponding to the pixel points of the welding points are higher than the weight values corresponding to the pixel points of the reference part reference welding point category information of the non-welding points.
5. The image processing and point cloud processing-based welding control method according to claim 1, wherein the performing point cloud smoothing processing on the point cloud data to obtain dense point cloud, and obtaining target category data corresponding to each point in the dense point cloud through a trained point cloud feature recognition model according to the dense point cloud, reference category data corresponding to each pixel point, and weight data comprises:
performing point cloud smoothing processing on the point cloud data to obtain dense point cloud, wherein the dense point cloud comprises an original point and a new point, and the original point is a point in the point cloud data;
acquiring reference category data and weight data corresponding to each pixel point in the dense point cloud according to the reference category data and the weight data corresponding to each pixel point, wherein the reference category data and the weight data of any original point are the same as those of a pixel point corresponding to the original point, and the reference category data and the weight data of any newly added point are the same as those of a nearest original point corresponding to the newly added point;
and acquiring target category data corresponding to each point in the dense point cloud through a trained point cloud feature recognition model according to the dense point cloud and the reference category data and the weight data corresponding to each point in the dense point cloud.
6. The image processing and point cloud processing based welding control method of claim 5, wherein the point cloud feature recognition model is trained according to the following steps:
inputting training point cloud data in point cloud processing training data into the point cloud feature recognition model, and acquiring training target class data corresponding to each point in the training point cloud data through the point cloud feature recognition model, wherein the point cloud processing training data comprises a plurality of groups of training point cloud sets, each group of training point cloud set comprises the training point cloud data and labeling target class data corresponding to each point in the training point cloud data, and the training point cloud data comprises a plurality of points, training reference class data corresponding to each point and training weight data;
and adjusting model parameters of the point cloud feature recognition model according to the training target class data and the labeling target class data, and continuing to execute the step of inputting training point cloud data in the point cloud processing training data into the point cloud feature recognition model until preset point cloud processing training conditions are met, so as to obtain the trained point cloud feature recognition model.
7. The image processing and point cloud processing-based welding control method according to claim 1, wherein the welding control of the device to be welded according to the actual part welding area coordinates and the standard part welding area coordinates includes:
calculating and acquiring welding coordinate offset data according to the actual component welding area coordinate and the standard component welding area coordinate;
adjusting a preset welding path corresponding to the equipment to be welded according to the coordinate offset data to obtain a corrected welding path;
and welding the welding equipment according to the corrected welding path.
8. A welding control system based on image processing and point cloud processing, the system comprising:
the device comprises a data acquisition module, a data acquisition module and a data processing module, wherein the data acquisition module is used for acquiring image data and point cloud data of a device to be welded, and one pixel point in the image data corresponds to one point in the point cloud data;
the image processing module is used for acquiring reference class data corresponding to each pixel point in the image data through a trained image feature recognition model according to the image data, wherein the reference class data comprise reference component class information and reference welding point class information, the reference component class information is any one of multiple preset component classes and interference classes, and the reference welding point class information is any one of a welding point and a non-welding point;
the weight acquisition module is used for acquiring weight data corresponding to each pixel point according to the reference component type information and the reference welding point type information;
the point cloud processing module is used for carrying out point cloud smoothing processing on the point cloud data to obtain dense point cloud, and acquiring target category data corresponding to each point in the dense point cloud through a trained point cloud feature identification model according to the dense point cloud, reference category data corresponding to each pixel point and weight data, wherein the target category data comprises target component category information and target welding point category information;
the point cloud clustering module is used for carrying out point cloud clustering according to the target category data and the point cloud data to obtain clustered target components and component category data corresponding to the target components, wherein the component category data comprise component categories and actual component welding area coordinates;
and the control module is used for acquiring the coordinates of the welding area of the standard component corresponding to each target component according to each component type and carrying out welding control on the equipment to be welded according to the coordinates of the welding area of the actual component and the coordinates of the welding area of the standard component.
9. An intelligent terminal, characterized in that the intelligent terminal comprises a memory, a processor and a welding control program based on image processing and point cloud processing stored on the memory and operable on the processor, wherein the welding control program based on image processing and point cloud processing is executed by the processor to realize the steps of the welding control method based on image processing and point cloud processing according to any one of claims 1-7.
10. A computer-readable storage medium, wherein a welding control program based on image processing and point cloud processing is stored on the computer-readable storage medium, and when executed by a processor, the welding control program based on image processing and point cloud processing implements the steps of the welding control method based on image processing and point cloud processing as claimed in any one of claims 1-7.
CN202310087975.1A 2023-02-09 2023-02-09 Welding control method based on image processing and point cloud processing and related equipment Active CN115810133B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310087975.1A CN115810133B (en) 2023-02-09 2023-02-09 Welding control method based on image processing and point cloud processing and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310087975.1A CN115810133B (en) 2023-02-09 2023-02-09 Welding control method based on image processing and point cloud processing and related equipment

Publications (2)

Publication Number Publication Date
CN115810133A true CN115810133A (en) 2023-03-17
CN115810133B CN115810133B (en) 2023-05-23

Family

ID=85487826

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310087975.1A Active CN115810133B (en) 2023-02-09 2023-02-09 Welding control method based on image processing and point cloud processing and related equipment

Country Status (1)

Country Link
CN (1) CN115810133B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116168032A (en) * 2023-04-25 2023-05-26 浙江工业大学 Point cloud-based method and point cloud-based system for detecting defects of terminating weld joints
CN116168031B (en) * 2023-04-25 2023-08-29 中建科技集团有限公司 Welding code generation method based on three-dimensional image, welding system and related equipment

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100152870A1 (en) * 2007-02-19 2010-06-17 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Method and device for controlling robots for welding workpieces
CN111805131A (en) * 2020-09-02 2020-10-23 季华实验室 Weld track real-time positioning method and device, storage medium and terminal
WO2021213223A1 (en) * 2020-04-20 2021-10-28 广东利元亨智能装备股份有限公司 Weld quality inspection method, apparatus and system, and electronic device
US20220016776A1 (en) * 2020-07-17 2022-01-20 Path Robotics, Inc. Real time feedback and dynamic adjustment for welding robots
CN114227054A (en) * 2022-01-05 2022-03-25 南昌大学 Automatic detection method for tube plate welding seam based on 3D point cloud
CN115035034A (en) * 2022-05-09 2022-09-09 武汉中观自动化科技有限公司 Automatic positioning method and system for thin-wall welding
CN115409808A (en) * 2022-08-31 2022-11-29 深圳前海瑞集科技有限公司 Weld joint recognition method and device, welding robot and storage medium
CN115409863A (en) * 2022-08-31 2022-11-29 深圳前海瑞集科技有限公司 Workpiece weld extraction method and device, computer readable medium and electronic equipment
CN115409805A (en) * 2022-08-31 2022-11-29 深圳前海瑞集科技有限公司 Workpiece weld joint identification method and device, computer readable medium and electronic equipment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100152870A1 (en) * 2007-02-19 2010-06-17 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Method and device for controlling robots for welding workpieces
WO2021213223A1 (en) * 2020-04-20 2021-10-28 广东利元亨智能装备股份有限公司 Weld quality inspection method, apparatus and system, and electronic device
US20220016776A1 (en) * 2020-07-17 2022-01-20 Path Robotics, Inc. Real time feedback and dynamic adjustment for welding robots
CN111805131A (en) * 2020-09-02 2020-10-23 季华实验室 Weld track real-time positioning method and device, storage medium and terminal
CN114227054A (en) * 2022-01-05 2022-03-25 南昌大学 Automatic detection method for tube plate welding seam based on 3D point cloud
CN115035034A (en) * 2022-05-09 2022-09-09 武汉中观自动化科技有限公司 Automatic positioning method and system for thin-wall welding
CN115409808A (en) * 2022-08-31 2022-11-29 深圳前海瑞集科技有限公司 Weld joint recognition method and device, welding robot and storage medium
CN115409863A (en) * 2022-08-31 2022-11-29 深圳前海瑞集科技有限公司 Workpiece weld extraction method and device, computer readable medium and electronic equipment
CN115409805A (en) * 2022-08-31 2022-11-29 深圳前海瑞集科技有限公司 Workpiece weld joint identification method and device, computer readable medium and electronic equipment

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116168032A (en) * 2023-04-25 2023-05-26 浙江工业大学 Point cloud-based method and point cloud-based system for detecting defects of terminating weld joints
CN116168032B (en) * 2023-04-25 2023-07-18 浙江工业大学 Point cloud-based method and point cloud-based system for detecting defects of terminating weld joints
CN116168031B (en) * 2023-04-25 2023-08-29 中建科技集团有限公司 Welding code generation method based on three-dimensional image, welding system and related equipment

Also Published As

Publication number Publication date
CN115810133B (en) 2023-05-23

Similar Documents

Publication Publication Date Title
US11878433B2 (en) Method for detecting grasping position of robot in grasping object
CN106599830B (en) Face key point positioning method and device
CN115810133B (en) Welding control method based on image processing and point cloud processing and related equipment
CN111563442A (en) Slam method and system for fusing point cloud and camera image data based on laser radar
CN111160269A (en) Face key point detection method and device
CN112837371A (en) Object grabbing method and device based on 3D matching and computing equipment
CN109492688B (en) Weld joint tracking method and device and computer readable storage medium
CN115171097B (en) Processing control method and system based on three-dimensional point cloud and related equipment
CN110322399B (en) Ultrasonic image adjustment method, system, equipment and computer storage medium
CN110310305B (en) Target tracking method and device based on BSSD detection and Kalman filtering
CN114743259A (en) Pose estimation method, pose estimation system, terminal, storage medium and application
CN111259710B (en) Parking space structure detection model training method adopting parking space frame lines and end points
CN110443245A (en) Localization method, device and the equipment of a kind of license plate area under unrestricted scene
CN112336342A (en) Hand key point detection method and device and terminal equipment
CN115793571A (en) Processing equipment control method and system based on multi-mode data and related equipment
CN110007764B (en) Gesture skeleton recognition method, device and system and storage medium
CN111968102B (en) Target equipment detection method, system, medium and electronic terminal
CN111723688A (en) Human body action recognition result evaluation method and device and electronic equipment
CN116604212A (en) Robot weld joint identification method and system based on area array structured light
CN111275758A (en) Hybrid 3D visual positioning method and device, computer equipment and storage medium
CN115862067A (en) Hand gesture recognition method, device, equipment and storage medium
JPH07146121A (en) Recognition method and device for three dimensional position and attitude based on vision
CN111415384B (en) Industrial image component accurate positioning system based on deep learning
CN117399859B (en) Automobile part welding control method and device
CN117649367B (en) Image orientation correction method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant