CN113762400B - Autonomous extraction method for weld joint position based on naive Bayes classifier - Google Patents

Autonomous extraction method for weld joint position based on naive Bayes classifier Download PDF

Info

Publication number
CN113762400B
CN113762400B CN202111066499.2A CN202111066499A CN113762400B CN 113762400 B CN113762400 B CN 113762400B CN 202111066499 A CN202111066499 A CN 202111066499A CN 113762400 B CN113762400 B CN 113762400B
Authority
CN
China
Prior art keywords
data
weld
interference
steps
threshold
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111066499.2A
Other languages
Chinese (zh)
Other versions
CN113762400A (en
Inventor
何银水
张弛
肖贺
杜雷恒
温珍平
余卓骅
马国红
袁海涛
熊家凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanchang University
Original Assignee
Nanchang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanchang University filed Critical Nanchang University
Priority to CN202111066499.2A priority Critical patent/CN113762400B/en
Publication of CN113762400A publication Critical patent/CN113762400A/en
Application granted granted Critical
Publication of CN113762400B publication Critical patent/CN113762400B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • G06F18/24155Bayesian classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides an autonomous extraction method of a weld joint position based on a naive Bayes classifier, which comprises the steps of designing a plurality of improved Gabor filters with more effective direction feature discrimination, and linearly combining filtering results to generate a comprehensive direction feature diagram of the weld joint position; then, performing local threshold autonomous segmentation on the target, designing probability density functions for distinguishing the target and the interference from the thickness, uniformity and compactness of the data class through nearest neighbor clustering and supervision methods, and preliminarily realizing effective discrimination of the weld joint position and the interference based on a naive Bayesian classifier; and finally, expanding the visual characteristic competition of each data class overlapped in the horizontal direction, and further removing the interference. The invention provides a typical joint weld position extraction method, which can effectively overcome adverse effects caused by interference such as electric arc, splashing and the like, can stabilize weld tracking in an automatic electric arc welding process based on laser visual sensing, and improves welding quality.

Description

Autonomous extraction method for weld joint position based on naive Bayes classifier
Technical Field
The invention relates to the technical field of automatic welding, in particular to an autonomous extraction method of a welding line position based on a naive Bayes classifier.
Background
Visual sensing is still one of the main information acquisition means in the current automatic arc welding, and laser visual sensing in the thick plate automatic arc welding is considered as one of the effective methods for effectively detecting the position to be welded and the weld forming characteristics. In this process, the laser beam is placed in front of the welding gun and the area to be welded is detected perpendicular to the welding direction. The extraction of the weld position (laser contour) is therefore a prerequisite for a subsequent online intervention in the welding process. However, the uncertainty of the welding process, strong spatter, and the variable nature of the thick plate weld profile, make extracting effective weld locations in real time still challenging.
In the weld position extraction, to suppress high frequency noise in the weld image, various filters are applied to various weld position extraction schemes to remove significant noise. And for the welding seam image obtained by adopting arc shielding measures, obtaining the welding seam contour by adopting a traversing maximum gray value searching method. For images with stronger interference, thresholding is typically used to further eliminate the interference and reduce the data processing dimension. And finally, extracting the weld joint position by adopting a morphological extraction method, a model matching reconstruction method, a clustering indirect extraction method and the like.
The existing method aims at the fact that the groove of the welding line profile is smaller (the plate thickness is smaller than 40 mm), the spatial span of the welding line profile is small, and the shape of the welding line is basically unchanged in welding. In addition, because the arc is blocked, the welding line images have less interference. In thick plate automated welding based on laser visual sense, laser detection span is large, the welding line profile is large, so that interference is more complex, splashing is more prone to be infected with the welding line profile, and the morphology and the position of the multi-pass welding line change along with the number of welding beads. The above situation makes efficient extraction of the weld location more challenging.
Disclosure of Invention
The invention provides an autonomous extraction method of a weld joint position based on a naive Bayesian classifier (Naive Bayes Classifier) aiming at the problems of large weld joint span, complete arc and splash interference and changeable weld joint contours in a weld joint image in thick plate consumable electrode gas-shielded automatic welding based on laser vision sensing. According to the method, firstly, the directional characteristics of the weld line outline are effectively detected through a plurality of improved Gabor filters, electric arc and splash interference are initially restrained, then, the probability density functions for effectively distinguishing three visual characteristics of laser stripes and interference data are designed through a supervision mode, further restraint of interference is achieved on two-dimensional data through a naive Bayesian classifier, and finally, final removal of interference is completed according to the spatial span of a data cluster and the visual characteristics of the outline fluctuation degree of the spatial span. The effectiveness and the anti-interference capability of the algorithm provided by the invention are verified by weld joint position extraction tests of different shapes of thick plate T-shaped joints, butt joint joints and thin plate overlap joints.
In order to achieve the above purpose, the present invention provides the following technical solutions: an autonomous extraction method of weld joint positions based on a naive Bayes classifier comprises the following steps:
firstly, designing an improved Gabor filter to primarily inhibit strong arc and splash interference of a welding line image;
secondly, performing local threshold autonomous segmentation on the filtered image, and realizing data classification according to a nearest neighbor clustering algorithm; three prior probability density functions for distinguishing thickness, uniformity and compactness of target data and interference data, which are subjected to normal distribution, are designed according to data classification, and preliminary recognition of the weld contour is realized based on a naive Bayesian classification algorithm according to a maximum posterior probability criterion;
and thirdly, designing a visual characteristic calculation method of space span and fluctuation degree, competing the unfolded visual characteristics of the data class which is preliminarily identified and overlapped in the horizontal direction, and finally accurately extracting the weld joint position.
Further, an improved Gabor filter is designed in the first step to primarily suppress strong arc and splash interference of the weld image, and the specific steps are as follows:
the first step: the construction method of the improved two-dimensional Gabor filter convolution kernel comprises the following steps:
where x '=xcos θ+ysin θ, y' = -xsin θ+ycos θ, f is the frequency in pixels, θ is the detection direction, σ is the convolution template size,is a phase, specifically designed as->σ=4,/>n and m are designed as follows:
the structured convolution kernel can obviously increase the contrast ratio of the setting direction area and the background, and can more effectively highlight the direction characteristics of the welding line position;
and a second step of: for the weld position extraction of the T-joint, formulas (2) a-c are used for detecting web, groove and weld region respectively, and formula (2) b is also used for detecting the bottom plate; the set filtering angles are respectively theta 1 ∈[-20°,-5°]、θ 2 ∈[-80°,-110°]、θ 3 ∈[5°,15°]And theta 4 ∈[15°,35°]The method comprises the steps of carrying out a first treatment on the surface of the Extracting the positions of the V-shaped groove weld joints of the butt joint, wherein the formulas (3) a-c are respectively used for detecting the plate surface and the left and right groove areas;
and a third step of: and linearly combining the filtering results to generate a comprehensive direction characteristic diagram.
Further, in the second step, the filtering image is automatically segmented according to a local threshold value, and data classification is realized according to a nearest neighbor clustering algorithm, and the specific steps are as follows:
the first step: the local threshold autonomous segmentation is carried out on the comprehensive direction feature diagram, so that data simplification is realized, and the specific local threshold determination method comprises the following steps:
(1) using a step size of 5 Smoothing the image F in the vertical direction by using a 5 window to obtain average gray values of different height positionsWherein I represents a row, j represents a column, and vector I formed by average gray values obtained everywhere is subjected to linear filtering twice, and the length of the filter is larger than 19;
(2) obtaining each monotonically increasing interval of the filtered vector I', recording the starting and ending positions of the monotonically increasing intervals, and simultaneously recording the gray value of the corresponding position;
(3) calculating the difference value of gray values at the two ends of each monotone section and recording the monotone section D with the difference value larger than G pixels i The gray value abrupt region is expressed by this;
(4) calculating a monotonic interval D i The difference value of gray values of the vectors I and I' at each termination position is recorded again in a monotone section D with the difference value larger than G pixels i ' indicates a sudden change in the original gray value rising region;
(5) with each monotone interval D i The 'end position' is taken as the center, and the average value T of the gray values of R positions at I is obtained i
(6) Determination T i If the number is greater than 1, then in adjacent monotonic intervals D 1 '、D 2 The average position of the 'end position' is the end position of the first threshold segmentation of the traversal, T 1 For the threshold used for the first threshold segmentation, and so on, the starting position of the last threshold segmentation is adjacent monotonic interval D i-1 '、D i ' average position of end position, end position is end position of this traversal, pairThe threshold value of the stress is T i The method comprises the steps of carrying out a first treatment on the surface of the If the number is equal to 1, then the traversal uses only one threshold T 1
(7) Searching 5 in each threshold segmentation range Pixels with gray values larger than the corresponding threshold value in the 5 window, only the gray values of the pixels are assigned 255, the 5 5, assigning 0 to gray values of other pixel points of the column area covered by the window to realize autonomous segmentation of the local threshold;
and a second step of: and carrying out nearest neighbor clustering on the two-dimensional data subjected to autonomous segmentation of the local threshold, wherein the distance threshold for clustering is n pixels, so that a certain number of data classes are obtained.
Further, the three prior probability density functions for distinguishing the target data and the interference data designed in the second step are designed as follows:
the first step: the average value of width, width uniformity and compactness in the vertical direction describing the weld position is w pixels, u pixels and cp pixels, respectively, and the variance is delta, respectively 1 Individual pixels, delta 2 Individual pixels and delta 3 Each pixel, its prior probability density function description is as formula (4):
wherein C is 1 Representing the category belonging to the laser stripe, X 1 Representing the width characteristic attribute of each data class, X 2 Representing width uniformity characteristic attribute of each data class, X 3 Representing width compactness characteristic attributes; and is also provided withIs the average width of the ith class of data in the vertical direction,/>Is the average deviation of the width from the average width of the ith class of data where N i Is the number of coordinates in the horizontal direction of the ith data class, W ij Is the ithThe width of the class of data at the j-th horizontal coordinate, is the number of data of the ith class of data in the jth column;
and a second step of: because the interference data are opposite events of the weld position data, and the prior probability density function of the interference data is designed to be smaller according to the maximum posterior probability criterion, the probability density function is described as the following formula (5):
wherein C is 2 Representing the category belonging to the interference data, and, in addition, agreeing on P (C) 1 )=P(C 2 )。
Further, in the second step, according to the maximum posterior probability criterion, the preliminary recognition of the weld contour is realized based on a naive bayes classification algorithm, and the specific steps are as follows:
the first step: calculating the specification C of each data class 1 And C 2 Is a priori probability under the condition of (2);
and a second step of: judging the attribution category of each data class according to the maximum posterior probability rule, wherein the judging rule of the welding line position of each data class is as follows:
P(X 1 |C 1 )P(X 2 |C 1 )P(X 3 |C 1 )>P(X 1 |C 2 )P(X 2 |C 2 )P(X 3 |C 2 ) (6)
if equation (6) is satisfied, then the data class is weld position data, otherwise the data class is interference data.
Further, the specific steps of the third step are as follows:
the first step: the spatial span of the kth class of data is defined as:
wherein the method comprises the steps ofIs the coordinates of the kth data class dissimilar data member in the image;
and a second step of: the degree of fluctuation of the kth class of data profile is defined as:
M k =N(s k ·s k+1 < 0) (8) whereinRepresenting the slope, N (·) represents the number of operations;
and a third step of: calculating visual characteristics L k /M k The larger it is, the higher the likelihood that the class of data belongs to weld position data;
fourth step: data class pair L with overlapped coordinates in horizontal direction k /M k Competing, L k /M k The large class of data is considered weld position data, further removing interference.
Compared with the prior art, the invention has the beneficial effects that:
(1) The method can effectively overcome adverse effects caused by interference such as electric arc, splashing and the like, can stabilize weld tracking in an automatic electric arc welding process based on laser visual sensing, and improves welding quality.
(2) The improved Gabor filter design method is provided, the filter can more effectively highlight the direction characteristic of the detection target than the traditional Gabor filter, and reference can be provided for detection of the target direction characteristic based on Gabor filtering.
(3) A naive Bayes classifier is designed, and can effectively identify the position of a welding line and the interference of the background based on laser vision sensing.
Drawings
FIG. 1 is a flow chart of weld location extraction based on a naive Bayes classifier;
FIG. 2 is an acquisition graph of weld position integrated directional characteristics based on a modified Gabor filter; (a) is an original image; (b) a directional profile generated for the modified Gabor filter; (c) a directional profile generated for a conventional Gabor filter;
FIG. 3 is an example of local threshold autonomous segmentation;
FIG. 4 is a classification diagram of data based on a naive Bayes classifier; (a) is a nearest neighbor clustering result; (b) classifying the data;
FIG. 5 is a diagram of a weld position extraction process; (a) preliminary recognition results of weld positions; (b) extracting a weld position;
FIG. 6 is a graph showing the result of the process of extracting the position of a butt joint weld; (a) is an original image; (b) a processing result graph;
FIG. 7 is a graph showing the results of a process for extracting lap joint weld position in accordance with the method of the present invention; (a) is an original image; (b) is a processing result graph.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. The specific embodiments described herein are only for the purpose of illustrating the technical solution of the present invention and are not to be construed as limiting the invention.
Example 1 autonomous extraction of T-joint weld position
The extraction flow chart is shown in fig. 1, and comprises the following steps:
1. the experiment adopts a GMAW welding method, a T-shaped joint with the thickness of 50mm and a butt joint are selected, a robot automatic welding is adopted to implement a welding process, the welding speed is 300.0mm/min, the wire feeding speed is 9000.0mm/min, and the welding current, voltage and wire feeding speed are automatically matched;
2. collecting an original welding line image, wherein the image comprises a complete arc area, a filling area, laser rays, a web plate and a bottom plate edge; the original image is shown in fig. 2 (a);
3. performing direction characteristic detection based on an improved Gabor filter, and respectively detecting a web, a groove and a groove by using a formula (2) a-cDetecting the welding line area, detecting the bottom plate by using the formula (2) b, and filtering the angle theta 1 ~θ 4 Respectively set to-10 degrees, -110 degrees, 5 degrees and 30 degrees; the direction characteristic diagram generated by the improved Gabor filter is shown in fig. 2 (b), and the direction characteristic diagram generated by the traditional Gabor filter is shown in fig. 2 (c), so that the improved Gabor filter can more effectively highlight the direction characteristic of the detection target than the traditional Gabor filter;
4. when the obtained directional characteristic diagram is subjected to local threshold autonomous segmentation, setting the lengths of two linear filters to be 23, wherein the difference value G=10 pixels of gray values, and R=6; an example of local threshold autonomous segmentation is shown in fig. 3;
5. when nearest neighbor clustering is carried out on the two-dimensional data, setting a clustering distance threshold value as n=2 pixels; the nearest neighbor clustering result is shown in fig. 4 (a);
6. when data classification is implemented based on a naive Bayes classifier, in order to determine the width, width uniformity and compactness of the weld joint position, a visual characteristic probability density function is designed, wherein w is 3 pixels, u is 0.5 pixel, cp is 0 pixel, and variance delta 1 0.4 pixel, delta 2 0.4 pixel, delta 3 0.5 pixels; the data classification result is shown in fig. 4 (b);
7. based on the maximum posterior probability criterion, performing preliminary recognition of the weld joint position; the preliminary recognition result of the weld position is shown in fig. 5 (a);
8. performing visual feature competition, and finally extracting weld contour data; the weld position extraction result is shown in fig. 5 (b).
Example 2 autonomous extraction of Butt joint weld position
The performed autonomous extraction process of the welding position is similar to that of example 1, but the angle θ is filtered 1 ~θ 4 Are set to-1 °, -10 °, and 1 ° (fig. 6), respectively.
Example 3 overlap joint weld position autonomous extraction
The performed autonomous extraction process of the welding position is similar to that of example 1, but the angle θ is filtered 1 ~θ 4 Respectively set to-2 degrees, -80 degrees and 80 degreesAnd 2 ° (fig. 7).
The foregoing description of the preferred embodiments of the present invention has been presented only in terms of those specific and detailed descriptions, and is not, therefore, to be construed as limiting the scope of the invention. It should be noted that modifications, improvements and substitutions can be made by those skilled in the art without departing from the spirit of the invention, which are all within the scope of the invention. Accordingly, the scope of protection of the present invention is to be determined by the appended claims.

Claims (3)

1. An autonomous extraction method of weld joint positions based on a naive Bayes classifier is characterized by comprising the following steps of: the method comprises the following steps:
firstly, designing an improved Gabor filter to primarily inhibit strong arc and splash interference of a welding line image;
secondly, performing local threshold autonomous segmentation on the filtered image, and realizing data classification according to a nearest neighbor clustering algorithm; three prior probability density functions for distinguishing thickness, uniformity and compactness of target data and interference data, which are subjected to normal distribution, are designed according to data classification, and preliminary recognition of the weld contour is realized based on a naive Bayesian classification algorithm according to a maximum posterior probability criterion;
designing a visual characteristic calculation method of space span and fluctuation degree, competing the overlapped unfolding visual characteristics of the primarily identified data in the horizontal direction, and finally accurately extracting the weld joint position;
in the first step, an improved Gabor filter is designed to primarily inhibit strong arc and splash interference of a welding line image, and the specific steps are as follows:
the first step: the construction method of the improved two-dimensional Gabor filter convolution kernel comprises the following steps:
(1),
wherein the method comprises the steps of,/>,/>Is the frequency in pixels, +.>Is the detection direction, & lt + & gt>Is the convolution template size, +.>Is a phase, specifically designed as->nAndmthe design is as follows:
(2),
(3),
and a second step of: for the weld position extraction of the T-joint, formulas (2) a-c are used for detecting web, groove and weld region respectively, and formula (2) b is also used for detecting the bottom plate; the detection directions are respectively、/>、/>And->,/>For detecting the direction of the web, the detection direction is set up +.>For detecting the direction of the groove, a detection direction is provided>For the detection of the weld region, a detection direction is provided, < >>The detection direction is set for detecting the bottom plate; extracting the positions of the V-shaped groove weld joints of the butt joint, wherein the formulas (3) a-c are respectively used for detecting the plate surface and the left and right groove areas;
and a third step of: performing linear combination on the filtering result to generate a comprehensive direction characteristic diagram;
and (3) three prior probability density functions for distinguishing target data and interference data, which are designed in the step two, are designed as follows:
the first step: the average values of width, width uniformity and compactness in the vertical direction describing the weld position are respectivelywEach pixel,uIndividual pixelscpEach pixel, variance isIndividual pixels, < >>Individual pixels and->Each pixel, its prior probability density function description is as formula (4):
(4),
wherein the method comprises the steps ofRepresenting the category belonging to the laser stripe +.>Representing the width characteristic properties of the respective data class, +.>Representing the width uniformity characteristic attribute of each data class, < >>Representing width compactness characteristic attributes; and->Is the firstiAverage width of individual data classes in vertical direction, < >>Is the firstiAverage deviation of the width from the average width throughout the data class, wherein +.>Is the firstiNumber of coordinates of horizontal direction of data class, +.>Is the firstiThe data is in the firstjWidth at horizontal coordinate, +.>,/>Is the firstiThe data is in the firstjThe number of data for a column;
and a second step of: because the interference data are opposite events of the weld position data, and the prior probability density function of the interference data is designed to be smaller according to the maximum posterior probability criterion, the probability density function is described as the following formula (5):
(5),
wherein the method comprises the steps ofRepresenting the category belonging to the interference data, and, in addition, contracted +.>
In the second step, according to the maximum posterior probability criterion, the preliminary recognition of the weld joint contour is realized based on a naive Bayes classification algorithm, and the specific steps are as follows:
the first step: calculating the specification of each data classAnd->Is a priori probability under the condition of (2);
and a second step of: judging the attribution category of each data class according to the maximum posterior probability rule, wherein the judging rule of the welding line position of each data class is as follows:
(6)
if equation (6) is satisfied, then the data class is weld position data, otherwise the data class is interference data.
2. The autonomous extraction method of weld locations based on a naive bayes classifier of claim 1, wherein: in the second step, the filtering image is subjected to local threshold autonomous segmentation, and data classification is realized according to a nearest neighbor clustering algorithm, and the specific steps are as follows:
the first step: the local threshold autonomous segmentation is carried out on the comprehensive direction feature diagram, and the specific local threshold determination method comprises the following steps:
(1) vertically aligning images using a 5 gamma 5 window with a step size of 5FSmoothing to obtain average gray values at different height positionsWherein->Representing row (s)/(s)>Represents a column and is associated with a vector of average gray values obtained everywhere +.>Performing linear filtering twice, wherein the length of the filter is larger than 19;
(2) obtaining filtered vectorsThe initial position and the end position of the monotonically increasing interval are recorded, and the gray value of the corresponding position is recorded at the same time;
(3) calculating the difference value of gray values at two ends of each monotonic interval, and recording that the difference value is larger thanMonotone interval of individual pixels->The gray value abrupt region is expressed by this;
(4) calculating monotonic intervalsThe termination positions are respectively in the vector->And->The difference in gray values of (2) is recorded again as being greater thanMonotone interval of individual pixels->Representing abrupt changes in the original gray value rising region;
(5) in each monotone intervalThe end position of (2) is taken as the center, and the left and right sides of the end position are obtainedRThe individual position is +.>Average of gray values of (a)
(6) Determination ofIf the number is greater than 1, then the number is +.>、/>The average position of the end positions of (a) is the end position of the first threshold segmentation of the traversal,/-, and>for the threshold used for the first threshold segmentation, and so on, the starting position of the last threshold segmentation is adjacent monotonic interval +.>、/>The average position of the end positions of the current traversal is the end position of the current traversal, and the corresponding threshold value is +.>The method comprises the steps of carrying out a first treatment on the surface of the If the number is equal to 1, then the current traversal uses only one threshold +.>
(7) Searching pixel points with gray values larger than corresponding thresholds in 5 gamma 5 windows in each threshold segmentation range, wherein only the gray values of the pixel points are assigned with 255, and the gray values of other pixel points in a column region covered by the 5 gamma 5 windows are assigned with 0, so that autonomous segmentation of local thresholds is realized;
and a second step of: performing nearest neighbor clustering on the two-dimensional data subjected to autonomous segmentation of the local threshold, wherein the clustering distance threshold isAnd pixels, thereby obtaining a number of data classes.
3. The autonomous extraction method of weld locations based on a naive bayes classifier of claim 1, wherein: the specific steps of the third step are as follows:
the first step: first, thekThe spatial span of the individual dataclasses is defined as:
(7),
wherein the method comprises the steps of,/>Is the firstkIndividual data class dissimilar data member in graphCoordinates in the image;
and a second step of: first, thekThe degree of fluctuation of the individual dataclass profiles is defined as:
(8),
wherein the method comprises the steps ofIndicating slope, & lt->Representing a fetch operation;
and a third step of: computing visual featuresThe larger it is, the higher the likelihood that the class of data belongs to weld position data;
fourth step: pairs of data classes with overlapping coordinates in the horizontal directionCompetitive->The large class of data is considered weld position data, further removing interference.
CN202111066499.2A 2021-09-13 2021-09-13 Autonomous extraction method for weld joint position based on naive Bayes classifier Active CN113762400B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111066499.2A CN113762400B (en) 2021-09-13 2021-09-13 Autonomous extraction method for weld joint position based on naive Bayes classifier

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111066499.2A CN113762400B (en) 2021-09-13 2021-09-13 Autonomous extraction method for weld joint position based on naive Bayes classifier

Publications (2)

Publication Number Publication Date
CN113762400A CN113762400A (en) 2021-12-07
CN113762400B true CN113762400B (en) 2023-10-31

Family

ID=78795088

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111066499.2A Active CN113762400B (en) 2021-09-13 2021-09-13 Autonomous extraction method for weld joint position based on naive Bayes classifier

Country Status (1)

Country Link
CN (1) CN113762400B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116823703B (en) * 2023-02-03 2024-04-19 肇庆学院 Structural laser weld image processing method based on Gabor filtering

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110043033A (en) * 2009-10-20 2011-04-27 삼성전기주식회사 Metal pad state detection method using gabor filter
CN106952281A (en) * 2017-05-15 2017-07-14 上海交通大学 A kind of method that weld profile feature recognition and its welding bead are planned in real time
CN109003279A (en) * 2018-07-06 2018-12-14 东北大学 Fundus retina blood vessel segmentation method and system based on K-Means clustering labeling and naive Bayes model
CN110717872A (en) * 2019-10-08 2020-01-21 江西洪都航空工业集团有限责任公司 Method and system for extracting characteristic points of V-shaped welding seam image under laser-assisted positioning
CN110717531A (en) * 2019-09-27 2020-01-21 华东师范大学 Method for detecting classified change type based on uncertainty analysis and Bayesian fusion
CN111968072A (en) * 2020-07-07 2020-11-20 南昌大学 Thick plate T-shaped joint welding position autonomous decision-making method based on Bayesian network

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10811151B2 (en) * 2017-07-05 2020-10-20 Electric Power Research Institute, Inc. Apparatus and method for identifying cracks in a structure using a multi-stage classifier

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110043033A (en) * 2009-10-20 2011-04-27 삼성전기주식회사 Metal pad state detection method using gabor filter
CN106952281A (en) * 2017-05-15 2017-07-14 上海交通大学 A kind of method that weld profile feature recognition and its welding bead are planned in real time
CN109003279A (en) * 2018-07-06 2018-12-14 东北大学 Fundus retina blood vessel segmentation method and system based on K-Means clustering labeling and naive Bayes model
CN110717531A (en) * 2019-09-27 2020-01-21 华东师范大学 Method for detecting classified change type based on uncertainty analysis and Bayesian fusion
CN110717872A (en) * 2019-10-08 2020-01-21 江西洪都航空工业集团有限责任公司 Method and system for extracting characteristic points of V-shaped welding seam image under laser-assisted positioning
CN111968072A (en) * 2020-07-07 2020-11-20 南昌大学 Thick plate T-shaped joint welding position autonomous decision-making method based on Bayesian network

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于方向显著性的熔池背景下V形焊缝激光条纹的有效提取;余卓骅;胡艳梅;何银水;;焊接(第01期);43-45 *
基于激光视觉系统的水下焊接焊缝识别;何银水;杨杰;余卓骅;;电焊机(第11期);15-20 *

Also Published As

Publication number Publication date
CN113762400A (en) 2021-12-07

Similar Documents

Publication Publication Date Title
Muhammad et al. A robust butt welding seam finding technique for intelligent robotic welding system using active laser vision
Wang et al. A robust weld seam recognition method under heavy noise based on structured-light vision
Muhammad et al. Welding seam profiling techniques based on active vision sensing for intelligent robotic welding
Li et al. Robust welding seam tracking and recognition
CN108765419B (en) Structured light vision welding seam image information self-adaptive extraction method
CN107424144B (en) Laser vision-based weld joint tracking image processing method
CN110717872B (en) Method and system for extracting characteristic points of V-shaped welding seam image under laser-assisted positioning
CN103425988B (en) Real-time positioning and matching method with arc geometric primitives
CN108537808A (en) A kind of gluing online test method based on robot teaching point information
CN111696107A (en) Molten pool contour image extraction method for realizing closed connected domain
CN111127402A (en) Visual detection method for welding quality of robot
CN110814465B (en) Universal method for automatically extracting welding seam contour
Zhang et al. Narrow-seam identification and deviation detection in keyhole deep-penetration TIG welding
CN113674206B (en) Extraction method suitable for characteristic parameters of deep-melting K-TIG welding molten pool and keyhole entrance
CN113762400B (en) Autonomous extraction method for weld joint position based on naive Bayes classifier
CN116833645A (en) Weld autonomous identification and welding method and system based on mobile robot
CN113723494A (en) Laser visual stripe classification and weld joint feature extraction method under uncertain interference source
Lin et al. Intelligent seam tracking of an ultranarrow gap during K-TIG welding: a hybrid CNN and adaptive ROI operation algorithm
CN117058404A (en) Multi-type welding groove feature extraction method based on three-dimensional point cloud
Sultana et al. Lane detection and tracking under rainy weather challenges
He et al. Parameter self-optimizing clustering for autonomous extraction of the weld seam based on orientation saliency in robotic MAG welding
CN113012181B (en) Novel quasi-circular detection method based on Hough transformation
CN113579467A (en) Welding seam identification method and device for welding robot and storage medium
CN117372827A (en) Sonar image statistics enhancement algorithm based on boundary constraint
CN110348363B (en) Vehicle tracking method for eliminating similar vehicle interference based on multi-frame angle information fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant