CN109492688B - Weld joint tracking method and device and computer readable storage medium - Google Patents

Weld joint tracking method and device and computer readable storage medium Download PDF

Info

Publication number
CN109492688B
CN109492688B CN201811309646.2A CN201811309646A CN109492688B CN 109492688 B CN109492688 B CN 109492688B CN 201811309646 A CN201811309646 A CN 201811309646A CN 109492688 B CN109492688 B CN 109492688B
Authority
CN
China
Prior art keywords
weld
image
determining
tracked
pollution degree
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811309646.2A
Other languages
Chinese (zh)
Other versions
CN109492688A (en
Inventor
彭佳勇
戴国政
戴国鸿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yibu Zhizao Technology Co ltd
Original Assignee
Shenzhen Yibu Zhizao Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Yibu Zhizao Technology Co ltd filed Critical Shenzhen Yibu Zhizao Technology Co ltd
Priority to CN201811309646.2A priority Critical patent/CN109492688B/en
Publication of CN109492688A publication Critical patent/CN109492688A/en
Application granted granted Critical
Publication of CN109492688B publication Critical patent/CN109492688B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features

Abstract

The invention discloses a welding seam tracking method, which comprises the following steps: acquiring a weld image, and determining an initial position corresponding to a weld characteristic point based on the weld image; determining an image to be tracked corresponding to the weld characteristic point in the weld image based on a preset pollution degree scoring network; and determining a target area corresponding to the weld characteristic point in the image to be tracked based on the initial position so as to carry out next welding according to the target area. The invention also discloses a welding seam tracking device and a computer readable storage medium. The invention improves the accuracy of the welding seam tracking in the welding process.

Description

Weld joint tracking method and device and computer readable storage medium
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a method and an apparatus for tracking a weld joint, and a computer-readable storage medium.
Background
The welding seam tracking is one of key technologies of automatic welding, common welding robots mostly adopt a teaching and reproducing operation mode, machining errors and welding thermal deformation exist, welding quality is reduced, welding defects are caused, and the welding robot with the real-time welding seam tracking function is a main means for solving the problems. At present, a visual sensing method is widely used for seam tracking, and because a large amount of interference such as arc light and splash exists in a welding process, a lot of noise interference exists in an acquired seam image, the extraction of seam characteristics is not facilitated, and even seam characteristic information is submerged in a serious condition, so that how to obtain a high-quality image is a key problem to be solved in a visual guidance seam tracking process.
Disclosure of Invention
The invention mainly aims to provide a welding seam tracking method, a welding seam tracking device and a computer readable storage medium, and aims to solve the technical problem that welding seam characteristics cannot be accurately extracted due to noise interference in welding seam images acquired in the existing visual guidance welding seam tracking process.
In order to achieve the above object, the present invention provides a weld tracking method including:
acquiring a weld image, and determining an initial position corresponding to a weld characteristic point based on the weld image;
determining an image to be tracked corresponding to the weld characteristic point in the weld image based on a preset pollution degree scoring network;
and determining a target area corresponding to the weld characteristic point in the image to be tracked based on the initial position so as to carry out next welding according to the target area.
Optionally, the step of acquiring a weld image and determining an initial position corresponding to the weld feature point based on the weld image includes:
acquiring a welding seam image in a welding process, wherein the welding seam image comprises a plurality of frames of welding seam images;
determining an initial position of a weld feature point based on a first frame of weld image, wherein the first frame of weld image is a weld image without noise pollution.
Optionally, the step of determining the image to be tracked corresponding to the weld feature point in the weld image based on the preset pollution degree scoring network includes:
based on a preset pollution degree grading network, grading the pollution degree of the plurality of frames of welding line images, and determining pollution degree scores corresponding to the plurality of frames of welding line images respectively;
judging whether the pollution degree score is smaller than or equal to a preset pollution degree threshold value or not;
and if so, determining the welding seam image corresponding to the pollution degree score as an image to be tracked.
Optionally, before the step of determining the image to be tracked corresponding to the weld feature point in the weld image based on the preset pollution degree scoring network, the method further includes:
acquiring weld image samples with different noise pollution degrees, and performing off-line training on a preset pollution degree scoring network based on the weld image samples so that the preset pollution degree scoring network can output pollution degree scores corresponding to the weld image samples;
and storing the preset pollution degree scoring network after the off-line training.
Optionally, after the step of determining whether the pollution level score is less than or equal to a preset pollution level threshold, the method further includes:
and if the pollution degree score is larger than the preset pollution degree threshold value, determining the welding seam image corresponding to the minimum pollution degree score as the image to be tracked.
Optionally, the step of determining a target region corresponding to the weld feature point in the image to be tracked based on the initial position so as to perform the next welding according to the target region includes:
determining a plurality of areas to be tracked in the image to be tracked based on the initial positions;
calculating tracking scores of the plurality of areas to be tracked based on a preset discrimination network to obtain corresponding tracking scores;
and determining the region to be tracked corresponding to the highest tracking score as a target region corresponding to the weld joint feature point, and determining the position information corresponding to the target region so as to carry out the next welding.
Optionally, before determining a target region corresponding to the weld feature point in the image to be tracked based on the initial position, so as to perform the next welding step according to the target region, the method further includes:
acquiring a weld image sample in a preset time period, and determining position information of weld characteristic points respectively corresponding to the weld image sample in the preset time period;
dividing the weld image sample in the preset time period into a positive sample and a negative sample based on the position information, and putting the positive sample and the negative sample into a preset discrimination network for off-line training so that the preset discrimination network outputs tracking scores corresponding to the positive sample and the negative sample;
and storing the preset discrimination network after the off-line training.
Optionally, the step of determining the position information corresponding to the target area so as to perform the next welding includes:
and extracting image information corresponding to the target area, and processing the image information corresponding to the target area to determine position information corresponding to the target area so as to perform the next welding based on the position information.
Further, to achieve the above object, the present invention also provides a bead-tracking apparatus including: a memory, a processor, and a weld tracking program stored on the memory and executable on the processor, the weld tracking program when executed by the processor implementing the steps of the weld tracking method as described above.
Further, to achieve the above object, the present invention also provides a computer readable storage medium having stored thereon a weld tracking program, which when executed by a processor, implements the steps of the weld tracking method as described above.
The invention provides a welding seam tracking method which comprises the steps of firstly obtaining a welding seam image, determining an initial position of a welding seam characteristic point from the welding seam image, then determining an image to be tracked corresponding to the welding seam characteristic point in the welding seam image based on a preset pollution degree scoring network, finally determining a corresponding target area in the image to be tracked based on the initial position of the welding seam characteristic point, and carrying out the next welding according to the target area. According to the welding seam tracking method, the initial position of the welding seam feature point is identified from the welding seam image, the pollution degree of the welding seam image is graded, the welding seam image to be tracked is determined, finally, the target area where the welding seam feature point appears is determined from the welding seam image to be tracked on the basis of the initial position of the welding seam feature point, the influence of ambient light such as arc light and splashing on the extraction of the welding seam feature point is eliminated, the accuracy of the extraction of the welding seam feature point is improved, and accurate welding seam tracking is achieved.
Drawings
FIG. 1 is a schematic diagram of an apparatus in a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a first embodiment of the weld tracking method of the present invention;
FIG. 3 is a schematic view of a weld tracking process in an embodiment of the present invention;
FIG. 4 is a schematic view of a first frame of a weld image in an embodiment of the present invention;
FIG. 5 is a detailed flowchart of step S20 in FIG. 2;
FIG. 6 is a schematic flow chart of a second embodiment of the weld tracking method of the present invention;
FIG. 7 is a schematic view of a weld arc in an embodiment of the invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The main solution of the embodiment of the invention is as follows: acquiring a weld image, and determining an initial position corresponding to a weld characteristic point based on the weld image; determining an image to be tracked corresponding to the weld characteristic point in the weld image based on a preset pollution degree scoring network; and determining a target area corresponding to the weld characteristic point in the image to be tracked based on the initial position so as to carry out next welding according to the target area. By the technical scheme of the embodiment of the invention, the technical problem that the weld joint features cannot be accurately extracted due to noise interference in the weld joint image acquired in the existing visual guidance weld joint tracking process is solved.
As shown in fig. 1, fig. 1 is a schematic device structure diagram of a hardware operating environment according to an embodiment of the present invention.
The device of the embodiment of the invention can be a PC, and can also be a mobile terminal device with a display function, such as a smart phone, a tablet computer, a portable computer and the like.
As shown in fig. 1, the apparatus may include: a processor 1001, such as a CPU, a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a non-volatile memory (e.g., a magnetic disk memory). The memory 1005 may alternatively be a storage device separate from the processor 1001.
Optionally, the device may also include a camera, RF (Radio Frequency) circuitry, sensors, audio circuitry, Wi-Fi modules, and the like. Such as light sensors, motion sensors, and other sensors. Of course, the device may also be configured with other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which are not described herein again.
Those skilled in the art will appreciate that the configuration of the device shown in fig. 1 is not intended to be limiting of the device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
As shown in FIG. 1, memory 1005, which is one type of computer storage medium, may include an operating system, a network communication module, a user interface module, and a weld tracking program.
In the apparatus shown in fig. 1, the network interface 1004 is mainly used for connecting to a backend server and performing data communication with the backend server; the user interface 1003 is mainly used for connecting a client (user side) and performing data communication with the client; and the processor 1001 and the memory 1005 may be provided in the seam tracking apparatus, which calls the seam tracking program stored in the memory 1005 by the processor 1001 and performs the following operations:
acquiring a weld image, and determining an initial position corresponding to a weld characteristic point based on the weld image;
determining an image to be tracked corresponding to the weld characteristic point in the weld image based on a preset pollution degree scoring network;
and determining a target area corresponding to the weld characteristic point in the image to be tracked based on the initial position so as to carry out next welding according to the target area.
Further, processor 1001 may invoke a weld tracking program stored in memory 1005 to also perform the following operations:
acquiring a welding seam image in a welding process, wherein the welding seam image comprises a plurality of frames of welding seam images;
determining an initial position of a weld feature point based on a first frame of weld image, wherein the first frame of weld image is a weld image without noise pollution.
Further, processor 1001 may invoke a weld tracking program stored in memory 1005 to also perform the following operations:
based on a preset pollution degree grading network, grading the pollution degree of the plurality of frames of welding line images, and determining pollution degree scores corresponding to the plurality of frames of welding line images respectively;
judging whether the pollution degree score is smaller than or equal to a preset pollution degree threshold value or not;
and if so, determining the welding seam image corresponding to the pollution degree score as an image to be tracked.
Further, processor 1001 may invoke a weld tracking program stored in memory 1005 to also perform the following operations:
acquiring weld image samples with different noise pollution degrees, and performing off-line training on a preset pollution degree scoring network based on the weld image samples so that the preset pollution degree scoring network can output pollution degree scores corresponding to the weld image samples;
and storing the preset pollution degree scoring network after the off-line training.
Further, processor 1001 may invoke a weld tracking program stored in memory 1005 to also perform the following operations:
and if the pollution degree score is larger than the preset pollution degree threshold value, determining the welding seam image corresponding to the minimum pollution degree score as the image to be tracked.
Further, processor 1001 may invoke a weld tracking program stored in memory 1005 to also perform the following operations:
determining a plurality of areas to be tracked in the image to be tracked based on the initial positions;
calculating tracking scores of the plurality of areas to be tracked based on a preset discrimination network to obtain corresponding tracking scores;
and determining the region to be tracked corresponding to the highest tracking score as a target region corresponding to the weld joint feature point, and determining the position information corresponding to the target region so as to carry out the next welding.
Further, processor 1001 may invoke a weld tracking program stored in memory 1005 to also perform the following operations:
acquiring a weld image sample in a preset time period, and determining position information of weld characteristic points respectively corresponding to the weld image sample in the preset time period;
dividing the weld image sample in the preset time period into a positive sample and a negative sample based on the position information, and putting the positive sample and the negative sample into a preset discrimination network for off-line training so that the preset discrimination network outputs tracking scores corresponding to the positive sample and the negative sample;
and storing the preset discrimination network after the off-line training.
Further, processor 1001 may invoke a weld tracking program stored in memory 1005 to also perform the following operations:
and extracting image information corresponding to the target area, and processing the image information corresponding to the target area to determine position information corresponding to the target area so as to perform the next welding based on the position information.
According to the scheme provided by the embodiment, the welding seam image is obtained firstly, the initial position of the welding seam feature point is determined from the welding seam image, the image to be tracked corresponding to the welding seam feature point is determined in the welding seam image based on the preset pollution degree scoring network, finally, the corresponding target area is determined in the image to be tracked based on the initial position of the welding seam feature point, and the next welding step is carried out according to the target area. According to the welding seam tracking method, the initial position of the welding seam feature point is identified from the welding seam image, the pollution degree of the welding seam image is graded, the welding seam image to be tracked is determined, finally, the target area where the welding seam feature point appears is determined from the welding seam image to be tracked on the basis of the initial position of the welding seam feature point, the influence of ambient light such as arc light and splashing on the extraction of the welding seam feature point is eliminated, the accuracy of the extraction of the welding seam feature point is improved, and accurate welding seam tracking is achieved.
Based on the hardware structure, the embodiment of the welding seam tracking method is provided.
Referring to fig. 2, fig. 2 is a schematic flow chart of a first embodiment of the seam tracking method of the present invention, in which the method includes:
step S10, acquiring a weld image, and determining an initial position corresponding to the weld characteristic point based on the weld image;
compared with a common light source, the laser structure light has the advantages of concentrated energy and small environmental influence, so the laser structure light is commonly used for guiding the seam tracking, and the small-sized line structure laser is commonly used for performing the seam tracking at present due to the high price of a large laser generator. As shown in fig. 3, which is a schematic view of a welding seam tracking process in an embodiment of the present invention, a sensor is installed in front of a welding gun, a distance between a measurement point and a welding point is d, the sensor continuously detects a position of a feature point of the welding seam during welding, that is, a position of the measurement point, and a welding system adjusts the position of the welding gun according to three-dimensional coordinates of the feature point, so as to implement automatic tracking welding.
In this embodiment, a weld image acquired in the current welding process, that is, a weld image captured by a sensor, is first acquired, and it can be understood that the acquired weld image includes several frames of weld images, as shown in fig. 4, which is a schematic diagram of a first frame weld image in an embodiment of the present invention, when shooting is started, since welding is not started yet, arc and spatter interference does not exist, so that the first frame image is free from noise pollution, and a rectangular frame shown in the figure is an initial position where a weld feature point in the first frame is located. The position a1 of the area where the weld feature point in the rectangular frame is located is calibrated to be (x1, y1, w, h), where x1 and y1 are initial position coordinate values of the weld feature point, and w and h are the width and height of the rectangular frame, it can be understood that the above four parameters may be set to different values in different application scenarios, and in this embodiment, are artificially set to fixed values.
Step S20, determining an image to be tracked corresponding to the weld joint feature point in the weld joint image based on a preset pollution degree scoring network;
further, determining a weld image to be tracked, that is, an image to be tracked, from the obtained plurality of frames of weld images through a preset pollution degree scoring network, and determining a target position of a weld feature point by determining the image to be tracked, specifically, as shown in fig. 5, the method includes:
step S21, based on a preset pollution degree grading network, carrying out pollution degree grading on the plurality of frames of welding seam images, and determining pollution degree scores corresponding to the plurality of frames of welding seam images respectively;
in this embodiment, the above frames of weld images are scored through a preset pollution degree scoring network to determine whether tracking is required. Specifically, the obtained frames of weld images are input into the preset pollution degree scoring network, so that the preset pollution degree scoring network outputs pollution degree scores corresponding to the frames of weld images respectively.
It can be understood that the preset pollution degree scoring network in this embodiment is a network stored after offline training, specifically, a large number of weld image samples with different noise pollution degrees are collected first, and the weld image samples may be scored for their pollution degrees manually, for example, the score may be set to 0 to 1. Furthermore, the collected welding seam image samples with different noise pollution degrees are placed into a multilayer convolution neural network layer for feature extraction, then grading of the pollution degree is carried out through a full-connection network and softmax, the pollution degree score corresponding to the welding seam image sample is finally output, offline training of a preset pollution degree grading network is completed, the trained preset pollution degree grading network is saved, and therefore online identification and output of the pollution degree score of the welding seam image are facilitated.
Step S22, judging whether the pollution degree score is less than or equal to a preset pollution degree threshold value; if yes, go to step S23; if not, go to step S24;
in this embodiment, the welding seam image is tracked to determine a position where a welding seam feature point may appear during the next welding, and therefore, to ensure the accuracy of welding, a welding seam image with low noise pollution should be selected for tracking. After the pollution degree score corresponding to the welding seam image is determined through the preset pollution degree scoring network, whether the pollution degree score is smaller than or equal to a preset pollution degree threshold value is judged at first, and whether the welding seam image can be used as an image to be tracked is determined.
Step S23, determining the weld image corresponding to the pollution degree score as an image to be tracked;
and step S24, determining the weld image corresponding to the minimum pollution degree score as the image to be tracked.
And if the pollution degree scores of the welding line images are smaller than or equal to a preset pollution degree threshold value, determining the welding line images as the images to be tracked, and if the pollution degree scores of the welding line images are larger than the preset pollution degree threshold value, determining the welding line image with the minimum pollution degree score as the image to be tracked.
And step S30, determining a target area corresponding to the weld joint feature point in the image to be tracked based on the initial position so as to carry out the next welding according to the target area.
Further, after the image to be tracked is determined, according to the initial position of the weld characteristic point, the position where the weld characteristic point is likely to appear, namely the area to be tracked, is determined from the image to be tracked, and then the target area where the weld characteristic point is most likely to appear is further determined from the area to be tracked, so that the next welding step can be carried out according to the target area finally.
In this embodiment, a weld image is obtained first, an initial position of a weld feature point is determined from the weld image, an image to be tracked corresponding to the weld feature point is determined in the weld image based on a preset pollution degree scoring network, finally, a corresponding target area is determined in the image to be tracked based on the initial position of the weld feature point, and the next welding is performed according to the target area. According to the welding seam tracking method, the initial position of the welding seam feature point is identified from the welding seam image, the pollution degree of the welding seam image is graded, the welding seam image to be tracked is determined, finally, the target area where the welding seam feature point appears is determined from the welding seam image to be tracked on the basis of the initial position of the welding seam feature point, the influence of ambient light such as arc light and splashing on the extraction of the welding seam feature point is eliminated, the accuracy of the extraction of the welding seam feature point is improved, and accurate welding seam tracking is achieved.
Further, referring to fig. 6, a second embodiment of the seam tracking method according to the present invention is proposed based on the above embodiment, in this embodiment, the step S30 specifically includes:
step S31, determining a plurality of areas to be tracked in the image to be tracked based on the initial position;
in a first frame of weld image without noise pollution, determining an initial position of a weld characteristic point to be A1 ═ x1, y1, w, h, in a first embodiment, selecting a weld image with a pollution degree score smaller than a preset pollution degree threshold value for tracking, assuming that the weld characteristic point in the image to be tracked is At ═ xt, yt, w, h, based on the position, selecting a plurality of regions to be tracked in a next frame of image to be tracked, wherein different regions to be tracked can be determined according to different types of welding targets. For example, if a straight weld, then in a narrow band [ y ]t-Δy,yt+Δy]Generated by using a Gaussian random functionSeveral positions, i.e. away (x)t,yt) The closer the points, the greater the probability of generating a sample to be tracked, but all bounded to [ yt-Δy,yt+Δy]In which, ΔyIs an empirical statistic; in the case of a weld of approximate arc shape, as shown in fig. 7, a rough mathematical model x ═ f of the arc-shaped weld is first establishedx(t),y=fy(t), calculating Xt and Yt, and then using Xt and Yt as the center (mean value) and using Gaussian random function to calculate the variance sigma1Multiple regions to be tracked are generated within the range, and in addition, if no explicit mathematical model can be designed, the (x) can be directly calculatedt,yt) As a mean value, by the ratio σ1Greater variance σ2A plurality of areas to be tracked are randomly generated.
Step S32, calculating the tracking scores of the plurality of areas to be tracked based on a preset discrimination network to obtain corresponding tracking scores;
further, after the to-be-tracked region in the to-be-tracked image is determined, the images in all the to-be-tracked regions are subjected to discrimination calculation by using a preset discrimination network, specifically, a tracking score corresponding to each to-be-tracked region is determined, and the higher the score is, the higher the probability that the weld characteristic point appears in the to-be-tracked region is.
Specifically, in this embodiment, the preset discrimination network is a network stored after offline training, and the process of offline training the preset discrimination network is as follows:
moving a laser structure light welding seam tracking system along a welding seam for welding, acquiring a welding seam image sample in a preset time period, and assuming that position information (xi, yi, w, h) of a welding seam feature point in each frame of image is marked on the welding seam image sample, wherein xi and yi are position coordinate values of the welding seam feature point, and w and h are the width and height of a rectangular frame where the welding seam feature point is located, namely the width and height of a target area, so that a position set of the welding seam feature point can be obtained. Dividing the weld image sample into a positive sample and a negative sample according to the position of the weld characteristic point, specifically, randomly generating a new frame around a target area corresponding to the weld characteristic point, and taking the weld image sample with the frame overlapping rate larger than a threshold value as the positive sample; and taking the weld image sample with the frame overlapping rate less than or equal to the threshold value as a negative sample. It can be understood that, in this embodiment, the amplification processing may be further performed on the positive sample to increase the diversity of the sample and improve the accuracy of the off-line training.
And inputting the positive sample and the negative sample into the preset discrimination network for off-line training, outputting the tracking scores corresponding to the positive sample and the negative sample by the preset discrimination network finally, and storing the trained preset discrimination network so as to determine the target area on line.
And step S33, determining the region to be tracked corresponding to the highest tracking score as a target region corresponding to the weld joint feature point, and determining the position information corresponding to the target region so as to carry out the next welding.
And calculating tracking scores of the areas to be tracked in the images to be tracked through a preset discrimination network, determining a tracking score of each area to be tracked, determining the area to be tracked with the highest score as a target area, namely the position where the characteristic point of the welding seam should appear during the next welding, and performing the next welding according to the target area.
Specifically, processing methods such as image processing, threshold segmentation, feature point extraction, laser stripe thinning and the like are performed on an image corresponding to the target region, so as to determine position information corresponding to the target region.
In the embodiment, a plurality of regions to be tracked are determined in an image to be tracked through the determined initial positions of the weld characteristic points, tracking scores corresponding to the plurality of regions to be tracked are calculated based on a preset discrimination network, and the region to be tracked with the highest score is determined as a target region, so that the next welding is performed according to the position information of the target region, the accurate extraction of the weld characteristic points is realized, and the accuracy of weld tracking in the welding process is improved.
Furthermore, an embodiment of the present invention further provides a computer-readable storage medium, where a seam tracking program is stored on the computer-readable storage medium, and when executed by a processor, the seam tracking program implements the following operations:
acquiring a weld image, and determining an initial position corresponding to a weld characteristic point based on the weld image;
determining an image to be tracked corresponding to the weld characteristic point in the weld image based on a preset pollution degree scoring network;
and determining a target area corresponding to the weld characteristic point in the image to be tracked based on the initial position so as to carry out next welding according to the target area.
Further, the weld tracking program when executed by the processor further performs the following operations:
acquiring a welding seam image in a welding process, wherein the welding seam image comprises a plurality of frames of welding seam images;
determining an initial position of a weld feature point based on a first frame of weld image, wherein the first frame of weld image is a weld image without noise pollution.
Further, the weld tracking program when executed by the processor further performs the following operations:
based on a preset pollution degree grading network, grading the pollution degree of the plurality of frames of welding line images, and determining pollution degree scores corresponding to the plurality of frames of welding line images respectively;
judging whether the pollution degree score is smaller than or equal to a preset pollution degree threshold value or not;
and if so, determining the welding seam image corresponding to the pollution degree score as an image to be tracked.
Further, the weld tracking program when executed by the processor further performs the following operations:
acquiring weld image samples with different noise pollution degrees, and performing off-line training on a preset pollution degree scoring network based on the weld image samples so that the preset pollution degree scoring network can output pollution degree scores corresponding to the weld image samples;
and storing the preset pollution degree scoring network after the off-line training.
Further, the weld tracking program when executed by the processor further performs the following operations:
and if the pollution degree score is larger than the preset pollution degree threshold value, determining the welding seam image corresponding to the minimum pollution degree score as the image to be tracked.
Further, the weld tracking program when executed by the processor further performs the following operations:
determining a plurality of areas to be tracked in the image to be tracked based on the initial positions;
calculating tracking scores of the plurality of areas to be tracked based on a preset discrimination network to obtain corresponding tracking scores;
and determining the region to be tracked corresponding to the highest tracking score as a target region corresponding to the weld joint feature point, and determining the position information corresponding to the target region so as to carry out the next welding.
Further, the weld tracking program when executed by the processor further performs the following operations:
acquiring a weld image sample in a preset time period, and determining position information of weld characteristic points respectively corresponding to the weld image sample in the preset time period;
dividing the weld image sample in the preset time period into a positive sample and a negative sample based on the position information, and putting the positive sample and the negative sample into a preset discrimination network for off-line training so that the preset discrimination network outputs tracking scores corresponding to the positive sample and the negative sample;
and storing the preset discrimination network after the off-line training.
Further, the weld tracking program when executed by the processor further performs the following operations:
and extracting image information corresponding to the target area, and processing the image information corresponding to the target area to determine position information corresponding to the target area so as to perform the next welding based on the position information.
According to the scheme provided by the embodiment, the welding seam image is obtained firstly, the initial position of the welding seam feature point is determined from the welding seam image, the image to be tracked corresponding to the welding seam feature point is determined in the welding seam image based on the preset pollution degree scoring network, finally, the corresponding target area is determined in the image to be tracked based on the initial position of the welding seam feature point, and the next welding step is carried out according to the target area. According to the welding seam tracking method, the initial position of the welding seam feature point is identified from the welding seam image, the pollution degree of the welding seam image is graded, the welding seam image to be tracked is determined, finally, the target area where the welding seam feature point appears is determined from the welding seam image to be tracked on the basis of the initial position of the welding seam feature point, the influence of ambient light such as arc light and splashing on the extraction of the welding seam feature point is eliminated, the accuracy of the extraction of the welding seam feature point is improved, and accurate welding seam tracking is achieved.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) as described above and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (9)

1. A weld tracking method, characterized by comprising the steps of:
acquiring a weld image, and determining an initial position corresponding to a weld characteristic point based on the weld image;
determining an image to be tracked corresponding to the weld characteristic point in the weld image based on a preset pollution degree scoring network;
determining a plurality of areas to be tracked in the image to be tracked based on the initial positions;
calculating tracking scores of the plurality of areas to be tracked based on a preset discrimination network to obtain corresponding tracking scores;
determining a region to be tracked corresponding to the highest tracking score as a target region corresponding to the weld joint feature point, and determining position information corresponding to the target region so as to carry out the next welding;
wherein the step of determining a number of regions to be tracked in the image to be tracked based on the initial position comprises:
if it isType of weldDetermining a target position corresponding to the initial position in the image to be tracked based on an approximate mathematical model of the arc-shaped welding seam if the welding seam is approximate to the arc-shaped welding seam, wherein the image to be tracked is a welding seam map with a pollution degree score smaller than or equal to a preset pollution degree threshold valueAn image;
and generating a plurality of regions to be tracked by taking the target position as a center and utilizing a Gaussian random function.
2. The weld tracking method of claim 1, wherein the step of acquiring the weld image and determining the initial position of the weld feature point corresponding to the weld image comprises:
acquiring a welding seam image in a welding process, wherein the welding seam image comprises a plurality of frames of welding seam images;
determining an initial position of a weld feature point based on a first frame of weld image, wherein the first frame of weld image is a weld image without noise pollution.
3. The weld tracking method according to claim 2, wherein the step of determining the image to be tracked corresponding to the weld feature point in the weld image based on a preset contamination degree scoring network comprises:
based on a preset pollution degree grading network, grading the pollution degree of the plurality of frames of welding line images, and determining pollution degree scores corresponding to the plurality of frames of welding line images respectively;
judging whether the pollution degree score is smaller than or equal to a preset pollution degree threshold value or not;
and if so, determining the welding seam image corresponding to the pollution degree score as an image to be tracked.
4. The weld joint tracking method according to claim 1, wherein before the step of determining the image to be tracked corresponding to the weld joint feature point in the weld joint image based on the preset pollution degree scoring network, the method further comprises:
acquiring weld image samples with different noise pollution degrees, and performing off-line training on a preset pollution degree scoring network based on the weld image samples so that the preset pollution degree scoring network can output pollution degree scores corresponding to the weld image samples;
and storing the preset pollution degree scoring network after the off-line training.
5. The weld tracking method according to claim 3, wherein the step of determining whether the contamination level score is less than or equal to a predetermined contamination level threshold value is followed by:
and if the pollution degree score is larger than the preset pollution degree threshold value, determining the welding seam image corresponding to the minimum pollution degree score as the image to be tracked.
6. The weld joint tracking method according to claim 1, wherein before the step of determining a target region corresponding to the weld joint feature point in the image to be tracked based on the initial position so as to perform the next welding according to the target region, the method further comprises:
acquiring a weld image sample in a preset time period, and determining position information of weld characteristic points respectively corresponding to the weld image sample in the preset time period;
dividing the weld image sample in the preset time period into a positive sample and a negative sample based on the position information, and putting the positive sample and the negative sample into a preset discrimination network for off-line training so that the preset discrimination network outputs tracking scores corresponding to the positive sample and the negative sample;
and storing the preset discrimination network after the off-line training.
7. The weld tracking method according to claim 1, wherein the step of determining the position information corresponding to the target area for the next welding step comprises:
and extracting image information corresponding to the target area, and processing the image information corresponding to the target area to determine position information corresponding to the target area so as to perform the next welding based on the position information.
8. A weld tracking device, comprising: memory, a processor and a weld tracking program stored on the memory and executable on the processor, the weld tracking program when executed by the processor implementing the steps of the weld tracking method according to any one of claims 1 to 7.
9. A computer-readable storage medium, having stored thereon a weld tracking program, which when executed by a processor, performs the steps of the weld tracking method according to any one of claims 1 to 7.
CN201811309646.2A 2018-11-05 2018-11-05 Weld joint tracking method and device and computer readable storage medium Active CN109492688B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811309646.2A CN109492688B (en) 2018-11-05 2018-11-05 Weld joint tracking method and device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811309646.2A CN109492688B (en) 2018-11-05 2018-11-05 Weld joint tracking method and device and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN109492688A CN109492688A (en) 2019-03-19
CN109492688B true CN109492688B (en) 2021-07-30

Family

ID=65695169

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811309646.2A Active CN109492688B (en) 2018-11-05 2018-11-05 Weld joint tracking method and device and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN109492688B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110210497B (en) * 2019-05-27 2023-07-21 华南理工大学 Robust real-time weld feature detection method
CN111144483B (en) * 2019-12-26 2023-10-17 歌尔股份有限公司 Image feature point filtering method and terminal
CN111723592B (en) * 2020-06-18 2022-07-19 深圳泰德激光技术股份有限公司 Storage method and system of welding information and computer readable storage medium
CN111702381A (en) * 2020-06-23 2020-09-25 石家庄坚持科技有限公司 Screen slice welding control method and system and terminal equipment
CN112589232B (en) * 2020-12-15 2022-05-20 广东工业大学 Weld joint tracking method and device based on independent deviation correction type deep learning
CN113319411A (en) * 2021-03-04 2021-08-31 湖南大学 Visual positioning method and system and computing equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101159017A (en) * 2007-11-27 2008-04-09 清华大学 Welding line automatic recognition visible sensation method based on partial image texture characteristic matched
CN107424176A (en) * 2017-07-24 2017-12-01 福州智联敏睿科技有限公司 A kind of real-time tracking extracting method of weld bead feature points
CN107798330A (en) * 2017-11-10 2018-03-13 上海电力学院 A kind of weld image characteristics information extraction method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9818039B2 (en) * 2013-12-17 2017-11-14 General Electric Company Method and device for automatically identifying a point of interest in a depth measurement on a viewed object

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101159017A (en) * 2007-11-27 2008-04-09 清华大学 Welding line automatic recognition visible sensation method based on partial image texture characteristic matched
CN107424176A (en) * 2017-07-24 2017-12-01 福州智联敏睿科技有限公司 A kind of real-time tracking extracting method of weld bead feature points
CN107798330A (en) * 2017-11-10 2018-03-13 上海电力学院 A kind of weld image characteristics information extraction method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于主动光视觉传感的焊缝自动跟踪系统研究;龚国基;《中国优秀硕士学位论文全文数据库信息科技辑》;20170215;第37-65页第4章,第66-67页第5.1节 *

Also Published As

Publication number Publication date
CN109492688A (en) 2019-03-19

Similar Documents

Publication Publication Date Title
CN109492688B (en) Weld joint tracking method and device and computer readable storage medium
US9521317B2 (en) Method and apparatus for detecting obstacle based on monocular camera
CN111462110A (en) Welding seam quality detection method, device and system and electronic equipment
CN109934847B (en) Method and device for estimating posture of weak texture three-dimensional object
US20170039727A1 (en) Methods and Systems for Detecting Moving Objects in a Sequence of Image Frames Produced by Sensors with Inconsistent Gain, Offset, and Dead Pixels
CN112967339B (en) Vehicle pose determining method, vehicle control method and device and vehicle
CN112598922B (en) Parking space detection method, device, equipment and storage medium
CN107038443B (en) Method and device for positioning region of interest on circuit board
US20220414910A1 (en) Scene contour recognition method and apparatus, computer-readable medium, and electronic device
CN110378227B (en) Method, device and equipment for correcting sample labeling data and storage medium
US11087224B2 (en) Out-of-vehicle communication device, out-of-vehicle communication method, information processing device, and computer readable medium
CN109145902B (en) Method for recognizing and positioning geometric identification by using generalized characteristics
CN109635700B (en) Obstacle recognition method, device, system and storage medium
CN110796104A (en) Target detection method and device, storage medium and unmanned aerial vehicle
CN111553914A (en) Vision-based goods detection method and device, terminal and readable storage medium
CN110291771B (en) Depth information acquisition method of target object and movable platform
CN113536867B (en) Object identification method, device and system
CN110222704B (en) Weak supervision target detection method and device
CN114074321A (en) Robot calibration method and device
CN112308917A (en) Vision-based mobile robot positioning method
CN110689556A (en) Tracking method and device and intelligent equipment
CN110660000A (en) Data prediction method, device, equipment and computer readable storage medium
CN115984197A (en) Defect detection method based on standard PCB image and related device
CN115909253A (en) Target detection and model training method, device, equipment and storage medium
CN108805121B (en) License plate detection and positioning method, device, equipment and computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant