CN116030098B - Weld joint target tracking method and system based on directional characteristic driving - Google Patents
Weld joint target tracking method and system based on directional characteristic driving Download PDFInfo
- Publication number
- CN116030098B CN116030098B CN202310300244.0A CN202310300244A CN116030098B CN 116030098 B CN116030098 B CN 116030098B CN 202310300244 A CN202310300244 A CN 202310300244A CN 116030098 B CN116030098 B CN 116030098B
- Authority
- CN
- China
- Prior art keywords
- weld
- image
- hog
- target
- current frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 48
- 230000004927 fusion Effects 0.000 claims abstract description 30
- 230000004044 response Effects 0.000 claims abstract description 28
- 238000010586 diagram Methods 0.000 claims abstract description 26
- 238000012549 training Methods 0.000 claims abstract description 14
- 238000003466 welding Methods 0.000 claims description 25
- 238000001914 filtration Methods 0.000 claims description 19
- 238000007781 pre-processing Methods 0.000 claims description 14
- 230000006870 function Effects 0.000 claims description 13
- 238000009432 framing Methods 0.000 claims description 7
- 238000012545 processing Methods 0.000 claims description 7
- 238000004364 calculation method Methods 0.000 claims description 4
- 238000009499 grossing Methods 0.000 claims description 3
- 230000000877 morphologic effect Effects 0.000 claims description 3
- 238000004422 calculation algorithm Methods 0.000 abstract description 16
- 238000000605 extraction Methods 0.000 abstract description 8
- 230000000007 visual effect Effects 0.000 abstract description 5
- 239000011159 matrix material Substances 0.000 description 19
- 125000004122 cyclic group Chemical group 0.000 description 10
- 230000008569 process Effects 0.000 description 9
- 230000000694 effects Effects 0.000 description 3
- 238000005286 illumination Methods 0.000 description 3
- -1 arc light Substances 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000009977 dual effect Effects 0.000 description 2
- 239000000428 dust Substances 0.000 description 2
- 239000002184 metal Substances 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000004064 recycling Methods 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 239000000779 smoke Substances 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 241000195940 Bryophyta Species 0.000 description 1
- 101100391182 Dictyostelium discoideum forI gene Proteins 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000021615 conjugation Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000017105 transposition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/80—Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Image Analysis (AREA)
Abstract
The invention provides a weld joint target tracking method and system based on directional characteristic driving, and belongs to the field of visual target tracking. Acquiring a plurality of Shan Zhen weld images, and selecting a target to be tracked in a frame in a first Shan Zhen weld image; extracting Gabor characteristics of multiple directions from the preprocessed weld joint image of the current frame; extracting HOG features from Gabor feature graphs in multiple directions, and carrying out weighted fusion on the HOG features in multiple directions; training a correlation filter by using the weighted and fused HOG characteristics; and carrying out convolution operation on the HOG characteristics after weighted fusion and the trained relevant filter to obtain a response diagram of the relevant filter, wherein the coordinate position with the maximum response value in the response diagram is the target position tracked by the weld joint image of the current frame. The method and the device can ensure the accuracy and the robustness of the target tracking algorithm by applying the weighted fusion feature extraction algorithm of the Gabor features and the HOG features.
Description
Technical Field
The invention belongs to the technical field of visual target tracking, and particularly relates to a weld target tracking method and system based on directional characteristic driving.
Background
The statements in this section merely provide background information related to the present disclosure and may not necessarily constitute prior art.
Target tracking is an important part of the field of computer vision, and is widely applied to the fields of security and protection, monitoring, inspection, intelligent life, intelligent industry and the like. Since the target tracking concept was proposed, the field has made a remarkable progress under the common research efforts of students at home and abroad.
When two plates with certain thickness are welded, a groove is often cut at the joint, and the plates can be connected more firmly by welding the grooves. In the welding process, under the oblique irradiation of the infrared laser of the auxiliary light source, laser stripes with a certain angle can appear at the weld groove, and an industrial camera is used for capturing a weld image with the laser stripes in real time through an optical triangulation method to track a target. However, various interferences such as smoke dust, arc light, metal splash and the like exist in the welding process, and due to the characteristics of the captured welding line image and the existence of the interferences, complicated noise interferences such as illumination and the like exist in the welding process, and the distinction between a target area and a background area is not obvious. The accuracy and the robustness of a pure KCF target tracking algorithm are difficult to ensure when facing such problems, and the problems are difficult in the field of weld target tracking.
Disclosure of Invention
In order to overcome the defects in the prior art, the invention provides a weld joint target tracking method and a weld joint target tracking system based on directional characteristic driving, which apply a weighted fusion characteristic extraction algorithm of Gabor characteristics and HOG characteristics, and can ensure the accuracy and the robustness of a target tracking algorithm under the conditions of complicated noise interference such as illumination and the like and indistinguishable target areas and backgrounds to be tracked.
To achieve the above object, one or more embodiments of the present invention provide the following technical solutions:
the invention provides a weld joint target tracking method based on direction characteristic driving.
A weld joint target tracking method based on direction characteristic driving comprises the following steps:
step one: acquiring a plurality of continuous Shan Zhen weld images, and framing a target to be tracked in the acquired first Shan Zhen weld image;
step two: preprocessing the current frame welding seam image to obtain a preprocessed current frame welding seam image;
step three: extracting Gabor characteristics of multiple directions from the preprocessed weld joint image of the current frame to obtain Gabor characteristic diagrams of the multiple directions;
step four: extracting HOG features from the Gabor feature graphs in multiple directions respectively to obtain HOG feature graphs in multiple directions, and carrying out weighted fusion on the HOG features in multiple directions to obtain HOG features after weighted fusion;
step five: training a correlation filter by using the weighted and fused HOG characteristics to obtain a trained correlation filter;
step six: carrying out convolution operation on the HOG characteristics after weighted fusion and the trained relevant filter to obtain a response diagram of the relevant filter, wherein the coordinate position with the maximum response value in the response diagram is the target position tracked by the weld joint image of the current frame;
step seven: updating the target appearance template and the filter model;
step eight: judging whether to continue tracking, if so, taking the next Shan Zhen weld image as the current frame weld image, and circularly executing the second step to the eighth step, otherwise, stopping tracking.
The second aspect of the invention provides a weld target tracking system driven based on directional characteristics.
A directional feature driven weld target tracking system comprising:
a single frame weld image acquisition module configured to: acquiring a plurality of continuous Shan Zhen weld images, and framing a target to be tracked in the acquired first Shan Zhen weld image;
a preprocessing module configured to: preprocessing the current frame welding seam image to obtain a preprocessed current frame welding seam image;
the Gabor feature map acquisition module is configured to: extracting Gabor characteristics of multiple directions from the preprocessed weld joint image of the current frame to obtain Gabor characteristic diagrams of the multiple directions;
the HOG feature acquisition module is configured to: extracting HOG features from the Gabor feature graphs in multiple directions respectively to obtain HOG feature graphs in multiple directions, and carrying out weighted fusion on the HOG features in multiple directions to obtain HOG features after weighted fusion;
a training module configured to: training a correlation filter by using the weighted and fused HOG characteristics to obtain a trained correlation filter;
a target location tracking module configured to: carrying out convolution operation on the HOG characteristics after weighted fusion and the trained relevant filter to obtain a response diagram of the relevant filter, wherein the coordinate position with the maximum response value in the response diagram is the target position tracked by the weld joint image of the current frame;
an update module configured to: updating the target appearance template and the filter model;
a judgment module configured to: judging whether to continue tracking, if so, taking the next Shan Zhen weld image as the current frame weld image, circularly executing the preprocessing module to the judging module, otherwise, stopping tracking.
A third aspect of the present invention provides a computer readable storage medium having stored thereon a program which when executed by a processor performs the steps in the direction feature driven weld target tracking method according to the first aspect of the present invention.
A fourth aspect of the invention provides an electronic device comprising a memory, a processor and a program stored on the memory and executable on the processor, the processor implementing the steps in the method for tracking a weld target based on directional characteristic actuation according to the first aspect of the invention when the program is executed.
The one or more of the above technical solutions have the following beneficial effects:
the algorithm provided by the invention is more in line with the image characteristics of the groove in the welding process, the Gabor characteristic extraction algorithm is used for extracting the characteristics in different directions before the HOG characteristics are extracted, then the angle which is beneficial to tracking the weld joint is selected according to the angle characteristics of the weld joint groove, and a larger weight is given to the angle in the weighted fusion process, so that the required characteristics can be highlighted, meanwhile, the unnecessary characteristics are desalted as much as possible, the characteristics required in the welding work are extracted more accurately, meanwhile, the algorithm is more stable, the error fluctuation range is smaller, and the accuracy and the robustness are improved.
Additional aspects of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention.
Fig. 1 is a flow chart of a method of a first embodiment.
Fig. 2 is a preprocessing flow chart.
FIG. 3 is a diagram showing the experimental comparison of the filtering method of the present invention with other filtering methods.
Fig. 4 is a Gabor feature extraction visual view.
Fig. 5 is a system configuration diagram of the second embodiment.
Detailed Description
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the invention. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of exemplary embodiments according to the present invention.
Embodiments of the invention and features of the embodiments may be combined with each other without conflict.
The invention provides a general idea:
the invention can effectively solve the problem of poor target tracking effect caused by various interferences such as smoke dust, arc light, metal splash and the like and indistinct distinction between a target area and a background, and because the angles of laser stripes at the groove of the welding seam are different from those of other plane areas, the invention can extract the detail texture characteristics of images in different directions by adopting the Gabor characteristic extraction algorithm, so that the angle characteristics of the laser stripes at the groove of the welding seam can be utilized, after HOG characteristic extraction, the weight coefficient at the groove is increased to highlight the groove area when the images at all angles are weighted and fused, the rest background areas are desalted, and the accuracy and the robustness of the welding seam in the tracking process are ensured.
Example 1
The embodiment discloses a weld joint target tracking method based on direction characteristic driving.
As shown in fig. 1, a method for tracking a weld target based on directional characteristic driving includes the following steps:
step one: acquiring a plurality of continuous Shan Zhen weld images, and framing a target to be tracked in the acquired first Shan Zhen weld image;
step two: preprocessing the current frame welding seam image to obtain a preprocessed current frame welding seam image;
step three: extracting Gabor characteristics of multiple directions from the preprocessed weld joint image of the current frame to obtain Gabor characteristic diagrams of the multiple directions;
step four: extracting HOG features from the Gabor feature graphs in multiple directions respectively to obtain HOG feature graphs in multiple directions, and carrying out weighted fusion on the HOG features in multiple directions to obtain HOG features after weighted fusion;
step five: training a correlation filter by using the weighted and fused HOG characteristics to obtain a trained correlation filter;
step six: carrying out convolution operation on the HOG characteristics after weighted fusion and the trained relevant filter to obtain a response diagram of the relevant filter, wherein the coordinate position with the maximum response value in the response diagram is the target position tracked by the weld joint image of the current frame;
step seven: updating the target appearance template and the filter model;
step eight: judging whether to continue tracking, if so, taking the next Shan Zhen weld image as the current frame weld image, and circularly executing the second step to the eighth step, otherwise, stopping tracking.
Specific:
the plurality of Shan Zhen weld images in step one, including each frame of picture or each frame of picture in video captured in real time by the industrial camera. And acquiring a first frame of welding line image in the captured video, and framing a target to be tracked.
In the second step, the pretreatment flow of the laser stripe at the weld joint is shown in fig. 2, and the steps are as follows:
(1) Firstly, carrying out graying treatment on the weld image to obtain a gray scale image. The invention provides a method for graying an image, which selects a weighted average method.
(2) After the gray scale image is obtained, the image needs to be smoothly filtered to remove image noise. Smoothing filtering mode the present invention selects gaussian filtering.
(3) After the smoothing filtering of the step (2), the noise of the image is removed, and the image is subjected to binarization processing in the next step, so that the image is changed from a gray state of 0-255 to a non-black binary state, namely a white binary state. Binarization method the present invention selects a global binarization method.
(4) After obtaining the binarized image, morphological filtering processing of opening and closing operation is performed on the image. The invention selects the closed operation to perform morphological filtering treatment.
The preprocessing effect of the 4-step image is shown in fig. 3, and finally the weld image after filtering out noise such as arc light, splash, illumination and the like as much as possible is obtained.
Step three, extracting Gabor characteristics of a plurality of directions from the weld joint image preprocessed in step two to obtain Gabor characteristic diagrams of a plurality of directions shown in fig. 4, wherein (a) in fig. 4 is a visual view of Gabor filters with core sizes of 5, 7, 9, 11 and 13 and angles of 0 °, 30 °, 45 °, 60 °,90 ° and 135 °, and (b) in fig. 4 is a Gabor characteristic extraction effect diagram.
Gabor features are features that can be used to describe image texture information, and the frequency and direction of Gabor filters are particularly suitable for texture representation and discrimination.
The method comprises the following specific steps:
the mathematical expression of the Gabor function is:
wherein ,representing the wavelength of the filtering, +.>Representing the tilt angle, < +.>Represents the phase offset, ranging from-180 DEG to 180 DEG,>represents the standard deviation of the gaussian function, +.>Representing aspect ratio,/-> and />The coordinate positions of the pixels are indicated. and />Representing the coordinate position of the pixel before rotation; />,/>,/> and />Representing the coordinate position of the rotated pixel. Wherein->The value of (2) may be provided with a plurality of angles, in this embodiment +.>The values of (1) are set to 0 °, 30 °, 45 °, 60 °,90 °, and 135 °, the kernel sizes of the filter bank are 5, 7, 9, 11, and 13, and gabor feature extraction visual views are shown in fig. 4. Experiment comparison tableIt is clear that the angles required for the image of this example are 45 °,90 ° and 135 °, the kernel size being 5. The Gabor features in the 45 DEG and 135 DEG directions are combined into a feature map +.>Gabor characteristic of the 90 DEG orientation +.>Is temporarily unchanged. The Gabor characteristic obtained is->The feature model at the feature level is denoted +.>, wherein />Indicate->Frame (F)>Representing the feature model.
And step four, respectively extracting HOG features from the Gabor feature graphs in the multiple directions obtained in the step three to obtain HOG features in the multiple directions. The direction gradient histogram (Histogram of Oriented Gradient, HOG) feature is a feature descriptor used for object detection in computer vision and image processing. HOG features are characterized by computing and counting the gradient direction histograms of local areas of the image.
The method comprises the following steps:
and extracting HOG features from the Gabor feature map after the Gabor feature map is obtained. Is provided withThe direction is horizontal direction, the>The direction is the vertical direction; is provided with->The pixel value at is +.>Then->Direction and->The calculation formula of the direction gradient value is as follows:
wherein ,representing a horizontal gradient>Representing the vertical gradient +.>、/>Respectively represent pixel points in the input image>A horizontal gradient and a vertical gradient. Pixel dot +.>Gradient of the part->And direction->Expressed as:
after the acquired gradient and the gradient direction are calculated to obtain a gradient direction histogram, the amplitude is normalized to obtain HOG characteristics of the two Gabor characteristics and />. The feature model of the obtained HOG features on the feature level is expressed as +.>, wherein />Indicate->And (3) a frame.
Step five: and carrying out weighted fusion on the HOG characteristics in the multiple directions. Because the angle of the laser stripe at the weld groove is not the same as the angles of other areas of the image, the areas needed by us can be highlighted by giving different weights to the different angles. Therefore, in HOG characteristics in multiple directions, a first weight coefficient is given to the HOG characteristics capable of highlighting the weld groove, and the value range of the first weight coefficient is 0.7-1; the HOG characteristics which are not needed are endowed with a second weight coefficient, and the value range of the second weight coefficient is 0-0.3; and the sum of the first weight coefficient and the second weight coefficient is 1, and finally the weighted and fused HOG characteristic is obtained. The process can fade the unnecessary characteristics as much as possible, and also can filter out partial interference in the welding process, so that the weighted and fused image is more convenient to track.
The weighted fusion calculation formula is:
Step six: the correlation filter is trained.
Tracking algorithms based on kernel correlation filtering, such as MOSSE, CSK, KCF, BACF, SAMF, introduce correlation filtering (measuring the similarity of two signals) in the communication field into target tracking, and correlation filters use the basic idea of correlation filtering tracking, so they are called correlation filters, which are common terms in the art. The tracking algorithm of the correlation filtering starts from a CSK method proposed by P.Martins in 2012, an author proposes a kernel tracking method based on a cyclic matrix, and the problem of Dense Sampling (Dense Sampling) is solved perfectly mathematically, and the detection process is realized rapidly by utilizing Fourier transformation. Compared with the traditional algorithms such as an optical flow method and Kalman, meanshift, the related filtering algorithm has the advantages of higher tracking speed and higher precision. The basic idea of the related filtering tracking is to design a filtering template, and to use the template to perform related operation with the target candidate region, wherein the position of the maximum output response is the target position of the current frame. The KCF target tracking algorithm is selected in the embodiment.
The training process for the correlation filter is as follows:
the KCF target tracking algorithm training correlation filter was implemented using ridge regression. The goal of training using ridge regression is to find a function,/>For the filter parameters +.>Representing the transpose of the matrix, let sample +.>And its regression objective->The least squares error of (2) and the loss function is:
wherein ,to control the regularization parameters of the overfitting. The matrix form can be expressed as:
in the formula ,is a cyclic matrix, consisting of samples->Composition; />Is->Corresponding tag vector, by regression target->Composition; when the value of the loss function is 0, the derivative is available:
6.2. According to the property in which the cyclic matrix can be diagonalized by the discrete fourier matrix, inverting the matrix into a property in which the eigenvalues are inverted; can be used forThe solution of (2) is converted into a frequency domain for operation, discrete Fourier transform is applied to improve the operation speed, and then the solution is inversely converted back into the time domain, so that the solution with the maximum response is obtained.
The upper part is a cyclic matrixWherein the first row represents the initial samples, after which each row is the initial samples +.>And (5) recycling. />Matrix form transformed by discrete Fourier transform->Diagonalization:
wherein Is->Fourier transform of->Is in the form of a matrix of discrete fourier transforms, +.>Matrix form representing a Fourier transform +.>And (5) transposition. Substituting into formula (9), can obtain:
representation->Conjugation of->The elements representing the corresponding positions are multiplied. By means of the anti-diagonalization properties, it is possible to obtain:
wherein Is->A matrix. Recycling cyclic matrix convolution properties: />The method comprises the following steps:
6.3. In order to solve the problem of high computational complexity in a high-dimensional space, a kernel function is found, and the formula can be expressed as follows:
wherein and />Belonging to the circulant matrix->,/>Is a low dimensional space, < >>Representing a nonlinear transformation function +.>Representing the inner product operation,/->Representing a kernel function belonging to a high-dimensional space. Will->Use->Is expressed by the non-linearity of (a):
6.4. Bringing the above into equation (8), forIs converted into pair->Is carried into the following steps:
wherein I.e. the final solution in the frequency domain of the ridge regression problem, < >>Representation->Column vector of the magnitude, is the filter parameter +.>Corresponding dual space coefficients; />Let->,/>A kernel matrix representing a kernel space, and is a cyclic matrix. The diagonalized property of the kernel function in the frequency domain can be exploited to:
6.5. Let the result after the feature weighted fusion in the above formula (6) be inputWhen the filter is turned into:
order the,/>A kernel matrix representing a kernel space, and is a cyclic matrix. Fourier transforming the above with the properties of the cyclic matrix can result in:
wherein Is a kernel function->Element of the first row->Is a vector, ++>Is->Comprises->Is a cyclic shift of the output of the cyclic shift.
Step seven: after the kernel function finishes training, the newly input sample features are mapped to the kernel space, all positions in the sample are operated through the kernel function, corresponding response results are obtained, the obtained response results are response diagrams, and the strongest response in the response diagrams is the position of the frame target.
Step eight: the invention also extends the updating strategy of the KCF target tracking algorithm, and the updating comprises updating the target appearance template and the filter model. The target appearance template part is used for carrying out weighted updating on the HOG characteristics extracted from the current frame image after weighted fusion and the HOG characteristics extracted from the previous frame image after weighted fusion to obtain a target appearance template of the current frame; the updating part of the correlation filter model is to train the HOG characteristic after weighting and fusing the target image in the current frame to obtain the correlation filter, and then to update the correlation filter with the correlation filter obtained by updating the previous frame in a weighting way, so as to obtain the correlation filter of the current frame.
And step nine, judging whether to continue tracking, if so, returning to the step two, otherwise, stopping tracking.
Example two
The embodiment discloses a weld joint target tracking system based on direction characteristic driving.
As shown in fig. 5, a weld target tracking system driven based on directional characteristics is characterized in that: comprising the following steps:
a single frame weld image acquisition module configured to: acquiring a plurality of continuous Shan Zhen weld images, and framing a target to be tracked in the acquired first Shan Zhen weld image;
a preprocessing module configured to: preprocessing the current frame welding seam image to obtain a preprocessed current frame welding seam image;
the Gabor feature map acquisition module is configured to: extracting Gabor characteristics of multiple directions from the preprocessed weld joint image of the current frame to obtain Gabor characteristic diagrams of the multiple directions;
the HOG feature acquisition module is configured to: extracting HOG features from the Gabor feature graphs in multiple directions respectively to obtain HOG feature graphs in multiple directions, and carrying out weighted fusion on the HOG features in multiple directions to obtain HOG features after weighted fusion;
a training module configured to: training a correlation filter by using the weighted and fused HOG characteristics to obtain a trained correlation filter;
a target location tracking module configured to: carrying out convolution operation on the HOG characteristics after weighted fusion and the trained relevant filter to obtain a response diagram of the relevant filter, wherein the coordinate position with the maximum response value in the response diagram is the target position tracked by the weld joint image of the current frame;
an update module configured to: updating the target appearance template and the filter model;
a judgment module configured to: judging whether to continue tracking, if so, taking the next Shan Zhen weld image as the current frame weld image, circularly executing the preprocessing module to the judging module, otherwise, stopping tracking.
Example III
An object of the present embodiment is to provide a computer-readable storage medium.
A computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps in a method of weld target tracking based on directional characteristic actuation as described in embodiment 1 of the present disclosure.
Example IV
An object of the present embodiment is to provide an electronic apparatus.
An electronic device comprising a memory, a processor and a program stored on the memory and executable on the processor, the processor implementing the steps in the directional feature driven weld target tracking method according to embodiment 1 of the present disclosure when the program is executed.
The steps involved in the devices of the second, third and fourth embodiments correspond to those of the first embodiment of the method, and the detailed description of the embodiments can be found in the related description section of the first embodiment. The term "computer-readable storage medium" should be taken to include a single medium or multiple media including one or more sets of instructions; it should also be understood to include any medium capable of storing, encoding or carrying a set of instructions for execution by a processor and that cause the processor to perform any one of the methods of the present invention.
It will be appreciated by those skilled in the art that the modules or steps of the invention described above may be implemented by general-purpose computer means, alternatively they may be implemented by program code executable by computing means, whereby they may be stored in storage means for execution by computing means, or they may be made into individual integrated circuit modules separately, or a plurality of modules or steps in them may be made into a single integrated circuit module. The present invention is not limited to any specific combination of hardware and software.
While the foregoing description of the embodiments of the present invention has been presented in conjunction with the drawings, it should be understood that it is not intended to limit the scope of the invention, but rather, it is intended to cover all modifications or variations within the scope of the invention as defined by the claims of the present invention.
Claims (9)
1. The welding seam target tracking method based on direction characteristic driving is characterized by comprising the following steps of:
step one: acquiring a plurality of continuous Shan Zhen weld images, and framing a target to be tracked in the acquired first Shan Zhen weld image;
step two: preprocessing the current frame welding seam image to obtain a preprocessed current frame welding seam image;
step three: extracting Gabor characteristics of multiple directions from the preprocessed weld joint image of the current frame to obtain Gabor characteristic diagrams of the multiple directions;
step four: extracting HOG features from the Gabor feature graphs in multiple directions respectively to obtain HOG feature graphs in multiple directions, and carrying out weighted fusion on the HOG features in multiple directions to obtain HOG features after weighted fusion;
in HOG characteristics in multiple directions, a first weight coefficient is given to the HOG characteristics capable of highlighting the weld groove, and the value range of the first weight coefficient is 0.7-1; the HOG characteristics which are not needed are endowed with a second weight coefficient, and the value range of the second weight coefficient is 0-0.3; the sum of the addition of the first weight coefficient and the second weight coefficient is 1, so that the HOG characteristic after weighted fusion is obtained;
step five: training a correlation filter by using the weighted and fused HOG characteristics to obtain a trained correlation filter;
step six: carrying out convolution operation on the HOG characteristics after weighted fusion and the trained relevant filter to obtain a response diagram of the relevant filter, wherein the coordinate position with the maximum response value in the response diagram is the target position tracked by the weld joint image of the current frame;
step seven: updating the target appearance template and the filter model;
step eight: judging whether to continue tracking, if so, taking the next Shan Zhen weld image as the current frame weld image, and circularly executing the second step to the eighth step, otherwise, stopping tracking.
2. The directional characteristic driven weld target tracking method according to claim 1, wherein the plurality of Shan Zhen weld images are each frame of pictures or each frame of pictures in video captured in real time by an industrial camera.
3. The method for tracking a weld target based on directional characteristic driving according to claim 1, wherein the current frame of the weld image is sequentially subjected to gray-scale processing, smoothing filtering processing, binarization processing, and morphological filtering processing of opening and closing operations to obtain a preprocessed image.
4. The directional characteristic driving-based weld target tracking method according to claim 1, wherein the mathematical expression of the Gabor function is:
wherein ,representing the wavelength of the filtering, +.>Representing the tilt angle, < +.>Representing the phase offset, +.>Represents the standard deviation of the gaussian function, +.>Representing aspect ratio,/-> and />Respectively representing coordinate positions of the pixels; />,,/> and />Representing the coordinate position of the rotated pixel.
5. The method for tracking a weld target based on directional characteristic driving as claimed in claim 1, wherein Gabor feature image pixels are obtainedPixel value at pixel point based on pixel point +.>Pixel value calculation at +.>Direction and->Directional gradient values based on->Direction and->Directional gradient value calculation pixel point +.>And (3) calculating a gradient direction histogram based on the gradient and the gradient direction, and normalizing the amplitude of the gradient direction histogram to obtain the HOG characteristic.
6. The method for tracking a weld target based on directional characteristic driving according to claim 1, wherein updating the target appearance template and the filter model comprises the following steps: the method comprises the steps of carrying out weighted updating on the characteristics extracted from a weld image of a current frame and the characteristics extracted from a weld image of a previous frame, and updating a target appearance template of the current frame; and carrying out weighted updating on the relevant filter obtained by the weld joint image of the current frame and the relevant filter obtained by the weld joint image of the previous frame, and finishing updating of the filter model.
7. A weld joint target tracking system based on direction characteristic driving is characterized in that: comprising the following steps:
a single frame weld image acquisition module configured to: acquiring a plurality of continuous Shan Zhen weld images, and framing a target to be tracked in the acquired first Shan Zhen weld image;
a preprocessing module configured to: preprocessing the current frame welding seam image to obtain a preprocessed current frame welding seam image;
the Gabor feature map acquisition module is configured to: extracting Gabor characteristics of multiple directions from the preprocessed weld joint image of the current frame to obtain Gabor characteristic diagrams of the multiple directions;
the HOG feature acquisition module is configured to: extracting HOG features from the Gabor feature graphs in multiple directions respectively to obtain HOG feature graphs in multiple directions, and carrying out weighted fusion on the HOG features in multiple directions to obtain HOG features after weighted fusion;
in HOG characteristics in multiple directions, a first weight coefficient is given to the HOG characteristics capable of highlighting the weld groove, and the value range of the first weight coefficient is 0.7-1; the HOG characteristics which are not needed are endowed with a second weight coefficient, and the value range of the second weight coefficient is 0-0.3; the sum of the addition of the first weight coefficient and the second weight coefficient is 1, so that the HOG characteristic after weighted fusion is obtained;
a training module configured to: training a correlation filter by using the weighted and fused HOG characteristics to obtain a trained correlation filter;
a target location tracking module configured to: carrying out convolution operation on the HOG characteristics after weighted fusion and the trained relevant filter to obtain a response diagram of the relevant filter, wherein the coordinate position with the maximum response value in the response diagram is the target position tracked by the weld joint image of the current frame;
an update module configured to: updating the target appearance template and the filter model;
a judgment module configured to: judging whether to continue tracking, if so, taking the next Shan Zhen weld image as the current frame weld image, circularly executing the preprocessing module to the judging module, otherwise, stopping tracking.
8. A computer readable storage medium having a program stored thereon, which when executed by a processor, implements the steps of the directional characteristic driven weld target tracking method according to any one of claims 1 to 6.
9. An electronic device comprising a memory, a processor and a program stored on the memory and executable on the processor, wherein the processor performs the steps in the direction feature driven weld target tracking method according to any one of claims 1-6 when the program is executed.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310300244.0A CN116030098B (en) | 2023-03-27 | 2023-03-27 | Weld joint target tracking method and system based on directional characteristic driving |
PCT/CN2023/138661 WO2024198528A1 (en) | 2023-03-27 | 2023-12-14 | Target tracking method and system based on direction feature driving |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310300244.0A CN116030098B (en) | 2023-03-27 | 2023-03-27 | Weld joint target tracking method and system based on directional characteristic driving |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116030098A CN116030098A (en) | 2023-04-28 |
CN116030098B true CN116030098B (en) | 2023-06-13 |
Family
ID=86079728
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310300244.0A Active CN116030098B (en) | 2023-03-27 | 2023-03-27 | Weld joint target tracking method and system based on directional characteristic driving |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN116030098B (en) |
WO (1) | WO2024198528A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116030098B (en) * | 2023-03-27 | 2023-06-13 | 齐鲁工业大学(山东省科学院) | Weld joint target tracking method and system based on directional characteristic driving |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103426179A (en) * | 2012-05-17 | 2013-12-04 | 深圳中兴力维技术有限公司 | Target tracking method and system based on mean shift multi-feature fusion |
CN113706580A (en) * | 2021-08-11 | 2021-11-26 | 西安交通大学 | Target tracking method, system, equipment and medium based on relevant filtering tracker |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
MY137246A (en) * | 2002-04-30 | 2009-01-30 | Jfe Steel Corp | Method and instrument for measuring bead cutting shape of electric welded tube |
CN104200237B (en) * | 2014-08-22 | 2019-01-11 | 浙江生辉照明有限公司 | One kind being based on the High-Speed Automatic multi-object tracking method of coring correlation filtering |
CN105224849B (en) * | 2015-10-20 | 2019-01-01 | 广州广电运通金融电子股份有限公司 | A kind of multi-biological characteristic fusion authentication identifying method and device |
CN107767405B (en) * | 2017-09-29 | 2020-01-03 | 华中科技大学 | Nuclear correlation filtering target tracking method fusing convolutional neural network |
CN109753846A (en) * | 2017-11-03 | 2019-05-14 | 北京深鉴智能科技有限公司 | Target following system for implementing hardware and method |
CN109685073A (en) * | 2018-12-28 | 2019-04-26 | 南京工程学院 | A kind of dimension self-adaption target tracking algorism based on core correlation filtering |
CN210334836U (en) * | 2019-01-15 | 2020-04-17 | 深圳大学 | Weld seam line tracking means based on parallel structure light |
CN112700441A (en) * | 2021-01-28 | 2021-04-23 | 中北大学 | Automatic weld searching method based on textural features |
CN114723783A (en) * | 2022-04-01 | 2022-07-08 | 南京信息工程大学 | Multi-feature satellite video target tracking method based on motion estimation |
CN114905507A (en) * | 2022-04-18 | 2022-08-16 | 广州东焊智能装备有限公司 | Welding robot precision control method based on environment vision analysis |
CN116071392A (en) * | 2023-01-31 | 2023-05-05 | 齐鲁工业大学(山东省科学院) | Moving target detection method and system combined with foreground contour extraction |
CN116030098B (en) * | 2023-03-27 | 2023-06-13 | 齐鲁工业大学(山东省科学院) | Weld joint target tracking method and system based on directional characteristic driving |
-
2023
- 2023-03-27 CN CN202310300244.0A patent/CN116030098B/en active Active
- 2023-12-14 WO PCT/CN2023/138661 patent/WO2024198528A1/en unknown
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103426179A (en) * | 2012-05-17 | 2013-12-04 | 深圳中兴力维技术有限公司 | Target tracking method and system based on mean shift multi-feature fusion |
CN113706580A (en) * | 2021-08-11 | 2021-11-26 | 西安交通大学 | Target tracking method, system, equipment and medium based on relevant filtering tracker |
Also Published As
Publication number | Publication date |
---|---|
CN116030098A (en) | 2023-04-28 |
WO2024198528A1 (en) | 2024-10-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111274916B (en) | Face recognition method and face recognition device | |
CN109903313B (en) | Real-time pose tracking method based on target three-dimensional model | |
CN104200495B (en) | A kind of multi-object tracking method in video monitoring | |
CN104331682B (en) | A kind of building automatic identifying method based on Fourier descriptor | |
CN111325769B (en) | Target object detection method and device | |
CN109919960B (en) | Image continuous edge detection method based on multi-scale Gabor filter | |
CN109961399B (en) | Optimal suture line searching method based on image distance transformation | |
CN111178252A (en) | Multi-feature fusion identity recognition method | |
CN116030098B (en) | Weld joint target tracking method and system based on directional characteristic driving | |
CN112784712B (en) | Missing child early warning implementation method and device based on real-time monitoring | |
CN108154496B (en) | Electric equipment appearance change identification method suitable for electric power robot | |
CN113763274B (en) | Multisource image matching method combining local phase sharpness orientation description | |
CN111402185B (en) | Image detection method and device | |
Gal | Automatic obstacle detection for USV’s navigation using vision sensors | |
CN117557565B (en) | Detection method and device for lithium battery pole piece | |
Dai et al. | An Improved ORB Feature Extraction Algorithm Based on Enhanced Image and Truncated Adaptive Threshold | |
Lee et al. | An effective method for detecting facial features and face in human–robot interaction | |
Narasimhamurthy et al. | A Copy-Move Image Forgery Detection Using Modified SURF Features and AKAZE Detector. | |
Zhou et al. | Speeded-up robust features based moving object detection on shaky video | |
Pang et al. | Low frame rate video target localization and tracking testbed | |
Cao et al. | Image Recognition Based on Denoising and Edge Detection | |
CN115409890B (en) | Self-defined mark detection method and system based on MSR and generalized Hough transform | |
Sina et al. | Object recognition on satellite images with biologically-inspired computational approaches | |
Zhang et al. | Electric Vehicle Charging Robot Charging Port Identification Method Based on Multi-Algorithm Fusion | |
Khadane et al. | Real-Time Object Size Dimensioning in Raspberry Pi |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |