CN116030098B - Weld joint target tracking method and system based on directional characteristic driving - Google Patents

Weld joint target tracking method and system based on directional characteristic driving Download PDF

Info

Publication number
CN116030098B
CN116030098B CN202310300244.0A CN202310300244A CN116030098B CN 116030098 B CN116030098 B CN 116030098B CN 202310300244 A CN202310300244 A CN 202310300244A CN 116030098 B CN116030098 B CN 116030098B
Authority
CN
China
Prior art keywords
weld
image
hog
target
current frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310300244.0A
Other languages
Chinese (zh)
Other versions
CN116030098A (en
Inventor
马凤英
孙玉和
纪鹏
罗光欣
陈新明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qilu University of Technology
Original Assignee
Qilu University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qilu University of Technology filed Critical Qilu University of Technology
Priority to CN202310300244.0A priority Critical patent/CN116030098B/en
Publication of CN116030098A publication Critical patent/CN116030098A/en
Application granted granted Critical
Publication of CN116030098B publication Critical patent/CN116030098B/en
Priority to PCT/CN2023/138661 priority patent/WO2024198528A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a weld joint target tracking method and system based on directional characteristic driving, and belongs to the field of visual target tracking. Acquiring a plurality of Shan Zhen weld images, and selecting a target to be tracked in a frame in a first Shan Zhen weld image; extracting Gabor characteristics of multiple directions from the preprocessed weld joint image of the current frame; extracting HOG features from Gabor feature graphs in multiple directions, and carrying out weighted fusion on the HOG features in multiple directions; training a correlation filter by using the weighted and fused HOG characteristics; and carrying out convolution operation on the HOG characteristics after weighted fusion and the trained relevant filter to obtain a response diagram of the relevant filter, wherein the coordinate position with the maximum response value in the response diagram is the target position tracked by the weld joint image of the current frame. The method and the device can ensure the accuracy and the robustness of the target tracking algorithm by applying the weighted fusion feature extraction algorithm of the Gabor features and the HOG features.

Description

Weld joint target tracking method and system based on directional characteristic driving
Technical Field
The invention belongs to the technical field of visual target tracking, and particularly relates to a weld target tracking method and system based on directional characteristic driving.
Background
The statements in this section merely provide background information related to the present disclosure and may not necessarily constitute prior art.
Target tracking is an important part of the field of computer vision, and is widely applied to the fields of security and protection, monitoring, inspection, intelligent life, intelligent industry and the like. Since the target tracking concept was proposed, the field has made a remarkable progress under the common research efforts of students at home and abroad.
When two plates with certain thickness are welded, a groove is often cut at the joint, and the plates can be connected more firmly by welding the grooves. In the welding process, under the oblique irradiation of the infrared laser of the auxiliary light source, laser stripes with a certain angle can appear at the weld groove, and an industrial camera is used for capturing a weld image with the laser stripes in real time through an optical triangulation method to track a target. However, various interferences such as smoke dust, arc light, metal splash and the like exist in the welding process, and due to the characteristics of the captured welding line image and the existence of the interferences, complicated noise interferences such as illumination and the like exist in the welding process, and the distinction between a target area and a background area is not obvious. The accuracy and the robustness of a pure KCF target tracking algorithm are difficult to ensure when facing such problems, and the problems are difficult in the field of weld target tracking.
Disclosure of Invention
In order to overcome the defects in the prior art, the invention provides a weld joint target tracking method and a weld joint target tracking system based on directional characteristic driving, which apply a weighted fusion characteristic extraction algorithm of Gabor characteristics and HOG characteristics, and can ensure the accuracy and the robustness of a target tracking algorithm under the conditions of complicated noise interference such as illumination and the like and indistinguishable target areas and backgrounds to be tracked.
To achieve the above object, one or more embodiments of the present invention provide the following technical solutions:
the invention provides a weld joint target tracking method based on direction characteristic driving.
A weld joint target tracking method based on direction characteristic driving comprises the following steps:
step one: acquiring a plurality of continuous Shan Zhen weld images, and framing a target to be tracked in the acquired first Shan Zhen weld image;
step two: preprocessing the current frame welding seam image to obtain a preprocessed current frame welding seam image;
step three: extracting Gabor characteristics of multiple directions from the preprocessed weld joint image of the current frame to obtain Gabor characteristic diagrams of the multiple directions;
step four: extracting HOG features from the Gabor feature graphs in multiple directions respectively to obtain HOG feature graphs in multiple directions, and carrying out weighted fusion on the HOG features in multiple directions to obtain HOG features after weighted fusion;
step five: training a correlation filter by using the weighted and fused HOG characteristics to obtain a trained correlation filter;
step six: carrying out convolution operation on the HOG characteristics after weighted fusion and the trained relevant filter to obtain a response diagram of the relevant filter, wherein the coordinate position with the maximum response value in the response diagram is the target position tracked by the weld joint image of the current frame;
step seven: updating the target appearance template and the filter model;
step eight: judging whether to continue tracking, if so, taking the next Shan Zhen weld image as the current frame weld image, and circularly executing the second step to the eighth step, otherwise, stopping tracking.
The second aspect of the invention provides a weld target tracking system driven based on directional characteristics.
A directional feature driven weld target tracking system comprising:
a single frame weld image acquisition module configured to: acquiring a plurality of continuous Shan Zhen weld images, and framing a target to be tracked in the acquired first Shan Zhen weld image;
a preprocessing module configured to: preprocessing the current frame welding seam image to obtain a preprocessed current frame welding seam image;
the Gabor feature map acquisition module is configured to: extracting Gabor characteristics of multiple directions from the preprocessed weld joint image of the current frame to obtain Gabor characteristic diagrams of the multiple directions;
the HOG feature acquisition module is configured to: extracting HOG features from the Gabor feature graphs in multiple directions respectively to obtain HOG feature graphs in multiple directions, and carrying out weighted fusion on the HOG features in multiple directions to obtain HOG features after weighted fusion;
a training module configured to: training a correlation filter by using the weighted and fused HOG characteristics to obtain a trained correlation filter;
a target location tracking module configured to: carrying out convolution operation on the HOG characteristics after weighted fusion and the trained relevant filter to obtain a response diagram of the relevant filter, wherein the coordinate position with the maximum response value in the response diagram is the target position tracked by the weld joint image of the current frame;
an update module configured to: updating the target appearance template and the filter model;
a judgment module configured to: judging whether to continue tracking, if so, taking the next Shan Zhen weld image as the current frame weld image, circularly executing the preprocessing module to the judging module, otherwise, stopping tracking.
A third aspect of the present invention provides a computer readable storage medium having stored thereon a program which when executed by a processor performs the steps in the direction feature driven weld target tracking method according to the first aspect of the present invention.
A fourth aspect of the invention provides an electronic device comprising a memory, a processor and a program stored on the memory and executable on the processor, the processor implementing the steps in the method for tracking a weld target based on directional characteristic actuation according to the first aspect of the invention when the program is executed.
The one or more of the above technical solutions have the following beneficial effects:
the algorithm provided by the invention is more in line with the image characteristics of the groove in the welding process, the Gabor characteristic extraction algorithm is used for extracting the characteristics in different directions before the HOG characteristics are extracted, then the angle which is beneficial to tracking the weld joint is selected according to the angle characteristics of the weld joint groove, and a larger weight is given to the angle in the weighted fusion process, so that the required characteristics can be highlighted, meanwhile, the unnecessary characteristics are desalted as much as possible, the characteristics required in the welding work are extracted more accurately, meanwhile, the algorithm is more stable, the error fluctuation range is smaller, and the accuracy and the robustness are improved.
Additional aspects of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention.
Fig. 1 is a flow chart of a method of a first embodiment.
Fig. 2 is a preprocessing flow chart.
FIG. 3 is a diagram showing the experimental comparison of the filtering method of the present invention with other filtering methods.
Fig. 4 is a Gabor feature extraction visual view.
Fig. 5 is a system configuration diagram of the second embodiment.
Detailed Description
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the invention. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of exemplary embodiments according to the present invention.
Embodiments of the invention and features of the embodiments may be combined with each other without conflict.
The invention provides a general idea:
the invention can effectively solve the problem of poor target tracking effect caused by various interferences such as smoke dust, arc light, metal splash and the like and indistinct distinction between a target area and a background, and because the angles of laser stripes at the groove of the welding seam are different from those of other plane areas, the invention can extract the detail texture characteristics of images in different directions by adopting the Gabor characteristic extraction algorithm, so that the angle characteristics of the laser stripes at the groove of the welding seam can be utilized, after HOG characteristic extraction, the weight coefficient at the groove is increased to highlight the groove area when the images at all angles are weighted and fused, the rest background areas are desalted, and the accuracy and the robustness of the welding seam in the tracking process are ensured.
Example 1
The embodiment discloses a weld joint target tracking method based on direction characteristic driving.
As shown in fig. 1, a method for tracking a weld target based on directional characteristic driving includes the following steps:
step one: acquiring a plurality of continuous Shan Zhen weld images, and framing a target to be tracked in the acquired first Shan Zhen weld image;
step two: preprocessing the current frame welding seam image to obtain a preprocessed current frame welding seam image;
step three: extracting Gabor characteristics of multiple directions from the preprocessed weld joint image of the current frame to obtain Gabor characteristic diagrams of the multiple directions;
step four: extracting HOG features from the Gabor feature graphs in multiple directions respectively to obtain HOG feature graphs in multiple directions, and carrying out weighted fusion on the HOG features in multiple directions to obtain HOG features after weighted fusion;
step five: training a correlation filter by using the weighted and fused HOG characteristics to obtain a trained correlation filter;
step six: carrying out convolution operation on the HOG characteristics after weighted fusion and the trained relevant filter to obtain a response diagram of the relevant filter, wherein the coordinate position with the maximum response value in the response diagram is the target position tracked by the weld joint image of the current frame;
step seven: updating the target appearance template and the filter model;
step eight: judging whether to continue tracking, if so, taking the next Shan Zhen weld image as the current frame weld image, and circularly executing the second step to the eighth step, otherwise, stopping tracking.
Specific:
the plurality of Shan Zhen weld images in step one, including each frame of picture or each frame of picture in video captured in real time by the industrial camera. And acquiring a first frame of welding line image in the captured video, and framing a target to be tracked.
In the second step, the pretreatment flow of the laser stripe at the weld joint is shown in fig. 2, and the steps are as follows:
(1) Firstly, carrying out graying treatment on the weld image to obtain a gray scale image. The invention provides a method for graying an image, which selects a weighted average method.
(2) After the gray scale image is obtained, the image needs to be smoothly filtered to remove image noise. Smoothing filtering mode the present invention selects gaussian filtering.
(3) After the smoothing filtering of the step (2), the noise of the image is removed, and the image is subjected to binarization processing in the next step, so that the image is changed from a gray state of 0-255 to a non-black binary state, namely a white binary state. Binarization method the present invention selects a global binarization method.
(4) After obtaining the binarized image, morphological filtering processing of opening and closing operation is performed on the image. The invention selects the closed operation to perform morphological filtering treatment.
The preprocessing effect of the 4-step image is shown in fig. 3, and finally the weld image after filtering out noise such as arc light, splash, illumination and the like as much as possible is obtained.
Step three, extracting Gabor characteristics of a plurality of directions from the weld joint image preprocessed in step two to obtain Gabor characteristic diagrams of a plurality of directions shown in fig. 4, wherein (a) in fig. 4 is a visual view of Gabor filters with core sizes of 5, 7, 9, 11 and 13 and angles of 0 °, 30 °, 45 °, 60 °,90 ° and 135 °, and (b) in fig. 4 is a Gabor characteristic extraction effect diagram.
Gabor features are features that can be used to describe image texture information, and the frequency and direction of Gabor filters are particularly suitable for texture representation and discrimination.
The method comprises the following specific steps:
the mathematical expression of the Gabor function is:
Figure SMS_1
(1)
wherein ,
Figure SMS_8
representing the wavelength of the filtering, +.>
Figure SMS_5
Representing the tilt angle, < +.>
Figure SMS_17
Represents the phase offset, ranging from-180 DEG to 180 DEG,>
Figure SMS_6
represents the standard deviation of the gaussian function, +.>
Figure SMS_15
Representing aspect ratio,/->
Figure SMS_14
and />
Figure SMS_20
The coordinate positions of the pixels are indicated.
Figure SMS_16
and />
Figure SMS_21
Representing the coordinate position of the pixel before rotation; />
Figure SMS_2
,/>
Figure SMS_10
,/>
Figure SMS_4
and />
Figure SMS_13
Representing the coordinate position of the rotated pixel. Wherein->
Figure SMS_19
The value of (2) may be provided with a plurality of angles, in this embodiment +.>
Figure SMS_23
The values of (1) are set to 0 °, 30 °, 45 °, 60 °,90 °, and 135 °, the kernel sizes of the filter bank are 5, 7, 9, 11, and 13, and gabor feature extraction visual views are shown in fig. 4. Experiment comparison tableIt is clear that the angles required for the image of this example are 45 °,90 ° and 135 °, the kernel size being 5. The Gabor features in the 45 DEG and 135 DEG directions are combined into a feature map +.>
Figure SMS_7
Gabor characteristic of the 90 DEG orientation +.>
Figure SMS_11
Is temporarily unchanged. The Gabor characteristic obtained is->
Figure SMS_18
The feature model at the feature level is denoted +.>
Figure SMS_22
, wherein />
Figure SMS_3
Indicate->
Figure SMS_12
Frame (F)>
Figure SMS_9
Representing the feature model.
And step four, respectively extracting HOG features from the Gabor feature graphs in the multiple directions obtained in the step three to obtain HOG features in the multiple directions. The direction gradient histogram (Histogram of Oriented Gradient, HOG) feature is a feature descriptor used for object detection in computer vision and image processing. HOG features are characterized by computing and counting the gradient direction histograms of local areas of the image.
The method comprises the following steps:
and extracting HOG features from the Gabor feature map after the Gabor feature map is obtained. Is provided with
Figure SMS_24
The direction is horizontal direction, the>
Figure SMS_25
The direction is the vertical direction; is provided with->
Figure SMS_26
The pixel value at is +.>
Figure SMS_27
Then->
Figure SMS_28
Direction and->
Figure SMS_29
The calculation formula of the direction gradient value is as follows:
Figure SMS_30
(2)
Figure SMS_31
(3)
wherein ,
Figure SMS_33
representing a horizontal gradient>
Figure SMS_35
Representing the vertical gradient +.>
Figure SMS_37
、/>
Figure SMS_34
Respectively represent pixel points in the input image>
Figure SMS_36
A horizontal gradient and a vertical gradient. Pixel dot +.>
Figure SMS_38
Gradient of the part->
Figure SMS_39
And direction->
Figure SMS_32
Expressed as:
Figure SMS_40
(4)
Figure SMS_41
(5)
after the acquired gradient and the gradient direction are calculated to obtain a gradient direction histogram, the amplitude is normalized to obtain HOG characteristics of the two Gabor characteristics
Figure SMS_42
and />
Figure SMS_43
. The feature model of the obtained HOG features on the feature level is expressed as +.>
Figure SMS_44
, wherein />
Figure SMS_45
Indicate->
Figure SMS_46
And (3) a frame.
Step five: and carrying out weighted fusion on the HOG characteristics in the multiple directions. Because the angle of the laser stripe at the weld groove is not the same as the angles of other areas of the image, the areas needed by us can be highlighted by giving different weights to the different angles. Therefore, in HOG characteristics in multiple directions, a first weight coefficient is given to the HOG characteristics capable of highlighting the weld groove, and the value range of the first weight coefficient is 0.7-1; the HOG characteristics which are not needed are endowed with a second weight coefficient, and the value range of the second weight coefficient is 0-0.3; and the sum of the first weight coefficient and the second weight coefficient is 1, and finally the weighted and fused HOG characteristic is obtained. The process can fade the unnecessary characteristics as much as possible, and also can filter out partial interference in the welding process, so that the weighted and fused image is more convenient to track.
The weighted fusion calculation formula is:
Figure SMS_47
(6)
wherein ,
Figure SMS_48
is a feature fusion coefficient. This time->
Figure SMS_49
The value is 0.8.
Step six: the correlation filter is trained.
Tracking algorithms based on kernel correlation filtering, such as MOSSE, CSK, KCF, BACF, SAMF, introduce correlation filtering (measuring the similarity of two signals) in the communication field into target tracking, and correlation filters use the basic idea of correlation filtering tracking, so they are called correlation filters, which are common terms in the art. The tracking algorithm of the correlation filtering starts from a CSK method proposed by P.Martins in 2012, an author proposes a kernel tracking method based on a cyclic matrix, and the problem of Dense Sampling (Dense Sampling) is solved perfectly mathematically, and the detection process is realized rapidly by utilizing Fourier transformation. Compared with the traditional algorithms such as an optical flow method and Kalman, meanshift, the related filtering algorithm has the advantages of higher tracking speed and higher precision. The basic idea of the related filtering tracking is to design a filtering template, and to use the template to perform related operation with the target candidate region, wherein the position of the maximum output response is the target position of the current frame. The KCF target tracking algorithm is selected in the embodiment.
The training process for the correlation filter is as follows:
the KCF target tracking algorithm training correlation filter was implemented using ridge regression. The goal of training using ridge regression is to find a function
Figure SMS_50
,/>
Figure SMS_51
For the filter parameters +.>
Figure SMS_52
Representing the transpose of the matrix, let sample +.>
Figure SMS_53
And its regression objective->
Figure SMS_54
The least squares error of (2) and the loss function is:
Figure SMS_55
(7)
wherein ,
Figure SMS_56
to control the regularization parameters of the overfitting. The matrix form can be expressed as:
Figure SMS_57
(8)
in the formula ,
Figure SMS_58
is a cyclic matrix, consisting of samples->
Figure SMS_59
Composition; />
Figure SMS_60
Is->
Figure SMS_61
Corresponding tag vector, by regression target->
Figure SMS_62
Composition; when the value of the loss function is 0, the derivative is available:
Figure SMS_63
(9)
Figure SMS_64
is->
Figure SMS_65
Is->
Figure SMS_66
Transpose (S)>
Figure SMS_67
Is an identity matrix.
6.2. According to the property in which the cyclic matrix can be diagonalized by the discrete fourier matrix, inverting the matrix into a property in which the eigenvalues are inverted; can be used for
Figure SMS_68
The solution of (2) is converted into a frequency domain for operation, discrete Fourier transform is applied to improve the operation speed, and then the solution is inversely converted back into the time domain, so that the solution with the maximum response is obtained.
Figure SMS_69
(10)
The upper part is a cyclic matrix
Figure SMS_70
Wherein the first row represents the initial samples, after which each row is the initial samples +.>
Figure SMS_71
And (5) recycling. />
Figure SMS_72
Matrix form transformed by discrete Fourier transform->
Figure SMS_73
Diagonalization:
Figure SMS_74
(11)
wherein
Figure SMS_75
Is->
Figure SMS_76
Fourier transform of->
Figure SMS_77
Is in the form of a matrix of discrete fourier transforms, +.>
Figure SMS_78
Matrix form representing a Fourier transform +.>
Figure SMS_79
And (5) transposition. Substituting into formula (9), can obtain:
Figure SMS_80
(12)
Figure SMS_81
representation->
Figure SMS_82
Conjugation of->
Figure SMS_83
The elements representing the corresponding positions are multiplied. By means of the anti-diagonalization properties, it is possible to obtain:
Figure SMS_84
(13)
wherein
Figure SMS_85
Is->
Figure SMS_86
A matrix. Recycling cyclic matrix convolution properties: />
Figure SMS_87
The method comprises the following steps:
Figure SMS_88
(14)
wherein ,
Figure SMS_89
is->
Figure SMS_90
Is a fourier transform of (a).
6.3. In order to solve the problem of high computational complexity in a high-dimensional space, a kernel function is found, and the formula can be expressed as follows:
Figure SMS_91
(15)
wherein
Figure SMS_93
and />
Figure SMS_96
Belonging to the circulant matrix->
Figure SMS_98
,/>
Figure SMS_94
Is a low dimensional space, < >>
Figure SMS_97
Representing a nonlinear transformation function +.>
Figure SMS_99
Representing the inner product operation,/->
Figure SMS_100
Representing a kernel function belonging to a high-dimensional space. Will->
Figure SMS_92
Use->
Figure SMS_95
Is expressed by the non-linearity of (a):
Figure SMS_101
(16)
wherein ,
Figure SMS_102
indicate->
Figure SMS_103
The dual spatial coefficients of the individual samples.
6.4. Bringing the above into equation (8), for
Figure SMS_104
Is converted into pair->
Figure SMS_105
Is carried into the following steps:
Figure SMS_106
(17)
wherein
Figure SMS_107
I.e. the final solution in the frequency domain of the ridge regression problem, < >>
Figure SMS_108
Representation->
Figure SMS_109
Column vector of the magnitude, is the filter parameter +.>
Figure SMS_110
Corresponding dual space coefficients; />
Figure SMS_111
Let->
Figure SMS_112
,/>
Figure SMS_113
A kernel matrix representing a kernel space, and is a cyclic matrix. The diagonalized property of the kernel function in the frequency domain can be exploited to:
Figure SMS_114
(18)
wherein
Figure SMS_115
Is a kernel function->
Figure SMS_116
Element of the first row->
Figure SMS_117
Is->
Figure SMS_118
Is a fourier transform of (a).
6.5. Let the result after the feature weighted fusion in the above formula (6) be input
Figure SMS_119
When the filter is turned into:
Figure SMS_120
(19)
order the
Figure SMS_121
,/>
Figure SMS_122
A kernel matrix representing a kernel space, and is a cyclic matrix. Fourier transforming the above with the properties of the cyclic matrix can result in:
Figure SMS_123
(20)
wherein
Figure SMS_124
Is a kernel function->
Figure SMS_125
Element of the first row->
Figure SMS_126
Is a vector, ++>
Figure SMS_127
Is->
Figure SMS_128
Comprises->
Figure SMS_129
Is a cyclic shift of the output of the cyclic shift.
Step seven: after the kernel function finishes training, the newly input sample features are mapped to the kernel space, all positions in the sample are operated through the kernel function, corresponding response results are obtained, the obtained response results are response diagrams, and the strongest response in the response diagrams is the position of the frame target.
Step eight: the invention also extends the updating strategy of the KCF target tracking algorithm, and the updating comprises updating the target appearance template and the filter model. The target appearance template part is used for carrying out weighted updating on the HOG characteristics extracted from the current frame image after weighted fusion and the HOG characteristics extracted from the previous frame image after weighted fusion to obtain a target appearance template of the current frame; the updating part of the correlation filter model is to train the HOG characteristic after weighting and fusing the target image in the current frame to obtain the correlation filter, and then to update the correlation filter with the correlation filter obtained by updating the previous frame in a weighting way, so as to obtain the correlation filter of the current frame.
And step nine, judging whether to continue tracking, if so, returning to the step two, otherwise, stopping tracking.
Example two
The embodiment discloses a weld joint target tracking system based on direction characteristic driving.
As shown in fig. 5, a weld target tracking system driven based on directional characteristics is characterized in that: comprising the following steps:
a single frame weld image acquisition module configured to: acquiring a plurality of continuous Shan Zhen weld images, and framing a target to be tracked in the acquired first Shan Zhen weld image;
a preprocessing module configured to: preprocessing the current frame welding seam image to obtain a preprocessed current frame welding seam image;
the Gabor feature map acquisition module is configured to: extracting Gabor characteristics of multiple directions from the preprocessed weld joint image of the current frame to obtain Gabor characteristic diagrams of the multiple directions;
the HOG feature acquisition module is configured to: extracting HOG features from the Gabor feature graphs in multiple directions respectively to obtain HOG feature graphs in multiple directions, and carrying out weighted fusion on the HOG features in multiple directions to obtain HOG features after weighted fusion;
a training module configured to: training a correlation filter by using the weighted and fused HOG characteristics to obtain a trained correlation filter;
a target location tracking module configured to: carrying out convolution operation on the HOG characteristics after weighted fusion and the trained relevant filter to obtain a response diagram of the relevant filter, wherein the coordinate position with the maximum response value in the response diagram is the target position tracked by the weld joint image of the current frame;
an update module configured to: updating the target appearance template and the filter model;
a judgment module configured to: judging whether to continue tracking, if so, taking the next Shan Zhen weld image as the current frame weld image, circularly executing the preprocessing module to the judging module, otherwise, stopping tracking.
Example III
An object of the present embodiment is to provide a computer-readable storage medium.
A computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps in a method of weld target tracking based on directional characteristic actuation as described in embodiment 1 of the present disclosure.
Example IV
An object of the present embodiment is to provide an electronic apparatus.
An electronic device comprising a memory, a processor and a program stored on the memory and executable on the processor, the processor implementing the steps in the directional feature driven weld target tracking method according to embodiment 1 of the present disclosure when the program is executed.
The steps involved in the devices of the second, third and fourth embodiments correspond to those of the first embodiment of the method, and the detailed description of the embodiments can be found in the related description section of the first embodiment. The term "computer-readable storage medium" should be taken to include a single medium or multiple media including one or more sets of instructions; it should also be understood to include any medium capable of storing, encoding or carrying a set of instructions for execution by a processor and that cause the processor to perform any one of the methods of the present invention.
It will be appreciated by those skilled in the art that the modules or steps of the invention described above may be implemented by general-purpose computer means, alternatively they may be implemented by program code executable by computing means, whereby they may be stored in storage means for execution by computing means, or they may be made into individual integrated circuit modules separately, or a plurality of modules or steps in them may be made into a single integrated circuit module. The present invention is not limited to any specific combination of hardware and software.
While the foregoing description of the embodiments of the present invention has been presented in conjunction with the drawings, it should be understood that it is not intended to limit the scope of the invention, but rather, it is intended to cover all modifications or variations within the scope of the invention as defined by the claims of the present invention.

Claims (9)

1. The welding seam target tracking method based on direction characteristic driving is characterized by comprising the following steps of:
step one: acquiring a plurality of continuous Shan Zhen weld images, and framing a target to be tracked in the acquired first Shan Zhen weld image;
step two: preprocessing the current frame welding seam image to obtain a preprocessed current frame welding seam image;
step three: extracting Gabor characteristics of multiple directions from the preprocessed weld joint image of the current frame to obtain Gabor characteristic diagrams of the multiple directions;
step four: extracting HOG features from the Gabor feature graphs in multiple directions respectively to obtain HOG feature graphs in multiple directions, and carrying out weighted fusion on the HOG features in multiple directions to obtain HOG features after weighted fusion;
in HOG characteristics in multiple directions, a first weight coefficient is given to the HOG characteristics capable of highlighting the weld groove, and the value range of the first weight coefficient is 0.7-1; the HOG characteristics which are not needed are endowed with a second weight coefficient, and the value range of the second weight coefficient is 0-0.3; the sum of the addition of the first weight coefficient and the second weight coefficient is 1, so that the HOG characteristic after weighted fusion is obtained;
step five: training a correlation filter by using the weighted and fused HOG characteristics to obtain a trained correlation filter;
step six: carrying out convolution operation on the HOG characteristics after weighted fusion and the trained relevant filter to obtain a response diagram of the relevant filter, wherein the coordinate position with the maximum response value in the response diagram is the target position tracked by the weld joint image of the current frame;
step seven: updating the target appearance template and the filter model;
step eight: judging whether to continue tracking, if so, taking the next Shan Zhen weld image as the current frame weld image, and circularly executing the second step to the eighth step, otherwise, stopping tracking.
2. The directional characteristic driven weld target tracking method according to claim 1, wherein the plurality of Shan Zhen weld images are each frame of pictures or each frame of pictures in video captured in real time by an industrial camera.
3. The method for tracking a weld target based on directional characteristic driving according to claim 1, wherein the current frame of the weld image is sequentially subjected to gray-scale processing, smoothing filtering processing, binarization processing, and morphological filtering processing of opening and closing operations to obtain a preprocessed image.
4. The directional characteristic driving-based weld target tracking method according to claim 1, wherein the mathematical expression of the Gabor function is:
Figure QLYQS_1
wherein ,
Figure QLYQS_4
representing the wavelength of the filtering, +.>
Figure QLYQS_6
Representing the tilt angle, < +.>
Figure QLYQS_9
Representing the phase offset, +.>
Figure QLYQS_3
Represents the standard deviation of the gaussian function, +.>
Figure QLYQS_7
Representing aspect ratio,/->
Figure QLYQS_10
and />
Figure QLYQS_12
Respectively representing coordinate positions of the pixels; />
Figure QLYQS_2
Figure QLYQS_5
,/>
Figure QLYQS_8
and />
Figure QLYQS_11
Representing the coordinate position of the rotated pixel.
5. The method for tracking a weld target based on directional characteristic driving as claimed in claim 1, wherein Gabor feature image pixels are obtained
Figure QLYQS_13
Pixel value at pixel point based on pixel point +.>
Figure QLYQS_14
Pixel value calculation at +.>
Figure QLYQS_15
Direction and->
Figure QLYQS_16
Directional gradient values based on->
Figure QLYQS_17
Direction and->
Figure QLYQS_18
Directional gradient value calculation pixel point +.>
Figure QLYQS_19
And (3) calculating a gradient direction histogram based on the gradient and the gradient direction, and normalizing the amplitude of the gradient direction histogram to obtain the HOG characteristic.
6. The method for tracking a weld target based on directional characteristic driving according to claim 1, wherein updating the target appearance template and the filter model comprises the following steps: the method comprises the steps of carrying out weighted updating on the characteristics extracted from a weld image of a current frame and the characteristics extracted from a weld image of a previous frame, and updating a target appearance template of the current frame; and carrying out weighted updating on the relevant filter obtained by the weld joint image of the current frame and the relevant filter obtained by the weld joint image of the previous frame, and finishing updating of the filter model.
7. A weld joint target tracking system based on direction characteristic driving is characterized in that: comprising the following steps:
a single frame weld image acquisition module configured to: acquiring a plurality of continuous Shan Zhen weld images, and framing a target to be tracked in the acquired first Shan Zhen weld image;
a preprocessing module configured to: preprocessing the current frame welding seam image to obtain a preprocessed current frame welding seam image;
the Gabor feature map acquisition module is configured to: extracting Gabor characteristics of multiple directions from the preprocessed weld joint image of the current frame to obtain Gabor characteristic diagrams of the multiple directions;
the HOG feature acquisition module is configured to: extracting HOG features from the Gabor feature graphs in multiple directions respectively to obtain HOG feature graphs in multiple directions, and carrying out weighted fusion on the HOG features in multiple directions to obtain HOG features after weighted fusion;
in HOG characteristics in multiple directions, a first weight coefficient is given to the HOG characteristics capable of highlighting the weld groove, and the value range of the first weight coefficient is 0.7-1; the HOG characteristics which are not needed are endowed with a second weight coefficient, and the value range of the second weight coefficient is 0-0.3; the sum of the addition of the first weight coefficient and the second weight coefficient is 1, so that the HOG characteristic after weighted fusion is obtained;
a training module configured to: training a correlation filter by using the weighted and fused HOG characteristics to obtain a trained correlation filter;
a target location tracking module configured to: carrying out convolution operation on the HOG characteristics after weighted fusion and the trained relevant filter to obtain a response diagram of the relevant filter, wherein the coordinate position with the maximum response value in the response diagram is the target position tracked by the weld joint image of the current frame;
an update module configured to: updating the target appearance template and the filter model;
a judgment module configured to: judging whether to continue tracking, if so, taking the next Shan Zhen weld image as the current frame weld image, circularly executing the preprocessing module to the judging module, otherwise, stopping tracking.
8. A computer readable storage medium having a program stored thereon, which when executed by a processor, implements the steps of the directional characteristic driven weld target tracking method according to any one of claims 1 to 6.
9. An electronic device comprising a memory, a processor and a program stored on the memory and executable on the processor, wherein the processor performs the steps in the direction feature driven weld target tracking method according to any one of claims 1-6 when the program is executed.
CN202310300244.0A 2023-03-27 2023-03-27 Weld joint target tracking method and system based on directional characteristic driving Active CN116030098B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202310300244.0A CN116030098B (en) 2023-03-27 2023-03-27 Weld joint target tracking method and system based on directional characteristic driving
PCT/CN2023/138661 WO2024198528A1 (en) 2023-03-27 2023-12-14 Target tracking method and system based on direction feature driving

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310300244.0A CN116030098B (en) 2023-03-27 2023-03-27 Weld joint target tracking method and system based on directional characteristic driving

Publications (2)

Publication Number Publication Date
CN116030098A CN116030098A (en) 2023-04-28
CN116030098B true CN116030098B (en) 2023-06-13

Family

ID=86079728

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310300244.0A Active CN116030098B (en) 2023-03-27 2023-03-27 Weld joint target tracking method and system based on directional characteristic driving

Country Status (2)

Country Link
CN (1) CN116030098B (en)
WO (1) WO2024198528A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116030098B (en) * 2023-03-27 2023-06-13 齐鲁工业大学(山东省科学院) Weld joint target tracking method and system based on directional characteristic driving

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103426179A (en) * 2012-05-17 2013-12-04 深圳中兴力维技术有限公司 Target tracking method and system based on mean shift multi-feature fusion
CN113706580A (en) * 2021-08-11 2021-11-26 西安交通大学 Target tracking method, system, equipment and medium based on relevant filtering tracker

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
MY137246A (en) * 2002-04-30 2009-01-30 Jfe Steel Corp Method and instrument for measuring bead cutting shape of electric welded tube
CN104200237B (en) * 2014-08-22 2019-01-11 浙江生辉照明有限公司 One kind being based on the High-Speed Automatic multi-object tracking method of coring correlation filtering
CN105224849B (en) * 2015-10-20 2019-01-01 广州广电运通金融电子股份有限公司 A kind of multi-biological characteristic fusion authentication identifying method and device
CN107767405B (en) * 2017-09-29 2020-01-03 华中科技大学 Nuclear correlation filtering target tracking method fusing convolutional neural network
CN109753846A (en) * 2017-11-03 2019-05-14 北京深鉴智能科技有限公司 Target following system for implementing hardware and method
CN109685073A (en) * 2018-12-28 2019-04-26 南京工程学院 A kind of dimension self-adaption target tracking algorism based on core correlation filtering
CN210334836U (en) * 2019-01-15 2020-04-17 深圳大学 Weld seam line tracking means based on parallel structure light
CN112700441A (en) * 2021-01-28 2021-04-23 中北大学 Automatic weld searching method based on textural features
CN114723783A (en) * 2022-04-01 2022-07-08 南京信息工程大学 Multi-feature satellite video target tracking method based on motion estimation
CN114905507A (en) * 2022-04-18 2022-08-16 广州东焊智能装备有限公司 Welding robot precision control method based on environment vision analysis
CN116071392A (en) * 2023-01-31 2023-05-05 齐鲁工业大学(山东省科学院) Moving target detection method and system combined with foreground contour extraction
CN116030098B (en) * 2023-03-27 2023-06-13 齐鲁工业大学(山东省科学院) Weld joint target tracking method and system based on directional characteristic driving

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103426179A (en) * 2012-05-17 2013-12-04 深圳中兴力维技术有限公司 Target tracking method and system based on mean shift multi-feature fusion
CN113706580A (en) * 2021-08-11 2021-11-26 西安交通大学 Target tracking method, system, equipment and medium based on relevant filtering tracker

Also Published As

Publication number Publication date
CN116030098A (en) 2023-04-28
WO2024198528A1 (en) 2024-10-03

Similar Documents

Publication Publication Date Title
CN111274916B (en) Face recognition method and face recognition device
CN109903313B (en) Real-time pose tracking method based on target three-dimensional model
CN104200495B (en) A kind of multi-object tracking method in video monitoring
CN104331682B (en) A kind of building automatic identifying method based on Fourier descriptor
CN111325769B (en) Target object detection method and device
CN109919960B (en) Image continuous edge detection method based on multi-scale Gabor filter
CN109961399B (en) Optimal suture line searching method based on image distance transformation
CN111178252A (en) Multi-feature fusion identity recognition method
CN116030098B (en) Weld joint target tracking method and system based on directional characteristic driving
CN112784712B (en) Missing child early warning implementation method and device based on real-time monitoring
CN108154496B (en) Electric equipment appearance change identification method suitable for electric power robot
CN113763274B (en) Multisource image matching method combining local phase sharpness orientation description
CN111402185B (en) Image detection method and device
Gal Automatic obstacle detection for USV’s navigation using vision sensors
CN117557565B (en) Detection method and device for lithium battery pole piece
Dai et al. An Improved ORB Feature Extraction Algorithm Based on Enhanced Image and Truncated Adaptive Threshold
Lee et al. An effective method for detecting facial features and face in human–robot interaction
Narasimhamurthy et al. A Copy-Move Image Forgery Detection Using Modified SURF Features and AKAZE Detector.
Zhou et al. Speeded-up robust features based moving object detection on shaky video
Pang et al. Low frame rate video target localization and tracking testbed
Cao et al. Image Recognition Based on Denoising and Edge Detection
CN115409890B (en) Self-defined mark detection method and system based on MSR and generalized Hough transform
Sina et al. Object recognition on satellite images with biologically-inspired computational approaches
Zhang et al. Electric Vehicle Charging Robot Charging Port Identification Method Based on Multi-Algorithm Fusion
Khadane et al. Real-Time Object Size Dimensioning in Raspberry Pi

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant