CN118397047B - Target tracking method and system based on cyclic neural network and electronic equipment - Google Patents

Target tracking method and system based on cyclic neural network and electronic equipment Download PDF

Info

Publication number
CN118397047B
CN118397047B CN202410815952.2A CN202410815952A CN118397047B CN 118397047 B CN118397047 B CN 118397047B CN 202410815952 A CN202410815952 A CN 202410815952A CN 118397047 B CN118397047 B CN 118397047B
Authority
CN
China
Prior art keywords
matching
target area
frame
optical flow
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410815952.2A
Other languages
Chinese (zh)
Other versions
CN118397047A (en
Inventor
贺璟璐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Xunce Technology Co ltd
Original Assignee
Shenzhen Xunce Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Xunce Technology Co ltd filed Critical Shenzhen Xunce Technology Co ltd
Priority to CN202410815952.2A priority Critical patent/CN118397047B/en
Publication of CN118397047A publication Critical patent/CN118397047A/en
Application granted granted Critical
Publication of CN118397047B publication Critical patent/CN118397047B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of image motion analysis, in particular to a target tracking method, a target tracking system and electronic equipment based on a cyclic neural network. The invention obtains the matching density coefficient corresponding to each target area on each frame image between the frame images; obtaining an optical flow interference coefficient of each target area on the next frame image according to the to-be-detected frame image, the next frame image of the to-be-detected frame and the variation difference of the matching density coefficients between the two next frame images; obtaining a rotation threshold suppression coefficient according to fluctuation of distribution variation differences of the matched optical flow points between adjacent frame images; combining the position distribution of the matched light flow points of each target area between the adjacent frame images to obtain an adjustment matching threshold; screening and adjusting the matched light flow points to obtain a new target area in the later frame of image; and the next frame image is used as a new frame image to be detected, so that continuous tracking of the target is realized. According to the invention, the accuracy of target positioning and tracking is improved by adaptively acquiring the accurate matching threshold value of the optical flow point.

Description

Target tracking method and system based on cyclic neural network and electronic equipment
Technical Field
The invention relates to the technical field of image motion analysis, in particular to a target tracking method, a target tracking system and electronic equipment based on a cyclic neural network.
Background
Along with the rapid development of computer vision and deep learning technology, target tracking has wide application prospect in the fields of video monitoring, automatic driving, intelligent transportation and the like; the cyclic neural network RNN can capture the time dependency relationship in the sequence data, has excellent sequence modeling capability, and can accurately track a target object in continuous frame images by combining a target detection and tracking technology, so that the cyclic neural network RNN is widely applied to target tracking tasks in video sequences.
Considering the influence of factors such as the motion of the target itself, the change of the visual angle of the camera, the change of the distance between the target and the camera in the cyclic neural network target tracking technology, the scale of target tracking is obviously and dynamically changed, so that the tracking difficulty is greatly increased, and even the tracking failure is possibly caused. In the prior art, an optical flow field is constructed to acquire optical flow points of a video sequence for target tracking, but because the inaccuracy of the setting of the optical flow point matching threshold value affects the effectiveness of matching the optical flow points, namely, the threshold value is set too high, some effective optical flow points can be erroneously eliminated, so that the tracking accuracy is reduced; conversely, setting the threshold too low may introduce excessive noise points, again affecting the accuracy of tracking; the target cannot be effectively distinguished from the background, and the target tracking effect is poor.
Disclosure of Invention
In order to solve the technical problem of inaccurate target tracking caused by inaccurate optical flow point matching threshold, the invention aims to provide a target tracking method, a target tracking system and electronic equipment based on a cyclic neural network, and the adopted technical scheme is as follows:
One embodiment of the invention provides a target tracking method based on a cyclic neural network, which comprises the following steps:
Acquiring target video training data containing a moving target;
taking an initial frame image of target video training data as a frame image to be detected, and acquiring an image area of a moving target in the frame image to be detected as a target area; screening out matching light flow points according to gray distribution and position characteristics of pixel points between each target area on the frame image to be detected and other frame images after the target area and the other frame images by combining light flow analysis;
obtaining a matching density coefficient corresponding to each target area on each frame image between the frame images according to the number of the matching optical flow points corresponding to each target area between the frame images and the relative distance between the matching optical flow points on each frame image; obtaining an optical flow interference coefficient of each target area on a next frame image of the frame to be detected according to the change difference of the matching density coefficient between the frame image to be detected and the next frame image of the frame to be detected and between the two next frame images; obtaining a rotation threshold suppression coefficient according to fluctuation of distribution variation differences of the matched optical flow points between adjacent frame images; adjusting a preset matching threshold according to the optical flow interference coefficient, the rotation threshold suppression coefficient and the position distribution of the matching optical flow points of each target area between adjacent frame images to obtain an adjusted matching threshold;
screening and adjusting the matching light flow points according to the adjusting and matching threshold value to obtain a target area in the later frame of image; based on the cyclic neural network, the next frame image is used as a new frame image to be detected, and continuous tracking of the target is realized.
Further, the method for acquiring the matching light flow point comprises the following steps:
obtaining an optical flow matching distance according to gray distribution and position characteristics of pixel points between each target area on the frame image to be detected and other frame images;
if the optical flow matching distance between each target area on the frame image to be detected and the pixel points of other frame images is smaller than or equal to a preset matching threshold value, the corresponding pixel points are the matched optical flow points.
Further, the optical flow matching distance obtaining method includes:
for each frame of image, obtaining the neighborhood gray scale characteristics of each pixel point according to the ratio of the gray scale fluctuation degree and the gray scale level of the pixel point in the neighborhood range of each pixel point;
Calculating the square sum of the position coordinates of each pixel point to be used as a regional position characteristic; obtaining the group characteristics of each pixel point according to the distribution quantity of the gray values corresponding to each pixel point;
obtaining a feature set of each pixel point according to the neighborhood gray level feature, the regional position feature and the family group feature of each pixel point;
And constructing an optical flow field of the target video training data, and obtaining an optical flow matching distance according to the change degree of a feature set of a corresponding pixel point between each target area on a frame image to be detected and other frame images in the optical flow field.
Further, the method for obtaining the matching density coefficient comprises the following steps:
calculating Euclidean distance between the matched optical flow points on each frame of image as a relative distance;
obtaining a matching density coefficient according to a calculation formula of the matching density coefficient, wherein the calculation formula of the matching density coefficient is as follows:
; wherein, Representing the first between frame imagesCorresponding first on frame imageMatching density coefficients of the target areas; representing correspondence between frame images The number of matching flow points in the respective target areas; representing the first between frame images Corresponding first on frame imageThe first target areaA plurality of matching flow points; representing the first between frame images Corresponding first on frame imageThe first target areaA plurality of matching flow points; Representing the Euclidean distance; Representing a gaussian kernel function; Representing the bandwidth parameter.
Further, the method for acquiring the optical flow interference coefficient comprises the following steps:
Obtaining the density change degree of each target area between two frame images according to the difference of the matching density coefficients of each target area between the two frame images;
And obtaining the optical flow interference coefficient of each target area on the next frame image of the frame to be detected according to the difference value of the density change degree of each target area between the frame image to be detected and the next two frame images of the frame to be detected and the next frame image.
Further, the method for acquiring the rotation threshold suppression coefficient comprises the following steps:
according to the relative distance between the matched optical flow points of each frame image corresponding to each target area, taking the relative distance as a distribution characteristic;
and obtaining a rotation threshold value inhibition coefficient according to the fluctuation degree of the distribution characteristic difference of the matched optical flow points corresponding to each target area between different frame images.
Further, the obtaining method for adjusting the matching threshold value comprises the following steps:
obtaining an area expansion coefficient according to the position distribution of the matching light flow points of each target area between the adjacent frame images;
obtaining an adjustment matching threshold according to the optical flow interference coefficient, the rotation threshold suppression coefficient, the area expansion coefficient and a preset matching threshold; the optical flow interference coefficient, the area expansion coefficient and the preset matching threshold are positively correlated with the adjustment matching threshold; the rotation threshold suppression coefficient is inversely related to the adjustment match threshold.
Further, the method for acquiring the area expansion coefficient comprises the following steps:
Constructing a minimum circumscribed rectangle of each target area, which is used for matching the light flow points between the frame images to be detected, as a first target area; constructing a minimum circumscribed rectangle of each target area, which is used for matching the light flow points between the images of the next frame, as a second target area;
Multiplying the length and the width of the first target area by the density change degree respectively to obtain the length and the width of the predicted target area; constructing the minimum circumscribed rectangle of the second target area and the predicted target area as an integral target area;
and calculating the ratio of the difference between the whole target area and the second target area to the second target area, and adding the ratio result and a preset constant to obtain the area expansion coefficient.
An embodiment of the present invention further provides a target tracking system based on a recurrent neural network, the system including:
And a data acquisition module: acquiring target video training data containing a moving target;
Optical flow matching module: taking an initial frame image of target video training data as a frame image to be detected, and acquiring an image area of a moving target in the frame image to be detected as a target area; screening out matching light flow points according to gray distribution and position characteristics of pixel points between each target area on the frame image to be detected and other frame images after the target area and the other frame images by combining light flow analysis;
A threshold adjustment module: obtaining a matching density coefficient corresponding to each target area on each frame image between the frame images according to the number of the matching optical flow points corresponding to each target area between the frame images and the relative distance between the matching optical flow points on each frame image; obtaining an optical flow interference coefficient of each target area on a next frame image of the frame to be detected according to the change difference of the matching density coefficient between the frame image to be detected and the next frame image of the frame to be detected and between the two next frame images; obtaining a rotation threshold suppression coefficient according to fluctuation of distribution variation differences of the matched optical flow points between adjacent frame images; adjusting a preset matching threshold according to the optical flow interference coefficient, the rotation threshold suppression coefficient and the position distribution of the matching optical flow points of each target area between adjacent frame images to obtain an adjusted matching threshold;
A target tracking module: screening and adjusting the matching light flow points according to the adjusting and matching threshold value to obtain a target area in the later frame of image; based on the cyclic neural network, the next frame image is used as a new frame image to be detected, and continuous tracking of the target is realized.
The invention also provides electronic equipment, which comprises a memory, a processor and a computer program stored in the memory and capable of running on the processor, wherein the processor realizes the steps of any target tracking method based on the cyclic neural network when executing the computer program.
The invention has the following beneficial effects:
The method takes an initial frame image of target video training data as a frame image to be detected, acquires an image area of a moving target in the frame image to be detected as a target area, and a target tracking algorithm generally starts from an initial frame and determines the image area of the moving target in the frame as a reference for follow-up tracking; according to gray distribution and position characteristics of pixel points between each target area on the frame image to be detected and other frame images, screening out matching optical flow points by combining optical flow analysis, evaluating the motion condition of a target between continuous frame images, and removing invalid optical flow points by screening to improve the accuracy and stability of target tracking; obtaining a matching density coefficient corresponding to each target area on each frame image between frame images according to the number of the matching light flow points corresponding to each target area between the frame images and the relative distance between the matching light flow points on each frame image, and reflecting the change condition of the target area between continuous frames; in practical application, due to background noise, illumination change and other factors, interference may be caused to the optical flow matching, according to the difference of the change of the matching density coefficients between the frame image to be detected and the next frame image of the frame to be detected and between the two next frame images, the optical flow interference coefficient of each target area on the next frame image of the frame to be detected is obtained, and the interference degree can be estimated; the rotation of the target can cause the distribution of the optical flow points to change, a rotation threshold value inhibition coefficient is obtained according to the fluctuation of the distribution change difference of the matched optical flow points between adjacent frame images, and the change condition of the target is quantified by calculating the rotation threshold value inhibition coefficient; the preset matching threshold value is adjusted according to the optical flow interference coefficient, the rotation threshold value inhibition coefficient and the position distribution of the matching optical flow points of each target area between the adjacent frame images, so that an adjusted matching threshold value is obtained, and the algorithm can be ensured to keep higher accuracy and stability under different conditions by dynamically adjusting the matching threshold value; screening and adjusting the matching light flow points according to the adjusting and matching threshold value, acquiring a target area in the image of the next frame, and determining a new position of the target in the next frame; based on the cyclic neural network, the next frame image is used as a new frame image to be detected, continuous tracking of the target is realized, the motion mode of the target can be learned through the cyclic neural network, and the position of the target is predicted in the new frame. According to the invention, the accuracy of target positioning and tracking is improved by adaptively acquiring the accurate matching threshold value of the optical flow point.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions and advantages of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are only some embodiments of the invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a target tracking method based on a recurrent neural network according to an embodiment of the present invention;
FIG. 2 is a block diagram of a target tracking system based on a recurrent neural network according to an embodiment of the present invention;
fig. 3 is a block diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to further describe the technical means and effects adopted by the invention to achieve the preset aim, the following is a detailed description of a target tracking method, a system and an electronic device based on a recurrent neural network, which are provided by the invention, with reference to the accompanying drawings and the preferred embodiment. In the following description, different "one embodiment" or "another embodiment" means that the embodiments are not necessarily the same. Furthermore, the particular features, structures, or characteristics of one or more embodiments may be combined in any suitable manner.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following specifically describes a target tracking method and system based on a recurrent neural network and a specific scheme of an electronic device provided by the invention with reference to the accompanying drawings.
Referring to fig. 1, a flowchart of a target tracking method based on a recurrent neural network according to an embodiment of the present invention is shown, and the specific method includes:
Step S1: target video training data including a moving target is acquired.
In the embodiment of the invention, the scale of the target is changed in the target detection process due to the factors of the movement of the target, the change of the visual angle of the camera, the change of the distance between the target and the camera and the like, so that the difficulty of target tracking is increased; in order to accurately track the change trend of the target, firstly, collecting the information of the environment of the moving target in the running process in real time through each collecting camera on the moving target, sending the collected video information to a central control unit ECU of the moving target in real time, and calculating and tracking the road target so as to analyze the tracking information of the moving target later; thus acquiring target video training data comprising moving targets; in one embodiment of the invention, the moving object refers to a vehicle, and the video training data refers to video data of vehicle object tracking information acquired by the vehicle during running.
Step S2: taking an initial frame image of target video training data as a frame image to be detected, and acquiring an image area of a moving target in the frame image to be detected as a target area; and screening out matched light flow points according to gray distribution and position characteristics of pixel points between each target area on the frame image to be detected and other frame images after the target area and the other frame images by combining light flow analysis.
In video target tracking, an initial frame is usually the beginning of a video, the state and the position of a target are relatively stable, and an explicit starting point is provided, so that the target to be tracked or analyzed can be accurately understood, and the initial frame image is taken as a frame image to be detected, so that the stability and the accuracy of a target area can be ensured. And acquiring an image area of a moving target in the frame image to be detected as a target area by taking the initial frame image of the target video training data as the frame image to be detected.
It should be noted that, in general, the location and the range of the moving target may be determined by manually labeling or using a labeling tool, and in one embodiment of the present invention, a YOLO neural network model may be used to label the target in the target video training data, so as to obtain an image area of the moving target in the frame image to be measured as a target area. The YOLO neural network model is a technical means well known to those skilled in the art, and will not be described herein.
The specific optical flow field is a technical means well known to those skilled in the art, and will not be described herein.
In the related target video training data, a plurality of acquisition moving targets exist, the acquisition targets and the targets for detection and tracking can have relative position movement on continuous frames, the scale of the same target in the continuous frames can be changed, and meanwhile, relative shielding conditions can also exist, so that the size of a tracking frame of the target is changed on the continuous frames; the optical flow analysis can carry out matching tracking on pixel points in continuous video frame images, describes the dynamic trend of gray level change of each pixel point between continuous frames, and represents the motion condition of the same target in the continuous frames by acquiring the matching optical flow points. And screening out matched light flow points according to gray distribution and position characteristics of pixel points between each target area on the frame image to be detected and other frame images after the target area and the other frame images by combining light flow analysis.
Preferably, in one embodiment of the present invention, the method for acquiring the matching light flow point includes:
obtaining an optical flow matching distance according to gray distribution and position characteristics of pixel points between each target area on the frame image to be detected and other frame images;
If the optical flow matching distance of the pixel points between each target area on the frame image to be detected and other frame images is smaller than or equal to a preset matching threshold value, the corresponding pixel points are the matched optical flow points.
The object of the optical flow matching is to minimize pixel differences between frames to obtain accurate motion estimation, and by matching the optical flow points, an algorithm can track the motion track of pixels or feature points in an image, so that the motion mode of the object can be more accurately understood. It should be noted that, in one embodiment of the present invention, the preset matching threshold is 5; in other embodiments of the present invention, the size of the preset matching threshold may be specifically set according to specific situations, which are not limited and described herein in detail.
The optical flow matching distance is a key index for measuring the motion similarity of pixels or feature points between frame images, and through calculating the optical flow matching distance, the pixel points which keep consistency between different frame images can be judged, related feature points can be effectively searched and matched in continuous frame images, and the motion trail and speed of a target can be estimated more accurately. The optical flow matching distance is acquired for screening.
Preferably, in an embodiment of the present invention, the optical flow matching distance acquiring method includes:
for each frame of image, obtaining the neighborhood gray scale characteristics of each pixel point according to the ratio of the gray scale fluctuation degree and the gray scale level of the pixel point in the neighborhood range of each pixel point; calculating the square sum of the position coordinates of each pixel point to be used as a regional position characteristic; obtaining the group characteristics of each pixel point according to the distribution quantity of the gray values corresponding to each pixel point;
Obtaining a feature set of each pixel point according to the neighborhood gray level feature, the regional position feature and the family group feature of each pixel point; and constructing an optical flow field of the target video training data, and obtaining an optical flow matching distance according to the change degree of a feature set of a corresponding pixel point between each target area on a frame image to be detected and other frame images in the optical flow field.
It should be noted that, in one embodiment of the present invention, the gray level fluctuation degree is obtained by calculating the variance of all gray level values in the neighborhood of the pixel point; the gray level is obtained by calculating the average value of gray levels in the neighborhood of the pixel point. In other embodiments of the invention, the range or average deviation, etc. may also be calculated to characterize the degree of fluctuation; the gray level may also be represented by selecting the number or mode of the gray values of the pixels, and the specific means are technical means well known to those skilled in the art, and will not be described herein.
It should be noted that, in an embodiment of the present invention, the degree of change may be analyzed by calculating the euclidean distance between each target region on the frame image to be measured and the pixel point of the subsequent frame image in the feature set, where the greater the euclidean distance, the greater the degree of change; in other embodiments of the present invention, the relationship with greater DTW distance and greater degree of variation may also be constructed by using DTW distance, and specific means are technical means known to those skilled in the art, and will not be described herein.
In one embodiment of the present invention, the neighborhood range is a range formed by 8 neighborhood pixels with the pixel as the center; in other embodiments of the present invention, the size of the neighborhood range may be specifically set according to specific situations, which is not limited and described herein.
Step S3: obtaining a matching density coefficient corresponding to each target area on each frame image between the frame images according to the number of the matching light flow points corresponding to each target area between the frame images and the relative distance between the matching light flow points on each frame image; obtaining an optical flow interference coefficient of each target area on a next frame image of the frame to be detected according to the change difference of the matching density coefficient between the frame image to be detected and the next frame image of the frame to be detected and between the two next frame images; obtaining a rotation threshold suppression coefficient according to fluctuation of distribution variation differences of the matched optical flow points between adjacent frame images; and adjusting a preset matching threshold according to the optical flow interference coefficient, the rotation threshold suppression coefficient and the position distribution of the matching optical flow points of each target area between adjacent frame images to obtain an adjusted matching threshold.
The number of matching optical flow points reflects the liveness of the target area moving between frame images, and more optical flow points generally means that the target area experiences larger displacement or deformation between frame images; the relative distance between the matching light flow points reflects the degree of dispersion between them, and if the distance is closer, it means that the light flow points are concentrated in a smaller area, indicating that the movement of the target area between frames is smaller or the shape is more stable, and the density is greater. The motion condition of the target area between the frame images can be more comprehensively analyzed; the matching density coefficient corresponding to each target area on each frame image between the frame images is obtained according to the number of the matching light flow points corresponding to each target area between the frame images and the relative distance between the matching light flow points on each frame image.
Preferably, in one embodiment of the present invention, the method for obtaining the matching density coefficient includes:
calculating Euclidean distance between the matched optical flow points on each frame of image as a relative distance;
obtaining a matching density coefficient according to a calculation formula of the matching density coefficient, wherein the calculation formula of the matching density coefficient is as follows:
Wherein, Representing the first between frame imagesCorresponding first on frame imageMatching density coefficients of the target areas; representing correspondence between frame images The number of matching flow points in the respective target areas; representing the first between frame images Corresponding first on frame imageThe first target areaA plurality of matching flow points; representing the first between frame images Corresponding first on frame imageThe first target areaA plurality of matching flow points; Representing the Euclidean distance; Representing a gaussian kernel function; representing bandwidth parameters in the gaussian kernel function.
In the formula for matching the density coefficients,Representing the first between frame imagesCorresponding first on frame imageThe first target areaA plurality of matching flow pointsThe ratio of Euclidean distance between the matched optical flow points to the bandwidth parameter; Measuring the frame image to be between Frame image corresponds to the firstThe first target areaA plurality of matching flow pointsThe similarity between the matching optical flow points, when the euclidean distance is smaller, the number of the matching optical flow points is larger,The smaller the Gaussian kernel function is, the larger the similarity is, and the larger the matching density coefficient in the target area is; Calculate the first The sum of Gaussian kernels between the matched light flow points and other matched light flow points is averaged to obtain the firstAverage similarity of each matched optical flow point to all other matched optical flow points, i.e. the firstCorresponding first on frame imageThe first target areaThe higher the average similarity, the higher the gaussian kernel function and the higher the kernel density.
For the same target area, the gradual change of the density coefficient is kept when the density coefficient is matched between the frame image to be detected and the adjacent frames and the interval frames, and the density change is relatively gentle; however, if the density of the corresponding target area between the adjacent frame images is significantly reduced, and the density of the corresponding target area between the interval frame images is increased, the density of the matching optical flow points is reduced and the larger optical flow interference coefficient is probably caused by the illumination change or the image quality problem of the frame images. And obtaining the optical flow interference coefficient of each target area on the next frame image of the frame to be detected according to the variation difference of the matching density coefficient between the frame image to be detected and the next frame image of the frame to be detected and between the two next frame images.
Preferably, in one embodiment of the present invention, the method for acquiring an optical flow interference coefficient includes:
Obtaining the density change degree of each target area between two frame images according to the difference of the matching density coefficients of each target area between the two frame images;
And obtaining the optical flow interference coefficient of each target area of the next frame image of the frame to be detected according to the difference value of the density change degree of each target area between the frame image to be detected and the next two frame images of the frame to be detected and the next frame image.
In one embodiment of the invention, the optical flow disturbance coefficient is formulated as:
Wherein, Represent the firstThe target areas are in the frame after the frame to be measuredOptical flow disturbance coefficients on the image; Representing frames to be measured Image and frame to be measured next frameCorrespondence between imagesThe degree of density variation of the individual target areas; Representing frames to be measured Two latter frames of image and frame to be measuredCorrespondence between imagesThe degree of density variation of the individual target areas; Represent the first Frame image and the firstCorrespondence between frame imagesThe degree of density variation of the individual target areas; representing the first between frame images Corresponding first on frame imageMatching density coefficients of the target areas; representing the first between frame images Corresponding first on frame imageMatching density coefficients for each target region.
In the formula of the optical flow disturbance coefficient,Representing frames to be measuredTwo latter frames of image and frame to be measuredImage and subsequent frameThe difference value of the density change degree of each target area is larger, the frame to be measured is largerImage and frame to be measured next frameThe lower the density variation degree between images, and the frame to be measuredTwo latter frames of image and frame to be measuredThe greater the degree of density variation between images, the more likely the latter frame is to be explainedThe more discrete the distribution of the matched optical flow points on the image, the more likely the illumination change or the image quality problem occurs, the larger the illumination interference coefficient, and the more optical flow points corresponding to the target area need to be acquired for accurate tracking.
It should be noted that, in other embodiments of the present invention,Other basic mathematical operations can be used to construct the frame image between the frame imagesFrame image corresponds to the firstThe greater the matching density coefficient of each target region, the smaller the degree of density variation, at the firstFrame image corresponds to the firstThe greater the matching density coefficient of each target area, the greater the density variation degree; first, theThe frame images are adjacent to each otherThe smaller the degree of density change between frame images corresponding to each target area, theFrame image corresponding intervalThe greater the density change degree of each target area between the frame images is, the greater the positive and negative correlation relation of the optical flow interference coefficient is; the positive correlation relationship indicates that the dependent variable increases along with the increase of the independent variable, the dependent variable decreases along with the decrease of the independent variable, and the specific relationship can be multiplication relationship, addition relationship, idempotent of an exponential function and is determined by practical application; the negative correlation indicates that the dependent variable decreases with increasing independent variable, and the dependent variable increases with decreasing independent variable, which may be a subtraction relationship, a division relationship, or the like, and is determined by the actual application. For example, calculate the normalization of the difference between the two and calculateThe ratio between them; the specific means are well known to those skilled in the art, and will not be described herein.
When the negative correlation is implemented, if the scaling is implemented, a manually set threshold value, for example, a value of 0.01, needs to be added at the denominator of the partial equation, so as to avoid the situation that the denominator of the partial equation takes a value of 0.
Since the optical flow method relies on similarity between adjacent frames for motion estimation, when inter-frame correlation changes, the distribution of optical flow points changes accordingly. By performing fluctuation detection on the distribution variation difference of the matched optical flow points between adjacent frame images, optical flow periodicity abnormality caused by reduced inter-frame relevance can be identified, and the fluctuation detection is helpful to identify unstable areas or abnormal frames possibly existing in video; according to the intensity of the distribution change difference of the inter-frame matching optical flow points, the rotation threshold value inhibition coefficient is adaptively adjusted, so that the change of video content is better adapted in optical flow calculation; the rotation threshold suppression coefficient is obtained from the fluctuation of the distribution variation difference of the matching optical flow points between the adjacent frame images.
Preferably, in one embodiment of the present invention, the method for acquiring the rotation threshold suppression coefficient includes:
according to the relative distance between the matched optical flow points of each frame image corresponding to each target area, taking the relative distance as a distribution characteristic;
and obtaining a rotation threshold value inhibition coefficient according to the fluctuation degree of the distribution characteristic difference of the matched optical flow points corresponding to each target area between different frame images.
In one embodiment of the invention, the rotation threshold suppression coefficient is formulated as:
Wherein, Represent the firstThe target areas are in the frame to be measuredImage and frame to be measured next frameRotation threshold suppression coefficient of the image; Represent the first The target areas are in the frame to be measuredImage and frame to be measured next frameImage NoA plurality of matching flow pointsThe ratio of the individual matching flow points, i.e. the difference in the distribution characteristics; Representing frames to be measured between frame images Corresponding first on the imageThe first target areaA plurality of matching flow points; Representing frames to be measured between frame images Corresponding first on the imageThe first target areaA plurality of matching flow points; the next frame representing the frame to be measured between the frame images Corresponding first on the imageThe first target areaA plurality of matching flow points; the next frame representing the frame to be measured between the frame images Corresponding first on the imageThe first target areaA plurality of matching flow points; Represent the first The target areas are in the frame to be measuredImage and frame to be measured next frameVariance of distribution characteristic differences of all matched optical flow points between images; Representing the Euclidean distance; Representing the normalization function.
In the formula for the rotation threshold suppression coefficient,Representing frames to be measured between frame imagesCorresponding first on the imageThe first target areaA plurality of matching flow points and a firstEuclidean distances between the individual matching optical flow points; the next frame representing the frame to be measured between the frame images Corresponding first on the imageThe first target areaA plurality of matching flow pointsEuclidean distances between the individual matching optical flow points; representing correspondence between frame images Target area numberA plurality of matching flow pointsThe ratio of Euclidean distances among the matched optical flow points, namely the distribution characteristic difference; the smaller the variance is, the more consistent the distribution difference of the matched optical flow points is, and the smaller the possibility of self rotation is; the larger the variance, the more inconsistent the matching optical flow point distribution differences, the more likely it is to be caused by self motion or rotation, and the larger the rotation threshold suppression coefficient.
It should be noted that, in one embodiment of the present invention, the fluctuation degree may be achieved by calculating the variance, and the larger the variance is, the larger the fluctuation degree is; the specific means are well known to those skilled in the art, and will not be described herein.
In the process of target tracking, the target may have dynamic changes such as scale change, rotation and the like; the optical flow interference coefficient reflects the scale change condition of the target area between adjacent frames, and when the target area has larger scale change, the originally set matching threshold value is probably not applicable any more, so that adjustment is needed based on the optical flow interference coefficient; the rotation threshold suppression coefficient measures the rotation condition of a target area between adjacent frames, when the target rotates, the position distribution of the matched light flow points changes, and the situation of mismatching or missed matching is possibly caused, and by introducing the rotation threshold suppression coefficient, the matching threshold can be adjusted according to the rotation condition of the target, so that the mismatching and missed matching situations are reduced, and the accuracy of target tracking is improved; as the object moves in the scene, its position in successive frame images changes, and by analyzing the position distribution of these light flow points, the matching threshold is more precisely adjusted to adapt to the actual motion situation of the object. The preset matching threshold is adjusted according to the optical flow interference coefficient, the rotation threshold suppression coefficient and the position distribution of the matching optical flow points of each target area between the adjacent frame images, and the adjusted matching threshold is obtained.
Preferably, in one embodiment of the present invention, the acquiring method for adjusting the matching threshold includes:
obtaining an area expansion coefficient according to the position distribution of the matching light flow points of each target area between the adjacent frame images;
Obtaining an adjustment matching threshold according to the optical flow interference coefficient, the rotation threshold suppression coefficient, the area expansion coefficient and a preset matching threshold; the optical flow interference coefficient, the area expansion coefficient and the preset matching threshold are positively correlated with the adjustment matching threshold; the rotation threshold suppression coefficient is inversely related to the adjustment match threshold. The positive correlation relationship indicates that the dependent variable increases along with the increase of the independent variable, the dependent variable decreases along with the decrease of the independent variable, and the specific relationship can be multiplication relationship, addition relationship, idempotent of an exponential function and is determined by practical application; the negative correlation indicates that the dependent variable decreases with increasing independent variable, and the dependent variable increases with decreasing independent variable, which may be a subtraction relationship, a division relationship, or the like, and is determined by the actual application.
When the target moves in the image sequence, the corresponding pixel position changes, and the size change of the target between adjacent frames can be estimated more accurately by analyzing the position distribution of the matched current points between the adjacent frame images, so that the position information of the target is updated; the area enlargement factor is obtained from the position distribution of the matching flow point between the adjacent frame images for each target area.
Preferably, in one embodiment of the present invention, the method for acquiring the area expansion coefficient includes:
Constructing a minimum circumscribed rectangle of each target area, which is used for matching the light flow points between the frame images to be detected, as a first target area; constructing a minimum circumscribed rectangle of each target area, which is used for matching the light flow points between the images of the next frame, as a second target area;
Multiplying the length and the width of the first target area by the density change degree respectively to obtain the length and the width of the predicted target area; constructing a minimum circumscribed rectangle of the second target area and the predicted target area as an integral target area;
and calculating the ratio of the difference between the whole target area and the second target area to the second target area, and adding the ratio result and a preset constant to obtain the area expansion coefficient.
In one embodiment of the invention, the formula for adjusting the match threshold is expressed as:
Wherein, Represent the firstThe target areas are in the frame to be measuredImage and frame to be measured next frameAdjusting the image to match the threshold; representing an area expansion coefficient; representing the area of the overall target area D; Representing the next frame of the frame under test Minimum circumscribed rectangle of matched optical flow points on imageI.e. the area of the second target area; Represent the first The target areas are in the frame to be measuredImage and frame to be measured next frameRotation threshold suppression coefficient of the image; Represent the first Optical flow interference coefficients of the target areas in the later frame of image; representing a preset matching threshold.
In the formula for adjusting the matching threshold value,Represent the firstOptical flow interference coefficient and the first of each target region in the subsequent frame imageThe target areas are in the frame to be measuredImage and frame to be measuredA ratio of rotation threshold suppression coefficients of the target rotation acquired by the image; the larger the optical flow interference coefficient is, the higher the threshold value is required to be adjusted to obtain richer matched optical flow points, the larger the rotation threshold value inhibition coefficient is, and the threshold value is required to be reduced to reduce the threshold value so as to inhibit the occurrence of misplacement matching of the target optical flow points and the surrounding environment; therefore, the larger the optical flow interference coefficient is, the smaller the rotation threshold value inhibition coefficient is, the larger the threshold value is required to be adjusted, and the larger the ratio is, the larger the corresponding matching threshold value is; the subsequent frame representing the whole target area D minus the frame to be measured The area of the smallest circumscribed rectangle C of the matched optical flow points on the image and the next frame of the frame to be detectedThe ratio of the area of the minimum circumscribing rectangle C of the matched optical flow points on the image is expressed as the ratio of the area with the unobtained optical flow matching to the area with the optical flow matching on the target,I.e., the area expansion coefficient; the smaller the ratio is, the larger the existing optical flow matching area is, the smaller the area expansion coefficient is, and the matching threshold value is not required to be increased; the larger the ratio, the smaller the existing optical flow matching region, the larger the area expansion coefficient, and the more the matching threshold needs to be increased.
It should be noted that, in other embodiments of the present invention, the positive correlation may be constructed by other basic mathematical operations such as partial addition, and specific means are technical means known to those skilled in the art, and will not be described herein.
Step S4: screening and adjusting the matching light flow points according to the adjusting and matching threshold value to obtain a new target area in the later frame of image; based on the cyclic neural network, the next frame image is used as a new frame image to be detected, and continuous tracking of the target is realized.
The matching threshold can be set to screen the matched optical flow points, so that the optical flow points with larger deviation from the expected motion trail are eliminated, the accuracy of extracting the target area can be improved, the mismatching and misdetection are reduced, and the target area in the next frame of image is extracted more accurately. And therefore, the matching light flow points are screened and adjusted according to the matching threshold value, and a new target area in the later frame of image is acquired.
It should be noted that, in one embodiment of the present invention, the optical flow matching distance between each target area on the frame image to be detected and the pixel points of other frame images is smaller than or equal to the adjustment matching threshold value, which corresponds to the adjustment of the matching optical flow points; and then constructing a minimum circumscribed rectangle on the later frame image by using the adjustment matching light flow points as a new target area in the later frame image.
The cyclic neural network introduces the concept of time in the neural network, so that the cyclic neural network can process sequence data, has memory, can retain and transmit previous information, can adapt to changes by learning target characteristics in history frames, and can more accurately track targets.
It should be noted that, the specific recurrent neural network is a technical means well known to those skilled in the art, and will not be described herein.
In summary, according to the gray distribution and the position characteristics of the pixel points between each target area on the frame image to be detected and other frame images, the matching light flow points are screened out by combining light flow analysis; obtaining a matching density coefficient corresponding to each target area on each frame image between the frame images according to the number of the matching light flow points corresponding to each target area between the frame images and the relative distance between the matching light flow points on each frame image; further obtaining an optical flow interference coefficient of each target area on a later frame image of the frame to be detected; obtaining a rotation threshold suppression coefficient according to fluctuation of distribution variation differences of the matched optical flow points between adjacent frame images; combining the position distribution of the matched light flow points of each target area between the adjacent frame images to obtain an adjustment matching threshold; screening and adjusting the matched light flow points to obtain a new target area in the later frame of image; based on the cyclic neural network, the next frame image is used as a new frame image to be detected, and continuous tracking of the target is realized. According to the invention, the accuracy of target positioning and tracking is improved by adaptively acquiring the accurate matching threshold value of the optical flow point.
Based on the same application conception as the target tracking method based on the cyclic neural network provided by the embodiment of the application, the embodiment also provides a target tracking system based on the cyclic neural network, as shown in fig. 2, the system comprises: a data acquisition module 201, an optical flow matching module 202, a threshold adjustment module 203, and a target tracking module 204:
the data acquisition module 201: acquiring target video training data containing a moving target;
optical flow matching module 202: taking an initial frame image of target video training data as a frame image to be detected, and acquiring an image area of a moving target in the frame image to be detected as a target area; screening out matching light flow points according to gray distribution and position characteristics of pixel points between each target area on the frame image to be detected and other frame images after the target area and the other frame images by combining light flow analysis;
Threshold adjustment module 203: obtaining a matching density coefficient corresponding to each target area on each frame image between the frame images according to the number of the matching light flow points corresponding to each target area between the frame images and the relative distance between the matching light flow points on each frame image; obtaining an optical flow interference coefficient of each target area on a next frame image of the frame to be detected according to the change difference of the matching density coefficient between the frame image to be detected and the next frame image of the frame to be detected and between the two next frame images; obtaining a rotation threshold suppression coefficient according to fluctuation of distribution variation differences of the matched optical flow points between adjacent frame images; adjusting a preset matching threshold according to the optical flow interference coefficient, the rotation threshold suppression coefficient and the position distribution of the matching optical flow points of each target area between adjacent frame images to obtain an adjusted matching threshold;
The target tracking module 204: screening and adjusting the matching light flow points according to the adjusting and matching threshold value to obtain a target area in the later frame of image; based on the cyclic neural network, the next frame image is used as a new frame image to be detected, and continuous tracking of the target is realized.
It should be noted that, in the system provided in the foregoing embodiment, only the division of the foregoing functional modules is illustrated, and in practical application, the foregoing functional allocation may be performed by different functional modules according to needs, that is, the internal structure of the computer device is divided into different functional modules, so as to perform all or part of the functions described above.
It should be appreciated that the system provided in this embodiment is used to perform the above-described target tracking method based on the recurrent neural network, and thus has the same advantageous effects as the method adopted, operated or implemented by the application program stored therein.
The invention also proposes an electronic device, as in fig. 3, comprising a memory 301, a processor 302 and a computer program 303 stored in the memory and executable on the processor, the processor 302 implementing any one of the steps of a method for target tracking based on a recurrent neural network when executing the computer program 303.
It should be noted that: the sequence of the embodiments of the present invention is only for description, and does not represent the advantages and disadvantages of the embodiments. The processes depicted in the accompanying drawings do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments.

Claims (6)

1. A method for target tracking based on a recurrent neural network, the method comprising:
Acquiring target video training data containing a moving target;
taking an initial frame image of target video training data as a frame image to be detected, and acquiring an image area of a moving target in the frame image to be detected as a target area; screening out matching light flow points according to gray distribution and position characteristics of pixel points between each target area on the frame image to be detected and other frame images after the target area and the other frame images by combining light flow analysis;
obtaining a matching density coefficient corresponding to each target area on each frame image between the frame images according to the number of the matching optical flow points corresponding to each target area between the frame images and the relative distance between the matching optical flow points on each frame image; obtaining an optical flow interference coefficient of each target area on a next frame image of the frame to be detected according to the change difference of the matching density coefficient between the frame image to be detected and the next frame image of the frame to be detected and between the two next frame images; obtaining a rotation threshold suppression coefficient according to fluctuation of distribution variation differences of the matched optical flow points between adjacent frame images; adjusting a preset matching threshold according to the optical flow interference coefficient, the rotation threshold suppression coefficient and the position distribution of the matching optical flow points of each target area between adjacent frame images to obtain an adjusted matching threshold;
Screening and adjusting the matching light flow points according to the adjusting and matching threshold value to obtain a target area in the later frame of image; based on the cyclic neural network, the next frame image is used as a new frame image to be detected, so that continuous tracking of the target is realized;
the method for acquiring the matching density coefficient comprises the following steps:
calculating Euclidean distance between the matched optical flow points on each frame of image as a relative distance;
obtaining a matching density coefficient according to a calculation formula of the matching density coefficient, wherein the calculation formula of the matching density coefficient is as follows:
Wherein, Representing the matching density coefficient of the ith target area corresponding to the t-th frame image between the frame images; n i represents the number of matching flow points in the corresponding i-th target area between frame images; representing an nth matching light flow point corresponding to an ith target area on a t-th frame image between frame images; Representing an mth matching light flow point corresponding to an ith target area on a t-th frame image between frame images; the euclidean distance is calculated; k () represents a Gaussian kernel function; h represents a bandwidth parameter in the gaussian kernel function;
the method for acquiring the optical flow interference coefficient comprises the following steps:
Obtaining the density change degree of each target area between two frame images according to the difference of the matching density coefficients of each target area between the two frame images;
obtaining an optical flow interference coefficient of each target area of a next frame image of the frame to be detected according to the difference value of the density change degree of each target area between the frame image to be detected and the next two frame images of the frame to be detected and between the next frame images;
the method for acquiring the rotation threshold suppression coefficient comprises the following steps:
according to the relative distance between the matched optical flow points of each frame image corresponding to each target area, taking the relative distance as a distribution characteristic;
Obtaining a rotation threshold value inhibition coefficient according to the fluctuation degree of the distribution characteristic difference of the matched optical flow points corresponding to each target area among different frame images;
The acquisition method for adjusting the matching threshold value comprises the following steps:
obtaining an area expansion coefficient according to the position distribution of the matching light flow points of each target area between the adjacent frame images;
obtaining an adjustment matching threshold according to the optical flow interference coefficient, the rotation threshold suppression coefficient, the area expansion coefficient and a preset matching threshold; the optical flow interference coefficient, the area expansion coefficient and the preset matching threshold are positively correlated with the adjustment matching threshold; the rotation threshold suppression coefficient is inversely related to the adjustment match threshold.
2. The target tracking method based on the recurrent neural network as claimed in claim 1, wherein the acquisition method of the matching current points comprises:
obtaining an optical flow matching distance according to gray distribution and position characteristics of pixel points between each target area on the frame image to be detected and other frame images;
if the optical flow matching distance between each target area on the frame image to be detected and the pixel points of other frame images is smaller than or equal to a preset matching threshold value, the corresponding pixel points are the matched optical flow points.
3. The method for tracking the target based on the recurrent neural network according to claim 2, wherein the method for acquiring the optical flow matching distance comprises the following steps:
for each frame of image, obtaining the neighborhood gray scale characteristics of each pixel point according to the ratio of the gray scale fluctuation degree and the gray scale level of the pixel point in the neighborhood range of each pixel point;
Calculating the square sum of the position coordinates of each pixel point to be used as a regional position characteristic; obtaining the group characteristics of each pixel point according to the distribution quantity of the gray values corresponding to each pixel point;
obtaining a feature set of each pixel point according to the neighborhood gray level feature, the regional position feature and the family group feature of each pixel point;
And constructing an optical flow field of the target video training data, and obtaining an optical flow matching distance according to the change degree of a feature set of a corresponding pixel point between each target area on a frame image to be detected and other frame images in the optical flow field.
4. The target tracking method based on the recurrent neural network as claimed in claim 1, wherein the area expansion coefficient obtaining method comprises:
Constructing a minimum circumscribed rectangle of each target area, which is used for matching the light flow points between the frame images to be detected, as a first target area; constructing a minimum circumscribed rectangle of each target area, which is used for matching the light flow points between the images of the next frame, as a second target area;
Multiplying the length and the width of the first target area by the density change degree respectively to obtain the length and the width of the predicted target area; constructing the minimum circumscribed rectangle of the second target area and the predicted target area as an integral target area;
and calculating the ratio of the difference between the whole target area and the second target area to the second target area, and adding the ratio result and a preset constant to obtain the area expansion coefficient.
5. A cyclic neural network-based target tracking system, the system comprising:
And a data acquisition module: acquiring target video training data containing a moving target;
Optical flow matching module: taking an initial frame image of target video training data as a frame image to be detected, and acquiring an image area of a moving target in the frame image to be detected as a target area; screening out matching light flow points according to gray distribution and position characteristics of pixel points between each target area on the frame image to be detected and other frame images after the target area and the other frame images by combining light flow analysis;
A threshold adjustment module: obtaining a matching density coefficient corresponding to each target area on each frame image between the frame images according to the number of the matching optical flow points corresponding to each target area between the frame images and the relative distance between the matching optical flow points on each frame image; obtaining an optical flow interference coefficient of each target area on a next frame image of the frame to be detected according to the change difference of the matching density coefficient between the frame image to be detected and the next frame image of the frame to be detected and between the two next frame images; obtaining a rotation threshold suppression coefficient according to fluctuation of distribution variation differences of the matched optical flow points between adjacent frame images; adjusting a preset matching threshold according to the optical flow interference coefficient, the rotation threshold suppression coefficient and the position distribution of the matching optical flow points of each target area between adjacent frame images to obtain an adjusted matching threshold;
the method for acquiring the matching density coefficient comprises the following steps:
calculating Euclidean distance between the matched optical flow points on each frame of image as a relative distance;
obtaining a matching density coefficient according to a calculation formula of the matching density coefficient, wherein the calculation formula of the matching density coefficient is as follows:
Wherein, Representing the matching density coefficient of the ith target area corresponding to the t-th frame image between the frame images; n i represents the number of matching flow points in the corresponding i-th target area between frame images; representing an nth matching light flow point corresponding to an ith target area on a t-th frame image between frame images; Representing an mth matching light flow point corresponding to an ith target area on a t-th frame image between frame images; the euclidean distance is calculated; k () represents a Gaussian kernel function; h represents a bandwidth parameter in the gaussian kernel function;
the method for acquiring the optical flow interference coefficient comprises the following steps:
Obtaining the density change degree of each target area between two frame images according to the difference of the matching density coefficients of each target area between the two frame images;
obtaining an optical flow interference coefficient of each target area of a next frame image of the frame to be detected according to the difference value of the density change degree of each target area between the frame image to be detected and the next two frame images of the frame to be detected and between the next frame images;
the method for acquiring the rotation threshold suppression coefficient comprises the following steps:
according to the relative distance between the matched optical flow points of each frame image corresponding to each target area, taking the relative distance as a distribution characteristic;
Obtaining a rotation threshold value inhibition coefficient according to the fluctuation degree of the distribution characteristic difference of the matched optical flow points corresponding to each target area among different frame images;
The acquisition method for adjusting the matching threshold value comprises the following steps:
obtaining an area expansion coefficient according to the position distribution of the matching light flow points of each target area between the adjacent frame images;
Obtaining an adjustment matching threshold according to the optical flow interference coefficient, the rotation threshold suppression coefficient, the area expansion coefficient and a preset matching threshold; the optical flow interference coefficient, the area expansion coefficient and the preset matching threshold are positively correlated with the adjustment matching threshold; the rotation threshold suppression coefficient is inversely related to the adjustment matching threshold;
A target tracking module: screening and adjusting the matching light flow points according to the adjusting and matching threshold value to obtain a target area in the later frame of image; based on the cyclic neural network, the next frame image is used as a new frame image to be detected, and continuous tracking of the target is realized.
6. An electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of a recurrent neural network based object tracking method as claimed in any one of claims 1-4 when the computer program is executed.
CN202410815952.2A 2024-06-24 2024-06-24 Target tracking method and system based on cyclic neural network and electronic equipment Active CN118397047B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410815952.2A CN118397047B (en) 2024-06-24 2024-06-24 Target tracking method and system based on cyclic neural network and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410815952.2A CN118397047B (en) 2024-06-24 2024-06-24 Target tracking method and system based on cyclic neural network and electronic equipment

Publications (2)

Publication Number Publication Date
CN118397047A CN118397047A (en) 2024-07-26
CN118397047B true CN118397047B (en) 2024-09-13

Family

ID=91988177

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410815952.2A Active CN118397047B (en) 2024-06-24 2024-06-24 Target tracking method and system based on cyclic neural network and electronic equipment

Country Status (1)

Country Link
CN (1) CN118397047B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118627862A (en) * 2024-08-12 2024-09-10 浙江省城乡规划设计研究院 Old cell update-oriented environmental data acquisition and update space analysis method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110555868A (en) * 2019-05-31 2019-12-10 南京航空航天大学 method for detecting small moving target under complex ground background
CN114782499A (en) * 2022-04-28 2022-07-22 杭州电子科技大学 Image static area extraction method and device based on optical flow and view geometric constraint

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104376576B (en) * 2014-09-04 2018-06-05 华为技术有限公司 A kind of method for tracking target and device
CN114842055A (en) * 2022-04-20 2022-08-02 拓元(广州)智慧科技有限公司 Container commodity tracking method based on optical flow
CN117882109A (en) * 2022-07-01 2024-04-12 京东方科技集团股份有限公司 Target tracking method, target tracking system and electronic equipment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110555868A (en) * 2019-05-31 2019-12-10 南京航空航天大学 method for detecting small moving target under complex ground background
CN114782499A (en) * 2022-04-28 2022-07-22 杭州电子科技大学 Image static area extraction method and device based on optical flow and view geometric constraint

Also Published As

Publication number Publication date
CN118397047A (en) 2024-07-26

Similar Documents

Publication Publication Date Title
CN118397047B (en) Target tracking method and system based on cyclic neural network and electronic equipment
CN110728697B (en) Infrared dim target detection tracking method based on convolutional neural network
US5018215A (en) Knowledge and model based adaptive signal processor
US7982774B2 (en) Image processing apparatus and image processing method
CN110120064B (en) Depth-related target tracking algorithm based on mutual reinforcement and multi-attention mechanism learning
CN108230352B (en) Target object detection method and device and electronic equipment
CN111582349B (en) Improved target tracking algorithm based on YOLOv3 and kernel correlation filtering
CN110909712B (en) Moving object detection method and device, electronic equipment and storage medium
CN113723190A (en) Multi-target tracking method for synchronous moving target
CN111008991B (en) Background-aware related filtering target tracking method
CN111429485B (en) Cross-modal filtering tracking method based on self-adaptive regularization and high-reliability updating
CN112906685A (en) Target detection method and device, electronic equipment and storage medium
CN114708300A (en) Anti-blocking self-adaptive target tracking method and system
CN118115502B (en) Sewage discharge monitoring method and system based on image features
CN117218161B (en) Fish track tracking method and system in fish tank
US8072612B2 (en) Method and apparatus for detecting a feature of an input pattern using a plurality of feature detectors, each of which corresponds to a respective specific variation type and outputs a higher value when variation received by the input pattern roughly matches the respective specific variation type
CN111681266A (en) Ship tracking method, system, equipment and storage medium
CN115861386A (en) Unmanned aerial vehicle multi-target tracking method and device through divide-and-conquer association
CN115797770A (en) Continuous image target detection method, system and terminal considering relative movement of target
JP2024516642A (en) Behavior detection method, electronic device and computer-readable storage medium
CN114764820A (en) Infrared dim target detection and tracking method and system based on contrast
Zhu et al. Visual tracking with dynamic model update and results fusion
CN108346158B (en) Multi-target tracking method and system based on main block data association
Cai et al. A target tracking method based on adaptive occlusion judgment and model updating strategy
JP2002163657A (en) Device and method for recognizing image and recording medium storing program therefor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant