CN112580679A - Infrared target tracking method, computer device and computer readable storage medium - Google Patents
Infrared target tracking method, computer device and computer readable storage medium Download PDFInfo
- Publication number
- CN112580679A CN112580679A CN201910941612.3A CN201910941612A CN112580679A CN 112580679 A CN112580679 A CN 112580679A CN 201910941612 A CN201910941612 A CN 201910941612A CN 112580679 A CN112580679 A CN 112580679A
- Authority
- CN
- China
- Prior art keywords
- target
- image
- infrared
- tracking method
- target tracking
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2415—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
- G06F18/24155—Bayesian classification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10048—Infrared image
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Probability & Statistics with Applications (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
Abstract
The invention belongs to the technical field of image processing, and particularly relates to an infrared target tracking method based on improved particle filtering, computer equipment and a computer readable storage medium, wherein the method comprises the following steps: s1, acquiring a positive sample and a negative sample of the first frame of infrared image; s2, constructing a sparse representation-based generative model (SGM); s3, constructing a guide filtering-based discriminant classifier (GFDC); s4, performing target prediction by using particle filtering; s5, carrying out weight distribution by using a GFDC-SGM method; and S6, updating the parameters online. The invention combines the generating formula and the discriminating type on the basis of adopting sparse representation and particle filtering, fully utilizes the target characteristics and combines background information, and fully combines the target characteristics and the background information to realize accurate and efficient tracking of the infrared target image sequence.
Description
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to an infrared target tracking method based on improved particle filtering, computer equipment and a computer readable storage medium.
Background
The infrared target tracking system has the advantages of strong concealment, good real-time performance and the like, and is widely applied to the fields of infrared remote measurement, cable, navigation, guidance, alarm and the like. Compared with a visible light detector, the infrared detector can work all weather, has strong environmental adaptability and has long acting distance. However, when the environment has problems of strong radiation change, detector movement, target shielding and the like, the appearance of the target in the image can be changed after the detector images; meanwhile, when the background has a complex background of a plurality of suspected targets, the performance of the target tracking algorithm is reduced; for small objects, they are easily buried in the background noise, causing the object tracking to "drift".
The whole process of the particle filter is divided into two steps, the first step is a prediction process, and the particle filter estimates the state of the current moment from the state of the previous moment by using a given system model. In the second step of updating process, the particle filter will eliminate the particles with low weight, and let the particles with higher weight value generate more particles, so that the algorithm converges towards the place with higher weight value. The first is an initialization phase, which requires selecting a target object to be tracked before performing target tracking using a particle filter algorithm. This process can be used with both manual delineation methods and automatic identification methods. Then, in a state transition stage, estimating the posterior probability density under the current environment through the prior probability of the previous time to complete target position prediction; and finally, in the stage of weight calculation and resampling, eliminating the particles with low weight and generating the particles with high weight, namely, the process of updating the posterior probability. The transfer stage transfers the particles in the previous frame image to obtain the new position in the current frame image, and the effect of all the particles is not useful, that is, part of the particles are not the position where the tracking area is to be moved. Therefore, each particle is scored, i.e., weight is assigned, the particles with lower scores are deleted, and the particles with higher scores are generated into more particles, i.e., a resampling process. Therefore, the weight calculation in the particle filtering process is very important. By better weight allocation, the tracking algorithm can be made more accurate.
The target tracking method is divided into a generating formula and a discriminating formula. The generation type target tracking method firstly extracts target characteristics, learns the appearance characteristics of a representative target, performs target matching through searching image areas, finds an area which is the most matched with the target in the image, namely the target, and is known as Kalman filtering, mean-shift and other methods. The discriminant target tracking method is mainly used for separating a target from a background, a target area is taken as a positive sample in a current frame, a background area is taken as a negative sample, classification is carried out through a training classifier, an optimal area is found by the trained classifier in a next frame, namely the target, and common methods comprise a correlation filtering method and a deep learning method. The generation type target tracking method focuses on extracting the characteristics of a target area, background information is less in utilization, and the discriminant type target tracking method is characterized in that a target is extracted from the background, and the characteristics of the target are less in utilization.
Disclosure of Invention
In order to overcome the defects of the two target tracking methods, an improved particle filter-based target tracking method is provided.
In order to solve the technical problems, the invention adopts a technical scheme that: the infrared target tracking method comprises the following steps:
s1, acquiring a positive sample and a negative sample of the first frame of infrared image;
s2, constructing a sparse representation-based generative model (SGM);
s3, constructing a guide filtering-based discriminant classifier (GFDC);
s4, performing target prediction by using particle filtering;
s5, carrying out weight distribution by using a GFDC-SGM method;
and S6, updating GFDC and SGM parameters online.
Further, the step S1 specifically includes
S11, inputting a first frame image in the image sequence to be tracked;
s12, framing a target area to be tracked in the first frame image, and taking the centroid position of the target area image to be tracked as the initial position of the target to be tracked, namely, a positive sample;
and S13, framing a background area in the area around the target to be tracked in the first frame image, and taking the framed background area as a negative sample.
Further, the step S2 includes
S21, converting each positive sample block into a vector to obtain a sparse parameter vector of each positive sample block;
s22, connecting the sparse coefficient vectors of each positive sample to form a histogram rho;
s23, obtaining a weighted histogram through rho and o;
and S24, calculating the similarity of the sparsity histogram.
As a modification, the step S3 further includes
S31, obtaining a sample image after guide filtering;
and S32, carrying out naive Bayes classification on the sample image.
Specifically, the step S4 includes, in the state transition stage of the particle filtering, estimating a posterior probability density of the target in the current frame image according to the prior probability of the target in the previous frame image, and completing the target position estimation.
As a modification, the step S5 further includes
In the target prediction process of step 4, an observation model is constructed and N is optimized1The probability density of each particle, and the optimal target state is obtained.
As a further improvement, the step S6 specifically includes
S61, GFDC online updating:
in order to distinguish between positive and negative sample templates, the GFDC model is updated online as follows:
wherein λ is2>0 is the update rate, negative example template parameter μi 0And σi 0Updated with similar rules;
s62, SGM online updating:
in the SGM model, the generation dictionary D is constant in the same sequence, and in order to adapt to a change in appearance, the histogram ψ in the current frame is updated as followst:
ψt=μψ1+(1-μ)ψt-1′
Where μ is the learning factor,. psi1And psit-1The histograms stored in the first frame and the previous frame are shown separately.
Further, the step S6 is followed by
S7, judging whether the current frame image is the last frame of the image sequence to be tracked, if yes, executing the next step S8; otherwise, executing step S4;
and S8, completing the tracking of the infrared target in the image sequence to be tracked.
A computer device having a processor and a memory, the memory storing a computer program which, when executed by the processor, causes the processor to perform the steps of the infrared target tracking method of any of the above.
A computer-readable storage medium, having stored thereon a computer program which, when executed by a processor, causes the processor to carry out the steps of the infrared target tracking method of any of the above.
The invention combines the generating formula and the discriminating type on the basis of adopting sparse representation and particle filtering, fully utilizes the target characteristics and combines background information, and fully combines the target characteristics and the background information to realize accurate and efficient tracking of the infrared target image sequence.
Drawings
FIG. 1 is a schematic flow chart of an infrared target tracking method of the present invention;
FIG. 2 is a schematic diagram of extracting positive samples and negative samples of a first frame of image according to an infrared target tracking method of the present invention;
FIG. 3 is a guiding image of GF filtering according to an infrared target tracking method of the present invention;
FIG. 4 is a target image after GF filtering according to an infrared target tracking method of the present invention;
FIG. 5 is an image of a GF-filtered target and a suspected target according to an infrared target tracking method of the present invention;
FIG. 6 is a background image after GF filtering according to an infrared target tracking method of the present invention;
FIG. 7 is a diagram illustrating a comparison of 30 frames of results of an infrared target tracking method of the present invention with several other tracking methods;
FIG. 8 is a diagram illustrating a comparison of 130 frames of results of an infrared target tracking method of the present invention with several other tracking methods;
FIG. 9 is a diagram illustrating a 170 frame comparison of results of an infrared target tracking method of the present invention with several other tracking methods;
FIG. 10 is a graph illustrating the tracking accuracy curves of an infrared target tracking method according to the present invention compared with other tracking methods;
FIG. 11 is a graph showing the tracking success rate curve of the infrared target tracking method of the present invention compared with other tracking methods.
Detailed Description
The following describes an infrared target tracking method, a computer device and a computer readable storage medium based on improved particle filtering according to the present invention with reference to fig. 1 to 11.
As shown in fig. 1, an infrared target tracking method is provided, which includes the following steps:
s1, acquiring a positive sample and a negative sample of the first frame of infrared image;
the step S1 specifically includes
S11, inputting a first frame image in the image sequence to be tracked;
s12, framing a target area to be tracked in the first frame image, and taking the centroid position of the target area image to be tracked as the initial position of the target to be tracked, namely, a positive sample;
and S13, framing a background area in the area around the target to be tracked in the first frame image, and taking the framed background area as a negative sample.
S2, constructing a sparse representation-based generative model (SGM) which first obtains M samples and converts them into a vector yi;
The step S2 includes
S21, converting each positive sample block into a vector to obtain a sparse parameter vector of each positive sample block; as a preferred embodiment, this step is embodied as converting each block of positive samples into a vector yi∈RA×1Where a is the line number, a sparse parameter vector for each positive sample block is obtained as follows:
wherein D ∈ RA×JIs the dictionary generated from the labeled target positive samples in the first frame and J is the column number. Beta is ai∈RJ×1Is a sparse coefficient vector. Lambda [ alpha ]1Is a constraint factor;
s22, connecting the sparse coefficient vectors of each positive sample to form a histogram rho; as a preferred embodiment, this step is to use the sparse coefficient vector β of each positive sample according to the following formula (2)i∈RJ×1Connected, the histogram ρ is composed:
ρ=[β1;β2;...;βM] (2)
blocks with large reconstruction errors are considered as occlusion and the corresponding sparse parameter vector is set to 0.
S23, obtaining a weighted histogram through rho and o; as a preferred embodiment, the step o is represented by the following formula (3):
wherein epsiloniRepresenting a block yiOf the reconstruction error, epsilon0Is a predetermined threshold. Epsilon0Was obtained by conducting a number of experiments.
S24, calculating the similarity of the sparsity histogram; as a preferred embodiment, this step specifically calculates the similarity of sparsity histograms according to the following formula (4):
S3, constructing a guide filtering-based discriminant classifier (GFDC);
the step S3 further comprises
S31, obtaining a sample image after guide filtering; as a preferred embodiment, this step specifically follows equation (5) to obtain a guide-filtered sample image:
where p is the pilot filtered output image, G is the input pilot image, ωkRepresenting a sample, i representing a pixel in the sample, akAnd bkIs omegakThe medium constant linear factor is calculated as follows:
where ε is the regularization parameter, σk 2And mukIs the mean of the variance and G, | ω | representing ωkThe sum of the pixels in (b).
S32, carrying out naive Bayes classification on the sample image; as a preferred embodiment, the step specifically follows the following formula (8) to obtain the naive bayes classification of the sample:
wherein v ═ v (v)1,...,vt) Representing each feature, v, in the samplei(i ═ 1, 2., t) denotes the gray-level value of the ith pixel, and P (y ═ 0) ═ P (y ═ 1) is the sample label mean prior. Furthermore, P (v)iY) as a conditional distribution, following the parameter μi 1,σi 1,μi 0,σi 0Gaussian distribution of (u)i 1(μi 0) And σi 1(σi 0) Is the mean and variance of the positive (negative) sample features.
S4, performing target prediction by using particle filtering;
the step S4 includes, in the state transition stage of particle filtering, estimating a posterior probability density of the target in the current frame image according to the prior probability of the target in the previous frame image, and completing the target position estimation.
As a preferred embodiment, this step calculates the posterior probability density function p(s) according to equation (9) belowt|z1:t):
Wherein z is1:tFor a given image sequence, p(s)t-1|z1:t-1) Is a probability density function of the target in the t-1 th frame image, stRepresenting the target state, p (z)t|st) For the observation model, p(s)t|st-1) For use in calculating the state transition probability density of the t-th frame prior.
S5, carrying out weight distribution by using a GFDC-SGM method;
the step S5 further comprises
In the target prediction process of step 4, an observation model is constructed and N is optimized1The probability density of each particle, and the optimal target state is obtained. As a preferred embodiment, this step specifically constructs the observation model according to the following formula (10):
and optimizing N according to the following formula (11)1Particles(s)t g(g=1,2,…N1) Probability density function p(s)t|z1:t) Obtaining the optimal target state:
the best target state in the current frame is the target position, and for the positive sample candidate template, the confidence value HbHigh, confidence value H for possible background sample templatesbLow. Because the GFDC method takes background information into account, the SGM method can obtain sample templates that are not well differentiated by the appropriate HbAnd (5) further correcting.
S6, updating GFDC and SGM parameters online;
the step S6 specifically includes
S61, GFDC online updating:
to distinguish between positive and negative sample templates, the GFDC model is updated online as follows (12):
wherein λ is2>0 is moreNew rate, negative example template parameter μi 0And σi 0Updated with similar rules;
s62, SGM online updating:
in the SGM model, the generation dictionary D is constant in the same sequence, and in order to adapt to a change in appearance, the histogram ψ in the current frame is updated in accordance with the following expression (13)t:
ψt=μψ1+(1-μ)ψt-1′ (13)
Where μ is the learning factor,. psi1And psit-1The histograms stored in the first frame and the previous frame are shown separately.
S7, judging whether the current frame image is the last frame of the image sequence to be tracked, if yes, executing the next step S8; otherwise, executing step S4;
and S8, completing the tracking of the infrared target in the image sequence to be tracked.
A computer device having a processor and a memory, the memory storing a computer program which, when executed by the processor, causes the processor to perform the steps of the infrared target tracking method of any of the above.
A computer-readable storage medium, having stored thereon a computer program which, when executed by a processor, causes the processor to carry out the steps of the infrared target tracking method of any of the above.
The invention combines the generating formula and the discriminating type on the basis of adopting sparse representation and particle filtering, fully utilizes the target characteristics and combines background information, and fully combines the target characteristics and the background information to realize accurate and efficient tracking of the infrared target image sequence.
In order to further clearly illustrate the technical scheme and the technical effect of the invention, the infrared detector is used for shooting a walking pedestrian infrared image sequence, the image size is 640 multiplied by 512 pixels, and the sequence is 200 frames of images as an experiment for further explanation.
Fig. 2 shows a framed target positive sample and a background negative sample, the size of the rectangular frame is 35 × 75 pixels, and the centroid position of the infrared target image area to be tracked is taken as the initial position of the infrared target to be tracked.
Fig. 3 is a guide image of GF filtering used to enhance the difference between the background and the target template in the present invention. When the filtered target template coincides with the original target template, the guided filter may reduce the effect of suspected objects, considering the selected target template of the first frame as G in order to avoid the effect from a complex background.
Fig. 4 is a GF filtered target image, which is obtained after the guided filtering process according to the following formula in the embodiment of the present invention.
Where p is the pilot filtered output image, G is the input pilot image, ω k represents the samples, i represents the pixels in the samples, and ak and bk are constant linear factors in ω k.
Fig. 5 is a view illustrating images of a target and a suspected target after GF filtering according to the following formula, where the images of the target and the suspected target after the guided filtering process are obtained.
Where p is the pilot filtered output image, G is the input pilot image, ω k represents the samples, i represents the pixels in the samples, and ak and bk are constant linear factors in ω k.
Fig. 6 is a background image after GF filtering in the embodiment of the present invention, and the background image after the guided filtering process is obtained according to the following formula.
Where p is the pilot filtered output image, G is the input pilot image, ω k represents the samples, i represents the pixels in the samples, and ak and bk are constant linear factors in ω k.
Fig. 7 is a schematic diagram showing the results of the method proposed in the embodiment of the present invention and other tracking methods, a tracking experiment is performed on a 200-frame image sequence, and fig. 7 is a graph showing the tracking result of the 30 th frame.
Fig. 8 is a schematic diagram showing the results of the proposed method and other tracking methods in the embodiment of the present invention, a tracking experiment is performed on a 200-frame image sequence, and fig. 8 is a 130 th frame tracking result diagram.
Fig. 9 is a schematic diagram showing the results of the proposed method and other tracking methods in the embodiment of the present invention, a tracking experiment is performed on a 200-frame image sequence, and fig. 9 is a 170 th frame tracking result diagram.
FIG. 10 is a graph of tracking accuracy for the proposed method and several other tracking methods in accordance with embodiments of the present invention. The accuracy is defined as that the Euclidean distance between the tracking result and the real target position is within the set threshold value (0, 50 pixels in the experiment), and it can be seen that the Euclidean distance is greater than 20 pixels, but the algorithm provided by the invention has the highest tracking accuracy which is approximately 0.99.
Fig. 11 is a trace success rate curve of the proposed method and several other tracing methods in an embodiment of the present invention. The success rate S > t0(t0 ∈ [0,1]) is the ratio of the overlapping area of the real target and the tracking result and the area of the real target and the tracking result, i.e., (BG ═ BT)/(BG ═ BT), and BG and BT are the real target and the tracking result.
It can be seen that the method provided by the invention has high success rate and obvious improvement effect.
The above description is only an embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes performed by the present specification and drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.
Claims (10)
1. An infrared target tracking method is characterized by comprising the following steps:
s1, acquiring a positive sample and a negative sample of the first frame of infrared image;
s2, constructing a generation model based on sparse representation;
s3, constructing a discriminant classifier based on the guide filtering;
s4, performing target prediction by using particle filtering;
s5, carrying out weight distribution by using a GFDC-SGM method;
and S6, updating GFDC and SGM parameters online.
2. The infrared target tracking method according to claim 1, wherein the step S1 specifically includes:
s11, inputting a first frame image in the image sequence to be tracked;
s12, framing a target area to be tracked in the first frame image, and taking the centroid position of the target area image to be tracked as the initial position of the target to be tracked, namely, a positive sample;
and S13, framing a background area in the area around the target to be tracked in the first frame image, and taking the framed background area as a negative sample.
3. The infrared target tracking method as claimed in claim 1, wherein the step S2 includes:
s21, converting each positive sample block into a vector to obtain a sparse parameter vector of each positive sample block;
s22, connecting the sparse coefficient vectors of each positive sample to form a histogram rho;
s23, obtaining a weighted histogram through rho and o, wherein:
the o notations are as follows:
wherein epsiloniRepresenting a block yiOf the reconstruction error, epsilon0Is a predetermined threshold value, epsilon0Obtained by carrying out a plurality of experiments;
and S24, calculating the similarity of the sparsity histogram.
4. The infrared target tracking method as claimed in claim 1, wherein the step S3 further comprises:
s31, obtaining a sample image after guide filtering;
and S32, carrying out naive Bayes classification on the sample image.
5. The infrared target tracking method according to claim 1, wherein the step S4 specifically includes, in a state transition stage of particle filtering, estimating a posterior probability density of a target in a current frame image according to a prior probability of the target in a previous frame image, and completing target position estimation.
6. The infrared target tracking method as claimed in claim 1, wherein the step S5 further comprises:
in the target prediction process of step 4, an observation model is constructed, and the probability density of N1 particles is optimized to obtain the optimal target state.
7. The infrared target tracking method according to claim 1, wherein the step S6 specifically includes:
s61, GFDC online updating:
in order to distinguish between positive and negative sample templates, the GFDC model is updated online as follows:
wherein λ is2>0 is the update rate, negative example template parameter μi 0And σi 0Updated with similar rules;
s62, SGM online updating:
in the SGM model, the generation dictionary D is constant in the same sequence, and in order to adapt to a change in appearance, the histogram ψ in the current frame is updated as followst:
ψt=μψ1+(1-μ)ψt-1′
Where μ is the learning factor,. psi1And psit-1The histograms stored in the first frame and the previous frame are shown separately.
8. The infrared target tracking method of claim 1, further comprising, after the step S6:
s7, judging whether the current frame image is the last frame of the image sequence to be tracked, if yes, executing the next step S8; otherwise, executing step S4;
and S8, completing the tracking of the infrared target in the image sequence to be tracked.
9. A computer device having a processor and a memory, the memory storing a computer program which, when executed by the processor, causes the processor to carry out the steps of the infrared target tracking method of any one of claims 1 to 8.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, causes the processor to carry out the steps of the infrared target tracking method as set forth in any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910941612.3A CN112580679A (en) | 2019-09-30 | 2019-09-30 | Infrared target tracking method, computer device and computer readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910941612.3A CN112580679A (en) | 2019-09-30 | 2019-09-30 | Infrared target tracking method, computer device and computer readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112580679A true CN112580679A (en) | 2021-03-30 |
Family
ID=75116278
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910941612.3A Pending CN112580679A (en) | 2019-09-30 | 2019-09-30 | Infrared target tracking method, computer device and computer readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112580679A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116563348A (en) * | 2023-07-06 | 2023-08-08 | 中国科学院国家空间科学中心 | Infrared weak small target multi-mode tracking method and system based on dual-feature template |
-
2019
- 2019-09-30 CN CN201910941612.3A patent/CN112580679A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116563348A (en) * | 2023-07-06 | 2023-08-08 | 中国科学院国家空间科学中心 | Infrared weak small target multi-mode tracking method and system based on dual-feature template |
CN116563348B (en) * | 2023-07-06 | 2023-11-14 | 中国科学院国家空间科学中心 | Infrared weak small target multi-mode tracking method and system based on dual-feature template |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8131011B2 (en) | Human detection and tracking system | |
CN110033473B (en) | Moving target tracking method based on template matching and depth classification network | |
CN109544603B (en) | Target tracking method based on deep migration learning | |
CN110889865B (en) | Video target tracking method based on local weighted sparse feature selection | |
CN111080674B (en) | Multi-target ISAR key point extraction method based on Gaussian mixture model | |
CN113408492A (en) | Pedestrian re-identification method based on global-local feature dynamic alignment | |
CN112634333B (en) | Tracking device method and device based on ECO algorithm and Kalman filtering | |
CN112801051A (en) | Method for re-identifying blocked pedestrians based on multitask learning | |
CN115690152A (en) | Target tracking method based on attention mechanism | |
CN107657627B (en) | Space-time context target tracking method based on human brain memory mechanism | |
CN114897932A (en) | Infrared target tracking implementation method based on feature and gray level fusion | |
CN106971176A (en) | Tracking infrared human body target method based on rarefaction representation | |
CN112580679A (en) | Infrared target tracking method, computer device and computer readable storage medium | |
CN113591607B (en) | Station intelligent epidemic situation prevention and control system and method | |
CN115880332A (en) | Target tracking method for low-altitude aircraft visual angle | |
CN114973305B (en) | Accurate human body analysis method for crowded people | |
Campos et al. | Mobile robot global localization with non-quantized SIFT features | |
CN105654514A (en) | Image target tracking method | |
Guo et al. | Adaptive video object segmentation with online data generation | |
Zhang et al. | Visual tracking via sparse representation based linear subspace model | |
Zhang et al. | Vision-based UAV obstacle avoidance algorithm on the embedded platform | |
CN115061574B (en) | Human-computer interaction system based on visual core algorithm | |
CN116310463B (en) | Remote sensing target classification method for unsupervised learning | |
Jiang et al. | Human posture recognition with convex programming | |
CN116580066B (en) | Pedestrian target tracking method under low frame rate scene and readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |