CN108876820B - Moving target tracking method under shielding condition based on mean shift - Google Patents

Moving target tracking method under shielding condition based on mean shift Download PDF

Info

Publication number
CN108876820B
CN108876820B CN201810596691.4A CN201810596691A CN108876820B CN 108876820 B CN108876820 B CN 108876820B CN 201810596691 A CN201810596691 A CN 201810596691A CN 108876820 B CN108876820 B CN 108876820B
Authority
CN
China
Prior art keywords
target
model
region
predicted
similarity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810596691.4A
Other languages
Chinese (zh)
Other versions
CN108876820A (en
Inventor
蔡延光
赵豪
蔡颢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong University of Technology
Original Assignee
Guangdong University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong University of Technology filed Critical Guangdong University of Technology
Priority to CN201810596691.4A priority Critical patent/CN108876820B/en
Publication of CN108876820A publication Critical patent/CN108876820A/en
Application granted granted Critical
Publication of CN108876820B publication Critical patent/CN108876820B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/18Complex mathematical operations for evaluating statistical data, e.g. average values, frequency distributions, probability functions, regression analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Computational Mathematics (AREA)
  • Multimedia (AREA)
  • Pure & Applied Mathematics (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Probability & Statistics with Applications (AREA)
  • Operations Research (AREA)
  • Algebra (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a moving target tracking method under a shielding condition based on mean shift. The method mainly comprises the following steps: 1. acquiring video frame image information in video data; 2. selecting a tracking target, establishing a Kalman filter model, and initializing initial parameters of an equation; 3. establishing a mean shift model, counting color values of all pixel points in a search window, establishing a histogram and normalizing; 4. calculating a predicted position output by the Kalman model to serve as an initial iteration position of the MeanShift algorithm, and carrying out shielding judgment at the position predicted by the Kalman model; 5. if the current frame is not shielded, taking the predicted position as the initial iteration position of the MeanShift algorithm, continuously iterating and calculating the optimal position of the target according to the mean shift algorithm, updating Kalman model parameters, adaptively changing the size of a window at the optimal position, and taking the size of the window as the size of the window of the next frame; 6. if the occlusion exists, the predicted position is used as the assumed observation position, the Kalman model parameter is updated according to the position, and the predicted target position is output.

Description

Moving target tracking method under shielding condition based on mean shift
Technical Field
The invention relates to the field of image processing, in particular to a moving target tracking method under an occlusion condition based on mean shift.
Background
The tracking of the moving target based on the video refers to the intelligent detection and tracking of the moving target of the moving image existing in the video, and meanwhile, the related motion parameter index of the moving target can be obtained. In specific practice, the tracking algorithm of the moving target in a complex environment usually needs to be combined with other auxiliary algorithms such as target intelligent detection and segmentation, track filtering of the moving target, prediction of the target position and the like to optimize the effect, however, requirements on the algorithm, such as object shadow, target occlusion, change in target form, brightness change caused by illumination change and the like due to the complexity of the environment where the moving target is located, are greatly examined, and real-time requirements in practical situations also put great demands on the target tracking problem.
Disclosure of Invention
The invention provides a moving target tracking method under the shielding condition based on mean shift, which can effectively solve the problem of moving target tracking under the shielding condition.
In order to solve the technical problems, the invention adopts the technical scheme that: a moving target tracking method under the shielding condition based on mean shift comprises the following steps:
s1, acquiring video frame image information in video data;
s2, manually setting an initial search window in any frame, namely selecting a tracking target, establishing a Kalman filter model, and initializing initial parameters of an equation;
s3, establishing a mean shift model, counting color values of all pixel points in a search window, establishing a histogram and normalizing to obtain model description of a target area, namely obtaining a density probability function of the target area;
s4, calculating a predicted position output by the Kalman filter model and taking the predicted position as an initial iteration position of the MeanShift algorithm; and judging occlusion at the position predicted by the Kalman, if no occlusion exists, executing step S5, and if occlusion exists, executing step S6;
s5, if the image is not shielded, taking the predicted position of the Kalman filter model as the initial iteration position of the MeanShift algorithm, continuously iterating and calculating the optimal position of the target according to the mean shift algorithm, taking the optimal position as a parameter to update the Kalman filter model parameter, and adaptively changing the size of a window at the optimal position to take the size as the size of the window of the next frame;
s6, if shielding exists, taking the prediction position of the Kalman filter as a hypothetical observation position, updating the Kalman filter model parameter according to the position, and outputting the predicted target position;
s7, judging whether the video is finished, if so, executing a step S8, otherwise, executing a step S4;
and S8, stopping and ending.
Further, the step S3 specifically includes:
s31, establishing a target model, selecting a target in a partitioned mode by adopting a rectangular window, dividing the region into 11 sub-regions when extracting features, wherein the 11 sub-regions form a pentagon structure, the central region of the pentagon structure is a regular pentagon, and each corner of the pentagon structure is divided into a region; respectively counting the color histogram of each subarea, and taking the color histogram result respectively counted by each subarea as the feature description of the whole target area; assuming the center coordinates of the target area as
Figure GDA0003276429260000021
Figure GDA0003276429260000022
Representing the coordinate position of each pixel in the area, wherein i is 1, 2.. multidot.n, establishing a color histogram to obtain m characteristic values obtained by counting colors; probability density q of the object model u1, 2.. m, expressed as:
Figure GDA0003276429260000023
Figure GDA0003276429260000024
wherein k (x) is a contour function, and C is quNormalized constant coefficient of (a); u is the index of the histogram;
Figure GDA0003276429260000025
for judging pixels in the region template
Figure GDA0003276429260000026
Whether the brightness value is in the u-th interval of the histogram or not, h is the kernel function bandwidth, and the position size of the pixel is normalized;
s32, describing candidate area model, and in the t frame, describing area center coordinates ftBy { zi}i1, 2.. n represents the pixels of the candidate region, then the probability density of the model of the candidate region is:
Figure GDA0003276429260000027
s33, similarity measurement, and the provided improved similarity judgment basis has the following specific expression:
Figure GDA0003276429260000028
Figure GDA0003276429260000029
Figure GDA0003276429260000031
Figure GDA0003276429260000032
where ρ (p, q) represents the similarity between the candidate object model and the object model, ρi(p, q) denotes the similarity between different regions, where i ═ 1, 2.., n; here, the Babbitt coefficient is used to describe the similarity; omegaiRespectively representing the weight coefficient of each sub-region and representing different contributions of different regions to the similarity between the whole candidate model and the target model; x is the number of0Represents the centroid coordinates, x, of the central regioniRepresenting the centroid coordinates of the other 10 regions, the weights of the different subregions being assigned by the region centroid position inversely proportional to the distance of the central region centroid position.
Further, in the step S4, the occlusion is determined according to the following equation:
Figure GDA0003276429260000033
in the formula, ρi(k) Representing the similarity of the ith area in the kth frame, wherein T is a threshold value; when the condition shown in the above expression is satisfied, it is assumed that the region 1 is blocked, and the approximate direction is from the region 1, and similarly, the determination of the other regions is also performed according to the above expression.
Further, in the step S5, the optimal position of the target is continuously calculated in an iterative manner, that is, in order to maximize the similarity function ρ (p, q), a taylor expansion is applied to the formula in the step S33 to obtain a corresponding approximate expression:
Figure GDA0003276429260000034
Figure GDA0003276429260000035
in the function of the above formula, p (p, q) is only ftChanges occur so only the second term is analyzed and from the article by Yizong Cheng, the following formula can be derived:
Figure GDA0003276429260000041
in the formula (f)kAs the original target center, fk+1If the distance moved finally is less than a certain threshold value epsilon or reaches the maximum iteration times, the vector direction can be determined to move towards the direction with the maximum color contrast change of the two models, the position after the movement is the optimal target position of the current frame, then the result center point is taken as the center of the next frame iteration algorithm, and the operation is repeated until the operation is finished.
Compared with the prior art, the beneficial effects are: according to the moving target tracking method under the shielding condition based on the mean shift, an improved region dividing mode, namely a pentagram structure, is adopted in the model building stage, the defect that the space characteristic is not considered when a model is built for a target by a classical mean shift algorithm is overcome, and the model description of the target is more effective; meanwhile, when the model faces shielding, the shielded direction information can be judged to a certain extent, and the Kalman prediction model is modified to a certain extent, so that the adaptability of the Kalman prediction model to shielding is stronger.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
Fig. 2 is a schematic diagram of color histogram division of the subareas in step S31 according to the present invention.
FIG. 3 is a diagram of simulation results according to an embodiment of the present invention.
Detailed Description
The drawings are for illustration purposes only and are not to be construed as limiting the invention; for the purpose of better illustrating the embodiments, certain features of the drawings may be omitted, enlarged or reduced, and do not represent the size of an actual product; it will be understood by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted. The positional relationships depicted in the drawings are for illustrative purposes only and are not to be construed as limiting the invention.
Example 1:
as shown in fig. 1, a method for tracking a moving target under an occlusion condition based on mean shift includes the following steps:
step 1, video frame image information in video data is obtained.
And 2, manually setting an initial search window in any frame, namely selecting a tracking target, establishing a Kalman filter model, and initializing initial parameters of an equation. The method specifically comprises the following steps:
s21, assuming the state parameters of the moving target as the center position and the speed of the interested area at the current moment, setting the state variable in the Kalman predictor as XkObserved value is set to ZkThe expressions are respectively as follows:
Xk=[xk,yk,vx,k,vy,k] (1)
Zk=[xk,yk] (2)
wherein x isk,ykRepresenting the position of the center point (particle position), v, of a rectangle containing the target areax,k,vy,kRepresenting the moving speed on x and y axes;
s22, the state equation of the Kalman predictor is as follows:
Xk=Ak,k-1Xk-1+Wk-1 (3)
wherein A isk,k-1The state transition matrix from time k-1 to time k is represented herein by the following matrix:
Figure GDA0003276429260000051
where Δ t denotes the time interval between two consecutive frames of pictures, where 1 frame is selected; wk-1Normal white noise which is irrelevant and has the average value of 0 is adopted for the analog noise added in the system state at the moment of k-1;
s23, an observation equation is as follows:
Zk=HkXk+Vk (5)
in the formula, HkIs an observation matrix at time k, VkNormal white noise which is not related to each other is adopted for the observation noise at the time k, and the average value of the gaussian noise is set to be 0. The observation matrix is as follows:
Figure GDA0003276429260000052
s24, the state updating equation of the Kalman predictor is as follows:
Figure GDA0003276429260000053
the state prediction equation of the Kalman predictor is as follows:
Figure GDA0003276429260000061
Kkgain matrix for kalman predictor:
Figure GDA0003276429260000062
Figure GDA0003276429260000063
Figure GDA0003276429260000064
wherein the content of the first and second substances,
Figure GDA0003276429260000065
is estimated based on the prior probability of the signal,
Figure GDA0003276429260000066
is according to ZkTo pair
Figure GDA0003276429260000067
Corrected update value of, Wk-1And VkNoise that is a standard normal distribution of complementary correlations; qkAnd RkIs WkAnd VkThe covariance matrix of (2) is as follows:
Figure GDA0003276429260000068
Figure GDA0003276429260000069
step 3, establishing a mean shift model, counting color values of all pixel points in a search window, establishing a histogram and normalizing, and obtaining model description of the target area (taking the color histogram as the characteristic of the target) to obtain a density probability function of the target area; the method specifically comprises the following steps:
the establishment of the target model uses an improved description method of the target region characteristics, namely, color histogram statistics of target pixel blocks of sub-regions, and still uses a rectangular window to select the region of the target, as shown in fig. 2, a pentagram-shaped region divided into 11 sub-regions is adopted during the extraction of the characteristics, the central region is a regular pentagram, the sub-regions are respectively numbered as 1-11, then color histograms of the regions are respectively counted, and the result of the color histogram respectively counted by the regions is taken as the characteristic description of the whole target region, so that the description of the spatial characteristic distribution of the target is improved. Assuming the center coordinates of the target area as
Figure GDA00032764292600000610
Figure GDA00032764292600000611
Representing the coordinate position of each pixel in the area, wherein i is 1, 2.. multidot.n, establishing a color histogram to obtain m characteristic values obtained by counting colors; probability density q of the object model u1, 2.. m, expressed as:
Figure GDA0003276429260000071
Figure GDA0003276429260000072
in the above formula, k (x) is a contour function, and C is quNormalized constant coefficient of (a); u is the index of the histogram; delta [ b (z)i-u)]To determine pixel x in the region templateiWhether the luminance value at (b) is located in the u-th interval of the histogram;
s32, describing candidate area model, and in the t frame, describing area center coordinates ftBy { z i1,2, n represents pixels of the candidate region, and the modulus of the candidate region is thenThe probability density of the pattern is:
Figure GDA0003276429260000073
s33, measuring similarity, wherein a similarity function is used for describing the similarity between a real model and a target candidate model, and the similarity function adopts a Bhattacharyya coefficient, and the formula is as follows:
Figure GDA0003276429260000074
an improved similarity determination criterion provided herein has a specific expression as follows:
Figure GDA0003276429260000075
Figure GDA0003276429260000076
Figure GDA0003276429260000077
Figure GDA0003276429260000078
where ρ (p, q) represents the similarity between the candidate object model and the object model, ρi(p, q) (where i 1, 2.., 11) similarity between different regions, where the babbitt coefficient is used to describe the similarity. OmegaiRespectively representing the weight coefficients of different regions in FIG. 2, representing different contributions of the different regions to the similarity between the entire candidate model and the target model, according to the following equations (20) (21), where xiRepresents the centroid coordinate, x, of the area No. 1-10 in the figure0Representing the coordinates of the center of mass of the area No. 11 in the figure, the different areasThe weight of the domain is distributed by the distance between the centroid position of the region and the centroid position of the central region in inverse proportion
Step 4, calculating a predicted position output by the Kalman filter model and taking the predicted position as an initial iteration position of the MeanShift algorithm; and performing occlusion judgment at the position of Kalman prediction, if no occlusion exists, executing S5, and if an occlusion exists, executing S6.
The judgment basis of the shielding is specifically shown as the following formula:
Figure GDA0003276429260000081
where ρ isi(k) And T is a threshold value, and takes a value of 0.8, which is determined according to specific conditions. When the condition shown by the equation (22) is satisfied, it is assumed that occlusion occurs in the region 1, and the approximate direction is from the region 1, and the same applies to the determination of the remaining regions.
And 5, if no shielding exists, taking the predicted position of the Kalman filter model as the initial iteration position of the MeanShift algorithm, continuously iterating and calculating the optimal position of the target according to the mean shift algorithm, taking the optimal position as a parameter to update the Kalman filter model parameter, and adaptively changing the size of a window at the optimal position to be taken as the window size of the next frame.
The optimal position of the target is continuously calculated in an iterative manner, namely in order to maximize the similarity function rho (p, q), a taylor expansion is applied to the formula (17), and a corresponding approximate expression is obtained as follows:
Figure GDA0003276429260000082
Figure GDA0003276429260000083
in the formula (23), only ftChanges will occur so that only the second analysis is madeSecond, the following formula can be derived from the article by YIzong Cheng:
Figure GDA0003276429260000084
in the above formula, fkAs the original target center, fk+1That is, the central point of the result after the Mean shift calculation is obtained, the Mean shift algorithm uses the formula (25) to perform iterative calculation, if the distance moved finally is smaller than a certain threshold epsilon or reaches the maximum iteration number, the vector direction can be determined to move towards the direction with the maximum color contrast change of the two models, the position after the movement is the target optimal position of the current frame, then the specific central point is taken as the center of the iterative algorithm of the next frame, the operation is repeated until the end, the value epsilon is 0.5 in the embodiment, and the maximum iteration number is 20.
And 6, if the occlusion exists, taking the prediction position of the Kalman filter as the assumed observation position, updating the Kalman filter model parameter according to the position, and outputting the predicted target position.
Step 7, judging whether the video is finished, if so, executing S8, and if not, executing S4;
and 8, stopping and ending.
It should be understood that the above-described embodiments of the present invention are merely examples for clearly illustrating the present invention, and are not intended to limit the embodiments of the present invention. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the claims of the present invention.

Claims (3)

1. A moving target tracking method under the shielding condition based on mean shift is characterized by comprising the following steps:
s1, acquiring video frame image information in video data;
s2, manually setting an initial search window in any frame, namely selecting a tracking target, establishing a Kalman filter model, and initializing initial parameters of an equation;
s3, establishing a mean shift model, counting color values of all pixel points in a search window, establishing a histogram and normalizing to obtain model description of a target area, namely obtaining a density probability function of the target area; the step S3 specifically includes:
s31, establishing a target model, selecting a target in a partitioned mode by adopting a rectangular window, dividing the region into 11 sub-regions when extracting features, wherein the 11 sub-regions form a pentagon structure, the central region of the pentagon structure is a regular pentagon, and each corner of the pentagon structure is divided into a region; respectively counting the color histogram of each subarea, and taking the color histogram result respectively counted by each subarea as the feature description of the whole target area; assuming the center coordinates of the target area as
Figure FDA0003276429250000011
Figure FDA0003276429250000012
Representing the coordinate position of each pixel in the area, wherein i is 1, 2.. multidot.n, establishing a color histogram to obtain m characteristic values obtained by counting colors; probability density q of the object modelu1, 2.. m, expressed as:
Figure FDA0003276429250000013
Figure FDA0003276429250000014
wherein k (x) is a contour function, and C is quNormalized constant coefficient of (a); u is the index of the histogram;
Figure FDA0003276429250000015
for judging pixels in the region template
Figure FDA0003276429250000016
Whether the brightness value is in the u-th interval of the histogram or not, h is the kernel function bandwidth, and the position size of the pixel is normalized;
s32, describing candidate area model, and in the t frame, describing area center coordinates ftBy { zi1,2, n represents pixels of the candidate region, and the probability density of the model of the candidate region is:
Figure FDA0003276429250000017
s33, similarity measurement, and the provided improved similarity judgment basis has the following specific expression:
Figure FDA0003276429250000021
Figure FDA0003276429250000022
Figure FDA0003276429250000023
Figure FDA0003276429250000024
where ρ (p, q) represents the similarity between the candidate object model and the object model, ρi(p, q) denotes the similarity between different regions, where i ═ 1, 2.., n; here, the Babbitt coefficient is used to describe the similarity; omegaiRespectively representing the weight coefficient of each sub-region and representing the similarity between the whole candidate model and the target modelDonate; x is the number of0Represents the centroid coordinates, x, of the central regioniRepresenting the mass center coordinates of other 10 areas, and the weights of different sub-areas are distributed by the distances between the mass center position of the area and the mass center position of the central area in inverse proportion;
s4, calculating a predicted position output by the Kalman filter model and taking the predicted position as an initial iteration position of the MeanShift algorithm; and judging occlusion at the position predicted by the Kalman, if no occlusion exists, executing step S5, and if occlusion exists, executing step S6;
s5, if the image is not shielded, taking the predicted position of the Kalman filter model as the initial iteration position of the MeanShift algorithm, continuously iterating and calculating the optimal position of the target according to the mean shift algorithm, taking the optimal position as a parameter to update the Kalman filter model parameter, and adaptively changing the size of a window at the optimal position to take the size as the size of the window of the next frame;
s6, if shielding exists, taking the prediction position of the Kalman filter as a hypothetical observation position, updating the Kalman filter model parameter according to the position, and outputting the predicted target position;
s7, judging whether the video is finished, if so, executing a step S8, otherwise, executing a step S4;
and S8, stopping and ending.
2. The method for tracking the moving target under the occlusion condition based on the mean shift of claim 1, wherein in the step S4, the occlusion is determined according to the following formula:
Figure FDA0003276429250000031
in the formula, ρi(k) Representing the similarity of the ith area in the kth frame, wherein T is a threshold value; when the condition shown in the above expression is satisfied, it is assumed that the region 1 is blocked, and the approximate direction is from the region 1, and similarly, the determination of the other regions is also performed according to the above expression.
3. The method of claim 2, wherein the optimal position of the target is calculated by successive iterations in step S5, that is, in order to maximize the similarity function p (p, q), taylor expansion is applied to the formula in step S33 to obtain the corresponding approximate expression:
Figure FDA0003276429250000032
Figure FDA0003276429250000033
in the function of the above formula, p (p, q) is only ftChanges will occur so that only the second term is analyzed to yield the following equation:
Figure FDA0003276429250000034
in the formula (f)kAs the original target center, fk+1If the distance moved finally is less than a certain threshold value epsilon or reaches the maximum iteration times, the vector direction is determined to move towards the direction with the maximum color contrast change of the two models, the position after the movement is the target optimal position of the current frame, then the result center point is taken as the center of the next frame iteration algorithm, and the operation is repeated until the operation is finished.
CN201810596691.4A 2018-06-11 2018-06-11 Moving target tracking method under shielding condition based on mean shift Active CN108876820B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810596691.4A CN108876820B (en) 2018-06-11 2018-06-11 Moving target tracking method under shielding condition based on mean shift

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810596691.4A CN108876820B (en) 2018-06-11 2018-06-11 Moving target tracking method under shielding condition based on mean shift

Publications (2)

Publication Number Publication Date
CN108876820A CN108876820A (en) 2018-11-23
CN108876820B true CN108876820B (en) 2022-01-25

Family

ID=64337765

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810596691.4A Active CN108876820B (en) 2018-06-11 2018-06-11 Moving target tracking method under shielding condition based on mean shift

Country Status (1)

Country Link
CN (1) CN108876820B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111277745B (en) * 2018-12-04 2023-12-05 北京奇虎科技有限公司 Target person tracking method and device, electronic equipment and readable storage medium
CN109949340A (en) * 2019-03-04 2019-06-28 湖北三江航天万峰科技发展有限公司 Target scale adaptive tracking method based on OpenCV
CN110458862A (en) * 2019-05-22 2019-11-15 西安邮电大学 A kind of motion target tracking method blocked under background
CN110233667A (en) * 2019-06-05 2019-09-13 华南理工大学 VLC dynamic positioning method and system based on average drifting and Unscented kalman filtering
CN110517291A (en) * 2019-08-27 2019-11-29 南京邮电大学 A kind of road vehicle tracking based on multiple feature spaces fusion
CN112070794A (en) * 2020-08-20 2020-12-11 成都恒创新星科技有限公司 Multi-object tracking method based on dynamic auxiliary target
CN115471139B (en) * 2022-10-31 2023-02-10 北京奥邦体育赛事评估有限责任公司 Large-scale crowd sports event comprehensive evaluation system based on image recognition technology

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101324956A (en) * 2008-07-10 2008-12-17 上海交通大学 Method for tracking anti-shield movement object based on average value wander

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9338409B2 (en) * 2012-01-17 2016-05-10 Avigilon Fortress Corporation System and method for home health care monitoring
US9582895B2 (en) * 2015-05-22 2017-02-28 International Business Machines Corporation Real-time object analysis with occlusion handling

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101324956A (en) * 2008-07-10 2008-12-17 上海交通大学 Method for tracking anti-shield movement object based on average value wander

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于卡尔曼的均值漂移抗遮挡移动目标追踪算法;赵豪等;《电子世界》;20180423(第08期);第31-32页 *

Also Published As

Publication number Publication date
CN108876820A (en) 2018-11-23

Similar Documents

Publication Publication Date Title
CN108876820B (en) Moving target tracking method under shielding condition based on mean shift
CN110517288B (en) Real-time target detection tracking method based on panoramic multi-path 4k video images
CN109919053A (en) A kind of deep learning vehicle parking detection method based on monitor video
CN112184759A (en) Moving target detection and tracking method and system based on video
CN111260738A (en) Multi-scale target tracking method based on relevant filtering and self-adaptive feature fusion
CN106952294B (en) A kind of video tracing method based on RGB-D data
CN105809716B (en) Foreground extraction method integrating superpixel and three-dimensional self-organizing background subtraction method
CN110717934B (en) Anti-occlusion target tracking method based on STRCF
CN112364865B (en) Method for detecting small moving target in complex scene
CN111582349A (en) Improved target tracking algorithm based on YOLOv3 and kernel correlation filtering
KR20170015299A (en) Method and apparatus for object tracking and segmentation via background tracking
CN109255799B (en) Target tracking method and system based on spatial adaptive correlation filter
CN110782487A (en) Target tracking method based on improved particle filter algorithm
CN110827262A (en) Weak and small target detection method based on continuous limited frame infrared image
CN112164093A (en) Automatic person tracking method based on edge features and related filtering
CN113379789B (en) Moving target tracking method in complex environment
CN108765463B (en) Moving target detection method combining region extraction and improved textural features
CN113205494B (en) Infrared small target detection method and system based on adaptive scale image block weighting difference measurement
Roy et al. A comprehensive survey on computer vision based approaches for moving object detection
KR101690050B1 (en) Intelligent video security system
CN113902694A (en) Target detection method based on dynamic and static combination
CN107871315B (en) Video image motion detection method and device
CN109102520A (en) The moving target detecting method combined based on fuzzy means clustering with Kalman filter tracking
Najafzadeh et al. Object tracking using Kalman filter with adaptive sampled histogram
CN110310303B (en) Image analysis multi-target tracking method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant