CN107392936B - Target tracking method based on meanshift - Google Patents

Target tracking method based on meanshift Download PDF

Info

Publication number
CN107392936B
CN107392936B CN201710434697.7A CN201710434697A CN107392936B CN 107392936 B CN107392936 B CN 107392936B CN 201710434697 A CN201710434697 A CN 201710434697A CN 107392936 B CN107392936 B CN 107392936B
Authority
CN
China
Prior art keywords
target
rectangle
pixels
pixel
candidate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710434697.7A
Other languages
Chinese (zh)
Other versions
CN107392936A (en
Inventor
沈振权
舒伟平
曹后平
田野
刘晓华
黄盛锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ANHUI GUANGZHEN PHOTOELECTRIC TECHNOLOGY Co.,Ltd.
Original Assignee
Guangdong Lite Array Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Lite Array Co ltd filed Critical Guangdong Lite Array Co ltd
Priority to CN201710434697.7A priority Critical patent/CN107392936B/en
Publication of CN107392936A publication Critical patent/CN107392936A/en
Application granted granted Critical
Publication of CN107392936B publication Critical patent/CN107392936B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • G06V20/42Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items of sport video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a target tracking method based on meanshift, which comprises the following steps: 1, initializing a target image, and selecting a rectangle A containing a tracked target1The initial position of (a); 2, for the target rectangle AnAll pixels are subjected to background judgment; 3, calculating a target rectangle AnProbability density q ofu(ii) a 4, calculating the probability density p of the candidate target area of the moving target in the candidate target area of the (n + 1) th frameu(ii) a 5, calculating the weight omega of each pixel in the candidate target areai(ii) a 6, calculating new position y of the candidate target areanew(ii) a 7 if y0‑ynewIf | | is less than epsilon or the iteration frequency is more than a threshold value, stopping iteration; otherwise, the iterative computation is continued until the candidate target position of the termination condition is satisfied. The target tracking method based on meanshift judges whether the pixels in the target frame belong to the background or not, and if the pixels belong to the background, the pixels do not participate in subsequent calculation, so that the real moving target is better modeled, and the tracking effect is optimized.

Description

Target tracking method based on meanshift
Technical Field
The invention relates to the technical field of target tracking, in particular to a target tracking method based on meanshift.
Background
In the meanshift target tracking process, all pixels in a rectangular frame where the target is located are usually modeled. This presents a problem in that the pixels in the rectangular box are not completely object-oriented, and there is also a portion of background, the information of which is also included in the object modeling. Especially if the selection of the rectangular frame is too large or the color difference between the background and the target is too large, the target modeling has considerable error. How to better model a real moving object is a key step influencing the tracking effect.
Disclosure of Invention
The invention aims to provide a target tracking method based on meanshift, which is used for obtaining whether pixels in a matrix frame belong to a background or not through background modeling, and if the pixels belong to the background, the pixels do not participate in subsequent calculation so as to solve the problems in the background technology.
In order to achieve the purpose, the invention provides the following technical scheme:
a target tracking method based on meanshift is characterized in that a video shooting is carried out on a target through a shooting tool to obtain a video sequence image of the target, and the target tracking method comprises the following steps:
step 1, initializing a target image, and selecting a rectangle A containing a tracked target1The initial position of (a);
step 2, recording a target rectangle A of the nth frame imagenJudging the background of all pixels of the target rectangle, and indicating the function BI if the pixels are judged to be the backgroundn(x) Recording as 1, otherwise, being 0;
step 3, a target rectangle A of the nth frame image is processednUsing the indicator function BIn(x) Information, calculating the probability density q of the target rectangleu
Step 4, the candidate target area of the moving target in the (n + 1) th frame uses the target rectangular position y of the image of the nth frame0Calculating a probability density p of candidate target regionsu
Step 5, calculating the weight omega of each pixel in the candidate target areai
Step 6, calculating new position y of candidate target areanew
Step 7, if y0-ynewIf | | is less than epsilon or the iteration frequency is more than a threshold value, stopping iteration; otherwise let y0=ynewAnd turning to step 4, continuing iterative computation until the candidate target position of the termination condition is met.
As a further improvement, in the step 2, a target rectangle A of the nth frame image is recordednJudging the background of all pixels of the target rectangle, and indicating the function BI if the pixels are judged to be the backgroundn(x) Recording as 1, otherwise, recording as 0, and specifically comprising the following steps:
step 21, judge the target rectangle AnPixels of the middle edge part, the target rectangle AnSize w × d, target rectangle AnThe 4 edges are pixel areas needing to be judged, the width h of the edge, the edge positions 1,2, 3 and 4 are arranged clockwise, and the edge position 1 is positioned in the target rectangle AnRight above;
target rectangle AnThe pixel of the central position belongs by default to the target, i.e. the indicator function BI of the central pixeln(x) Set directly to 0;
step 22, from the target rectangle AnStarting with vertex a in the upper left corner, a rectangle a of size 3 × 3 with a as the left vertex is selected, the matrix comprises 9 pixels, the gray distribution of these 9 pixels is fitted with a gaussian model, and the mean μ and variance σ are calculated2
Figure BDA0001318308310000021
Figure BDA0001318308310000022
Wherein gray (x) represents the gray value of a pixel;
whether the pixels belong to the background is judged by calculating the probability of belonging to the Gaussian model for all the pixels positioned in the edge position 1 and the edge position 4, and the formula is as follows:
Figure BDA0001318308310000023
where f (x) represents the probability that the pixel (x) belongs to the Gaussian model, thus indicating the function BIn(x) Can be calculated by the following formula:
Figure BDA0001318308310000031
by the method, all the pixels in the edge positions 1 and 4 can be determined to obtain the corresponding indication function BIn(x);
Step 23, similar to step 22, from the target rectangle AnStarting from the vertex B at the upper right corner, judging all pixels in the edge position 1 and the edge position 2, and if a certain pixel in the edge position 1 is judged to be the background in the step 22, omitting the judgment of the pixel;
step 24, similar to step 22, from the target rectangle AnStarting from the vertex D at the lower right corner, determining pixels at edge positions 2 and 3, and omitting the determination of a certain pixel at edge position 2 if the pixel is determined as the background in step 23;
step 25, similar to step 22, from the target rectangle AnStarting from the vertex C at the lower left corner, determining pixels at edge positions 3 and 4, and if a certain pixel at edge position 3 is determined as the background in step 24, omitting the determination of the pixel;
thus, the target rectangle AnThe indication function BI of all pixels inn(x) Are all obtained by calculation.
As a further improvement, in the step 3, the target rectangle A of the nth frame image is processednUsing the indicator function BIn(x) Information, calculating the probability density q of the target rectangleuThe method specifically comprises the following steps:
selecting gray information as a feature space of the Mean Shift tracker, counting a gray histogram of the feature space, dividing the feature space into 32 parts, recording each part as a feature value of the feature space, recording x0 as a central position coordinate of a target template region, and setting { x0 }i1, L, n are all pixel positions within the target template region that do not belong to the background, i.e. their indicator function BInIf the (x, y) values are both 0, the probability density function of the target template based on the gray scale feature u being 1, L, m is calculated as follows:
Figure BDA0001318308310000032
wherein C isqIs the normalization constant of the target template,
Figure BDA0001318308310000033
k (g) is a kernel function.
As a further improvement, in the step 4, the target rectangular position y of the target in the image of the n frame is used as the candidate target area of the moving target in the n +1 frame0Calculating a probability density p of candidate target regionsuThe method specifically comprises the following steps:
starting calculation by using the position of the target template in the previous frame, namely the nth frame image, and setting the center of the candidate target area as y0In the region with the previous frame pixel { xiFor each pixel corresponding to 1, L, n position, { y }, i ═ L, niAnd j, i is 1, L, n, and the probability density function of the candidate region can be obtained in the same manner as the probability density function of the target template:
Figure BDA0001318308310000041
as a further improvement, in the step 5, the weight ω of each pixel in the candidate target region is calculatedi
Figure BDA0001318308310000042
As a further improvement, said step 6, calculating a new position y of the candidate target regionnewThe method specifically comprises the following steps:
measuring the similarity between the histograms corresponding to the target template and the candidate target region through a Bhattacharyya coefficient, and moving the search window to the real position of the target along the direction with the maximum density increase according to the principle that the similarity of the two histograms is the maximum;
wherein q isuAs target template, puAs candidate target templates, the Bhattacharyya coefficients are defined as follows:
Figure BDA0001318308310000043
will be provided with
Figure BDA0001318308310000044
And obtaining an updating formula of the center position of the candidate target area by derivation after Taylor series expansion:
Figure BDA0001318308310000045
wherein g (x) k' (x), ωiIs the weight of each pixel.
The invention has the beneficial effects that: and (4) directly using all pixels in the target frame to participate in subsequent calculation without considering whether the pixels in the target frame belong to the background or not by using the common Meanshift. The target tracking method based on meanshift judges whether the pixels in the target frame belong to the background or not, and if the pixels belong to the background, the pixels do not participate in subsequent calculation, so that the real moving target is better modeled, and the tracking effect is optimized.
The present invention will be described in further detail with reference to the accompanying drawings and specific embodiments.
Drawings
FIG. 1 is a flow chart of a target tracking method based on meanshift;
FIG. 2 shows step 2 target rectangle A of example 2nSchematic structural diagram of (a);
Detailed Description
Embodiment 1, referring to fig. 1, in the target tracking method based on meanshift provided in this embodiment, a target is subjected to video shooting through a camera tool, so as to obtain a video sequence image { P } of the targetn(x, y) | n ═ 1,2, L N }, the target tracking method comprising the steps of:
step 1, initializing a target image, and selecting a rectangle A containing a tracked target1The initial position of (a);
step 2, recording a target rectangle A of the nth frame imagenJudging the background of all pixels of the target rectangle, and indicating the function BI if the pixels are judged to be the backgroundn(x) Recording as 1, otherwise, being 0;
step 3, a target rectangle A of the nth frame image is processednUsing the indicator function BIn(x) Information, calculating the probability density q of the target rectangleu
Step 4, the candidate target area of the moving target in the (n + 1) th frame uses the target rectangular position y of the image of the nth frame0Calculating a probability density p of candidate target regionsu
Step 5, calculating the weight omega of each pixel in the candidate target areai
Step 6, calculating new position y of candidate target areanew
Step 7, if y0-ynewIf | | is less than epsilon or the iteration frequency is more than a threshold value, stopping iteration; otherwise let y0=ynewAnd turning to step 4, continuing iterative computation until the candidate target position of the termination condition is met.
Embodiment 2, referring to fig. 1 to 2, in the target tracking method based on meanshift provided in this embodiment, taking an nth (≧ 1) frame and an nth +1 frame of two adjacent frames of images as an example, how to perform target tracking by using the idea of meanshift is described in detail, that is, according to a position of a target rectangle of the nth frame, a position of a tracking rectangle of the nth +1 frame is calculated.
Firstly, shooting a target through a camera tool to obtain a video sequence image { Pn (x, y) | n ═ 1,2, L N } of the target, wherein the target tracking method comprises the following steps:
step 1, initializing a target image, and manually selecting a rectangle A containing a tracked target1The initial position of (a);
step 2, recording a target rectangle A of the nth frame imagenJudging the background of all pixels of the target rectangle, and indicating the function if the pixels are judged to be the backgroundBIn(x) Recording as 1, otherwise, recording as 0, and specifically comprising the following steps:
step 21, judge the target rectangle AnThe pixels at the middle edge portion, as shown in FIG. 2, are registered as a rectangle AnSize w × d, target rectangle AnThe 4 edges are pixel areas needing to be judged, the width h and h of the edges take a value of 10, the edge positions 1,2, 3 and 4 are arranged clockwise, and the edge position 1 is positioned in the target rectangle AnRight above;
target rectangle AnThe pixel of the central position belongs by default to the target, i.e. the indicator function BI of the central pixeln(x) Set directly to 0;
step 22, from the target rectangle AnStarting with vertex a in the upper left corner, a rectangle a of size 3 × 3 with a as the left vertex is selected, the matrix comprises 9 pixels, the gray distribution of these 9 pixels is fitted with a gaussian model, and the mean μ and variance σ are calculated2
Figure BDA0001318308310000061
Figure BDA0001318308310000062
Wherein gray (x) represents the gray value of a pixel;
whether the pixels belong to the background is judged by calculating the probability of belonging to the Gaussian model for all the pixels positioned in the edge position 1 and the edge position 4, and the formula is as follows:
Figure BDA0001318308310000063
where f (x) represents the probability that the pixel (x) belongs to the Gaussian model, thus indicating the function BIn(x) Can be calculated by the following formula:
Figure BDA0001318308310000064
edge positions 1 and 1 can be aligned in the manner described aboveAll pixels in the edge position 4 are determined to obtain the corresponding indication function BIn(x);
Step 23, similar to step 22, from the target rectangle AnStarting from the vertex B at the upper right corner, judging all pixels in the edge position 1 and the edge position 2, and if a certain pixel in the edge position 1 is judged to be the background in the step 22, omitting the judgment of the pixel;
step 24, similar to step 22, from the target rectangle AnStarting from the vertex D at the lower right corner, determining pixels at edge positions 2 and 3, and omitting the determination of a certain pixel at edge position 2 if the pixel is determined as the background in step 23;
step 25, similar to step 22, from the target rectangle AnStarting from the vertex C at the lower left corner, determining pixels at edge positions 3 and 4, and if a certain pixel at edge position 3 is determined as the background in step 24, omitting the determination of the pixel;
thus, the target rectangle AnThe indication function BI of all pixels inn(x) All are obtained by calculation;
step 3, a target rectangle A of the nth frame image is processednUsing the indicator function BIn(x) Information, calculating the probability density q of the target rectangleu(ii) a The method specifically comprises the following steps:
selecting gray information as a feature space of the Mean Shift tracker, counting a gray histogram of the feature space, dividing the feature space into 32 parts, recording each part as a feature value of the feature space, and recording x0Set { x ] for the center position coordinates of the target template regioni1, L, n are all pixel positions within the target template region that do not belong to the background, i.e. their indicator function BInIf the (x, y) values are both 0, the probability density function of the target template based on the gray scale feature u being 1, L, m is calculated as follows:
Figure BDA0001318308310000071
wherein C isqIs a target templateThe constant is normalized by the normalization factor,
Figure BDA0001318308310000072
k (g) is a kernel function.
K (g) kernel function is used to consider the influence of occlusion or background interference, and assign a larger weight to the pixel close to the target center position, and assign a smaller weight to the pixel far from the target template center position, so as to distinguish the contribution of the pixels at different positions in the target region in estimating the target probability density function
Figure BDA0001318308310000073
Where h is the kernel function bandwidth and δ (x) is the Kronecker delta function, used to determine the pixel x in the target regioniWhether the gray value of (a) belongs to the color index value of the u-th cell is equal to 1, otherwise, the gray value of (b) is 0;
step 4, the candidate target area of the moving target in the (n + 1) th frame uses the target rectangular position y of the image of the nth frame0Calculating a probability density p of candidate target regionsu(ii) a The method specifically comprises the following steps:
starting calculation by using the position of the target template in the previous frame, namely the nth frame image, and setting the center of the candidate target area as y0In the region with the previous frame pixel { xiFor each pixel corresponding to 1, L, n position, { y }, i ═ L, niAnd j, i is 1, L, n, and the probability density function of the candidate region can be obtained in the same manner as the probability density function of the target template:
Figure BDA0001318308310000081
step 5, calculating the weight omega of each pixel in the candidate target areai
Figure BDA0001318308310000082
Step 6, calculating new position y of candidate target areanew(ii) a The method specifically comprises the following steps:
measuring the similarity between the histograms corresponding to the target template and the candidate target region through a Bhattacharyya coefficient, and moving the search window to the real position of the target along the direction with the maximum density increase according to the principle that the similarity of the two histograms is the maximum;
wherein q isuAs target template, puAs candidate target templates, the Bhattacharyya coefficients are defined as follows:
Figure BDA0001318308310000083
will be provided with
Figure BDA0001318308310000084
And obtaining an updating formula of the center position of the candidate target area by derivation after Taylor series expansion:
Figure BDA0001318308310000085
wherein g (x) k' (x), ωiA weight for each pixel;
step 7, if y0-ynewIf | | is less than epsilon or the iteration frequency is more than a threshold value, stopping iteration; otherwise let y0=ynewAnd turning to step 4, continuing iterative computation until the candidate target position of the termination condition is met.
Compared with the ordinary Meanshift, the method directly uses all the pixels in the target box to participate in the subsequent calculation. The target tracking method based on meanshift judges whether the pixels in the target frame belong to the background or not, and if the pixels belong to the background, the pixels do not participate in subsequent calculation, so that the real moving target is better modeled, and the tracking effect is optimized.
The present invention is not limited to the above embodiment, and other target tracking methods based on meanshift, which are obtained by using the same or similar method as the above embodiment of the present invention, are within the protection scope of the present invention.

Claims (5)

1. A target tracking method based on meanshift is characterized in that a video shooting is carried out on a target through a shooting tool to obtain a video sequence image of the target, and the target tracking method comprises the following steps:
step 1, initializing a target image, and selecting a rectangle A containing a tracked target1The initial position of (a);
step 2, recording a target rectangle A of the nth frame imagenJudging the background of all pixels of the target rectangle, and indicating the function BI if the pixels are judged to be the backgroundn(x) Recording as 1, otherwise, being 0;
step 3, in the indication function BIn(x) When it is noted as 0, the target rectangle A of the n-th frame image isnUsing the indicator function BIn(x) Information, calculating the probability density q of the target rectangleu
Step 4, the candidate target area of the moving target in the (n + 1) th frame uses the target rectangular position y of the image of the nth frame0Calculating a probability density p of candidate target regionsu
Step 5, calculating the weight omega of each pixel in the candidate target areai
Step 6, calculating new position y of candidate target areanew
Step 7, if y0-ynewIf | | is less than epsilon or the iteration frequency is more than a threshold value, stopping iteration; otherwise let y0=ynewTurning to the step 4, continuing iterative computation until the candidate target position meeting the termination condition;
in the step 2, a target rectangle A of the nth frame image is recordednJudging the background of all pixels of the target rectangle, and indicating the function BI if the pixels are judged to be the backgroundn(x) Recording as 1, otherwise, recording as 0, and specifically comprising the following steps:
step 21, judge the target rectangle AnPixels of the middle edge part, the target rectangle AnSize w × d, target rectangle AnThe 4 edges are pixel areas needing to be judged, the width h of the edge, the edge positions 1,2, 3 and 4 are arranged clockwise, and the edge position 1 is positioned in the target rectangle AnRight above;
target rectangle AnCenter position pixel defaultIdentifying the indicator function BI as belonging to the object, i.e. the central pixeln(x) Set directly to 0;
step 22, from the target rectangle AnStarting with vertex a in the upper left corner, a rectangle a of size 3 × 3 with a as the left vertex is selected, the matrix comprises 9 pixels, the gray distribution of these 9 pixels is fitted with a gaussian model, and the mean μ and variance σ are calculated2
Figure FDA0002363982840000021
Figure FDA0002363982840000022
Wherein gray (x) represents the gray value of a pixel;
whether the pixels belong to the background is judged by calculating the probability of belonging to the Gaussian model for all the pixels positioned in the edge position 1 and the edge position 4, and the formula is as follows:
Figure FDA0002363982840000023
where f (x) represents the probability that the pixel (x) belongs to the Gaussian model, indicating the function BIn(x) Calculated using the following formula:
Figure FDA0002363982840000024
all pixels in the edge positions 1 and 4 are determined to obtain corresponding indication functions BIn(x);
Step 23, similar to step 22, from the target rectangle AnStarting from the vertex B at the upper right corner, judging all pixels in the edge position 1 and the edge position 2, and if a certain pixel in the edge position 1 is judged to be the background in the step 22, omitting the judgment of the pixel;
step 24, similar to step 22, from the target rectangle AnStarting from the vertex D of the lower right corner, the edge position is judgedSetting the pixels at 2 and 3, and if a certain pixel at 2 is determined as the background in step 23, omitting the determination of the pixel;
step 25, similar to step 22, from the target rectangle AnStarting from the vertex C at the lower left corner, determining pixels at edge positions 3 and 4, and if a certain pixel at edge position 3 is determined as the background in step 24, omitting the determination of the pixel;
thus, the target rectangle AnThe indication function BI of all pixels inn(x) Are all obtained by calculation.
2. The meanshift-based target tracking method of claim 1, wherein in the step 3, a target rectangle A of an nth frame image is usednUsing the indicator function BIn(x) Information, calculating the probability density q of the target rectangleuThe method specifically comprises the following steps:
selecting gray information as a feature space of the Mean Shift tracker, counting a gray histogram of the feature space, dividing the feature space into 32 parts, recording each part as a feature value of the feature space, and recording x0Set { x ] for the center position coordinates of the target template regioni1, …, n is all the pixel positions in the target template region that do not belong to the background, i.e. their indicator function BInIf the (x, y) values are both 0, the formula for calculating the probability density function of the target template based on the gray scale feature u being 1, …, m is:
Figure FDA0002363982840000031
wherein C isqIs the normalization constant of the target template,
Figure FDA0002363982840000032
k (g) is the kernel function and h is the kernel function bandwidth.
3. The meanshift-based target tracking method of claim 2, wherein the method is characterized in thatIn step 4, the target rectangle position y of the moving target in the candidate target area of the n +1 th frame is used0Calculating a probability density p of candidate target regionsuThe method specifically comprises the following steps:
starting calculation by using the position of the target template in the previous frame, namely the nth frame image, and setting the center of the candidate target area as y0In the region with the previous frame pixel { xi1, …, and y is used for each pixel corresponding to n positioniAnd (5) obtaining a probability density function of the candidate area, wherein i is 1, …, and n is the same as the probability density function of the target template:
Figure FDA0002363982840000033
4. the meanshift-based target tracking method of claim 3, wherein in the step 5, the weight ω of each pixel in the candidate target area is calculatedi
Figure FDA0002363982840000034
5. The meanshift-based target tracking method of claim 4, wherein the step 6 is to calculate a new position y of the candidate target areanewThe method specifically comprises the following steps:
measuring the similarity between the histograms corresponding to the target template and the candidate target region through a Bhattacharyya coefficient, and moving the search window to the real position of the target along the direction with the maximum density increase according to the principle that the similarity of the two histograms is the maximum;
wherein q isuAs target template, puAs candidate target templates, the Bhattacharyya coefficients are:
Figure FDA0002363982840000041
will be provided with
Figure FDA0002363982840000042
And obtaining an updating formula of the center position of the candidate target area by derivation after Taylor series expansion:
Figure FDA0002363982840000043
wherein g (x) k' (x), ωiIs the weight of each pixel.
CN201710434697.7A 2017-06-09 2017-06-09 Target tracking method based on meanshift Active CN107392936B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710434697.7A CN107392936B (en) 2017-06-09 2017-06-09 Target tracking method based on meanshift

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710434697.7A CN107392936B (en) 2017-06-09 2017-06-09 Target tracking method based on meanshift

Publications (2)

Publication Number Publication Date
CN107392936A CN107392936A (en) 2017-11-24
CN107392936B true CN107392936B (en) 2020-06-05

Family

ID=60332350

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710434697.7A Active CN107392936B (en) 2017-06-09 2017-06-09 Target tracking method based on meanshift

Country Status (1)

Country Link
CN (1) CN107392936B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110458862A (en) * 2019-05-22 2019-11-15 西安邮电大学 A kind of motion target tracking method blocked under background
CN111275740B (en) * 2020-01-19 2021-10-22 武汉大学 Satellite video target tracking method based on high-resolution twin network

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101727570A (en) * 2008-10-23 2010-06-09 华为技术有限公司 Tracking method, track detection processing unit and monitor system
CN101783015A (en) * 2009-01-19 2010-07-21 北京中星微电子有限公司 Equipment and method for tracking video
CN102270346A (en) * 2011-07-27 2011-12-07 宁波大学 Method for extracting target object from interactive video
CN103366163A (en) * 2013-07-15 2013-10-23 北京丰华联合科技有限公司 Human face detection system and method based on incremental learning
CN104077779A (en) * 2014-07-04 2014-10-01 中国航天科技集团公司第五研究院第五一三研究所 Moving object statistical method with Gaussian background model and mean value shift tracking combined

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101727570A (en) * 2008-10-23 2010-06-09 华为技术有限公司 Tracking method, track detection processing unit and monitor system
CN101783015A (en) * 2009-01-19 2010-07-21 北京中星微电子有限公司 Equipment and method for tracking video
CN102270346A (en) * 2011-07-27 2011-12-07 宁波大学 Method for extracting target object from interactive video
CN103366163A (en) * 2013-07-15 2013-10-23 北京丰华联合科技有限公司 Human face detection system and method based on incremental learning
CN104077779A (en) * 2014-07-04 2014-10-01 中国航天科技集团公司第五研究院第五一三研究所 Moving object statistical method with Gaussian background model and mean value shift tracking combined

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
基于全景视觉的运动目标检测与跟踪方法研究;甄景蕾;《中国优秀硕士学位论文全文数据库-信息科技辑》;20100615(第6期);第I138-491页第5.3-5.4节 *
基于判别式序列表的均值漂移目标跟踪算法;蒋良卫 等;《华中科技大学学报(自然科学版)》;20111130;第39卷(第增刊二期);第204-219页 *
基于机器视觉的运动目标轨迹跟踪技术研究;管春苗;《中国优秀硕士学位论文全文数据库-信息科技辑》;20160215(第2期);第I138-1552页 *

Also Published As

Publication number Publication date
CN107392936A (en) 2017-11-24

Similar Documents

Publication Publication Date Title
US11763485B1 (en) Deep learning based robot target recognition and motion detection method, storage medium and apparatus
CN110276264B (en) Crowd density estimation method based on foreground segmentation graph
CN108846854B (en) Vehicle tracking method based on motion prediction and multi-feature fusion
CN108198201A (en) A kind of multi-object tracking method, terminal device and storage medium
CN108229475B (en) Vehicle tracking method, system, computer device and readable storage medium
CN107862698A (en) Light field foreground segmentation method and device based on K mean cluster
CN105740945A (en) People counting method based on video analysis
CN112991193B (en) Depth image restoration method, device and computer-readable storage medium
CN107507222B (en) Anti-occlusion particle filter target tracking method based on integral histogram
CN110006444B (en) Anti-interference visual odometer construction method based on optimized Gaussian mixture model
Huang et al. Image-guided non-local dense matching with three-steps optimization
CN113379789B (en) Moving target tracking method in complex environment
CN108596947B (en) Rapid target tracking method suitable for RGB-D camera
CN108765463B (en) Moving target detection method combining region extraction and improved textural features
CN110717934A (en) Anti-occlusion target tracking method based on STRCF
CN115375733A (en) Snow vehicle sled three-dimensional sliding track extraction method based on videos and point cloud data
CN107392936B (en) Target tracking method based on meanshift
CN113643365A (en) Camera pose estimation method, device, equipment and readable storage medium
CN113689459B (en) Real-time tracking and mapping method based on GMM and YOLO under dynamic environment
Sun et al. Adaptive image dehazing and object tracking in UAV videos based on the template updating Siamese network
Xu et al. Head tracking using particle filter with intensity gradient and color histogram
CN103337082A (en) Video segmentation method based on statistical shape prior
CN113516713A (en) Unmanned aerial vehicle self-adaptive target tracking method based on pseudo twin network
CN115035326B (en) Radar image and optical image accurate matching method
CN116883897A (en) Low-resolution target identification method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20200706

Address after: 230000 west side of Xianghe North Road, Feidong Economic Development Zone, Feidong County, Hefei City, Anhui Province

Patentee after: ANHUI GUANGZHEN PHOTOELECTRIC TECHNOLOGY Co.,Ltd.

Address before: 523000 Guangdong province Dongguan Yinxing Industrial Zone Qingxi Town Guangdong light array photoelectric technology Co. Ltd.

Patentee before: GUANGDONG LITE ARRAY Co.,Ltd.

TR01 Transfer of patent right