CN106127808A - A kind of block particle filter method for tracking target based on color and the anti-of local binary patterns Feature Fusion - Google Patents
A kind of block particle filter method for tracking target based on color and the anti-of local binary patterns Feature Fusion Download PDFInfo
- Publication number
- CN106127808A CN106127808A CN201610454063.3A CN201610454063A CN106127808A CN 106127808 A CN106127808 A CN 106127808A CN 201610454063 A CN201610454063 A CN 201610454063A CN 106127808 A CN106127808 A CN 106127808A
- Authority
- CN
- China
- Prior art keywords
- feature
- particle
- target
- color
- local binary
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 239000002245 particle Substances 0.000 title claims abstract description 239
- 238000000034 method Methods 0.000 title claims abstract description 85
- 230000004927 fusion Effects 0.000 title claims abstract description 35
- 238000000605 extraction Methods 0.000 claims abstract description 11
- 238000012952 Resampling Methods 0.000 claims abstract description 8
- 230000010354 integration Effects 0.000 claims abstract 12
- 239000000284 extract Substances 0.000 claims description 9
- 238000004364 calculation method Methods 0.000 claims description 8
- PCTMTFRHKVHKIS-BMFZQQSSSA-N (1s,3r,4e,6e,8e,10e,12e,14e,16e,18s,19r,20r,21s,25r,27r,30r,31r,33s,35r,37s,38r)-3-[(2r,3s,4s,5s,6r)-4-amino-3,5-dihydroxy-6-methyloxan-2-yl]oxy-19,25,27,30,31,33,35,37-octahydroxy-18,20,21-trimethyl-23-oxo-22,39-dioxabicyclo[33.3.1]nonatriaconta-4,6,8,10 Chemical compound C1C=C2C[C@@H](OS(O)(=O)=O)CC[C@]2(C)[C@@H]2[C@@H]1[C@@H]1CC[C@H]([C@H](C)CCCC(C)C)[C@@]1(C)CC2.O[C@H]1[C@@H](N)[C@H](O)[C@@H](C)O[C@H]1O[C@H]1/C=C/C=C/C=C/C=C/C=C/C=C/C=C/[C@H](C)[C@@H](O)[C@@H](C)[C@H](C)OC(=O)C[C@H](O)C[C@H](O)CC[C@@H](O)[C@H](O)C[C@H](O)C[C@](O)(C[C@H](O)[C@H]2C(O)=O)O[C@H]2C1 PCTMTFRHKVHKIS-BMFZQQSSSA-N 0.000 claims description 7
- 238000007500 overflow downdraw method Methods 0.000 claims 6
- 230000000903 blocking effect Effects 0.000 claims 2
- 238000009792 diffusion process Methods 0.000 claims 1
- 230000000694 effects Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 4
- 239000000654 additive Substances 0.000 description 2
- 230000000996 additive effect Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000000342 Monte Carlo simulation Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- ZPUCINDJVBIVPJ-LJISPDSOSA-N cocaine Chemical compound O([C@H]1C[C@@H]2CC[C@@H](N2C)[C@H]1C(=O)OC)C(=O)C1=CC=CC=C1 ZPUCINDJVBIVPJ-LJISPDSOSA-N 0.000 description 1
- 238000010835 comparative analysis Methods 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20024—Filtering details
Landscapes
- Image Analysis (AREA)
Abstract
Description
技术领域technical field
本发明涉及图像处理、视频处理、目标跟踪等领域,尤其涉及基于视频的目标跟踪领域。The invention relates to the fields of image processing, video processing, object tracking and the like, and in particular relates to the field of video-based object tracking.
背景技术Background technique
在视频目标跟踪技术中,遮挡问题是一个常见的难点。目标在移动过程中,可能会遇到各种遮挡:目标因旋转或因动作引起的自我遮挡、移动目标遇到其他行人引起的相互遮挡、移动目标被周围环境中的障碍物遮挡等。目标遇到遮挡时,使得目标的特征信息的提取被干扰,造成目标特征获取不完整甚至完全获取不到,最终引起目标跟踪不准确,甚至目标跟丢。In video object tracking technology, the occlusion problem is a common difficulty. During the moving process, the target may encounter various occlusions: self-occlusion of the target due to rotation or movement, mutual occlusion caused by the moving target encountering other pedestrians, occlusion of the moving target by obstacles in the surrounding environment, etc. When the target encounters occlusion, the extraction of the target's feature information is interfered, resulting in incomplete or even no acquisition of the target feature, which eventually leads to inaccurate target tracking, and even the target is lost.
粒子滤波是一种通过蒙特卡罗方法仿真和基于贝叶斯估计推导的算法,它利用状态空间中一组带有权重的随机样本(“粒子”)来表示状态的后验概率密度函数,然后利用贝叶斯估计方法来不断地迭代更新导出新的随机样本以及新的权重,以此来得到下一时刻状态的后验概率密度函数。粒子滤波方法可适用于非线性、非高斯系统下的目标跟踪问题。从原理上看,粒子滤波跟踪算法既具有处理遮挡问题的潜力,又具有先天性的特征融合框架。Particle filtering is an algorithm simulated by Monte Carlo methods and derived based on Bayesian estimation, which uses a set of weighted random samples ("particles") in the state space to represent the posterior probability density function of the state, and then The Bayesian estimation method is used to iteratively update and derive new random samples and new weights, so as to obtain the posterior probability density function of the state at the next moment. The particle filter method can be applied to the target tracking problem under the nonlinear and non-Gaussian system. In principle, the particle filter tracking algorithm has both the potential to deal with occlusion problems and an innate feature fusion framework.
传统的粒子滤波跟踪方法提取单一的颜色直方图进行目标跟踪,颜色直方图是一种基于全局的特征,它没有关注目标的局部特征,所以在遇到遮挡后,全局颜色特征并不能很准确地描述目标。The traditional particle filter tracking method extracts a single color histogram for target tracking. The color histogram is a global-based feature that does not pay attention to the local features of the target. Therefore, after encountering occlusion, the global color feature cannot be very accurate Describe the goal.
发明内容Contents of the invention
为了克服当前现有的视频目标跟踪方法单一特征的局限性以及在发生遮挡时表现出的跟踪效果不佳、甚至目标跟丢的问题,本发明提出一种基于颜色与局部二值模式特征融合的抗遮挡粒子滤波目标跟踪方法,该方法将颜色特征与局部二值模式特征按照各自特征与背景的区分度,通过确定性系数进行加性融合,能更有效地描述目标特征,在跟踪过程中,对遮挡情况进行实时判定并针对不同的遮挡情况采取相应的跟踪机制,从而提高遮挡情况下目标跟踪的稳定性和鲁棒性。In order to overcome the limitation of the single feature of the existing video target tracking method and the problem of poor tracking effect and even lost target when occlusion occurs, the present invention proposes a feature fusion based on color and local binary mode. Anti-occlusion particle filter target tracking method, which uses color features and local binary pattern features to perform additive fusion through deterministic coefficients according to the degree of discrimination between their respective features and the background, which can describe target features more effectively. During the tracking process, The occlusion situation is judged in real time and the corresponding tracking mechanism is adopted for different occlusion situations, so as to improve the stability and robustness of target tracking under occlusion situations.
本发明解决其技术问题所采用的技术方案是:The technical solution adopted by the present invention to solve its technical problems is:
一种基于颜色与局部二值模式特征融合的抗遮挡粒子滤波目标跟踪方法,包括以下步骤:An anti-occlusion particle filter target tracking method based on color and local binary pattern feature fusion, comprising the following steps:
步骤1,目标的初始化;Step 1, the initialization of the target;
步骤2,感兴趣区域的颜色积分直方图和局部二值模式积分直方图特征提取;Step 2, the color integral histogram and local binary mode integral histogram feature extraction of the region of interest;
步骤3,颜色特征和局部二值模式特征的特征确定性系数计算:计算每个粒子矩形框的颜色直方图特征和局部二值模式直方图特征,计算每个粒子背景区域的颜色直方图特征和局部二值模式直方图特征,计算每个粒子的颜色特征与其背景区域的颜色特征的似然比值并根据似然比值计算每个粒子的颜色特征与背景颜色特征的区分度,根据区分度计算每个粒子的颜色特征的特征确定性系数,计算每个粒子的局部二值模式特征与其背景区域的局部二值模式特征的似然比值并根据似然比值计算每个粒子的局部二值模式特征与背景局部二值模式特征的区分度,根据区分度计算每个粒子的局部二值模式特征的特征确定性系数;Step 3, calculation of feature certainty coefficient of color feature and local binary mode feature: calculate the color histogram feature and local binary mode histogram feature of each particle rectangular box, calculate the color histogram feature and the color histogram feature of each particle background area Local binary mode histogram feature, calculate the likelihood ratio of the color feature of each particle and the color feature of the background area, and calculate the discrimination degree between the color feature of each particle and the background color feature according to the likelihood ratio, and calculate each The feature certainty coefficient of the color feature of each particle, calculate the likelihood ratio of the local binary mode feature of each particle and the local binary mode feature of the background area, and calculate the local binary mode feature of each particle and the local binary mode feature of each particle according to the likelihood ratio. The discrimination degree of the background local binary pattern feature, according to the discrimination degree, calculate the feature certainty coefficient of the local binary pattern characteristic of each particle;
步骤4,根据当前目标状态不同,选择不同的跟踪方法:如果目标状态正常,用颜色与局部二值模式特征融合的粒子滤波方法进行目标跟踪,如果目标状态为部分遮挡,用颜色与局部二值模式特征融合的分块粒子滤波方法进行目标跟踪,如果目标状态为严重遮挡,用最小二乘法预测目标位置;Step 4. Select different tracking methods according to the current target state: if the target state is normal, use the particle filter method of color and local binary mode feature fusion to track the target; if the target state is partially occluded, use color and local binary mode to track the target. The block particle filter method of pattern feature fusion is used for target tracking. If the target state is severely occluded, the least square method is used to predict the target position;
步骤5,更新当前目标状态;Step 5, update the current target state;
步骤6,当目标处于正常状态时,更新目标的颜色特征模板、局部二值模式特征模板以及子块的颜色特征模板、局部二值模式特征模板;Step 6, when the target is in a normal state, update the color feature template, the local binary pattern feature template of the target, and the color feature template and the local binary pattern feature template of the sub-block;
步骤7,采用系统重采样方法进行粒子的重采样;Step 7, using the system resampling method to resample the particles;
步骤8,粒子传播:经过重采样的粒子,在x、y方向上分别扩散得到新的对应粒子,作为下一帧中粒子的初始分布。Step 8, particle propagation: the resampled particles are diffused in the x and y directions to obtain new corresponding particles, which are used as the initial distribution of particles in the next frame.
进一步,所述步骤1中,目标的初始化过程为:在第1帧中通过手动框选出目标,记目标跟踪框的高为height,宽为width,目标中心点坐标为(x1,y1),提取目标区域的颜色直方图和局部二值模式特征并初始化目标的颜色特征模板H=(h1,h2,…,hn)和局部二值模式特征模板G=(g1,g2,…,gn)(n=1,2,…,32),n是特征直方图的区间个数;将目标的高等分成三份横向子块,从上到下分别记为子块1、2、3,将目标的宽等分成三份纵向子块,从左到右分别记为子块4、5、6,提取每个子块的颜色直方图和局部二值模式特征并初始化目标的子块颜色特征模板Hi=(h'1,h'2,…,h'n)和子块局部二值模式特征模板Gi=(g'1,g'2,…,g'n)(i=1,2,…,6;n=1,2,…,32),初始化粒子数p,初始化各粒子的位置(p_xj,p_yj)(j=1,2,…,p),初始化目标状态标志位Flag为0,初始化各子块的状态标志位为0。Further, in the step 1, the initialization process of the target is: select the target by manual frame in the first frame, record the height of the target tracking frame as height, the width as width, and the coordinates of the center point of the target as (x 1 , y 1 ), extract the color histogram and local binary pattern features of the target area and initialize the target color feature template H=(h 1 ,h 2 ,…,h n ) and local binary pattern feature template G=(g 1 ,g 2 ,...,g n )(n=1,2,...,32), n is the number of intervals in the feature histogram; divide the height of the target into three horizontal sub-blocks, and record them as sub-block 1 from top to bottom , 2, 3, divide the width of the target into three longitudinal sub-blocks, and record them as sub-blocks 4, 5, and 6 from left to right, extract the color histogram and local binary mode features of each sub-block and initialize the target Sub-block color feature template H i =(h' 1 ,h' 2 ,...,h' n ) and sub-block local binary pattern feature template G i =(g' 1 ,g' 2 ,...,g' n )( i=1,2,...,6; n=1,2,...,32), initialize the number of particles p, initialize the position of each particle (p_x j , p_y j ) (j=1,2,...,p), Initialize the target status flag bit Flag to 0, and initialize the status flag bits of each sub-block is 0.
更进一步,所述步骤2中,感兴趣区域的颜色积分直方图和局部二值模式积分直方图特征提取过程为:读取第k帧图像P,感兴趣区域指能覆盖所有粒子背景区域的最小矩形区域,粒子的背景区域是以粒子位置为中心点,宽为高为的矩形区域减去目标矩形区域后的“回”字形区域,其中,height为目标跟踪框的高,width为目标跟踪框的宽,感兴趣区域的四个顶点A、B、C、D的坐标分别为:Further, in the step 2, the feature extraction process of the color integral histogram and the local binary mode integral histogram of the region of interest is as follows: read the image P of the kth frame, and the region of interest refers to the smallest particle background region that can cover all particles. Rectangular area, the background area of the particle is centered on the particle position, and the width is high for The rectangular area of the target is subtracted from the "back" font area, where height is the height of the target tracking frame, width is the width of the target tracking frame, and the coordinates of the four vertices A, B, C, and D of the region of interest They are:
其中,(p_x,p_y)是粒子的坐标,min()是求最小值函数,max()是求最大值函数,计算感兴趣矩形区域ABCD上颜色特征的积分直方图Hin(x,y),即计算从图像点P(xA,yA)到点P(x,y)构成的矩形区域内所有点的颜色直方图;计算感兴趣矩形区域ABCD上局部二值模式特征的积分直方图Gin(x,y),即计算从图像点P(xA,yA)到点P(x,y)构成的矩形区域内所有点的局部二值模式直方图。Among them, (p_x,p_y) is the coordinates of the particle, min() is the minimum value function, max() is the maximum value function, and calculates the integral histogram H in (x,y) of the color feature on the rectangular area ABCD of interest , that is, calculate the color histogram of all points in the rectangular area formed from the image point P(x A ,y A ) to the point P(x,y); calculate the integral histogram of the local binary pattern features on the rectangular area of interest ABCD G in (x, y), is to calculate the local binary mode histogram of all points in the rectangular area formed from the image point P(x A , y A ) to the point P(x, y).
更进一步,所述步骤3中,颜色特征和局部二值模式特征的特征确定性系数的计算过程为:利用积分直方图对p个粒子分别提取以每个粒子j(j=1,2,…,p)为中心的宽为width,高为height的矩形框内的颜色直方图特征HPj=(hp1,hp2,…,hpn)和局部二值模式直方图特征GPj=(gp1,gp2,…,gpn)(n=1,2,…,32),其中粒子j矩形框的四个顶点A'、B'、C'、D'坐标分别为:Furthermore, in the step 3, the calculation process of the feature certainty coefficient of the color feature and the local binary pattern feature is: use the integral histogram to extract the p particles respectively, and each particle j (j=1,2,... ,p) is the color histogram feature HP j =(hp 1 ,hp 2 ,...,hp n ) and local binary mode histogram feature GP j =(gp 1 ,gp 2 ,...,gp n )(n=1,2,...,32), where the coordinates of the four vertices A', B', C', D' of the rectangular frame of particle j are respectively:
其中,(p_xj,p_yj)(j=1,2,…,p)是粒子j的坐标,则粒子j矩形框的颜色直方图特征HPj和局部二值模式直方图特征GPj分别为:Among them, (p_x j ,p_y j )(j=1,2,...,p) is the coordinates of particle j, then the color histogram feature HP j and the local binary pattern histogram feature GP j of the rectangular box of particle j are respectively :
HPj=Hin(xA',yA')-Hin(xC',yC'-1)-Hin(xB'-1,yB')+Hin(xD'-1,yD'-1),GPj=Gin(xA',yA')-Gin(xC',yC'-1)-Gin(xB'-1,yB')+Gin(xD'-1,yD'-1),其中,Hin(x,y)是感兴趣矩形区域上颜色特征的积分直方图,Gin(x,y)是感兴趣矩形区域上局部二值模式特征的积分直方图;利用积分直方图提取粒子j的背景区域的颜色直方图特征BG_HPj=(bh1,bh2,…,bhn)和局部二值模式直方图特征BG_GPj=(bg1,bg2,…,bgn)(n=1,2,…,32),粒子的背景区域是以粒子位置为中心点,宽为高为的矩形区域减去目标矩形区域后的“回”字形区域,粒子背景区域的四个外顶点E'、F'、G'、H'坐标分别为:HP j =H in (x A' ,y A' )-H in (x C' ,y C' -1)-H in (x B' -1,y B' )+H in (x D' - 1,y D' -1), GP j =G in (x A' ,y A' )-G in (x C' ,y C' -1)-G in (x B' -1,y B' )+G in (x D' -1,y D' -1), where H in (x,y) is the integral histogram of color features on the rectangular area of interest, and G in (x,y) is the The integral histogram of the local binary pattern feature on the rectangular area; utilize the integral histogram to extract the color histogram feature BG_HP j = (bh 1 , bh 2 ,...,bh n ) and the local binary pattern histogram of the background area of the particle j Feature BG_GP j =(bg 1 ,bg 2 ,…,bg n )(n=1,2,…,32), the background area of the particle is centered on the particle position, and the width is high for The coordinates of the four outer vertices E', F', G', and H' of the particle background area are respectively:
则粒子j背景区域的颜色直方图特征BG_HPj和局部二值模式直方图特征BG_GPj分别为:Then the color histogram feature BG_HP j and the local binary pattern histogram feature BG_GP j of the particle j background area are respectively:
BG_HPj=Hin(xE',yE')-Hin(xG',yG'-1)-Hin(xF'-1,yF')+Hin(xH'-1,yH'-1)-HPj,BG_GPj=Gin(xE',yE')-Gin(xG',yG'-1)-Gin(xF'-1,yF')+Gin(xH'-1,yH'-1)-GPj,计算粒子j的颜色特征与其背景区域的颜色特征的似然比值其中ε=0.001,设T0为特征的区分度阈值,则粒子j的颜色特征与背景颜色特征的区分度为粒子j的颜色特征的特征确定性系数为计算粒子j的局部二值模式特征与其背景区域的局部二值模式特征的似然比值其中ε=0.001,粒子j的局部二值模式特征与背景局部二值模式特征的区分度为粒子j的局部二值模式特征的特征确定性系数为 BG_HP j =H in (x E' ,y E' )-H in (x G' ,y G' -1)-H in (x F' -1,y F' )+H in (x H' - 1,y H' -1)-HP j ,BG_GP j =G in (x E' ,y E' )-G in (x G' ,y G' -1)-G in (x F' -1, y F' )+G in (x H' -1,y H' -1)-GP j , calculate the likelihood ratio between the color feature of particle j and the color feature of the background area Where ε=0.001, let T 0 be the threshold of feature discrimination, then the discrimination between the color feature of particle j and the background color feature is The characteristic certainty coefficient of the color characteristic of particle j is Calculate the likelihood ratio of the local binary pattern feature of particle j and the local binary pattern feature of the background region Where ε=0.001, the degree of discrimination between the local binary pattern feature of particle j and the background local binary pattern feature is The feature certainty coefficient of the local binary pattern feature of particle j is
再进一步,所述步骤4中,目标状态正常下的颜色与局部二值模式特征融合的粒子滤波目标跟踪过程为:对p个粒子分别计算粒子j(j=1,2,…,p)的颜色特征HPj=(hp1,hp2,…,hpn)与目标颜色特征模板H=(h1,h2,…,hn)(n=1,2,…,32)的巴氏系数巴氏距离为计算粒子j的颜色特征权重其中σ=0.05,对各粒子的颜色特征权重进行归一化处理对p个粒子分别计算粒子j的局部二值模式特征GPj=(gp1,gp2,…,gpn)与目标局部二值模式特征模板G=(g1,g2,…,gn)的巴氏系数巴氏距离为计算粒子j的局部二值模式特征权重其中σ=0.05,对各粒子的局部二值模式特征权重进行归一化处理从上一个步骤中可计算得到粒子j的颜色特征的特征确定性系数为β_Cj,局部二值模式特征的特征确定性系数为β_Lj,计算各粒子颜色和局部二值模式特征融合后的权重为(若特征确定性系数β_Cj=β_Lj=0,则令β_Cj=β_Lj=0.5),对各粒子权重进行归一化处理将各粒子坐标按其权重加权得到当前帧(第k帧)目标的中心点坐标 Further, in the step 4, the particle filter target tracking process of the fusion of the color and the local binary mode feature under the normal state of the target is: calculate the particle j (j=1,2,...,p) for p particles respectively Color feature HP j = (hp 1 ,hp 2 ,...,hp n ) and target color feature template H=(h 1 ,h 2 ,...,h n )(n=1,2,...,32) coefficient Barthroat's distance is Calculate the color feature weight of particle j Where σ=0.05, the color feature weight of each particle is normalized Calculate the local binary pattern feature GP j =(gp 1 ,gp 2 ,…,gp n ) and the target local binary pattern feature template G=(g 1 ,g 2 ,…,g n ) of the Barthel coefficient Barthroat's distance is Calculate the local binary pattern feature weight of particle j Where σ=0.05, the local binary mode feature weight of each particle is normalized From the previous step, it can be calculated that the feature certainty coefficient of the color feature of particle j is β_C j , and the feature certainty coefficient of the local binary mode feature is β_L j , and calculate the weight of the fusion of each particle color and local binary mode feature for (If the feature certainty coefficient β_C j = β_L j = 0, then let β_C j = β_L j = 0.5), normalize the weight of each particle Weight the coordinates of each particle according to its weight to obtain the center point coordinates of the target in the current frame (kth frame)
或者是:所述步骤4中,目标被部分遮挡时的颜色与局部二值模式特征融合的粒子滤波目标跟踪过程为:根据上一帧图像中目标出现遮挡情况而检测出的子块状态标志位对p个粒子分别计算粒子j(j=1,2,…,p)中时即有效子块的颜色直方图特征HPj_i和局部二值模式直方图特征GPj_i;将粒子j的各有效子块的颜色特征与对应的子块颜色特征模板Hi进行对比,计算各有效子块i的巴氏系数取有效子块相似度的均值作为对应粒子j整体部分的相似度,记有效子块个数为M,则粒子j的巴氏系数为巴氏距离为计算各粒子的颜色特征权重对各粒子的颜色特征权重进行归一化处理再将粒子j的各有效子块的局部二值模式特征与对应的子块局部二值模式特征模板Gi进行对比,计算各有效子块i的巴氏系数取有效子块相似度的均值作为对应粒子j整体部分的相似度,记有效子块个数为M,则粒子j的巴氏系数为巴氏距离为计算各粒子的局部二值模式特征权重对各粒子的局部二值模式特征权重进行归一化处理从上一个步骤中可计算得到粒子j的颜色特征的特征确定性系数为β_Cj,局部二值模式特征的特征确定性系数为β_Lj,计算各粒子颜色和局部二值模式特征融合后的权重为(若特征确定性系数β_Cj=β_Lj=0,则令β_Cj=β_Lj=0.5),对各粒子权重进行归一化处理将各粒子坐标按其权重加权得到当前帧(第k帧)目标的中心点坐标 Or: in the step 4, the particle filter target tracking process of the fusion of the color and the local binary mode feature when the target is partially occluded is: the sub-block status flag detected according to the occlusion of the target in the previous frame image For p particles, calculate the particle j (j=1,2,...,p) respectively That is, the color histogram feature HP j_i of the effective sub-block and the histogram feature GP j_i of the local binary mode; the color feature of each effective sub-block of particle j is compared with the corresponding sub-block color feature template H i , and each effective sub-block is calculated. Bhattachary coefficient of subblock i Take the mean value of the similarity of effective sub-blocks as the similarity of the whole part of the corresponding particle j, record the number of effective sub-blocks as M, then the Bhattachary coefficient of particle j is Barthroat's distance is Calculate the color feature weight of each particle Normalize the color feature weights of each particle Then, compare the local binary pattern features of each effective sub-block of particle j with the corresponding sub-block local binary pattern feature template G i , and calculate the Bhattachary coefficient of each effective sub-block i Take the mean value of the similarity of effective sub-blocks as the similarity of the whole part of the corresponding particle j, record the number of effective sub-blocks as M, then the Bhattachary coefficient of particle j is Barthroat's distance is Calculate the local binary pattern feature weight of each particle Normalize the local binary pattern feature weights of each particle From the previous step, it can be calculated that the feature certainty coefficient of the color feature of particle j is β_C j , and the feature certainty coefficient of the local binary mode feature is β_L j , and calculate the weight of the fusion of each particle color and local binary mode feature for (If the feature certainty coefficient β_C j = β_L j = 0, then set β_C j = β_L j = 0.5), normalize the weight of each particle Weight the coordinates of each particle according to its weight to obtain the center point coordinates of the target in the current frame (kth frame)
又或者是:所述步骤4中,目标被严重遮挡时的最小二乘法目标位置预测过程为:根据前面所有帧的目标中心点坐标(xt,yt)(t=1,2,…,k-1),建立如下方程:Or: in the step 4, the least square method target position prediction process when the target is severely occluded is: according to the target center point coordinates (x t , y t ) of all previous frames (t=1,2,..., k-1), establish the following equation:
通过解这个方程计算出各系数a1,a2,b1,b2,根据公式xk=a1k+b1,yk=a2k+b2计算得到第k帧中目标的中心点坐标(xk,yk)。Calculate the coefficients a 1 , a 2 , b 1 , b 2 by solving this equation, and calculate the center of the target in the kth frame according to the formula x k =a 1 k+b 1 , y k =a 2 k+b 2 Point coordinates (x k , y k ).
再进一步,所述步骤5中,目标状态的更新过程为:从前面步骤中可计算得到当前帧(第k帧)目标的中心点坐标(xk,yk),计算当前帧中目标颜色直方图特征Hacc=(h1′,h2′,……,hn′)(n=1,2,…,32),记当前帧目标颜色特征与颜色特征模板H=(h1,h2,…,hn)(n=1,2,…,32)的相似度为设目标的整体相似度阈值为T1,当B大于等于阈值T1时,说明目标在当前帧中是正常状态,如果此时目标状态标志位Flag等于0,则保持不变,否则更新当前目标状态标志位为0,即表明此时目标已经脱离了遮挡;当B小于阈值T1时,说明目标在当前帧中被遮挡,提取目标坐标(xk,yk)区域上各子块i(i=1,2,…,6)的颜色直方图特征记作Hacc_i,计算各子块颜色特征与对应子块颜色特征模板Hi的相似度设子块的颜色特征相似度阈值为T2,则:Further, in the step 5, the update process of the target state is: the center point coordinates (x k , y k ) of the target in the current frame (the kth frame) can be calculated from the previous steps, and the target color histogram in the current frame can be calculated Image feature H acc =(h 1 ′,h 2 ′,……,h n ′)(n=1,2,…,32), record the current frame target color feature and color feature template H=(h 1 ,h 2 ,…,h n )(n=1,2,…,32) similarity is Set the overall similarity threshold of the target as T 1 , when B is greater than or equal to the threshold T 1 , it means that the target is in a normal state in the current frame, if the target status flag bit Flag is equal to 0 at this time, it will remain unchanged, otherwise update the current target The status flag bit is 0, which means that the target has been out of occlusion at this time; when B is less than the threshold T 1 , it indicates that the target is occluded in the current frame, and each sub-block i ( The color histogram features of i=1,2,…,6) are denoted as H acc_i , and the similarity between each sub-block color feature and the corresponding sub-block color feature template H i is calculated Set the color feature similarity threshold of the sub-block as T 2 , then:
即当Bi小于T2,该子块i为无效子块,记子块状态标志位为0;当Bi大于等于T2,该子块i为有效子块,记子块状态标志位为1,统计有效子块的数目M,根据有效子块数目来判定目标的遮挡严重程度:That is, when B i is less than T 2 , the sub-block i is an invalid sub-block, and the sub-block status flag is recorded is 0; when B i is greater than or equal to T 2 , the sub-block i is a valid sub-block, and the sub-block status flag is recorded is 1, the number M of effective sub-blocks is counted, and the occlusion severity of the target is determined according to the number of effective sub-blocks:
即当有效子块的数目M大于2时,说明目标在当前帧中被部分遮挡,更新目标状态标志位Flag为1,当有效子块的数目M小于等于2时,说明目标在当前帧中被严重遮挡,更新目标状态标志位Flag为2。That is, when the number M of valid sub-blocks is greater than 2, it means that the target is partially occluded in the current frame, and the target status flag bit Flag is updated to 1; when the number M of valid sub-blocks is less than or equal to 2, it means that the target is partially covered in the current frame. Serious occlusion, update the target status flag Flag to 2.
所述步骤6中,模板更新方法为:记目标颜色特征模板为H,当前帧的目标新坐标区域颜色直方图特征为Hacc,则模板更新公式为:H=αH+(1-α)Hacc,其中,0.80≤α≤0.99,α具体数值根据视频情况设定;目标的局部二值模式特征模板、子块的颜色特征模板、子块的局部二值模式特征模板更新方法与上述目标颜色特征模板更新方法类同。In said step 6, the template update method is: mark the target color feature template as H, and the target new coordinate area color histogram feature of the current frame is H acc , then the template update formula is: H=αH+(1-α)H acc , where, 0.80≤α≤0.99, the specific value of α is set according to the video situation; the local binary mode feature template of the target, the color feature template of the sub-block, the update method of the local binary mode feature template of the sub-block and the above-mentioned target color feature The template update method is similar.
本发明的有益效果主要表现在:提取目标的颜色特征和局部二值模式特征,并将颜色特征与局部二值模式特征通过确定性系数进行加性融合,能更有效地描述目标特征,对遮挡进行实时判定并根据不同遮挡情况采取不同的跟踪机制,可提高遮挡情况下目标跟踪的稳定性和鲁棒性。The beneficial effects of the present invention are mainly manifested in: extracting the color feature and local binary pattern feature of the target, and performing additive fusion of the color feature and the local binary pattern feature through the deterministic coefficient, which can describe the target feature more effectively and reduce occlusion Making real-time judgments and adopting different tracking mechanisms according to different occlusion situations can improve the stability and robustness of target tracking under occlusion conditions.
附图说明Description of drawings
图1为本发明的一种基于颜色与局部二值模式特征融合的抗遮挡粒子滤波目标跟踪方法流程图。FIG. 1 is a flow chart of an anti-occlusion particle filter target tracking method based on fusion of color and local binary pattern features of the present invention.
图2为分块方法示意图。Figure 2 is a schematic diagram of the block method.
图3为粒子目标区域和背景区域示意图。Fig. 3 is a schematic diagram of particle target area and background area.
图4为感兴趣区域示意图。Figure 4 is a schematic diagram of the region of interest.
图5为严重遮挡示意图。Figure 5 is a schematic diagram of severe occlusion.
图6为测试视频的目标跟踪效果,其中,(a)为传统粒子滤波目标跟踪方法跟踪结果((a)-1是第15帧,(a)-2是第28帧,(a)-3是第45帧,(a)-4是第63帧,(a)-5是第92帧,(a)-6是第102帧,(a)-7是第113帧,(a)-8是第142帧);(b)为本发明提出的一种基于颜色与局部二值模式特征融合的抗遮挡粒子滤波目标跟踪方法跟踪结果((b)-1是第15帧,(b)-2是第28帧,(b)-3是第45帧,(b)-4是第63帧,(b)-5是第92帧,(b)-6是第102帧,(b)-7是第113帧,(b)-8是第142帧)。Figure 6 is the target tracking effect of the test video, where (a) is the tracking result of the traditional particle filter target tracking method ((a)-1 is the 15th frame, (a)-2 is the 28th frame, (a)-3 is frame 45, (a)-4 is frame 63, (a)-5 is frame 92, (a)-6 is frame 102, (a)-7 is frame 113, (a)-8 is the 142nd frame); (b) is the tracking result of an anti-occlusion particle filter target tracking method based on the fusion of color and local binary pattern features proposed by the present invention ((b)-1 is the 15th frame, (b)- 2 is frame 28, (b)-3 is frame 45, (b)-4 is frame 63, (b)-5 is frame 92, (b)-6 is frame 102, (b)- 7 is the 113th frame, (b)-8 is the 142nd frame).
具体实施方式detailed description
下面结合附图和实施例对本发明作进一步说明。The present invention will be further described below in conjunction with drawings and embodiments.
参照图1~图6,一种基于颜色与局部二值模式特征融合的抗遮挡粒子滤波目标跟踪方法,包括以下步骤:Referring to Figures 1 to 6, an anti-occlusion particle filter target tracking method based on fusion of color and local binary pattern features includes the following steps:
步骤1,目标的初始化;Step 1, the initialization of the target;
步骤2,感兴趣区域的颜色积分直方图和局部二值模式积分直方图特征提取;Step 2, the color integral histogram and local binary mode integral histogram feature extraction of the region of interest;
步骤3,颜色特征和局部二值模式特征的特征确定性系数计算:计算每个粒子矩形框的颜色直方图特征和局部二值模式直方图特征,计算每个粒子背景区域的颜色直方图特征和局部二值模式直方图特征,计算每个粒子的颜色特征与其背景区域的颜色特征的似然比值并根据似然比值计算每个粒子的颜色特征与背景颜色特征的区分度,根据区分度计算每个粒子的颜色特征的特征确定性系数,计算每个粒子的局部二值模式特征与其背景区域的局部二值模式特征的似然比值并根据似然比值计算每个粒子的局部二值模式特征与背景局部二值模式特征的区分度,根据区分度计算每个粒子的局部二值模式特征的特征确定性系数;Step 3, calculation of feature certainty coefficient of color feature and local binary mode feature: calculate the color histogram feature and local binary mode histogram feature of each particle rectangular box, calculate the color histogram feature and the color histogram feature of each particle background area Local binary mode histogram feature, calculate the likelihood ratio of the color feature of each particle and the color feature of the background area, and calculate the discrimination degree between the color feature of each particle and the background color feature according to the likelihood ratio, and calculate each The feature certainty coefficient of the color feature of each particle, calculate the likelihood ratio of the local binary mode feature of each particle and the local binary mode feature of the background area, and calculate the local binary mode feature of each particle and the local binary mode feature of each particle according to the likelihood ratio. The discrimination degree of the background local binary pattern feature, according to the discrimination degree, calculate the feature certainty coefficient of the local binary pattern characteristic of each particle;
步骤4,根据当前目标状态不同,选择不同的跟踪方法:如果目标状态正常,用颜色与局部二值模式特征融合的粒子滤波方法进行目标跟踪,如果目标状态为部分遮挡,用颜色与局部二值模式特征融合的分块粒子滤波方法进行目标跟踪,如果目标状态为严重遮挡,用最小二乘法预测目标位置;Step 4. Select different tracking methods according to the current target state: if the target state is normal, use the particle filter method of color and local binary mode feature fusion to track the target; if the target state is partially occluded, use color and local binary mode to track the target. The block particle filter method of pattern feature fusion is used for target tracking. If the target state is severely occluded, the least square method is used to predict the target position;
步骤5,更新当前目标状态;Step 5, update the current target state;
步骤6,当目标处于正常状态时,更新目标的颜色特征模板、局部二值模式特征模板以及子块的颜色特征模板、局部二值模式特征模板;Step 6, when the target is in a normal state, update the color feature template, the local binary pattern feature template of the target, and the color feature template and the local binary pattern feature template of the sub-block;
步骤7,采用系统重采样方法进行粒子的重采样;Step 7, using the system resampling method to resample the particles;
步骤8,粒子传播:经过重采样的粒子,在x、y方向上分别扩散得到新的对应粒子,作为下一帧中粒子的初始分布。Step 8, particle propagation: the resampled particles are diffused in the x and y directions to obtain new corresponding particles, which are used as the initial distribution of particles in the next frame.
本实施例采用CAVIAR视频库的一段视频进行测试,该视频为MPEG2压缩的MPG格式文件,分辨率是384×288像素,帧速为每秒25帧,设粒子数为300,阈值T0=0.7,T1=0.8,T2=0.9,α=0.9。The present embodiment adopts a section of video of CAVIAR video storehouse to test, and this video is the MPG format file of MPEG2 compression, and resolution is 384 * 288 pixels, and frame rate is 25 frames per second, and particle number is 300 assuming, and threshold value T 0 =0.7 , T 1 =0.8, T 2 =0.9, α=0.9.
具体的实施流程包括8个步骤,如图1所示,具体为:The specific implementation process includes 8 steps, as shown in Figure 1, specifically:
(1)目标初始化(1) Target initialization
在第1帧中通过手动框选出目标,记目标跟踪框的高为height,宽为width,目标中心点坐标为(x1,y1),提取目标区域的颜色直方图特征和局部二值模式直方图特征,初始化目标的颜色特征模板H=(h1,h2,…,hn)和局部二值模式特征模板G=(g1,g2,…,gn)(n=1,2,…,32),n是特征直方图的区间个数。如图2所示,将目标的高等分成三份横向子块,从上到下分别记为子块1、2、3,将目标的宽等分成三份纵向子块,从左到右分别记为子块4、5、6,提取每个子块的颜色直方图和局部二值模式直方图特征,初始化目标的子块颜色特征模板Hi=(h'1,h'2,…,h'n)和子块局部二值模式特征模板Gi=(g'1,g'2,…,g'n)(i=1,2,…,6;n=1,2,…,32),初始化粒子数p,初始化各粒子的位置(p_xj,p_yj)(j=1,2,…,p),初始化目标状态标志位Flag为0,初始化各子块的状态标志位为0。Select the target by manual frame in the first frame, record the height of the target tracking frame as height, the width as width, and the coordinates of the target center point as (x 1 , y 1 ), extract the color histogram features and local binary values of the target area Pattern histogram feature, initializing target color feature template H=(h 1 ,h 2 ,…,h n ) and local binary pattern feature template G=(g 1 ,g 2 ,…,g n )(n=1 ,2,...,32), n is the number of intervals in the feature histogram. As shown in Figure 2, the height of the target is divided into three horizontal sub-blocks, which are recorded as sub-blocks 1, 2, and 3 from top to bottom, and the width of the target is divided into three vertical sub-blocks, which are recorded from left to right. For sub-blocks 4, 5, and 6, extract the color histogram and local binary mode histogram features of each sub-block, and initialize the target sub-block color feature template H i =(h' 1 ,h' 2 ,...,h' n ) and sub-block local binary pattern feature template G i =(g' 1 ,g' 2 ,...,g' n )(i=1,2,...,6; n=1,2,...,32), Initialize the number of particles p, initialize the position of each particle (p_x j , p_y j ) (j=1,2,...,p), initialize the target state flag Flag to 0, initialize the state flag bits of each sub-block is 0.
(2)感兴趣区域的颜色积分直方图和局部二值模式积分直方图特征提取(2) Color integral histogram and local binary mode integral histogram feature extraction of the region of interest
读取第k帧图像P,感兴趣区域指能覆盖所有粒子背景区域的最小矩形区域,如图3表示的是原始图像与粒子背景区域位置,图中矩形A'B'C'D'就是以粒子为中心的宽为width,高为height的目标区域,矩形E'F'G'H'就是以粒子为中心的宽为高为的区域,矩形A'B'C'D'和矩形E'F'G'H'构成的“回”字形区域为粒子的背景区域,目标区域和背景区域覆盖面积相同,即包含的总像素点的个数是相同的。如图4表示的是原始图像与感兴趣区域的位置,图中矩形ABCD(即斜线底纹部分)即感兴趣区域,其四个顶点A、B、C、D的坐标分别为:Read the image P of the kth frame, the region of interest refers to the smallest rectangular area that can cover all the particle background areas, as shown in Figure 3 is the position of the original image and the particle background area, the rectangle A'B'C'D' in the figure is the The width of the particle as the center is width, and the height is the target area of height. The rectangle E'F'G'H' is the width of the particle as the center. high for area, the "back"-shaped area formed by the rectangle A'B'C'D' and the rectangle E'F'G'H' is the background area of the particle, and the target area and the background area cover the same area, that is, the total number of pixels contained The number of is the same. Figure 4 shows the position of the original image and the region of interest. In the figure, the rectangle ABCD (that is, the part of the shaded line) is the region of interest. The coordinates of the four vertices A, B, C, and D are:
其中,min()是求最小值函数,max()是求最大值函数。Among them, min() is a function to find the minimum value, and max() is a function to find the maximum value.
在积分直方图中,每个像素点的值代表的是从图像的左上角的原点到这个像素点所构成的矩形区域内所有点的颜色直方图。计算感兴趣矩形区域ABCD上颜色特征的积分直方图Hin(x,y),即计算从图像点P(xA,yA)到点P(x,y)构成的矩形区域内所有点的颜色直方图;计算感兴趣矩形区域ABCD上局部二值模式特征的积分直方图Gin(x,y),即计算从图像点P(xA,yA)到点P(x,y)构成的矩形区域内所有点的局部二值模式直方图。In the integral histogram, the value of each pixel represents the color histogram of all points in the rectangular area formed by the origin from the upper left corner of the image to this pixel. Calculate the integral histogram H in (x,y) of the color features on the rectangular area of interest ABCD, that is, calculate the integral histogram of all points in the rectangular area formed from the image point P(x A ,y A ) to the point P(x,y) Color histogram; calculate the integral histogram G in (x, y) of the local binary pattern feature on the rectangular area of interest ABCD, that is, calculate the composition from the image point P(x A ,y A ) to the point P(x,y) Histogram of local binary patterns for all points in the rectangular area of .
(3)特征确定性系数计算(3) Calculation of characteristic certainty coefficient
对于不同的跟踪环境来说,特征对其目标的描述能力的强弱也不一样。在计算目标区域的特征时,应当考虑到目标的整体与背景的区分程度,当目标特征与背景差距比较大时,说明该特征能较好地描述目标。本发明通过用log似然比来比较目标与背景的区分程度,从而计算出特征的确定性系数,用确定性系数来修正不同特征表述下的粒子权值,得到一个更加准确的粒子权值。For different tracking environments, the ability of features to describe their targets is different. When calculating the features of the target area, the degree of distinction between the target and the background should be considered. When the gap between the target features and the background is relatively large, it means that the feature can better describe the target. The invention calculates the certainty coefficient of the feature by using the log likelihood ratio to compare the degree of distinction between the target and the background, and uses the certainty coefficient to correct particle weights under different feature expressions to obtain a more accurate particle weight.
利用感兴趣区域的颜色特征的积分直方图Hin(x,y)和局部二值模式特征的积分直方图Gin(x,y),可快速计算得到当前第k帧中以每个粒子j为中心的宽为width,高为height的区域内的颜色直方图特征HPj=(hp1,hp2,…,hpn)和局部二值模式直方图特征GPj=(gp1,gp2,…,gpn)(n=1,2,…,32)。如图3所示,粒子j的目标区域为矩形A'B'C'D',其四个顶点A'、B'、C'、D'的坐标分别为:Using the integral histogram H in (x, y) of the color feature of the region of interest and the integral histogram G in (x, y) of the local binary mode feature, it can be quickly calculated to get each particle j in the current kth frame The color histogram feature HP j =(hp 1 ,hp 2 ,...,hp n ) and the local binary pattern histogram feature GP j =(gp 1 ,gp 2 ,...,gp n ) (n=1,2,...,32). As shown in Figure 3, the target area of particle j is a rectangle A'B'C'D', and the coordinates of its four vertices A', B', C', D' are:
则利用积分直方图可计算粒子j矩形框的颜色直方图特征HPj和局部二值模式直方图特征GPj分别为:Then the integral histogram can be used to calculate the color histogram feature HP j and the local binary mode histogram feature GP j of the rectangular box of particle j as follows:
HPj=Hin(xA',yA')-Hin(xC',yC'-1)-Hin(xB'-1,yB')+Hin(xD'-1,yD'-1) (1)HP j =H in (x A' ,y A' )-H in (x C' ,y C' -1)-H in (x B' -1,y B' )+H in (x D' - 1,y D' -1) (1)
GPj=Gin(xA',yA')-Gin(xC',yC'-1)-Gin(xB'-1,yB')+Gin(xD'-1,yD'-1) (2)GP j =G in (x A' ,y A' )-G in (x C' ,y C' -1)-G in (x B' -1,y B' )+G in (x D' - 1,y D' -1) (2)
矩形A'B'C'D'和矩形E'F'G'H'构成的“回”字形区域为粒子j的背景区域,背景区域的四个外顶点E'、F'、G'、H'的坐标分别为:The "back"-shaped area formed by the rectangle A'B'C'D' and the rectangle E'F'G'H' is the background area of particle j, and the four outer vertices E', F', G', H of the background area 'The coordinates are:
则利用积分直方图提取粒子j背景区域的颜色直方图BG_HPj和局部二值模式直方图BG_GPj分别为:Then use the integral histogram to extract the color histogram BG_HP j and the local binary pattern histogram BG_GP j of the background area of the particle j, respectively:
BG_HPj=Hin(xE',yE')-Hin(xG',yG'-1)-Hin(xF'-1,yF')+Hin(xH'-1,yH'-1)-HPj (3)BG_HP j =H in (x E' ,y E' )-H in (x G' ,y G' -1)-H in (x F' -1,y F' )+H in (x H' - 1,y H' -1)-HP j (3)
BG_GPj=Gin(xE',yE')-Gin(xG',yG'-1)-Gin(xF'-1,yF')+Gin(xH'-1,yH'-1)-GPj (4)BG_GP j =G in (x E' ,y E' )-G in (x G' ,y G' -1)-G in (x F' -1,y F' )+G in (x H' - 1,y H' -1)-GP j (4)
通过计算log似然比函数,可以较快速地比较目标特征直方图和背景特征直方图,挑选出区分度较明显的特征分量。计算粒子j的颜色特征与其背景区域的颜色特征的似然比值:By calculating the log likelihood ratio function, the target feature histogram and the background feature histogram can be compared quickly, and the feature components with more obvious discrimination can be selected. Calculate the likelihood ratio of the color features of particle j to the color features of its background region:
其中,ε=0.001,保证分子分母都不能为零。设T0为特征的区分度阈值,则粒子j的颜色特征与背景颜色特征的区分度为:Among them, ε=0.001, to ensure that both the numerator and the denominator cannot be zero. Let T 0 be the discrimination threshold of the feature, then the discrimination degree between the color feature of particle j and the background color feature is:
本发明中目标特征的确定性系数是指目标特征中与背景特征差距较大的特征占整个目标总特征的比例,则颜色特征的特征确定性系数为:In the present invention, the certainty coefficient of the target feature refers to the ratio of the feature with a large gap between the target feature and the background feature to the total feature of the whole target, and then the feature certainty coefficient of the color feature is:
计算粒子j的局部二值模式特征与其背景区域的局部二值模式特征的似然比值:Calculate the likelihood ratio of the local binary pattern features of particle j and the local binary pattern features of the background area:
则粒子j的局部二值模式特征与背景局部二值模式特征的区分度:Then the degree of discrimination between the local binary pattern features of particle j and the background local binary pattern features:
其局部二值模式特征的特征确定性系数为:The feature certainty coefficient of its local binary pattern feature is:
(4)跟踪策略的选择(4) Selection of tracking strategy
根据目标状态的不同,选择不同的跟踪策略来达到抗遮挡且稳定跟踪的目的。如果目标状态标志位Flag为0,表示目标状态正常,用颜色与局部二值模式特征融合的粒子滤波方法进行目标跟踪;如果目标状态标志位Flag为1,表示目标状态为部分遮挡,用颜色与局部二值模式特征融合的分块粒子滤波方法进行目标跟踪;如果目标状态标志位Flag为2,表示目标状态为严重遮挡,用最小二乘法预测目标位置。According to different target states, different tracking strategies are selected to achieve the purpose of anti-occlusion and stable tracking. If the target state flag bit Flag is 0, it means that the target state is normal, and the particle filter method of color and local binary mode feature fusion is used for target tracking; if the target state flag bit Flag is 1, it means that the target state is partially occluded, and the color and local binary mode feature fusion is used for target tracking; The block particle filter method of local binary mode feature fusion is used for target tracking; if the target state flag bit Flag is 2, it means that the target state is severely occluded, and the least square method is used to predict the target position.
目标状态正常下的颜色与局部二值模式特征融合的粒子滤波目标跟踪过程为:根据上一步骤中计算得到的粒子j矩形框的颜色直方图特征HPj和局部二值模式直方图特征GPj,用巴氏系数作为当前粒子框特征与目标模板相似性对比的参照系数,当巴氏系数的数值越大,巴氏距离的数值就越小,说明两个样本的相似程度越高,反之,说明两个样本之间的相似程度越低。粒子j的颜色特征HPj与目标颜色特征模板H的巴氏系数BCj计算公式为:The particle filter target tracking process of fusion of color and local binary mode features under normal target state is as follows: According to the color histogram feature HP j and local binary mode histogram feature GP j of the rectangular box of particle j calculated in the previous step , using the Bhattacharyian coefficient as the reference coefficient for the comparison between the current particle frame feature and the target template similarity. When the value of the Bhattacharyian coefficient is larger, the value of the Bhattacharyian distance is smaller, indicating that the similarity between the two samples is higher. On the contrary, It means that the similarity between the two samples is lower. The calculation formula of the color feature HP j of the particle j and the Barcol coefficient BC j of the target color feature template H is:
巴氏距离为粒子j的颜色特征权重w1(j)计算公式为:Barthroat's distance is The formula for calculating the color feature weight w 1 (j) of particle j is:
其中,σ=0.05,再对各粒子颜色特征权重进行归一化处理:Among them, σ=0.05, and then normalize the color feature weight of each particle:
粒子j的局部二值模式特征GPj与目标局部二值模式特征模板G的巴氏系数BLj计算公式为:The formula for calculating the local binary pattern feature GP j of the particle j and the Barthel coefficient BL j of the target local binary pattern feature template G is:
巴氏距离为粒子j的局部二值模式特征权重w2(j)计算公式为:Barthroat's distance is The formula for calculating the local binary mode feature weight w 2 (j) of particle j is:
再对各粒子局部二值模式特征权重进行归一化处理:Then normalize the feature weight of each particle's local binary mode:
根据上一步骤中计算得到的特征确定性系数把颜色和局部二值模式特征进行加性融合来计算各粒子的权重:According to the feature certainty coefficient calculated in the previous step, the color and the local binary mode feature are additively fused to calculate the weight of each particle:
其中,为了保证公式(17)的有效性,若特征确定性系数β_Cj和β_Lj同时为0时令β_Cj=β_Lj=0.5。对各粒子权重进行归一化处理把所有粒子的坐标按其权重加权得到当前帧目标的中心点坐标 Wherein, in order to ensure the validity of the formula (17), if the characteristic certainty coefficients β_C j and β_L j are 0 at the same time, β_C j =β_L j =0.5. Normalize the weight of each particle Weight the coordinates of all particles according to their weights to get the center point coordinates of the target in the current frame
当出现遮挡时,对目标进行分块跟踪。被遮挡的子块特征将不能完整提取,此时需要对未被遮挡的子块进行特征提取,以达到持续跟踪的目的。When occlusion occurs, the target is tracked in blocks. The features of the occluded sub-blocks cannot be completely extracted, and feature extraction is required for the unoccluded sub-blocks to achieve the purpose of continuous tracking.
目标被部分遮挡时的颜色与局部二值模式特征融合的粒子滤波目标跟踪过程为:根据上一帧图像中目标出现遮挡情况而检测出的子块状态标志位提取粒子j中时即有效子块的颜色直方图特征HPj_i和局部二值模式直方图特征GPj_i。将粒子j矩形框内的各有效子块的颜色直方图特征HPj_i与对应的子块颜色模板Hi进行对比计算各子块颜色特征的巴氏系数:The particle filter target tracking process of the fusion of color and local binary mode features when the target is partially occluded is as follows: the sub-block status flag detected according to the occlusion of the target in the previous frame image extract particle j The color histogram feature HP j_i and the local binary pattern histogram feature GP j_i of the time-effective sub-block. Compare the color histogram feature HP j_i of each effective sub-block within the rectangular frame of particle j with the corresponding sub-block color template H i to calculate the Bhattachary coefficient of each sub-block color feature:
取有效子块相似度均值作为对应粒子的整体部分的相似度,记有效子块个数为M,则粒子j颜色特征的巴氏系数为:Take the average value of the similarity of the effective sub-blocks as the similarity of the entire part of the corresponding particle, record the number of effective sub-blocks as M, then the Bhattachary coefficient of the color feature of the particle j is:
巴氏距离为用公式(12)计算各粒子的颜色特征权重w1(j),用公式(13)对各粒子颜色特征权重进行归一化处理。Barthroat's distance is Use the formula (12) to calculate the color feature weight w 1 (j) of each particle, and use the formula (13) to normalize the color feature weight of each particle.
将粒子j矩形框内的各有效子块的局部二值模式直方图特征GPj_i与对应的子块局部二值模式模板Gi进行对比计算各子块局部二值模式特征的巴氏系数:Comparing the local binary pattern histogram feature GP j_i of each effective sub-block in the rectangular frame of particle j with the corresponding sub-block local binary pattern template G i to calculate the Bhattachary coefficient of each sub-block local binary pattern feature:
则粒子j的局部二值模式特征的巴氏系数为:Then the Bhattachary coefficient of the local binary pattern feature of particle j is:
巴氏距离为用公式(15)计算各粒子的局部二值模式特征权重w2(j),用公式(16)对各粒子局部二值模式特征权重进行归一化处理。Barthroat's distance is Use formula (15) to calculate the local binary mode feature weight w 2 (j) of each particle, and use formula (16) to normalize the local binary mode feature weight of each particle.
根据上一步骤中计算得到的特征确定性系数用特征融合公式(17)计算各粒子的权重w(j),对各粒子权重进行归一化处理把所有粒子的坐标按其权重加权得到当前帧目标的中心点坐标 According to the feature certainty coefficient calculated in the previous step, use the feature fusion formula (17) to calculate the weight w(j) of each particle, and normalize the weight of each particle Weight the coordinates of all particles according to their weights to get the center point coordinates of the target in the current frame
目标被严重遮挡时的最小二乘法目标位置预测过程为:在跟踪过程中,若在前一帧检测到目标被严重遮挡,则特征信息提取较为困难,需要根据前面k-1帧的目标位置来预测出第k帧目标的位置。记之前帧中目标的中心位置坐标为(xt,yt),其中t=1,2,…,k-1,t表示帧数。当目标在短暂时间内被严重遮挡时,假设目标近似在做直线运动。根据最小二乘法原理,建立当前目标中心位置在x轴上的坐标xt和在y轴上的坐标yt随着帧数t的变化方程如下:The least square method target position prediction process when the target is severely occluded is as follows: in the tracking process, if the target is detected to be severely occluded in the previous frame, feature information extraction is more difficult, and it needs to be calculated according to the target position of the previous k-1 frames. Predict the position of the target in the kth frame. Record the coordinates of the center position of the target in the previous frame as (x t , y t ), where t=1,2,...,k-1, and t represents the number of frames. When the target is severely occluded within a short period of time, it is assumed that the target is approximately moving in a straight line. According to the principle of the least square method, the coordinate x t of the current target center position on the x-axis and the coordinate y t on the y-axis are established with the change equation of the frame number t as follows:
代入已知的中心位置坐标(xt,yt)(t=1,2,…,k-1)解这个方程,其中两条直线的斜率a1、a2和截距b1、b2的计算公式分别如下:Substitute the known center position coordinates (x t , y t ) (t=1,2,…,k-1) to solve this equation, where the slopes a 1 , a 2 and intercepts b 1 , b 2 of the two straight lines The calculation formulas are as follows:
求得两条拟合的直线后,当前帧图像中预测的目标中心点位置(xk,yk)表示为:After obtaining the two fitted straight lines, the predicted position of the target center point (x k , y k ) in the current frame image is expressed as:
xk=a1k+b1 (27)x k =a 1 k+b 1 (27)
yk=a2k+b2 (28)y k =a 2 k+b 2 (28)
这样就可计算得到k帧的目标中心位置(xk,yk)。In this way, the target center position (x k , y k ) of frame k can be calculated.
(6)更新目标状态(6) Update target state
目标在运动过程中,在正常状态下即没有遮挡发生时提取的整体特征应该是在一定范围内变动或者保持不变,但是遇到遮挡后,被遮挡部分的特征会发生变化,从而影响整体目标的特征。我们通过对整体颜色特征与颜色模板的比较来分析目标是否被遮挡。During the movement of the target, under normal conditions, that is, when there is no occlusion, the overall features extracted should change within a certain range or remain unchanged, but after encountering occlusion, the features of the occluded part will change, thereby affecting the overall target. Characteristics. We analyze whether the object is occluded by comparing the overall color feature with the color template.
经过前述步骤在得到当前帧目标估计位置的中心坐标(xk,yk)后,将以当前新坐标(xk,yk)为中心点的区域内的颜色特征与目标颜色模板进行比较,当相似度高于阈值时,说明新坐标位置上的目标与跟踪的目标相似度比较高,是匹配的,继续用粒子滤波跟踪方法进行跟踪;当相似度低于阈值时,说明目标的特征发生较大变化,认为目标出现了遮挡情况,但具体的遮挡严重程度还需要进一步来判别。After the above steps, after obtaining the center coordinates (x k , y k ) of the target estimated position in the current frame, compare the color features in the area with the current new coordinates (x k , y k ) as the center point with the target color template, When the similarity is higher than the threshold, it means that the target at the new coordinate position has a relatively high similarity with the tracked target, and they match, and continue to track with the particle filter tracking method; when the similarity is lower than the threshold, it means that the characteristics of the target have occurred Larger changes, it is considered that the target has been occluded, but the specific occlusion severity needs to be further judged.
计算当前帧中目标颜色特征直方图Hacc=(h1′,h2′,…,hn′),记当前帧目标特征与颜色模板H的相似度为:Calculate the target color feature histogram H acc =(h 1 ′,h 2 ′,…,h n ′) in the current frame, record the similarity between the target feature of the current frame and the color template H as:
设目标的整体相似度阈值为T1。Let the overall similarity threshold of the target be T 1 .
当B≥T1时,说明目标在当前帧中状态正常;如果此时目标状态标志位Flag等于0,无需更新,否则更新目标状态标志位Flag为0,说明此时目标已经脱离了遮挡。When B≥T 1 , it means that the target is in a normal state in the current frame; if the target state flag is equal to 0 at this time, there is no need to update it; otherwise, update the target state flag to 0, indicating that the target is out of occlusion at this time.
当B<T1时,说明目标在当前帧中出现了遮挡情况,利用当前目标子块颜色特征与分块颜色模板比较分析目标遮挡情况的严重性。将当前位置的各子块颜色特征分别与目标颜色模板中对应子块特征进行比较,若相似度较高,说明该子块状态正常;若子块颜色特征与对应的子块颜色模板间存在较大差异时,说明在该子块出现了遮挡情况。将以当前新坐标(xk,yk)为中心点的矩形区域分块后,提取每个子块i(i=1,2,…,6)的颜色特征直方图记作Hacc_i,把各子块颜色特征分别与目标颜色模板中相应子块特征Hi进行比较,对应的相似度记为Bi:When B<T 1 , it means that the target is occluded in the current frame, and the severity of the target occlusion is analyzed by comparing the color characteristics of the current target sub-block with the block color template. Compare the color features of each sub-block at the current position with the corresponding sub-block features in the target color template. If the similarity is high, it indicates that the state of the sub-block is normal; if there is a large gap between the sub-block color features and the corresponding sub-block color template When there is a difference, it means that occlusion occurs in this sub-block. After the rectangular area with the current new coordinates (x k , y k ) as the center point is divided into blocks, the color feature histogram of each sub-block i (i=1,2,…,6) is extracted as H acc_i , and each The sub-block color features are compared with the corresponding sub-block features H i in the target color template, and the corresponding similarity is recorded as B i :
设分子块的相似度阈值为T2,统计每个子块的相似度情况,设定有效子块和无效子块,记子块的状态标志位为FlagBi,则:Set the similarity threshold of the sub-block as T 2 , count the similarity of each sub-block, set valid sub-blocks and invalid sub-blocks, and record the status flag of the sub-block as Flag Bi , then:
即当Bi<T2,令代表子块i是无效子块;当Bi≥T2,令代表子块i是有效子块。本发明分块中若有4个子块都出现遮挡情况,则遮挡总面积覆盖了目标框内大部分像素。因而,将无效子块数目的阈值定为4,则有效子块数目的阈值为2,如图5所示,当子块1、3、4和5都是无效子块时,目标处于严重遮挡状态。统计所有子块中的有效子块数目,根据有效子块数目来判定目标的遮挡严重程度:That is, when B i < T 2 , let Represents that sub-block i is an invalid sub-block; when B i ≥ T 2 , let Indicates that subblock i is a valid subblock. If there are occlusions in all four sub-blocks in the sub-block of the present invention, the total area of occlusion covers most of the pixels in the target frame. Therefore, if the threshold of the number of invalid sub-blocks is set to 4, then the threshold of the number of valid sub-blocks is 2, as shown in Figure 5, when sub-blocks 1, 3, 4 and 5 are all invalid sub-blocks, the target is severely occluded state. Count the number of effective sub-blocks in all sub-blocks, and determine the occlusion severity of the target according to the number of effective sub-blocks:
即当M>2时,代表目标处于部分遮挡状态,更新目标状态标志位Flag=1;当M≤2时,代表目标处于严重遮挡状态,更新目标状态标志位Flag=2。That is, when M>2, it means that the target is in a partial occlusion state, and the target state flag Flag=1 is updated; when M≤2, it means that the target is in a severe occlusion state, and the target state flag Flag=2 is updated.
(7)模板的更新(7) Template update
目标的特征模板初始化是在视频初始帧中手动框选出目标时计算得到的,随着时间的推移,目标可能会发生或多或少的变化,需要对目标的特征模板进行自适应更新。当目标在遮挡情况下时,不对模板进行更新,以免受到遮挡物的干扰。只有在目标处于正常状态下时,对目标模板进行更新。即Flag=0时,按模板更新公式分别对目标的特征模板和子块的特征模板同时进行更新。令目标颜色特征模板的直方图为H,当前帧的目标新坐标(xk,yk)区域颜色特征直方图为Hacc,颜色特征模板更新公式为:The feature template initialization of the target is calculated when the target is manually selected in the initial frame of the video. As time goes by, the target may change more or less, and the feature template of the target needs to be adaptively updated. When the target is under occlusion, the template is not updated to avoid interference from occlusions. The target template is updated only when the target is in a healthy state. That is, when Flag=0, the feature template of the target and the feature template of the sub-block are updated simultaneously according to the template update formula. Let the histogram of the target color feature template be H, the color feature histogram of the target new coordinate (x k , y k ) region of the current frame be H acc , and the update formula of the color feature template is:
H=αH+(1-α)Hacc (33)H=αH+(1-α)H acc (33)
其中,0.80≤α≤0.99,本实施例中α=0.9;目标的局部二值模式特征模板、子块的颜色特征模板、子块的局部二值模式特征模板更新方法与目标颜色特征模板更新方法类同。Among them, 0.80≤α≤0.99, in this embodiment, α=0.9; target local binary pattern feature template, sub-block color feature template, sub-block local binary pattern feature template update method and target color feature template update method Similar.
(8)粒子重采样(8) Particle resampling
我们用系统重采样算法来进行粒子的重采样,去除权值小的粒子,保留或复制权值高的粒子。We use the system resampling algorithm to resample particles, remove particles with small weights, and retain or copy particles with high weights.
(9)粒子的传播(9) Propagation of particles
经过重采样的粒子,在x、y方向上分别扩散得到新的对应粒子,作为下一帧中粒子的初始分布。The resampled particles are diffused in the x and y directions to obtain new corresponding particles, which are used as the initial distribution of particles in the next frame.
图6是传统粒子滤波目标跟踪方法与本发明提出的一种基于颜色与局部二值模式特征融合的抗遮挡粒子滤波目标跟踪方法在测试视频上的跟踪效果对比图。在这个视频场景中,黑色上衣的男子是跟踪的目标人物,目标在移动过程中被白色上衣的女子遮挡,当采用传统的粒子滤波目标跟踪方法进行跟踪时,跟踪框会偏离目标位置,甚至在目标离开遮挡物后,跟踪框出现跟错的现象。而采用本发明的方法进行跟踪时,在目标人物遇到遮挡以及离开遮挡后的整个过程中,跟踪框能够准确地跟踪目标。在第15、28、45帧时,目标处于被遮挡状态,两种方法都能框出目标人物,而本发明的方法能更准确地框出目标人物;在第63帧时,目标被遮挡物遮挡住部分身体,两种方法的跟踪框还是可以框出目标的可见部分;从第102帧开始,传统粒子滤波方法的跟踪框已经偏向于遮挡物,处于错跟状态,直至目标跟丢,而本发明的方法能较好地继续对目标人物进行跟踪,且跟踪框显示较准确,达到良好的稳定跟踪效果。Fig. 6 is a comparison chart of the tracking effect of the traditional particle filter target tracking method and the anti-occlusion particle filter target tracking method based on the fusion of color and local binary pattern features proposed by the present invention on the test video. In this video scene, the man in the black shirt is the target person to be tracked, and the target is blocked by the woman in the white coat during the movement. When the traditional particle filter target tracking method is used for tracking, the tracking frame will deviate from the target position, even in the After the target leaves the occluder, the tracking frame appears to be wrong. However, when the method of the present invention is used for tracking, the tracking frame can accurately track the target during the entire process of the target person encountering the occlusion and leaving the occlusion. At the 15th, 28th, and 45th frame, the target is in the occluded state, and the two methods can frame the target person, but the method of the present invention can frame the target person more accurately; at the 63rd frame, the target occluded object Covering part of the body, the tracking frame of the two methods can still frame the visible part of the target; starting from the 102nd frame, the tracking frame of the traditional particle filter method has been biased towards the occluder, and is in a wrong tracking state until the target is lost. The method of the invention can better continue to track the target person, and the tracking frame display is more accurate, achieving a good and stable tracking effect.
为了更好地比较两种方法,采用目标的中心位置的跟踪误差来衡量两种方法的跟踪效果。跟踪误差用欧式距离来计算,如下式所示:In order to compare the two methods better, the tracking error of the center position of the target is used to measure the tracking effect of the two methods. The tracking error is calculated using the Euclidean distance, as shown in the following formula:
其中,(x',y')表示跟踪方法测得的目标中心点位置,(x,y)为视频中每帧的实际目标中心点位置,这里我们通过手动测量获取。分别计算出用两种跟踪方法得到的中心点坐标和跟踪误差,进行对比分析,比较结果如表1和表2所示。可以明显地看出,不论有无遮挡影响,本发明提出的基于颜色与局部二值模式特征融合的抗遮挡粒子滤波目标跟踪方法都比传统粒子滤波方法的跟踪误差小,尤其是随着帧数的增加,在目标遇到遮挡并离开遮挡物后,传统的粒子滤波方法已经跟丢目标,而本发明方法的误差一直保持在一定范围内,表现出了稳定且鲁棒的跟踪效果。Among them, (x', y') represents the position of the center point of the target measured by the tracking method, and (x, y) is the actual position of the center point of the target in each frame of the video, which we obtain through manual measurement. Calculate the coordinates of the center point and the tracking error obtained by the two tracking methods respectively, and conduct a comparative analysis. The comparison results are shown in Table 1 and Table 2. It can be clearly seen that, regardless of the influence of occlusion, the anti-occlusion particle filter target tracking method based on the fusion of color and local binary mode features proposed by the present invention has smaller tracking errors than the traditional particle filter method, especially as the number of frames After the target encounters the occlusion and leaves the occlusion, the traditional particle filter method has lost the target, but the error of the method of the present invention has been kept within a certain range, showing a stable and robust tracking effect.
表1Table 1
表2Table 2
显而易见,在不偏离本发明的真实精神和范围的前提下,在此描述的本发明可以有许多变化。因此,所有对于本领域技术人员来说显而易见的改变,都应包括在本权利要求书所涵盖的范围之内。本发明所要求保护的范围仅由所述的权利要求书进行限定。It will be apparent that many changes may be made to the invention described herein without departing from the true spirit and scope of the invention. Therefore, all changes obvious to those skilled in the art shall be included within the scope covered by the claims. The claimed scope of the present invention is limited only by the claims set forth.
Claims (9)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610454063.3A CN106127808B (en) | 2016-06-20 | 2016-06-20 | It is a kind of that particle filter method for tracking target is blocked based on color and the anti-of local binary patterns Fusion Features |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610454063.3A CN106127808B (en) | 2016-06-20 | 2016-06-20 | It is a kind of that particle filter method for tracking target is blocked based on color and the anti-of local binary patterns Fusion Features |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106127808A true CN106127808A (en) | 2016-11-16 |
CN106127808B CN106127808B (en) | 2018-09-07 |
Family
ID=57470445
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610454063.3A Active CN106127808B (en) | 2016-06-20 | 2016-06-20 | It is a kind of that particle filter method for tracking target is blocked based on color and the anti-of local binary patterns Fusion Features |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106127808B (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106504282A (en) * | 2016-11-23 | 2017-03-15 | 浙江大华技术股份有限公司 | A kind of video shelter detection method and device |
CN106815860A (en) * | 2017-01-17 | 2017-06-09 | 湖南优象科技有限公司 | A kind of method for tracking target based on comparison feature in order |
CN107330384A (en) * | 2017-06-19 | 2017-11-07 | 北京协同创新研究院 | The method and device of motion target tracking in a kind of video |
CN107424173A (en) * | 2017-06-09 | 2017-12-01 | 广东光阵光电科技有限公司 | A Target Tracking Method Based on Extended Local Invariant Feature Description |
CN107527356A (en) * | 2017-07-21 | 2017-12-29 | 华南农业大学 | A kind of video tracing method based on lazy interactive mode |
CN108985375A (en) * | 2018-07-14 | 2018-12-11 | 李军 | Consider the multiple features fusion tracking of particle weight spatial distribution |
CN109033204A (en) * | 2018-06-29 | 2018-12-18 | 浙江大学 | A kind of level integration histogram Visual Inquiry method based on WWW |
CN109389087A (en) * | 2018-10-10 | 2019-02-26 | 惠州学院 | A kind of multi-modal video object Feature Recognition System |
CN113139986A (en) * | 2021-04-30 | 2021-07-20 | 东风越野车有限公司 | Integrated environment perception and multi-target tracking system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020063711A1 (en) * | 1999-05-12 | 2002-05-30 | Imove Inc. | Camera system with high resolution image inside a wide angle view |
CN101026759A (en) * | 2007-04-09 | 2007-08-29 | 华为技术有限公司 | Visual tracking method and system based on particle filtering |
CN102722702A (en) * | 2012-05-28 | 2012-10-10 | 河海大学 | Multiple feature fusion based particle filter video object tracking method |
CN105279769A (en) * | 2015-07-16 | 2016-01-27 | 北京理工大学 | Hierarchical particle filtering tracking method combined with multiple features |
-
2016
- 2016-06-20 CN CN201610454063.3A patent/CN106127808B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020063711A1 (en) * | 1999-05-12 | 2002-05-30 | Imove Inc. | Camera system with high resolution image inside a wide angle view |
CN101026759A (en) * | 2007-04-09 | 2007-08-29 | 华为技术有限公司 | Visual tracking method and system based on particle filtering |
CN102722702A (en) * | 2012-05-28 | 2012-10-10 | 河海大学 | Multiple feature fusion based particle filter video object tracking method |
CN105279769A (en) * | 2015-07-16 | 2016-01-27 | 北京理工大学 | Hierarchical particle filtering tracking method combined with multiple features |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106504282B (en) * | 2016-11-23 | 2019-09-17 | 浙江大华技术股份有限公司 | A kind of video shelter detection method and device |
CN106504282A (en) * | 2016-11-23 | 2017-03-15 | 浙江大华技术股份有限公司 | A kind of video shelter detection method and device |
CN106815860A (en) * | 2017-01-17 | 2017-06-09 | 湖南优象科技有限公司 | A kind of method for tracking target based on comparison feature in order |
CN106815860B (en) * | 2017-01-17 | 2019-11-29 | 湖南优象科技有限公司 | A kind of method for tracking target based on orderly comparison feature |
CN107424173A (en) * | 2017-06-09 | 2017-12-01 | 广东光阵光电科技有限公司 | A Target Tracking Method Based on Extended Local Invariant Feature Description |
CN107424173B (en) * | 2017-06-09 | 2020-06-05 | 广东光阵光电科技有限公司 | Target tracking method based on extended local invariant feature description |
CN107330384A (en) * | 2017-06-19 | 2017-11-07 | 北京协同创新研究院 | The method and device of motion target tracking in a kind of video |
CN107527356A (en) * | 2017-07-21 | 2017-12-29 | 华南农业大学 | A kind of video tracing method based on lazy interactive mode |
CN107527356B (en) * | 2017-07-21 | 2020-12-11 | 华南农业大学 | A Video Tracking Method Based on Lazy Interaction |
CN109033204A (en) * | 2018-06-29 | 2018-12-18 | 浙江大学 | A kind of level integration histogram Visual Inquiry method based on WWW |
CN109033204B (en) * | 2018-06-29 | 2021-10-08 | 浙江大学 | A Visual Query Method for Histogram of Hierarchical Integral Based on World Wide Web |
CN108985375A (en) * | 2018-07-14 | 2018-12-11 | 李军 | Consider the multiple features fusion tracking of particle weight spatial distribution |
CN109389087A (en) * | 2018-10-10 | 2019-02-26 | 惠州学院 | A kind of multi-modal video object Feature Recognition System |
CN113139986A (en) * | 2021-04-30 | 2021-07-20 | 东风越野车有限公司 | Integrated environment perception and multi-target tracking system |
Also Published As
Publication number | Publication date |
---|---|
CN106127808B (en) | 2018-09-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106127808B (en) | It is a kind of that particle filter method for tracking target is blocked based on color and the anti-of local binary patterns Fusion Features | |
CN107507222B (en) | Anti-occlusion particle filter target tracking method based on integral histogram | |
CN106846359B (en) | Moving target rapid detection method based on video sequence | |
CN103489199B (en) | video image target tracking processing method and system | |
CN106408594B (en) | Video multi-target tracking based on more Bernoulli Jacob's Eigen Covariances | |
CN102646279B (en) | Anti-shielding tracking method based on moving prediction and multi-sub-block template matching combination | |
CN109242884B (en) | Remote sensing video target tracking method based on JCFNet network | |
CN102750711B (en) | A kind of binocular video depth map calculating method based on Iamge Segmentation and estimation | |
CN110276785B (en) | Anti-shielding infrared target tracking method | |
CN106600625A (en) | Image processing method and device for detecting small-sized living thing | |
CN105809715B (en) | A kind of visual movement object detection method adding up transformation matrices based on interframe | |
CN101527838B (en) | Method and system for feedback-type object detection and tracing of video object | |
CN110827332A (en) | Registration method of SAR image based on convolutional neural network | |
CN108647649A (en) | The detection method of abnormal behaviour in a kind of video | |
CN102915545A (en) | OpenCV(open source computer vision library)-based video target tracking algorithm | |
CN109919053A (en) | A deep learning vehicle parking detection method based on surveillance video | |
US20210042935A1 (en) | Object tracker, object tracking method, and computer program | |
CN101303726A (en) | Infrared Human Target Tracking System Based on Particle Dynamic Sampling Model | |
CN104200492A (en) | Automatic detecting and tracking method for aerial video target based on trajectory constraint | |
CN115171218A (en) | Material sample feeding abnormal behavior recognition system based on image recognition technology | |
Guan et al. | Visual hull construction in the presence of partial occlusion | |
CN104200483B (en) | Object detection method based on human body center line in multi-cam environment | |
Chen et al. | A color-guided, region-adaptive and depth-selective unified framework for Kinect depth recovery | |
CN109978916B (en) | Vibe moving target detection method based on gray level image feature matching | |
CN105631405A (en) | Multistage blocking-based intelligent traffic video recognition background modeling method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |