CN101887587B - Multi-target track method based on moving target detection in video monitoring - Google Patents

Multi-target track method based on moving target detection in video monitoring Download PDF

Info

Publication number
CN101887587B
CN101887587B CN2010102212904A CN201010221290A CN101887587B CN 101887587 B CN101887587 B CN 101887587B CN 2010102212904 A CN2010102212904 A CN 2010102212904A CN 201010221290 A CN201010221290 A CN 201010221290A CN 101887587 B CN101887587 B CN 101887587B
Authority
CN
China
Prior art keywords
target
obj
foreground
agglomerate
blob
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2010102212904A
Other languages
Chinese (zh)
Other versions
CN101887587A (en
Inventor
卢官明
徐腾飞
沈苏彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Post and Telecommunication University
Nanjing University of Posts and Telecommunications
Original Assignee
Nanjing Post and Telecommunication University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Post and Telecommunication University filed Critical Nanjing Post and Telecommunication University
Priority to CN2010102212904A priority Critical patent/CN101887587B/en
Publication of CN101887587A publication Critical patent/CN101887587A/en
Application granted granted Critical
Publication of CN101887587B publication Critical patent/CN101887587B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a multi-target track method based on moving target detection in video monitoring. The method comprises the following steps: firstly, using a background removing method to detect foreground moving targets; then establishing an incidence matrix between a foreground target block mass of the current frame and the target detected in a previous frame so as to judge various states of the targets (such as target missing, target keeping initial condition, target overlap, target separation and the like), and secondarily tracking the targets which are in the separation condition; and at last, updating the positions and areas of the targets, kernel weighted color histograms and other characteristics to achieve tracking multiple targets. The method of the invention can be used for effectively and reliably tracking sheltered targets and the multiple targets which are mutually overlapped and then separated in real time, and improving stability and robustness of a video monitoring system.

Description

In the video monitoring based on the multi-object tracking method of moving object detection
Technical field
The invention belongs to Flame Image Process and computer vision field, be specifically related to a kind of multi-target detection and tracking in video monitoring system.
Background technology
The detection of target is an important subject of computer vision field with following the tracks of in the video sequence; Be the basis of subsequent treatment such as target classification, behavioural analysis, have important use at numerous areas such as intelligent transportation control, intelligent video monitoring, military guidances and be worth.
In recent years, people have proposed many and method for target following, summarize and get up mainly to be divided into following four types:
(1) based on the tracking in zone: at first set up To Template, according to the template tracking target, its restrictive condition is that target can not be blocked, otherwise lose objects easily;
(2) based on the tracking of characteristic: extract the characteristic informations such as color, barycenter of target, calculate tracking target, generally need to merge multiple information through similarity;
(3) based on the tracking of profile template: the boundary profile of getting target is as template, tracking target profile in the edge image of subsequent frame, but responsive to profile variations;
(4) based on the tracking of model: generally use a precise geometrical model to follow the tracks of, and the timely replacement model parameter, accurate tracking target, but calculate complicated.
Because the space strong correlation that video image exists and the time continuity of target state, so realize that through the coincidence area that calculates the consecutive frame foreground target target following is possible between consecutive frame.As being that the Chinese patent of CN101339608A discloses " a kind of based on the method for tracking target and the system that detect " at publication number; This patent is set up the tracking target formation; And present frame testing result and object queue are mated according to position and yardstick; Realize that target following and state upgrade, strengthened the real-time of following the tracks of.But when target took place to overlap or separates, reliability was low, followed the tracks of unstable.
Summary of the invention
Technical matters: the objective of the invention is to disclose a kind of moving object detection and multi-object tracking method that is applied in the video monitoring system; Realization is carried out in real time, is followed the tracks of reliably a plurality of targets of separating again by the target of partial occlusion and after overlapping, the stability and the robustness of raising video monitoring system.
Technical scheme: the technical scheme that the present invention will solve is: at first use background subtraction to detect the foreground moving target; Set up the incidence matrix between the detected target of present frame foreground target agglomerate and former frame then; And with this judge the residing various states of target (as target disappear, target keeps original state, target coincidence, target separation etc.), the target that is in released state is carried out two secondary trackings; Characteristics such as the position of final updating target, area and nuclear weighting color histogram realize the tracking to a plurality of targets, and concrete steps are following:
Multi-object tracking method based on moving object detection in the video monitoring of the present invention may further comprise the steps:
A, employing background subtraction detect moving target;
The probability distribution of the nuclear weighting color histogram of B, calculating moving target;
C, set up the incidence matrix between present frame foreground target agglomerate and the detected target of former frame;
D, judge and the state of living in of each target the target that is in released state is carried out two secondary trackings;
E, more position, area and the nuclear weighting color histogram characteristic of fresh target.
In the said steps A, adopt the concrete steps of background subtraction detection moving target following:
A1, present frame gray level image and background image are subtracted each other, taking absolute value obtains error image;
A2, use adaptive threshold carry out binary conversion treatment to error image;
A3, to bianry image carry out the computing of form open-close, connected region label, fill up " cavity ", the aftertreatments such as length breadth ratio judgement of connected region area size and minimum boundary rectangle; After the influence that elimination noise and background disturbance bring, obtain foreground target binaryzation mask;
A4, use foreground target binaryzation mask and present frame input picture carry out the logical operation, detect moving target.
Among the said step C, the concrete steps of setting up the incidence matrix between the detected target of present frame foreground target agglomerate and former frame are following:
C1, make Obj_n and Blob_m represent detected n the target of former frame and m foreground target agglomerate of present frame respectively, calculate the area S that overlaps between Blob_m and the Obj_n Mn, m=1 wherein, 2 ... M, n=1,2 ... N, M, N are respectively present frame foreground target agglomerate number and the detected target number of former frame;
C2, set up the incidence matrix S of the capable N row of M, its element is S Mn
C3, incidence matrix S is carried out binary conversion treatment, calculate each row of S, nonzero term number Row_m and Col_n of each row then respectively.
Among the said step D,, the residing state of target is divided into five kinds of situation according to the relation that overlaps of the detected target of former frame with present frame foreground target agglomerate:
D1, if Obj_n does not exist with any foreground target agglomerate to be overlapped, promptly Col_n=0 then judges target Obj_n disappearance;
D2, if Obj_n and Blob_m existence overlaps, and be one-to-one relationship, i.e. Col_n=1, S Mn=1, Row_m=1 judges that then target Obj_n keeps original state;
D3, if Obj_n only exists with Blob_m to be overlapped, but Blob_m also overlaps with other targets existence, i.e. Col_n=1, S Mn=1, Row_m>1 judges that then target Obj_n overlaps with other targets;
D4, if Blob_m does not exist with any target of former frame to be overlapped, promptly Row_m=0 then judges the fresh target generation;
D5, if Obj_n overlaps with the existence of a plurality of foreground target agglomerates, promptly Col_n>1 judges that then target Obj_n is in released state.
Among the said step D5, to being in the target of released state, overlap if Obj_n exists with a plurality of foreground target agglomerates, then bonded area, color characteristic carry out two secondary trackings, and its concrete steps are following:
D51, delete the foreground target agglomerate of its area greater than 1.5 times of Obj_n areas;
D52, the remaining mutual distance of foreground target agglomerate of basis are judged the reason of separating;
D53, if separating reason is the influence that blocked by background, the boundary rectangle of then getting all foreground target agglomerates is as the corresponding foreground area of target;
D54, be to isolate fresh target if separate reason; The probability distribution of the nuclear weighting color histogram of the foreground target agglomerate of calculated candidate then; And calculate the similarity Bhattacharyya coefficient ρ of probability distribution of the nuclear weighting color histogram of itself and former frame target Obj_n respectively, find out ρ and be the current goal zone of peaked foreground target agglomerate Blob_x for being complementary with Obj_n.
In the said step e, each target of former frame that obtains according to incidence matrix S and the matching relationship of present frame foreground target agglomerate, characteristics such as the position of fresh target, area and color more, concrete steps are:
E1, the position that obtains Blob_x, area features, and upgrade position, the area of Obj_n with this;
E2, upgrade the nuclear weighting color histogram of Obj_n with the nuclear weighting color histogram of Blob_x.
Beneficial effect: experimental result shows, the inventive method can be effectively to by the target of partial occlusion and a plurality of targets of separating again after overlapping carry out in real time, follow the tracks of reliably, the stability and the robustness of raising video monitoring system.
Compared with prior art, the invention has the advantages that:
1) target is followed the tracks of adopting background subtraction to detect automatically on the basis of moving target; In the target following process, the residing state of target is divided into fresh target produces, target disappears, target keeps original state, target overlaps with target and separates five kinds of situation, except that the target separation; Other four kinds of situation are utilized consecutive frame zone coupling; Guaranteeing that shortcut calculation has improved real-time under the prerequisite of effectively following the tracks of.
2) when target is in released state, to different separation reasons, characteristics such as the color of combining target, position, area are carried out two secondary trackings.Adopt nuclear weighting color histogram to describe the colouring information of target, compare general color histogram, can effectively reduce the interference of background and other separate targets, improve the reliability of following the tracks of.
Description of drawings
Fig. 1 is the multi-object tracking method process flow diagram based on moving object detection of the present invention.
Embodiment
Realization based on the multi-object tracking method of moving object detection in the video monitoring of the present invention mainly comprises following steps:
Step 1: adopt background subtraction to detect moving target
(1) present frame gray level image and background image are subtracted each other, taking absolute value obtains error image;
(2) use adaptive threshold, error image is carried out binary conversion treatment;
(3) to bianry image carry out the computing of form open-close, connected region label, fill up " cavity ", the aftertreatments such as length breadth ratio judgement of connected region area size and minimum boundary rectangle; After the influence that elimination noise and background disturbance bring, obtain foreground target binaryzation mask;
(4) use foreground target binaryzation mask and present frame input picture to carry out the logical operation, detect moving target.
Step 2: the probability distribution of calculating the nuclear weighting color histogram of moving target
Color histogram is a kind of effective sign clarification of objective, because can accurately reflect color of object information, do not change, be easy to characteristics such as calculating with target shape, in target following, is used widely.
Suppose that R is the rectangular area track window that comprises target, it is centered close to x 0, the probability distribution p={p (u) of the nuclear weighting color histogram of R then } U=1 ..., KCan calculate by formula (1).
p ( u ) = C Σ i = 1 L k ( x i - x 0 h ) δ [ b ( x i ) - u ] - - - ( 1 )
In the formula (1): C is a normalization coefficient, makes
Figure BSA00000178177300042
Wherein K is the color characteristic space dimensionality, gets K=8 * 8 * 8=512; L representes the number of pixels in the rectangular area; x iFor in the zone more arbitrarily, || x i-x 0|| be x iTo x 0Distance; H is the kernel function bandwidth, gets and follows the tracks of the half the of window width; B (x i) be x iThe color range of pairing color space is to pixel x iCorresponding color characteristic carries out the K level and quantizes; δ [] is a unit impulse function, as pixel x iWhen corresponding color characteristic falls into u color characteristic, δ [b (x i)-u] value be 1; K () is a kernel function, and it is defined as:
k ( x ) = 1 - | | x | | 2 , | | x | | < 1 0 , otherwise
Can find out that from the expression formula of kernel function the area contribution that is positioned at target central authorities is maximum, and the marginarium is blocked or it is bigger to receive the possibility of background influence, so contribution is minimum.Can find out that the nuclear weights reduce comparatively slowly, therefore have more zone to obtain bigger weights, this is very favourable to improving tracking stability.
Step 3: set up the incidence matrix between the detected target of present frame foreground target agglomerate and former frame
(1) makes Obj_n and Blob_m represent detected n the target of former frame and m foreground target agglomerate of present frame respectively, calculate the area S that overlaps between Blob_m and the Obj_n Mn:
S mn=A(Obj_n)∩A(Blob_m) (2)
M={1 wherein, 2 ... M}; N={1,2 ... N}, M, N are respectively present frame foreground target agglomerate number and the detected target number of former frame, and the boundary rectangle area of target is extracted in A () expression, and ∩ representes to ask the coincidence area;
(2) set up the incidence matrix S that a capable N of M is listed as, its element is S Mn(m=1,2 ... M; N=1,2 ... N);
Figure BSA00000178177300051
(3) binaryzation incidence matrix S, order
In the formula (4), α is a coefficient, gets α=0.3; Min () representes to minimize function; S Mn=1 expression Blob_m effectively overlaps with the Obj_n existence, otherwise S Mn=0.
(4) each row of compute associations matrix S, the nonzero term number Row_m and the Col_n of each row:
Row _ m = &Sigma; j = 1 N S mj - - - ( 5 )
Col _ n = &Sigma; i = 1 M S in - - - ( 6 )
Step 4: judge the residing state of each target, the target that is in released state is carried out two secondary trackings according to the overlap relation of the detected target of former frame with present frame foreground target agglomerate, judge the residing state of each target, concrete steps are:
(1) if Obj_n does not exist with any foreground target agglomerate to be overlapped, promptly Col_n=0 judges that then target Obj_n disappears;
(2), and be one-to-one relationship if Obj_n and Blob_m existence overlaps, i.e. Col_n=1, S Mn=1, Row_m=1 judges that then target Obj_n keeps original state;
(3) if Obj_n only exists with Blob_m to be overlapped, but Blob_m also overlaps with other targets existence, i.e. Col_n=1, S Mn=1, Row_m>1 judges that then target Obj_n overlaps with other targets;
(4) if Blob_m does not exist with any target of former frame to be overlapped, promptly Row_m=0 judges that then fresh target produces;
(5) if existing with J foreground target agglomerate, Obj_n overlaps; Be Col_n=J; And J>1 judges that then target Obj_n is in released state, and then characteristic such as bonded area, nuclear weighting color histogram is carried out two secondary trackings; The task of two secondary trackings is exactly therefrom accurately to find out one or several foreground target agglomerate of corresponding target Obj_n, and its concrete steps are following:
1. delete the foreground target agglomerate of its area, that is: if A (Blob_m)>1.5A (Obj_n) then makes S greater than 1.5 times of Obj_n areas Mn=0;
2. ask for the maximal value in each foreground target agglomerate and other foreground target agglomerate minimum spacing, that is: d Max=max [min (d (Blob_i, Blob_j))];
3. if d Max≤T, (T is a distance threshold, gets T=10) then is judged to and blocked by background and target divides, and the boundary rectangle of getting all foreground target agglomerates is as the corresponding foreground area of target, more fresh target;
4. if d Max>T then is judged in the former target and divides fresh target, by the probability distribution q={q (u) of the nuclear weighting color histogram of the foreground target agglomerate of formula (1) calculated candidate } U=1 ..., KAnd calculate the similarity Bhattacharyya coefficient ρ of probability distribution p (u) of the nuclear weighting color histogram of q (u) and former frame target Obj_n respectively by formula (7), find out ρ and be the current goal zone of peaked foreground target agglomerate Blob_x for being complementary with Obj_n;
&rho; = &Sigma; u = 1 K p ( u ) q ( u ) - - - ( 7 )
Step 5: each target of former frame that obtains according to incidence matrix S and the matching relationship of present frame foreground target agglomerate, characteristics such as the position of fresh target, area and color more, concrete steps are:
(1) obtains position, the area features of Blob_x, and upgrade position, the area of Obj_n with this;
(2) upgrade the nuclear weighting color histogram of Obj_n with the nuclear weighting color histogram of Blob_x.

Claims (2)

  1. In the video monitoring based on the multi-object tracking method of moving object detection, it is characterized in that this method may further comprise the steps:
    A, employing background subtraction detect moving target;
    The probability distribution of the nuclear weighting color histogram of B, calculating moving target;
    C, set up the incidence matrix between present frame foreground target agglomerate and the detected target of former frame;
    D, judge and the state of living in of each target the target that is in released state is carried out two secondary trackings;
    E, more position, area and the nuclear weighting color histogram characteristic of fresh target;
    In the said steps A, adopt the concrete steps of background subtraction detection moving target following:
    A1, present frame gray level image and background image are subtracted each other, taking absolute value obtains error image;
    A2, use adaptive threshold carry out binary conversion treatment to error image;
    A3, to bianry image carry out the computing of form open-close, connected region label, fill up " cavity ", the length breadth ratio of connected region area size and minimum boundary rectangle judges aftertreatment; After the influence that elimination noise and background disturbance bring, obtain foreground target binaryzation mask;
    A4, use foreground target binaryzation mask and present frame input picture carry out the logical operation, detect moving target;
    Among the said step C, the concrete steps of setting up the incidence matrix between the detected target of present frame foreground target agglomerate and former frame are following:
    C1, make Obj_n and Blob_m represent detected n the target of former frame and m foreground target agglomerate of present frame respectively, calculate the area S that overlaps between Blob_m and the Obj_n Mn, m=1 wherein, 2 ... M, n=1,2 ... N, M, N are respectively present frame foreground target agglomerate number and the detected target number of former frame;
    C2, set up the incidence matrix S of the capable N row of M, its element is S Mn
    C3, incidence matrix S is carried out binary conversion treatment, calculate each row of S, nonzero term number Row_m and Col_n of each row then respectively;
    Among the said step D,, the residing state of target is divided into five kinds of situation according to the relation that overlaps of the detected target of former frame with present frame foreground target agglomerate:
    D1, if Obj_n does not exist with any foreground target agglomerate to be overlapped, promptly Col_n=0 then judges target Obj_n disappearance;
    D2, if Obj_n and Blob_m existence overlaps, and be one-to-one relationship, i.e. Col_n=1, S Mn=1, Row_m=1 judges that then target Obj_n keeps original state;
    D3, if Obj_n only exists with Blob_m to be overlapped, but Blob_m also overlaps with other targets existence, i.e. Col_n=1, S Mn=1, Row_m>1 judges that then target Obj_n overlaps with other targets;
    D4, if Blob_m does not exist with any target of former frame to be overlapped, promptly Row_m=0 then judges the fresh target generation;
    D5, if Obj_n overlaps with the existence of a plurality of foreground target agglomerates, promptly Col_n>1 judges that then target Obj_n is in released state;
    In the said step e, each target of former frame that obtains according to incidence matrix S and the matching relationship of present frame foreground target agglomerate, the more position of fresh target, area and color characteristic, concrete steps are:
    E1, the position that obtains foreground target agglomerate Blob_x, area features, and upgrade position, the area of Obj_n with this;
    E2, upgrade the nuclear weighting color histogram of Obj_n with the nuclear weighting color histogram of Blob_x.
  2. 2. in the video monitoring according to claim 1 based on the multi-object tracking method of moving object detection; It is characterized in that; Among the said step D5,,, Obj_n overlaps if existing with a plurality of foreground target agglomerates to being in the target of released state; Then bonded area, color characteristic carry out two secondary trackings, and its concrete steps are following:
    D51, delete the foreground target agglomerate of its area greater than 1.5 times of Obj_n areas;
    D52, the remaining mutual distance of foreground target agglomerate of basis are judged the reason of separating;
    D53, if separating reason is the influence that blocked by background, the boundary rectangle of then getting all foreground target agglomerates is as the corresponding foreground area of target;
    D54, be to isolate fresh target if separate reason; The probability distribution of the nuclear weighting color histogram of the foreground target agglomerate of calculated candidate then; And calculate the similarity Bhattacharyya coefficient ρ of probability distribution of the nuclear weighting color histogram of itself and former frame target Obj_n respectively, find out ρ and be the current goal zone of peaked foreground target agglomerate Blob_x for being complementary with Obj_n.
CN2010102212904A 2010-07-07 2010-07-07 Multi-target track method based on moving target detection in video monitoring Expired - Fee Related CN101887587B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2010102212904A CN101887587B (en) 2010-07-07 2010-07-07 Multi-target track method based on moving target detection in video monitoring

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2010102212904A CN101887587B (en) 2010-07-07 2010-07-07 Multi-target track method based on moving target detection in video monitoring

Publications (2)

Publication Number Publication Date
CN101887587A CN101887587A (en) 2010-11-17
CN101887587B true CN101887587B (en) 2012-05-23

Family

ID=43073497

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2010102212904A Expired - Fee Related CN101887587B (en) 2010-07-07 2010-07-07 Multi-target track method based on moving target detection in video monitoring

Country Status (1)

Country Link
CN (1) CN101887587B (en)

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102081801B (en) * 2011-01-26 2014-04-16 上海交通大学 Multi-feature adaptive fused ship tracking and track detecting method
CN102254396B (en) * 2011-07-06 2014-06-04 通号通信信息集团有限公司 Intrusion detection method and device based on video
CN102930554B (en) * 2011-08-11 2015-06-03 天津市亚安科技股份有限公司 Method and system for accurately capturing target in monitored scene
CN103020577B (en) * 2011-09-20 2015-07-22 佳都新太科技股份有限公司 Moving target identification method based on hog characteristic and system
CN103425960B (en) * 2012-05-25 2017-04-05 信帧机器人技术(北京)有限公司 Fast moving objects method for detecting in a kind of video
CN102842139B (en) * 2012-07-19 2015-09-02 电子科技大学 A kind of acquiring method of target trajectory
CN103123726B (en) * 2012-09-07 2016-09-07 佳都新太科技股份有限公司 A kind of target tracking algorism analyzed based on motor behavior
CN103677734A (en) * 2012-09-25 2014-03-26 中国航天科工集团第二研究院二〇七所 Multi-target data association algorithm based on feature matching matrix
CN103065325B (en) * 2012-12-20 2015-07-29 中国科学院上海微系统与信息技术研究所 A kind of method for tracking target based on the polymerization of color Distance geometry Iamge Segmentation
CN104079798B (en) * 2013-03-25 2017-09-26 日电(中国)有限公司 Image detecting method, device and a kind of video monitoring system
CN103400157B (en) * 2013-07-23 2017-02-08 青岛海信网络科技股份有限公司 Road pedestrian and non-motor vehicle detection method based on video analysis
CN104732187B (en) * 2013-12-18 2017-12-22 杭州华为企业通信技术有限公司 A kind of method and apparatus of image trace processing
CN103729861B (en) * 2014-01-03 2016-06-22 天津大学 A kind of multi-object tracking method
CN103886607B (en) * 2014-01-24 2016-08-17 清华大学深圳研究生院 A kind of detection for disturbance target and suppressing method
CN103927763B (en) * 2014-03-24 2016-08-17 河海大学 A kind of image sequence multiple target tracking track identification processing method
CN104091348B (en) * 2014-05-19 2017-04-05 南京工程学院 The multi-object tracking method of fusion marked feature and piecemeal template
KR102390876B1 (en) * 2015-03-27 2022-04-26 삼성전자주식회사 Method and apparatus for recognizing a uers’s activity by using a accelerometer
CN105118072A (en) * 2015-08-19 2015-12-02 西华大学 Method and device for tracking multiple moving targets
CN105847684A (en) * 2016-03-31 2016-08-10 深圳奥比中光科技有限公司 Unmanned aerial vehicle
CN106297130A (en) * 2016-08-22 2017-01-04 国家电网公司 Transmission line of electricity video analysis early warning system
CN106327502A (en) * 2016-09-06 2017-01-11 山东大学 Multi-scene multi-target recognition and tracking method in security video
CN106530325A (en) * 2016-10-26 2017-03-22 合肥润客软件科技有限公司 Multi-target visual detection and tracking method
CN106709521A (en) * 2016-12-26 2017-05-24 深圳极视角科技有限公司 Fire pre-warning method and fire pre-warning system based on convolution neural network and dynamic tracking
CN108229459B (en) * 2018-01-04 2020-11-20 北京环境特性研究所 Target tracking method
CN108932730B (en) * 2018-05-31 2021-11-23 哈工大机器人集团(昆山)有限公司 Video multi-target tracking method and system based on data association
CN110136174B (en) * 2019-05-22 2021-06-22 北京华捷艾米科技有限公司 Target object tracking method and device
CN113052853B (en) * 2021-01-25 2023-07-21 广东技术师范大学 Video target tracking method and device in complex environment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1972370A (en) * 2005-11-23 2007-05-30 中国科学院沈阳自动化研究所 Real-time multi-target mark and centroid operation method
CN101621615A (en) * 2009-07-24 2010-01-06 南京邮电大学 Self-adaptive background modeling and moving target detecting method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1972370A (en) * 2005-11-23 2007-05-30 中国科学院沈阳自动化研究所 Real-time multi-target mark and centroid operation method
CN101621615A (en) * 2009-07-24 2010-01-06 南京邮电大学 Self-adaptive background modeling and moving target detecting method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
卢官明,郎苏娟.基于YCb Cr颜色空间的背景建模及运动目标检测.《南京邮电大学学报(自然科学版)》.2009,第29卷(第6期),17-22页. *
徐方明, 卢官明.基于改进surendra 背景更新算法的运动目标检测算法.《山西电子技术》.2009,(第5期),39-40页. *
杨海波,姚庆栋,荆仁杰.基于团块匹配的序列图像中运动目标的分割方法.《浙江大学学报》.2001,第35卷(第4期),365-369页. *

Also Published As

Publication number Publication date
CN101887587A (en) 2010-11-17

Similar Documents

Publication Publication Date Title
CN101887587B (en) Multi-target track method based on moving target detection in video monitoring
CN103281477B (en) Multi-target track method based on multi-level characteristic association
CN110009665B (en) Target detection tracking method in shielding environment
Milan et al. Continuous energy minimization for multitarget tracking
CN102165493B (en) Detection of vehicles in an image
CN102332092B (en) Flame detection method based on video analysis
CN102750708B (en) Affine motion target tracing algorithm based on fast robust feature matching
CN105528794A (en) Moving object detection method based on Gaussian mixture model and superpixel segmentation
CN104091348A (en) Multi-target tracking method integrating obvious characteristics and block division templates
CN105404847A (en) Real-time detection method for object left behind
CN104268583A (en) Pedestrian re-recognition method and system based on color area features
Timofte et al. Combining traffic sign detection with 3D tracking towards better driver assistance
CN103902985B (en) High-robustness real-time lane detection algorithm based on ROI
CN104978567A (en) Vehicle detection method based on scenario classification
CN105260715A (en) Remote-area-oriented small-animal target detecting method
CN104036250A (en) Video pedestrian detecting and tracking method
CN106503748A (en) A kind of based on S SIFT features and the vehicle targets of SVM training aids
CN104809742A (en) Article safety detection method in complex scene
CN110633678A (en) Rapid and efficient traffic flow calculation method based on video images
CN106056078A (en) Crowd density estimation method based on multi-feature regression ensemble learning
Zhan et al. Pedestrian detection and behavior recognition based on vision
Sheng et al. Real-time anti-interference location of vehicle license plates using high-definition video
CN105469054A (en) Model construction method of normal behaviors and detection method of abnormal behaviors
Avery et al. Investigation into shadow removal from traffic images
CN104392466A (en) Detecting and tracking method based on visual salient original target

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120523

Termination date: 20150707

EXPY Termination of patent right or utility model