CN101887587A - Multi-target track method based on moving target detection in video monitoring - Google Patents

Multi-target track method based on moving target detection in video monitoring Download PDF

Info

Publication number
CN101887587A
CN101887587A CN 201010221290 CN201010221290A CN101887587A CN 101887587 A CN101887587 A CN 101887587A CN 201010221290 CN201010221290 CN 201010221290 CN 201010221290 A CN201010221290 A CN 201010221290A CN 101887587 A CN101887587 A CN 101887587A
Authority
CN
China
Prior art keywords
target
obj
foreground
agglomerate
blob
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 201010221290
Other languages
Chinese (zh)
Other versions
CN101887587B (en
Inventor
卢官明
徐腾飞
沈苏彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Post and Telecommunication University
Nanjing University of Posts and Telecommunications
Original Assignee
Nanjing Post and Telecommunication University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Post and Telecommunication University filed Critical Nanjing Post and Telecommunication University
Priority to CN2010102212904A priority Critical patent/CN101887587B/en
Publication of CN101887587A publication Critical patent/CN101887587A/en
Application granted granted Critical
Publication of CN101887587B publication Critical patent/CN101887587B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a multi-target track method based on moving target detection in video monitoring. The method comprises the following steps: firstly, using a background removing method to detect foreground moving targets; then establishing an incidence matrix between a foreground target block mass of the current frame and the target detected in a previous frame so as to judge various states of the targets (such as target missing, target keeping initial condition, target overlap, target separation and the like), and secondarily tracking the targets which are in the separation condition; and at last, updating the positions and areas of the targets, kernel weighted color histograms and other characteristics to achieve tracking multiple targets. The method of the invention can be used for effectively and reliably tracking sheltered targets and the multiple targets which are mutually overlapped and then separated in real time, and improving stability and robustness of a video monitoring system.

Description

In the video monitoring based on the multi-object tracking method of moving object detection
Technical field
The invention belongs to Flame Image Process and computer vision field, be specifically related to a kind of multi-target detection and tracking in video monitoring system.
Background technology
The detection of target is an important subject of computer vision field with following the tracks of in the video sequence, be the basis of subsequent treatment such as target classification, behavioural analysis, have important use at numerous areas such as intelligent transportation control, intelligent video monitoring, military guidances and be worth.
In recent years, people have proposed many and method for target following, summarize and get up mainly to be divided into following four classes:
(1) based on the tracking in zone: at first set up To Template, according to the template tracking target, its restrictive condition is that target can not be blocked, otherwise lose objects easily;
(2) based on the tracking of feature: extract the characteristic informations such as color, barycenter of target, calculate tracking target, generally need to merge multiple information by similarity;
(3) based on the tracking of profile template: the boundary profile of getting target is as template, tracking target profile in the edge image of subsequent frame, but responsive to profile variations;
(4) based on the tracking of model: generally use a precise geometrical model to follow the tracks of, and the timely replacement model parameter, accurate tracking target, but calculate complicated.
Because the space strong correlation that video image exists and the time continuity of target state, so realize that by the coincidence area that calculates the consecutive frame foreground target target following is possible between consecutive frame.As being that the Chinese patent of CN101339608A discloses " a kind of based on the method for tracking target and the system that detect " at publication number, this patent is set up the tracking target formation, and present frame testing result and object queue are mated according to position and yardstick, realize that target following and state upgrade, strengthened the real-time of following the tracks of.But when target took place to overlap or separates, reliability was low, followed the tracks of unstable.
Summary of the invention
Technical matters: the objective of the invention is to disclose a kind of moving object detection and multi-object tracking method that is applied in the video monitoring system, realization is carried out in real time, is followed the tracks of reliably a plurality of targets of separating again by the target of partial occlusion and after overlapping, the stability and the robustness of raising video monitoring system.
Technical scheme: the technical scheme that the present invention will solve is: at first use background subtraction to detect the foreground moving target; Set up the incidence matrix between the detected target of present frame foreground target agglomerate and former frame then, and with this judge the residing various states of target (as target disappear, target keeps original state, target coincidence, target separation etc.), the target that is in released state is carried out two secondary trackings; Features such as the position of final updating target, area and nuclear weighting color histogram realize the tracking to a plurality of targets, and concrete steps are as follows:
Multi-object tracking method based on moving object detection in the video monitoring of the present invention may further comprise the steps:
A, employing background subtraction detect moving target;
The probability distribution of the nuclear weighting color histogram of B, calculating moving target;
C, set up the incidence matrix between present frame foreground target agglomerate and the detected target of former frame;
D, judge and the state of living in of each target the target that is in released state is carried out two secondary trackings;
E, more position, area and the nuclear weighting color histogram feature of fresh target.
In the described steps A, adopt the concrete steps of background subtraction detection moving target as follows:
A1, present frame gray level image and background image are subtracted each other, taking absolute value obtains error image;
A2, use adaptive threshold carry out binary conversion treatment to error image;
A3, to bianry image carry out the computing of form open-close, connected region label, fill up " cavity ", the aftertreatments such as length breadth ratio judgement of connected region area size and minimum boundary rectangle, after the influence that elimination noise and background disturbance bring, obtain foreground target binaryzation mask;
A4, use foreground target binaryzation mask and present frame input picture carry out the logical operation, detect moving target.
Among the described step C, the concrete steps of setting up the incidence matrix between the detected target of present frame foreground target agglomerate and former frame are as follows:
C1, make Obj_n and Blob_m represent detected n the target of former frame and m foreground target agglomerate of present frame respectively, calculate the area S that overlaps between Blob_m and the Obj_n Mn, m=1 wherein, 2 ... M, n=1,2 ... N, M, N are respectively present frame foreground target agglomerate number and the detected target number of former frame;
C2, set up the incidence matrix S of the capable N row of M, its element is S Mn
C3, incidence matrix S is carried out binary conversion treatment, calculate each row of S, nonzero term number Row_m and Col_n of each row then respectively.
Among the described step D, the coincidence relation according to detected target of former frame and present frame foreground target agglomerate is divided into five kinds of situations with the residing state of target:
D1, if Obj_n does not exist with any foreground target agglomerate to be overlapped, promptly Col_n=0 then judges target Obj_n disappearance;
D2, if Obj_n and Blob_m existence overlaps, and be one-to-one relationship, i.e. Col_n=1, S Mn=1, Row_m=1 judges that then target Obj_n keeps original state;
D3, if Obj_n only exists with Blob_m to be overlapped, but Blob_m also overlaps with other targets existence, i.e. Col_n=1, S Mn=1, Row_m>1 judges that then target Obj_n overlaps with other targets;
D4, if Blob_m does not exist with any target of former frame to be overlapped, promptly Row_m=0 then judges the fresh target generation;
D5, if Obj_n overlaps with the existence of a plurality of foreground target agglomerates, promptly Col_n>1 judges that then target Obj_n is in released state.
Among the described step D5, to being in the target of released state, overlap if Obj_n exists with a plurality of foreground target agglomerates, then bonded area, color characteristic carry out two secondary trackings, and its concrete steps are as follows:
D51, delete the foreground target agglomerate of its area greater than 1.5 times of Obj_n areas;
D52, the remaining mutual distance of foreground target agglomerate of basis are judged the reason of separating;
D53, if separating reason is the influence that blocked by background, then get the foreground area of the boundary rectangle of all foreground target agglomerates as the target correspondence;
D54, be to isolate fresh target if separate reason, the probability distribution of the nuclear weighting color histogram of the foreground target agglomerate of calculated candidate then, and calculate the similarity Bhattacharyya coefficient ρ of probability distribution of the nuclear weighting color histogram of itself and former frame target Obj_n respectively, find out ρ and be the current goal zone of peaked foreground target agglomerate Blob_x for being complementary with Obj_n.
In the described step e, each target of former frame that obtains according to incidence matrix S and the matching relationship of present frame foreground target agglomerate, features such as the position of fresh target, area and color more, concrete steps are:
E1, the position that obtains Blob_x, area features, and upgrade position, the area of Obj_n with this;
E2, upgrade the nuclear weighting color histogram of Obj_n with the nuclear weighting color histogram of Blob_x.
Beneficial effect: experimental result shows, the inventive method can be effectively to by the target of partial occlusion and a plurality of targets of separating again after overlapping carry out in real time, follow the tracks of reliably, the stability and the robustness of raising video monitoring system.
Compared with prior art, the invention has the advantages that:
1) target is followed the tracks of adopting background subtraction to detect automatically on the basis of moving target, in the target following process, with the residing state of target be divided into that fresh target produces, target disappears, target keeps original state, target overlaps and separates five kinds of situations with target, except that target is separated, other four kinds of situations are utilized consecutive frame zone coupling, guaranteeing that shortcut calculation has improved real-time under the prerequisite of effectively following the tracks of.
2) when target is in released state, at different separation reasons, features such as the color of combining target, position, area are carried out two secondary trackings.Adopt nuclear weighting color histogram to describe the colouring information of target, compare general color histogram, can effectively reduce the interference of background and other separate targets, improve the reliability of following the tracks of.
Description of drawings
Fig. 1 is the multi-object tracking method process flow diagram based on moving object detection of the present invention.
Embodiment
Realization based on the multi-object tracking method of moving object detection in the video monitoring of the present invention mainly comprises following steps:
Step 1: adopt background subtraction to detect moving target
(1) present frame gray level image and background image are subtracted each other, taking absolute value obtains error image;
(2) use adaptive threshold, error image is carried out binary conversion treatment;
(3) to bianry image carry out the computing of form open-close, connected region label, fill up " cavity ", the aftertreatments such as length breadth ratio judgement of connected region area size and minimum boundary rectangle, after the influence that elimination noise and background disturbance bring, obtain foreground target binaryzation mask;
(4) use foreground target binaryzation mask and present frame input picture to carry out the logical operation, detect moving target.
Step 2: the probability distribution of calculating the nuclear weighting color histogram of moving target
Color histogram is a kind of effective sign clarification of objective, because can accurately reflect color of object information, do not change, be easy to characteristics such as calculating with target shape, is used widely in target following.
Suppose that R is the rectangular area track window that comprises target, it is centered close to x 0, the probability distribution p={p (u) of the nuclear weighting color histogram of R then } U=1 ..., KCan calculate by formula (1).
p ( u ) = C Σ i = 1 L k ( x i - x 0 h ) δ [ b ( x i ) - u ] - - - ( 1 )
In the formula (1): C is a normalization coefficient, makes
Figure BSA00000178177300042
Wherein K is the color characteristic space dimensionality, gets K=8 * 8 * 8=512; L represents the number of pixels in the rectangular area; x iFor in the zone more arbitrarily, || x i-x 0|| be x iTo x 0Distance; H is the kernel function bandwidth, gets half of following the tracks of window width; B (x i) be x iThe color range of pairing color space is to pixel x iCorresponding color characteristic carries out the K level and quantizes; δ [] is a unit impulse function, as pixel x iWhen corresponding color characteristic falls into u color characteristic, δ [b (x i)-u] value be 1; K () is a kernel function, and it is defined as:
k ( x ) = 1 - | | x | | 2 , | | x | | < 1 0 , otherwise
From the expression formula of kernel function as can be seen, be positioned at the area contribution maximum of target central authorities, and the marginarium is blocked or it is bigger to be subjected to the possibility of background influence, so contribution is minimum.Examine weights as can be seen and reduce comparatively slowly, therefore have more zone to obtain bigger weights, this is very favourable to improving tracking stability.
Step 3: set up the incidence matrix between the detected target of present frame foreground target agglomerate and former frame
(1) makes Obj_n and Blob_m represent detected n the target of former frame and m foreground target agglomerate of present frame respectively, calculate the area S that overlaps between Blob_m and the Obj_n Mn:
S mn=A(Obj_n)∩A(Blob_m) (2)
M={1 wherein, 2 ... M}; N={1,2 ... N}, M, N are respectively present frame foreground target agglomerate number and the detected target number of former frame, and the boundary rectangle area of target is extracted in A () expression, and ∩ represents to ask the coincidence area;
(2) set up the incidence matrix S that a capable N of M is listed as, its element is S Mn(m=1,2 ... M; N=1,2 ... N);
Figure BSA00000178177300051
(3) binaryzation incidence matrix S, order
Figure BSA00000178177300052
In the formula (4), α is a coefficient, gets α=0.3; Min () represents to minimize function; S Mn=1 expression Blob_m effectively overlaps with the Obj_n existence, otherwise S Mn=0.
(4) each row of compute associations matrix S, the nonzero term number Row_m and the Col_n of each row:
Row _ m = &Sigma; j = 1 N S mj - - - ( 5 )
Col _ n = &Sigma; i = 1 M S in - - - ( 6 )
Step 4: judge the residing state of each target, the target that is in released state is carried out the coincidence relation of two secondary trackings according to detected target of former frame and present frame foreground target agglomerate, judge the residing state of each target, concrete steps are:
(1) if Obj_n does not exist with any foreground target agglomerate to be overlapped, promptly Col_n=0 judges that then target Obj_n disappears;
(2), and be one-to-one relationship if Obj_n and Blob_m existence overlaps, i.e. Col_n=1, S Mn=1, Row_m=1 judges that then target Obj_n keeps original state;
(3) if Obj_n only exists with Blob_m to be overlapped, but Blob_m also overlaps with other targets existence, i.e. Col_n=1, S Mn=1, Row_m>1 judges that then target Obj_n overlaps with other targets;
(4) if Blob_m does not exist with any target of former frame to be overlapped, promptly Row_m=0 judges that then fresh target produces;
(5) if existing with J foreground target agglomerate, Obj_n overlaps, be Col_n=J, and J>1, judge that then target Obj_n is in released state, then feature such as bonded area, nuclear weighting color histogram is carried out two secondary trackings, the task of two secondary trackings is exactly therefrom accurately to find out one or several foreground target agglomerate of corresponding target Obj_n, and its concrete steps are as follows:
1. delete the foreground target agglomerate of its area, that is: if A (Blob_m)>1.5A (Obj_n) then makes S greater than 1.5 times of Obj_n areas Mn=0;
2. ask for the maximal value in each foreground target agglomerate and other foreground target agglomerate minimum spacing, that is: d Max=max[min (d (Blob_i, Blob_j))];
3. if d Max≤ T, (T is a distance threshold, gets T=10) then is judged to and blocked by background and target divides, and gets the foreground area of the boundary rectangle of all foreground target agglomerates as the target correspondence, more fresh target;
4. if d Max>T then is judged in the former target and divides fresh target, by the probability distribution q={q (u) of the nuclear weighting color histogram of the foreground target agglomerate of formula (1) calculated candidate } U=1 ..., KAnd calculate the similarity Bhattacharyya coefficient ρ of probability distribution p (u) of the nuclear weighting color histogram of q (u) and former frame target Obj_n respectively by formula (7), find out ρ and be the current goal zone of peaked foreground target agglomerate Blob_x for being complementary with Obj_n;
&rho; = &Sigma; u = 1 K p ( u ) q ( u ) - - - ( 7 )
Step 5: each target of former frame that obtains according to incidence matrix S and the matching relationship of present frame foreground target agglomerate, features such as the position of fresh target, area and color more, concrete steps are:
(1) obtains position, the area features of Blob_x, and upgrade position, the area of Obj_n with this;
(2) upgrade the nuclear weighting color histogram of Obj_n with the nuclear weighting color histogram of Blob_x.

Claims (6)

  1. In the video monitoring based on the multi-object tracking method of moving object detection, it is characterized in that this method may further comprise the steps:
    A, employing background subtraction detect moving target;
    The probability distribution of the nuclear weighting color histogram of B, calculating moving target;
    C, set up the incidence matrix between present frame foreground target agglomerate and the detected target of former frame;
    D, judge and the state of living in of each target the target that is in released state is carried out two secondary trackings;
    E, more position, area and the nuclear weighting color histogram feature of fresh target.
  2. 2. based on the multi-object tracking method of moving object detection, it is characterized in that in the video monitoring according to claim 1, in the described steps A, adopt the concrete steps of background subtraction detection moving target as follows:
    A1, present frame gray level image and background image are subtracted each other, taking absolute value obtains error image;
    A2, use adaptive threshold carry out binary conversion treatment to error image;
    A3, to bianry image carry out the computing of form open-close, connected region label, fill up " cavity ", the aftertreatments such as length breadth ratio judgement of connected region area size and minimum boundary rectangle, after the influence that elimination noise and background disturbance bring, obtain foreground target binaryzation mask;
    A4, use foreground target binaryzation mask and present frame input picture carry out the logical operation, detect moving target.
  3. 3. in the video monitoring according to claim 1 based on the multi-object tracking method of moving object detection, it is characterized in that, among the described step C, the concrete steps of setting up the incidence matrix between the detected target of present frame foreground target agglomerate and former frame are as follows:
    C1, make Obj_n and Blob_m represent detected n the target of former frame and m foreground target agglomerate of present frame respectively, calculate the area S that overlaps between Blob_m and the Obj_n Mn, m=1 wherein, 2 ... M, n=1,2 ... N, M, N are respectively present frame foreground target agglomerate number and the detected target number of former frame;
    C2, set up the incidence matrix S of the capable N row of M, its element is S Mn
    C3, incidence matrix S is carried out binary conversion treatment, calculate each row of S, nonzero term number Row_m and Col_n of each row then respectively.
  4. 4. in the video monitoring according to claim 1 based on the multi-object tracking method of moving object detection, it is characterized in that, among the described step D, the coincidence relation according to detected target of former frame and present frame foreground target agglomerate is divided into five kinds of situations with the residing state of target:
    D1, if Obj_n does not exist with any foreground target agglomerate to be overlapped, promptly Col_n=0 then judges target Obj_n disappearance;
    D2, if Obj_n and Blob_m existence overlaps, and be one-to-one relationship, i.e. Col_n=1, S Mn=1, Row_m=1 judges that then target Obj_n keeps original state;
    D3, if Obj_n only exists with Blob_m to be overlapped, but Blob_m also overlaps with other targets existence, i.e. Col_n=1, S Mn=1, Row_m>1 judges that then target Obj_n overlaps with other targets;
    D4, if Blob_m does not exist with any target of former frame to be overlapped, promptly Row_m=0 then judges the fresh target generation;
    D5, if Obj_n overlaps with the existence of a plurality of foreground target agglomerates, promptly Col_n>1 judges that then target Obj_n is in released state.
  5. 5. in the video monitoring according to claim 4 based on the multi-object tracking method of moving object detection, it is characterized in that, among the described step D5, to being in the target of released state, if existing with a plurality of foreground target agglomerates, Obj_n overlaps, then bonded area, color characteristic carry out two secondary trackings, and its concrete steps are as follows:
    D51, delete the foreground target agglomerate of its area greater than 1.5 times of Obj_n areas;
    D52, the remaining mutual distance of foreground target agglomerate of basis are judged the reason of separating;
    D53, if separating reason is the influence that blocked by background, then get the foreground area of the boundary rectangle of all foreground target agglomerates as the target correspondence;
    D54, be to isolate fresh target if separate reason, the probability distribution of the nuclear weighting color histogram of the foreground target agglomerate of calculated candidate then, and calculate the similarity Bhattacharyya coefficient ρ of probability distribution of the nuclear weighting color histogram of itself and former frame target Obj_n respectively, find out ρ and be the current goal zone of peaked foreground target agglomerate Blob_x for being complementary with Obj_n.
  6. 6. in the video monitoring according to claim 1 based on the multi-object tracking method of moving object detection, it is characterized in that, in the described step e, each target of former frame that obtains according to incidence matrix S and the matching relationship of present frame foreground target agglomerate, features such as the position of fresh target, area and color more, concrete steps are:
    E1, the position that obtains Blob_x, area features, and upgrade position, the area of Obj_n with this;
    E2, upgrade the nuclear weighting color histogram of Obj_n with the nuclear weighting color histogram of Blob_x.
CN2010102212904A 2010-07-07 2010-07-07 Multi-target track method based on moving target detection in video monitoring Expired - Fee Related CN101887587B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2010102212904A CN101887587B (en) 2010-07-07 2010-07-07 Multi-target track method based on moving target detection in video monitoring

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2010102212904A CN101887587B (en) 2010-07-07 2010-07-07 Multi-target track method based on moving target detection in video monitoring

Publications (2)

Publication Number Publication Date
CN101887587A true CN101887587A (en) 2010-11-17
CN101887587B CN101887587B (en) 2012-05-23

Family

ID=43073497

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2010102212904A Expired - Fee Related CN101887587B (en) 2010-07-07 2010-07-07 Multi-target track method based on moving target detection in video monitoring

Country Status (1)

Country Link
CN (1) CN101887587B (en)

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102081801A (en) * 2011-01-26 2011-06-01 上海交通大学 Multi-feature adaptive fused ship tracking and track detecting method
CN102254396A (en) * 2011-07-06 2011-11-23 北京国铁华晨通信信息技术有限公司 Intrusion detection method and device based on video
CN102842139A (en) * 2012-07-19 2012-12-26 电子科技大学 Method for calculating target locus
CN102930554A (en) * 2011-08-11 2013-02-13 天津市亚安科技股份有限公司 Method and system for accurately capturing target in monitored scene
CN103020577A (en) * 2011-09-20 2013-04-03 佳都新太科技股份有限公司 Moving target identification method based on hog characteristic and system
CN103065325A (en) * 2012-12-20 2013-04-24 中国科学院上海微系统与信息技术研究所 Target tracking method based on color distance of multicolors and image dividing and aggregating
CN103123726A (en) * 2012-09-07 2013-05-29 佳都新太科技股份有限公司 Target tracking algorithm based on movement behavior analysis
CN103400157A (en) * 2013-07-23 2013-11-20 青岛海信网络科技股份有限公司 Road pedestrian and non-motor vehicle detection method based on video analysis
CN103425960A (en) * 2012-05-25 2013-12-04 信帧电子技术(北京)有限公司 Method for detecting fast-moving objects in video
CN103677734A (en) * 2012-09-25 2014-03-26 中国航天科工集团第二研究院二〇七所 Multi-target data association algorithm based on feature matching matrix
CN103729861A (en) * 2014-01-03 2014-04-16 天津大学 Multiple object tracking method
CN103886607A (en) * 2014-01-24 2014-06-25 清华大学深圳研究生院 Detection and suppression method for disturbance target
CN103927763A (en) * 2014-03-24 2014-07-16 河海大学 Identification processing method for multi-target tracking tracks of image sequences
CN104079798A (en) * 2013-03-25 2014-10-01 日电(中国)有限公司 Image detection method and device as well as video monitoring system
CN104091348A (en) * 2014-05-19 2014-10-08 南京工程学院 Multi-target tracking method integrating obvious characteristics and block division templates
CN104732187A (en) * 2013-12-18 2015-06-24 杭州华为企业通信技术有限公司 Method and equipment for image tracking processing
CN105118072A (en) * 2015-08-19 2015-12-02 西华大学 Method and device for tracking multiple moving targets
CN105847684A (en) * 2016-03-31 2016-08-10 深圳奥比中光科技有限公司 Unmanned aerial vehicle
CN106297130A (en) * 2016-08-22 2017-01-04 国家电网公司 Transmission line of electricity video analysis early warning system
CN106327502A (en) * 2016-09-06 2017-01-11 山东大学 Multi-scene multi-target recognition and tracking method in security video
CN106530325A (en) * 2016-10-26 2017-03-22 合肥润客软件科技有限公司 Multi-target visual detection and tracking method
CN106709521A (en) * 2016-12-26 2017-05-24 深圳极视角科技有限公司 Fire pre-warning method and fire pre-warning system based on convolution neural network and dynamic tracking
CN107636683A (en) * 2015-03-27 2018-01-26 三星电子株式会社 Utilize the movable method and apparatus of accelerometer identification user
CN108229459A (en) * 2018-01-04 2018-06-29 北京环境特性研究所 A kind of method for tracking target
CN108932730A (en) * 2018-05-31 2018-12-04 哈工大机器人(昆山)有限公司 Video multi-target tracking and system based on data correlation
CN110136174A (en) * 2019-05-22 2019-08-16 北京华捷艾米科技有限公司 A kind of target object tracking and device
CN113052853A (en) * 2021-01-25 2021-06-29 广东技术师范大学 Video target tracking method and device in complex environment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1972370A (en) * 2005-11-23 2007-05-30 中国科学院沈阳自动化研究所 Real-time multi-target mark and centroid operation method
CN101621615A (en) * 2009-07-24 2010-01-06 南京邮电大学 Self-adaptive background modeling and moving target detecting method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1972370A (en) * 2005-11-23 2007-05-30 中国科学院沈阳自动化研究所 Real-time multi-target mark and centroid operation method
CN101621615A (en) * 2009-07-24 2010-01-06 南京邮电大学 Self-adaptive background modeling and moving target detecting method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
《南京邮电大学学报(自然科学版)》 20091231 卢官明,郎苏娟 基于YCb Cr颜色空间的背景建模及运动目标检测 17-22页 1-6 第29卷, 第6期 2 *
《山西电子技术》 20091231 徐方明, 卢官明 基于改进surendra 背景更新算法的运动目标检测算法 39-40页 1-6 , 第5期 2 *
《浙江大学学报》 20010731 杨海波,姚庆栋,荆仁杰 基于团块匹配的序列图像中运动目标的分割方法 365-369页 1-3,6 第35卷, 第4期 2 *

Cited By (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102081801B (en) * 2011-01-26 2014-04-16 上海交通大学 Multi-feature adaptive fused ship tracking and track detecting method
CN102081801A (en) * 2011-01-26 2011-06-01 上海交通大学 Multi-feature adaptive fused ship tracking and track detecting method
CN102254396A (en) * 2011-07-06 2011-11-23 北京国铁华晨通信信息技术有限公司 Intrusion detection method and device based on video
CN102254396B (en) * 2011-07-06 2014-06-04 通号通信信息集团有限公司 Intrusion detection method and device based on video
CN102930554A (en) * 2011-08-11 2013-02-13 天津市亚安科技股份有限公司 Method and system for accurately capturing target in monitored scene
CN102930554B (en) * 2011-08-11 2015-06-03 天津市亚安科技股份有限公司 Method and system for accurately capturing target in monitored scene
CN103020577A (en) * 2011-09-20 2013-04-03 佳都新太科技股份有限公司 Moving target identification method based on hog characteristic and system
CN103425960B (en) * 2012-05-25 2017-04-05 信帧机器人技术(北京)有限公司 Fast moving objects method for detecting in a kind of video
CN103425960A (en) * 2012-05-25 2013-12-04 信帧电子技术(北京)有限公司 Method for detecting fast-moving objects in video
CN102842139A (en) * 2012-07-19 2012-12-26 电子科技大学 Method for calculating target locus
CN102842139B (en) * 2012-07-19 2015-09-02 电子科技大学 A kind of acquiring method of target trajectory
CN103123726A (en) * 2012-09-07 2013-05-29 佳都新太科技股份有限公司 Target tracking algorithm based on movement behavior analysis
CN103123726B (en) * 2012-09-07 2016-09-07 佳都新太科技股份有限公司 A kind of target tracking algorism analyzed based on motor behavior
CN103677734A (en) * 2012-09-25 2014-03-26 中国航天科工集团第二研究院二〇七所 Multi-target data association algorithm based on feature matching matrix
CN103065325A (en) * 2012-12-20 2013-04-24 中国科学院上海微系统与信息技术研究所 Target tracking method based on color distance of multicolors and image dividing and aggregating
CN103065325B (en) * 2012-12-20 2015-07-29 中国科学院上海微系统与信息技术研究所 A kind of method for tracking target based on the polymerization of color Distance geometry Iamge Segmentation
CN104079798A (en) * 2013-03-25 2014-10-01 日电(中国)有限公司 Image detection method and device as well as video monitoring system
CN104079798B (en) * 2013-03-25 2017-09-26 日电(中国)有限公司 Image detecting method, device and a kind of video monitoring system
CN103400157B (en) * 2013-07-23 2017-02-08 青岛海信网络科技股份有限公司 Road pedestrian and non-motor vehicle detection method based on video analysis
CN103400157A (en) * 2013-07-23 2013-11-20 青岛海信网络科技股份有限公司 Road pedestrian and non-motor vehicle detection method based on video analysis
CN104732187A (en) * 2013-12-18 2015-06-24 杭州华为企业通信技术有限公司 Method and equipment for image tracking processing
CN103729861A (en) * 2014-01-03 2014-04-16 天津大学 Multiple object tracking method
CN103729861B (en) * 2014-01-03 2016-06-22 天津大学 A kind of multi-object tracking method
CN103886607B (en) * 2014-01-24 2016-08-17 清华大学深圳研究生院 A kind of detection for disturbance target and suppressing method
CN103886607A (en) * 2014-01-24 2014-06-25 清华大学深圳研究生院 Detection and suppression method for disturbance target
CN103927763A (en) * 2014-03-24 2014-07-16 河海大学 Identification processing method for multi-target tracking tracks of image sequences
CN103927763B (en) * 2014-03-24 2016-08-17 河海大学 A kind of image sequence multiple target tracking track identification processing method
CN104091348A (en) * 2014-05-19 2014-10-08 南京工程学院 Multi-target tracking method integrating obvious characteristics and block division templates
CN104091348B (en) * 2014-05-19 2017-04-05 南京工程学院 The multi-object tracking method of fusion marked feature and piecemeal template
CN107636683A (en) * 2015-03-27 2018-01-26 三星电子株式会社 Utilize the movable method and apparatus of accelerometer identification user
CN105118072A (en) * 2015-08-19 2015-12-02 西华大学 Method and device for tracking multiple moving targets
CN105847684A (en) * 2016-03-31 2016-08-10 深圳奥比中光科技有限公司 Unmanned aerial vehicle
CN106297130A (en) * 2016-08-22 2017-01-04 国家电网公司 Transmission line of electricity video analysis early warning system
CN106327502A (en) * 2016-09-06 2017-01-11 山东大学 Multi-scene multi-target recognition and tracking method in security video
CN106530325A (en) * 2016-10-26 2017-03-22 合肥润客软件科技有限公司 Multi-target visual detection and tracking method
CN106709521A (en) * 2016-12-26 2017-05-24 深圳极视角科技有限公司 Fire pre-warning method and fire pre-warning system based on convolution neural network and dynamic tracking
CN108229459A (en) * 2018-01-04 2018-06-29 北京环境特性研究所 A kind of method for tracking target
CN108229459B (en) * 2018-01-04 2020-11-20 北京环境特性研究所 Target tracking method
CN108932730A (en) * 2018-05-31 2018-12-04 哈工大机器人(昆山)有限公司 Video multi-target tracking and system based on data correlation
CN110136174A (en) * 2019-05-22 2019-08-16 北京华捷艾米科技有限公司 A kind of target object tracking and device
CN110136174B (en) * 2019-05-22 2021-06-22 北京华捷艾米科技有限公司 Target object tracking method and device
CN113052853A (en) * 2021-01-25 2021-06-29 广东技术师范大学 Video target tracking method and device in complex environment

Also Published As

Publication number Publication date
CN101887587B (en) 2012-05-23

Similar Documents

Publication Publication Date Title
CN101887587B (en) Multi-target track method based on moving target detection in video monitoring
CN103281477B (en) Multi-target track method based on multi-level characteristic association
CN102332092B (en) Flame detection method based on video analysis
CN102165493B (en) Detection of vehicles in an image
Santosh et al. Tracking multiple moving objects using gaussian mixture model
CN103035013B (en) A kind of precise motion shadow detection method based on multi-feature fusion
KR101653278B1 (en) Face tracking system using colar-based face detection method
CN103942536B (en) Multi-target tracking method of iteration updating track model
Timofte et al. Combining traffic sign detection with 3D tracking towards better driver assistance
CN101715111B (en) Method for automatically searching abandoned object in video monitoring
CN104978567A (en) Vehicle detection method based on scenario classification
CN104036250A (en) Video pedestrian detecting and tracking method
CN106529461A (en) Vehicle model identifying algorithm based on integral characteristic channel and SVM training device
CN106503748A (en) A kind of based on S SIFT features and the vehicle targets of SVM training aids
CN106056078A (en) Crowd density estimation method based on multi-feature regression ensemble learning
Sheng et al. Real-time anti-interference location of vehicle license plates using high-definition video
Zhan et al. Pedestrian detection and behavior recognition based on vision
Avery et al. Investigation into shadow removal from traffic images
Abdullah et al. Vehicles detection system at different weather conditions
CN107169439A (en) A kind of Pedestrians and vehicles detection and sorting technique
Ghasemi et al. A real-time multiple vehicle classification and tracking system with occlusion handling
CN105118073A (en) Human body head target identification method based on Xtion camera
CN105989615A (en) Pedestrian tracking method based on multi-feature fusion
CN113221603A (en) Method and device for detecting shielding of monitoring equipment by foreign matters
Borhade et al. Advanced driver assistance system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120523

Termination date: 20150707

EXPY Termination of patent right or utility model