CN105844634A - Multi-motion-object video monitoring system and tracking monitoring method thereof - Google Patents

Multi-motion-object video monitoring system and tracking monitoring method thereof Download PDF

Info

Publication number
CN105844634A
CN105844634A CN201610160824.4A CN201610160824A CN105844634A CN 105844634 A CN105844634 A CN 105844634A CN 201610160824 A CN201610160824 A CN 201610160824A CN 105844634 A CN105844634 A CN 105844634A
Authority
CN
China
Prior art keywords
target
module
data processing
control unit
video data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610160824.4A
Other languages
Chinese (zh)
Other versions
CN105844634B (en
Inventor
冯莹莹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fuyang Normal University
Original Assignee
Fuyang Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuyang Normal University filed Critical Fuyang Normal University
Priority to CN201610160824.4A priority Critical patent/CN105844634B/en
Publication of CN105844634A publication Critical patent/CN105844634A/en
Application granted granted Critical
Publication of CN105844634B publication Critical patent/CN105844634B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The invention discloses a multi-motion-object video monitoring system and a tracking monitoring method thereof, relating to the video monitoring technology field. The multi-motion-object video monitoring system comprises a plurality of cameras, a plurality of video data processing units, a central control unit, a user display terminal, a target positioning module for positioning a motion object, and a camera control module for controlling the deflection direction and the deflection angle of cloud deck cameras. The video data processing unit is connected to the central control unit through an Ethernet; the cameras are cloud deck cameras; each camera is calibrated; the camera control module is connected to the cameras. Each cloud deck camera is provided with one of the video data processing units for processing an image sequence of the camera, so that the load of the central control unit is reduced and expansion is facilitated. The independent arrangement of the target positioning module and the camera control module realizes the target precision positioning and timely control of the cameras.

Description

A kind of multiple mobile object video monitoring system and tracking monitor method thereof
Technical field
The present invention relates to technical field of video monitoring, especially a kind of multiple mobile object video monitoring system and tracking monitor side thereof Method.
Background technology
Along with the development of technology for information acquisition, video monitoring system has been widely used in every field, it is possible to help people real Time obtain the information in monitoring scene, such as, the head of a family is at home or work unit just it will be seen that child by video monitoring Active situation at kindergarten.In the video monitoring system of reality, fixing camera one scene of monitoring, especially The scene that scope is bigger is nearly impossible.One good solution is to form one by multiple rotary photographic head Individual video surveillance network, carries out continuous seamless monitoring synergistically;But this fusion to multiple camera informations, photographic head is appointed The distribution of business, data time moving target is blocked by barrier process, and suffer from certain requirement.Prior art or many or There is multiple camera information less asynchronous, photographic head controls information delay even entanglement, target location inaccuracy, blocks place Manage the problems such as inappropriate.
Summary of the invention
For the problems referred to above, the present invention provides a kind of multiple mobile object video monitoring system and tracking monitor method, photographic head Control in real time, precisely, it is easy to extension.
In order to solve above-mentioned technical problem, the technical scheme is that
A kind of multiple mobile object video monitoring system, including some photographic head and video data processing element, central control unit, User's display terminal, each photographic head connects the video data processing element of its correspondence, and described video data processing element connects Described central control unit, described central control unit is connected with described user's display terminal;It is fixed also to include for moving target The target locating module of position and for controlling the camera control module of cradle head camera yawing moment and angle, described video counts Described central control unit is connected by Ethernet according to processing unit;Described photographic head is cradle head camera, each photographic head Processing through calibration, described camera control module connects described photographic head;Described video data processing element includes for carrying Take the module of target detection of moving target feature, for finding the target tracking module with the most like candidate target of moving target, And for the communication submodule carried out data transmission with described central control unit;Described photographic head connects described target detection Module, described module of target detection connect described target tracking module, described communication submodule respectively with described target following mould Block, target locating module, camera control module are bi-directionally connected;Described central control unit includes for sub with described communication The communication module that module connects, for the data received carry out the virtual synchronous module of synchronization process, is used for integrating each The data association module of the local message that photographic head obtains, is used for judging yawing moment and angle needed for photographic head pursuit movement target The photographic head management module of degree;Described communication module connects the communication submodule of each video data processing element, described virtual Synchronization module, data association module, photographic head management module is bi-directionally connected with described communication module;Described user's display terminal For the motion flight path of moving target is shown.
Further, described central control unit is cascade structure, including primary control unit and whole level control unit, described Primary control unit is connected with the video data processing element in respective region, and the primary control unit in zones of different passes through Whole level control unit links together, and described user's display terminal is connected with described whole level control unit.
Further, described virtual synchronous module includes time synchronized module and characteristic synchronization module;Described time synchronized module It is connected with described video data processing element by serial ports, sends the time with certain frequency to each video data processing element Synchronizing signal;The target information received is stored by described characteristic synchronization module temporarily, when a packet arrives, resolves Target information in this packet and packing time, estimate other videos on this time point according to the target information of interim storage The target information that data processing unit transmission comes synchronizes.
A kind of multiple mobile object tracking monitor method, comprises the following steps:
Step 1: photographic head calibration processes;
Step 2: photographic head obtains the video information in monitoring region in real time, and sends respective video data processing element to;
Step 3: video data processing element carries out three-frame difference detection to image sequence, obtains the characteristic information of moving target;
Step 4: Kalman filtering and mean shift algorithm are combined pursuit movement target by video data processing element, obtain Candidate target region;
Step 5: carry out the characteristic matching of moving target in candidate target region, draw candidate target;
Step 6: target locating module carries out position to candidate target and determines according to combining dem data;
Step 7: the candidate target information from different video data processing unit is synchronized by central control unit;
Step 8: Matching Model set up by central control unit, it is achieved target association and track association;
Step 9: photographic head management module calculates yawing moment and the angle of different photographic head according to virtual target information, and will Data send each camera control module together to;
Step 10: camera control module receives the control information of different photographic head, only extracts the control information mated with self, And control photographic head deflection;
Wherein, described step 4 uses the mean shift algorithm of extended BHF approach and improvement, specifically includes,
Step 4.1: the fresh target that auto-initiation is stable, determines the initial state vector of Kalman filter;
Step 4.2: utilize the candidate target position of the Kalman filter prediction current time present frame after initializing;
Step 4.3: the iteration starting point with the candidate target position of Kalman filter prediction as mean shift algorithm, starts Mean-shift iteration process, until convergence, obtains the target location of prediction;
Step 4.4: the future position obtained by mean shift algorithm is as the observation vector of Kalman filter, to card Kalman Filtering is updated;
Step 4.5: utilize the Kalman filter after updating, repetition step 4.2 to 4.4, to calculate the target position of next frame Put.
Further, in described step 5, if moving target is detected in current frame image, then keeping characteristics mates Positional value (the x of the candidate target arrived1, y1), with target location (x2, the y of mean shift tracking prediction2) be weighted averagely obtaining (x, y), (x, y) carries out IIR filtering, obtains the target state vector conduct at present frame in final goal position in final goal position The observation vector of Kalman filter;If moving target is not detected in current frame image, then judge that moving target is even The continuous number of image frames disappeared, exceedes setting threshold value, then judge that target disappears, and not less than setting threshold value, then supposes that it is obscured, The target location of mean shift tracking prediction is preserved as moving target actual value when covering, and as next frame card The observation vector of Kalman Filtering.
Further, the synchronization in described step 7 includes time synchronized and characteristic synchronization;Described time synchronized is controlled by central authorities Unit processed with certain frequency to each video data processing element send time synchronizing signal, each video data processing element with The time of central control unit is as the criterion and realizes;Described characteristic synchronization is completed by Target buffer pond chain, described Target buffer Real features data buffer zone, pond and virtual feature data buffer zone are constituted, and real features data buffer zone storage central authorities control single R the real features data from a certain video data processing element that unit obtains, are expressed as t, t+k1、...、t+kt, Virtual feature data buffer zone adds up s result after interpolation or prediction, is expressed as t ', t '+k '1、...、t′+k′t, belong to N Target buffer pond composition chain structure of this video data processing element, implements and comprises the following steps,
After S101: central control unit receives the real features data of arbitrary video data processing element, travel through connected All video data processing element, for the video data processing element matched with this Data Source, perform S102, Otherwise performing S103, all connected video data processing element are carried out complete, then the data continuing next round connect Receive;
S102: traversal real features Data-Link, finds target designation same in the chain of Target buffer pond, with first in first out Mode update real features data buffer zone and virtual feature relief area, if Target buffer pond chain not having same Target designation, then add the Buffer Pool of this target in chain and initialize, and be finished return S101;
S103: traversal Target buffer pond chain, with the detecting period of this secondary data for interpolation independent variable, with real features relief area In data be interpolation knot, perform prediction algorithm obtain virtual feature data, in the way of first in first out, update virtual feature Data buffer zone, be finished return S101.
Beneficial effects of the present invention: each cradle head camera is equipped with video data processing element, processes its image sequence, Alleviate the burden of central control unit, and be prone to extension, be individually equipped with target locating module and camera control module, it is achieved Targeting accuracy location and photographic head control in time;Information from different photographic head is carried out synchronization process, solves nonsynchronous Problem;The situation being blocked moving target has carried out suitable process, it is ensured that the seriality of Target Movement Track.
Accompanying drawing explanation
Fig. 1 is the structural representation of the present invention;
Fig. 2 is the cascade structure schematic diagram of central control unit;
Fig. 3 is target tracking algorism flow chart;
Fig. 4 is moving target matching treatment flow chart.
Detailed description of the invention
The present invention is further detailed explanation with detailed description of the invention below in conjunction with the accompanying drawings.Embodiments of the invention be in order to Example and being given for the sake of describing, and be not exhaustively or limit the invention to disclosed form.A lot of amendments It is apparent from for the ordinary skill in the art with change.Selecting and describing embodiment is to more preferably illustrate The principle of the present invention and actual application, and make those of ordinary skill in the art it will be appreciated that the present invention thus design is suitable to spy Determine the various embodiments with various amendments of purposes.
Embodiment
The present invention protects a kind of multiple mobile object video monitoring system, as it is shown in figure 1, include some photographic head 10 and video Data processing unit 20, central control unit 30, user's display terminal 40, each photographic head 10 connects regarding of its correspondence Frequently data processing unit 20, described video data processing element 20 connects described central control unit 30, and described central authorities control Unit 30 is connected with described user's display terminal 40;Also include the target locating module 50 for moving target location and use In controlling cradle head camera yawing moment and the camera control module 60 of angle, described video data processing element 20 passes through Ethernet connects described central control unit 30;Described photographic head 10 is cradle head camera, and each photographic head is through calibration Processing, described camera control module 60 connects described photographic head 10;Described video data processing element 20 include for Extract the module of target detection 201 of moving target feature, for finding the target following with the most like candidate target of moving target Module 202, and for the communication submodule 203 carried out data transmission with described central control unit;Described photographic head 10 Connecting described module of target detection 201, described module of target detection 201 connects described target tracking module 202, described logical News submodule 203 is two-way with described target tracking module 202, target locating module 50, camera control module 60 respectively Connect;Described central control unit 30 includes, for the communication module 301 being connected with described communication submodule, being used for docking The data received carry out the virtual synchronous module 302 of synchronization process, for integrating the number of the local message that each photographic head obtains According to relating module 303, the photographic head management module of yawing moment and angle needed for judging photographic head pursuit movement target 304;Described communication module 301 connects the communication submodule 203 of each video data processing element, described virtual synchronous mould Block 302, data association module 303, photographic head management module 304 is bi-directionally connected with described communication module 301, described use Family display terminal 40 is for showing the motion flight path of moving target.
Described virtual synchronous module 302 includes time synchronized module and characteristic synchronization module;Described time synchronized module is by string Mouth is connected with described video data processing element, sends time synchronized letter with certain frequency to each video data processing element Number;The target information received is stored by described characteristic synchronization module temporarily, when a packet arrives, resolves this data Target information in bag and packing time, estimate on this time point at other video datas according to the target information of interim storage The target information that reason unit transmission comes synchronizes.
Described central control unit can also be cascade structure, as in figure 2 it is shown, include primary control unit 30 and whole level control Unit 70 processed, described primary control unit 30 is connected with the video data processing element 20 in respective region, not same district Primary control unit 30 in territory is linked together by whole level control unit 70, and described user's display terminal 40 is with described Whole level control unit 70 connects, and described whole level control unit 70 is similar with primary control unit 30 structure, including for The whole level communication module 701 that described primary control unit connects, empty for the data received being carried out the whole level of synchronization process Intend synchronization module 702, for integrating the whole DBMS relating module 703 of the local message that each photographic head obtains;Described end Level virtual synchronous module 702, whole DBMS relating module 703 are bi-directionally connected with described whole level communication module 701.Whole level control Unit 70 processed has only to the interregional coincidence scene observed is carried out virtual synchronous and data association.
The present invention also protects a kind of multiple mobile object tracking monitor method, comprises the following steps:
Step 1: photographic head calibration processes;
Step 2: photographic head obtains the video information in monitoring region in real time, and sends respective video data processing element to;
Step 3: video data processing element carries out three-frame difference detection to image sequence, obtains the characteristic information of moving target;
Step 4: Kalman filtering and mean shift algorithm are combined pursuit movement target by video data processing element, obtain Candidate target region;
Step 5: carry out the characteristic matching of moving target in candidate target region, draw candidate target;
Step 6: target locating module carries out position to candidate target and determines according to combining dem data;
Step 7: the candidate target information from different video data processing unit is synchronized by central control unit;
Step 8: Matching Model set up by central control unit, it is achieved target association and track association;
Step 9: photographic head management module calculates yawing moment and the angle of different photographic head according to virtual target information, and will Data send each camera control module together to;
Step 10: camera control module receives the control information of different photographic head, only extracts the control information mated with self, And control photographic head deflection;
Wherein, described step 4 uses the mean shift algorithm of extended BHF approach and improvement, as it is shown on figure 3, specifically Including,
Step 4.1: the fresh target that auto-initiation is stable, determines the initial state vector of Kalman filter;
Step 4.2: utilize the candidate target position of the Kalman filter prediction current time present frame after initializing;
Step 4.3: the iteration starting point with the candidate target position of Kalman filter prediction as mean shift algorithm, starts Mean-shift iteration process, until convergence, obtains the target location of prediction;
Step 4.4: the future position obtained by mean shift algorithm is as the observation vector of Kalman filter, to card Kalman Filtering is updated;
Step 4.5: utilize the Kalman filter after updating, repetition step 4.2 to 4.4, to calculate the target position of next frame Put.
In above-mentioned steps 5, if moving target is detected in current frame image, then keeping characteristics mates the candidate's mesh obtained Target positional value (x1, y1), with target location (x2, the y of mean shift tracking prediction2) be weighted averagely obtaining final goal position Put (x, y),
Weighted average formula:α is weight, and value is between 0-1;
(x, y) carries out IIR filtering, obtains the target state vector at present frame as Kalman filter in final goal position Observation vector,
IIR Filtering Formula:Wherein β, γ are for updating the factor, β > 0, γ > 1, Footmark k-1 represents the target state value in former frame, and footmark k represents the target state value at present frame, and { x, y} represent target Position,For target velocity,For aimed acceleration;
If moving target is not detected in current frame image, then judge the number of image frames that moving target disappears continuously, super Cross setting threshold value, then judge that target disappears, and not less than setting threshold value, then supposes that it is obscured, by mean shift tracking prediction Target location as cover time moving target actual value preserve, and as next frame Kalman filtering observation to Amount, as shown in Figure 4.
Concrete tracking and matching process uses three layers of list structure, the node of ground floor chained list contain that three-frame difference detection obtains point Cutting the position of rear moving region and the characteristic information of size parameter, second layer chained list is by carrying out one-level feature with ground floor chained list Coupling obtains fresh target;The node of second layer chained list contains information and speed that ground floor chained list node comprised and target is steady Qualitative mark, reaches the target of tenacious tracking condition in second layer chained list, sends third layer chained list to, if second layer chained list Middle target is judged as disappearing, then delete this target;Third layer chained list carries out secondary characteristics with ground floor chained list and mates, it is impossible to The ground floor chained list node of Knot Searching any with third layer chained list is new candidate target, adds in second layer chained list, can be with The ground floor chained list node of third layer chained list Knot Searching updates as the state of this node;Described one-level characteristic matching is target The arest neighbors coupling of part centroid position and the weighted sum of regional color characteristic matching, described secondary characteristics coupling is color between target The weighted sum that characteristic matching is mated with shape facility.
Synchronization in above-mentioned steps 7 includes time synchronized and characteristic synchronization;Described time synchronized passes through central control unit with one Determining frequency and send time synchronizing signal to each video data processing element, each video data processing element controls single with central authorities The time of unit is as the criterion and realizes;Described characteristic synchronization is completed by Target buffer pond chain, described Target buffer pond real features Data buffer zone and virtual feature data buffer zone are constituted, and what real features data buffer zone storage central control unit obtained comes From r real features data of a certain video data processing element, it is expressed as t, t+k1、...、t+kt, virtual feature Data buffer zone adds up s result after interpolation or prediction, is expressed as t ', t '+k '1、...、t′+k′t, belong to this video counts According to n Target buffer pond composition chain structure of processing unit, implement and comprise the following steps,
After S101: central control unit receives the real features data of arbitrary video data processing element, travel through connected All video data processing element, for the video data processing element matched with this Data Source, perform S102, Otherwise performing S103, all connected video data processing element are carried out complete, then the data continuing next round connect Receive;
S102: traversal real features Data-Link, finds target designation same in the chain of Target buffer pond, with first in first out Mode update real features data buffer zone and virtual feature relief area, if Target buffer pond chain not having same Target designation, then add the Buffer Pool of this target in chain and initialize, and be finished return S101;
S103: traversal Target buffer pond chain, with the detecting period of this secondary data for interpolation independent variable, with real features relief area In data be interpolation knot, perform prediction algorithm obtain virtual feature data, in the way of first in first out, update virtual feature Data buffer zone, be finished return S101.
Obviously, described embodiment is only a part of embodiment of the present invention rather than whole embodiments.Based on this Embodiment in invention, this area and those of ordinary skill in the related art are obtained on the premise of not making creative work The every other embodiment obtained, all should belong to the scope of protection of the invention.

Claims (6)

1. a multiple mobile object video monitoring system, controls single including some photographic head and video data processing element, central authorities Unit, user's display terminal, each photographic head connects the video data processing element of its correspondence, described video data processing element Connecting described central control unit, described central control unit is connected with described user's display terminal, it is characterised in that:
Also include the target locating module for moving target location and for controlling cradle head camera yawing moment and angle Camera control module, described video data processing element connects described central control unit by Ethernet;Described shooting Head is cradle head camera, and each photographic head processes through calibration, and described camera control module connects described photographic head;
Described video data processing element includes the module of target detection for extracting moving target feature, is used for finding and moving The target tracking module of the most like candidate target of target, and for the communication carried out data transmission with described central control unit Submodule;Described photographic head connects described module of target detection, and described module of target detection connects described target tracking module, Described communication submodule is bi-directionally connected with described target tracking module, target locating module, camera control module respectively;
Described central control unit includes the communication module for being connected with described communication submodule, for the data received Carry out the virtual synchronous module of synchronization process, for integrating the data association module of the local message that each photographic head obtains, use The photographic head management module of yawing moment and angle needed for judging photographic head pursuit movement target;Described communication module connects each The communication submodule of individual video data processing element, described virtual synchronous module, data association module, photographic head management module It is bi-directionally connected with described communication module;
Described user's display terminal is for showing the motion flight path of moving target.
Multiple mobile object video monitoring system the most according to claim 1, it is characterised in that: described central control unit For cascade structure, including primary control unit and whole level control unit, described primary control unit and the video in respective region Data processing unit is connected, and the primary control unit in zones of different is linked together by whole level control unit, described use Family display terminal is connected with described whole level control unit.
Multiple mobile object video monitoring system the most according to claim 1, it is characterised in that: described virtual synchronous module Including time synchronized module and characteristic synchronization module;
Described time synchronized module is connected with described video data processing element, with certain frequency to each video by serial ports Data processing unit sends time synchronizing signal;
The target information received is stored by described characteristic synchronization module temporarily, when a packet arrives, resolves this data Target information in bag and packing time, estimate on this time point at other video datas according to the target information of interim storage The target information that reason unit transmission comes synchronizes.
4. a multiple mobile object tracking monitor method, it is characterised in that comprise the following steps:
Step 1: photographic head calibration processes;
Step 2: photographic head obtains the video information in monitoring region in real time, and sends respective video data processing element to;
Step 3: video data processing element carries out three-frame difference detection to image sequence, obtains the characteristic information of moving target;
Step 4: Kalman filtering and mean shift algorithm are combined pursuit movement target by video data processing element, obtain Candidate target region;
Step 5: carry out the characteristic matching of moving target in candidate target region, draw candidate target;
Step 6: target locating module carries out position to candidate target and determines according to combining dem data;
Step 7: the candidate target information from different video data processing unit is synchronized by central control unit;
Step 8: Matching Model set up by central control unit, it is achieved target association and track association;
Step 9: photographic head management module calculates yawing moment and the angle of different photographic head according to virtual target information, and will Data send each camera control module together to;
Step 10: camera control module receives the control information of different photographic head, only extracts the control information mated with self, And control photographic head deflection;
Wherein, described step 4 uses the mean shift algorithm of extended BHF approach and improvement, specifically includes,
Step 4.1: the fresh target that auto-initiation is stable, determines the initial state vector of Kalman filter;
Step 4.2: utilize the candidate target position of the Kalman filter prediction current time present frame after initializing;
Step 4.3: the iteration starting point with the candidate target position of Kalman filter prediction as mean shift algorithm, starts Mean-shift iteration process, until convergence, obtains the target location of prediction;
Step 4.4: the future position obtained by mean shift algorithm is as the observation vector of Kalman filter, to card Kalman Filtering is updated;
Step 4.5: utilize the Kalman filter after updating, repetition step 4.2 to 4.4, to calculate the target position of next frame Put.
Multiple mobile object tracking monitor method the most according to claim 4, it is characterised in that: in described step 5, if Moving target is detected in current frame image, then positional value (the x of the candidate target that keeping characteristics coupling obtains1, y1), with Target location (the x of mean shift tracking prediction2, y2) be weighted averagely obtaining final goal position (x, y), final goal position (x, y) carries out IIR filtering, obtains the target state vector at present frame as the observation vector of Kalman filter;If motion Target is not detected in current frame image, then judge the number of image frames that moving target disappears continuously, exceedes setting threshold value, Then judge that target disappears, and not less than setting threshold value, then supposes that it is obscured, the target location of mean shift tracking prediction is made Moving target actual value during for covering preserves, and as the observation vector of next frame Kalman filtering.
Multiple mobile object tracking monitor method the most according to claim 4, it is characterised in that: same in described step 7 Step includes time synchronized and characteristic synchronization;
Described time synchronized sends time synchronized with certain frequency to each video data processing element by central control unit Signal, each video data processing element is as the criterion with the time of central control unit and realizes;
Described characteristic synchronization is completed by Target buffer pond chain, real features data buffer zone, described Target buffer pond and virtual Characteristic relief area is constituted, real features data buffer zone storage central control unit obtain from a certain video data R real features data of reason unit, are expressed as t, t+k1、...、t+k1, virtual feature data buffer zone is accumulative inserts S result after value or prediction, is expressed as t ', t '+k '1、…、t′+k′1, belong to n of this video data processing element Target buffer pond composition chain structure, implements and comprises the following steps,
After S101: central control unit receives the real features data of arbitrary video data processing element, travel through connected All video data processing element, for the video data processing element matched with this Data Source, perform S102, Otherwise performing S103, all connected video data processing element are carried out complete, then the data continuing next round connect Receive;
S102: traversal real features Data-Link, finds target designation same in the chain of Target buffer pond, with first in first out Mode update real features data buffer zone and virtual feature relief area, if Target buffer pond chain not having same Target designation, then add the Buffer Pool of this target in chain and initialize, and be finished return S101;
S103: traversal Target buffer pond chain, with the detecting period of this secondary data for interpolation independent variable, with real features relief area In data be interpolation knot, perform prediction algorithm obtain virtual feature data, in the way of first in first out, update virtual feature Data buffer zone, be finished return S101.
CN201610160824.4A 2016-03-18 2016-03-18 A kind of multiple mobile object tracking monitor method Expired - Fee Related CN105844634B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610160824.4A CN105844634B (en) 2016-03-18 2016-03-18 A kind of multiple mobile object tracking monitor method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610160824.4A CN105844634B (en) 2016-03-18 2016-03-18 A kind of multiple mobile object tracking monitor method

Publications (2)

Publication Number Publication Date
CN105844634A true CN105844634A (en) 2016-08-10
CN105844634B CN105844634B (en) 2019-04-05

Family

ID=56588110

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610160824.4A Expired - Fee Related CN105844634B (en) 2016-03-18 2016-03-18 A kind of multiple mobile object tracking monitor method

Country Status (1)

Country Link
CN (1) CN105844634B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106651908A (en) * 2016-10-13 2017-05-10 北京科技大学 Multi-moving-target tracking method
CN107067440A (en) * 2017-03-28 2017-08-18 苏州工业园区服务外包职业学院 A kind of image calibration method and system
CN108805900A (en) * 2017-05-03 2018-11-13 杭州海康威视数字技术股份有限公司 A kind of determination method and device of tracking target
CN109841022A (en) * 2018-11-29 2019-06-04 广州紫川物联网科技有限公司 A kind of target motion track detecting/warning method, system and storage medium
CN109982044A (en) * 2019-04-03 2019-07-05 大连海事大学 A kind of tracking of the target localization and tracking system based on CCTV Sensor Network
CN111343416A (en) * 2018-12-18 2020-06-26 华为技术有限公司 Distributed image analysis method, system and storage medium
CN115619867A (en) * 2022-11-18 2023-01-17 腾讯科技(深圳)有限公司 Data processing method, device, equipment, storage medium and program product

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101068342A (en) * 2007-06-05 2007-11-07 西安理工大学 Video frequency motion target close-up trace monitoring method based on double-camera head linkage structure
CN101465033A (en) * 2008-05-28 2009-06-24 丁国锋 Automatic tracking recognition system and method
CN101572804A (en) * 2009-03-30 2009-11-04 浙江大学 Multi-camera intelligent control method and device
CN101610396A (en) * 2008-06-16 2009-12-23 北京智安邦科技有限公司 Intellective video monitoring device module and system and method for supervising thereof with secret protection
CN101883261A (en) * 2010-05-26 2010-11-10 中国科学院自动化研究所 Method and system for abnormal target detection and relay tracking under large-range monitoring scene
CN102223522A (en) * 2011-05-20 2011-10-19 杭州数尔电子有限公司 Intelligent automatic tracker module
CN102289822A (en) * 2011-09-09 2011-12-21 南京大学 Method for tracking moving target collaboratively by multiple cameras
CN103237192A (en) * 2012-08-20 2013-08-07 苏州大学 Intelligent video monitoring system based on multi-camera data fusion
CN104125433A (en) * 2014-07-30 2014-10-29 西安冉科信息技术有限公司 Moving object video surveillance method based on multi-PTZ (pan-tilt-zoom)-camera linkage structure
CN104301669A (en) * 2014-09-12 2015-01-21 重庆大学 Suspicious target detection tracking and recognition method based on dual-camera cooperation

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101068342A (en) * 2007-06-05 2007-11-07 西安理工大学 Video frequency motion target close-up trace monitoring method based on double-camera head linkage structure
CN101465033A (en) * 2008-05-28 2009-06-24 丁国锋 Automatic tracking recognition system and method
CN101610396A (en) * 2008-06-16 2009-12-23 北京智安邦科技有限公司 Intellective video monitoring device module and system and method for supervising thereof with secret protection
CN101572804A (en) * 2009-03-30 2009-11-04 浙江大学 Multi-camera intelligent control method and device
CN101883261A (en) * 2010-05-26 2010-11-10 中国科学院自动化研究所 Method and system for abnormal target detection and relay tracking under large-range monitoring scene
CN102223522A (en) * 2011-05-20 2011-10-19 杭州数尔电子有限公司 Intelligent automatic tracker module
CN102289822A (en) * 2011-09-09 2011-12-21 南京大学 Method for tracking moving target collaboratively by multiple cameras
CN103237192A (en) * 2012-08-20 2013-08-07 苏州大学 Intelligent video monitoring system based on multi-camera data fusion
CN104125433A (en) * 2014-07-30 2014-10-29 西安冉科信息技术有限公司 Moving object video surveillance method based on multi-PTZ (pan-tilt-zoom)-camera linkage structure
CN104301669A (en) * 2014-09-12 2015-01-21 重庆大学 Suspicious target detection tracking and recognition method based on dual-camera cooperation

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106651908A (en) * 2016-10-13 2017-05-10 北京科技大学 Multi-moving-target tracking method
CN106651908B (en) * 2016-10-13 2020-03-31 北京科技大学 Multi-moving-target tracking method
CN107067440A (en) * 2017-03-28 2017-08-18 苏州工业园区服务外包职业学院 A kind of image calibration method and system
CN107067440B (en) * 2017-03-28 2020-05-19 苏州工业园区服务外包职业学院 Image calibration method and system
CN108805900A (en) * 2017-05-03 2018-11-13 杭州海康威视数字技术股份有限公司 A kind of determination method and device of tracking target
CN108805900B (en) * 2017-05-03 2021-04-16 杭州海康威视数字技术股份有限公司 Method and device for determining tracking target
CN109841022A (en) * 2018-11-29 2019-06-04 广州紫川物联网科技有限公司 A kind of target motion track detecting/warning method, system and storage medium
CN111343416A (en) * 2018-12-18 2020-06-26 华为技术有限公司 Distributed image analysis method, system and storage medium
CN111343416B (en) * 2018-12-18 2021-06-01 华为技术有限公司 Distributed image analysis method, system and storage medium
CN109982044A (en) * 2019-04-03 2019-07-05 大连海事大学 A kind of tracking of the target localization and tracking system based on CCTV Sensor Network
CN109982044B (en) * 2019-04-03 2021-12-10 大连海事大学 Tracking method of target positioning and tracking system based on CCTV sensor network
CN115619867A (en) * 2022-11-18 2023-01-17 腾讯科技(深圳)有限公司 Data processing method, device, equipment, storage medium and program product

Also Published As

Publication number Publication date
CN105844634B (en) 2019-04-05

Similar Documents

Publication Publication Date Title
CN105844634A (en) Multi-motion-object video monitoring system and tracking monitoring method thereof
CN110068335B (en) Unmanned aerial vehicle cluster real-time positioning method and system under GPS rejection environment
CN110222581B (en) Binocular camera-based quad-rotor unmanned aerial vehicle visual target tracking method
Strydom et al. Visual odometry: autonomous uav navigation using optic flow and stereo
CN107945265B (en) Real-time dense monocular SLAM method and system based on on-line study depth prediction network
CN108242079B (en) VSLAM method based on multi-feature visual odometer and graph optimization model
CN109211241B (en) Unmanned aerial vehicle autonomous positioning method based on visual SLAM
CN105809687B (en) A kind of monocular vision ranging method based on point information in edge in image
Mademlis et al. Autonomous unmanned aerial vehicles filming in dynamic unstructured outdoor environments [applications corner]
CN103874193B (en) A kind of method and system of mobile terminal location
US11483540B2 (en) Method and corresponding system for generating video-based 3-D models of a target such as a dynamic event
US10322819B2 (en) Autonomous system for taking moving images from a drone, with target tracking and improved target location
Valenti et al. Autonomous quadrotor flight using onboard RGB-D visual odometry
CN106056075A (en) Important person identification and tracking system in community meshing based on unmanned aerial vehicle
CN107504969A (en) Four rotor-wing indoor air navigation aids of view-based access control model and inertia combination
CN109901621A (en) A kind of the batch unmanned plane close/intra system and formation method of desired guiding trajectory
WO2023019740A1 (en) Cooperative transportation method and system based on multiple agents
CN109828658A (en) A kind of man-machine co-melting long-range situation intelligent perception system
CN109076173A (en) Image output generation method, equipment and unmanned plane
CN108496353A (en) Image processing method and unmanned plane
CN102937443A (en) Target rapid positioning system and target rapid positioning method based on unmanned aerial vehicle
CN109189100A (en) A kind of the quadrotor drone group control system and method for view-based access control model positioning
Lee et al. Autonomous feature following for visual surveillance using a small unmanned aerial vehicle with gimbaled camera system
CN110533719A (en) Augmented reality localization method and device based on environmental visual Feature point recognition technology
CN108985193A (en) A kind of unmanned plane portrait alignment methods based on image detection

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20190405

CF01 Termination of patent right due to non-payment of annual fee