CN101344965A - Tracking system based on binocular camera shooting - Google Patents

Tracking system based on binocular camera shooting Download PDF

Info

Publication number
CN101344965A
CN101344965A CNA2008100424910A CN200810042491A CN101344965A CN 101344965 A CN101344965 A CN 101344965A CN A2008100424910 A CNA2008100424910 A CN A2008100424910A CN 200810042491 A CN200810042491 A CN 200810042491A CN 101344965 A CN101344965 A CN 101344965A
Authority
CN
China
Prior art keywords
coordinate
module
camera
target
unique point
Prior art date
Application number
CNA2008100424910A
Other languages
Chinese (zh)
Inventor
赵宇明
胡福乔
蔡岭
Original Assignee
上海交通大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海交通大学 filed Critical 上海交通大学
Priority to CNA2008100424910A priority Critical patent/CN101344965A/en
Publication of CN101344965A publication Critical patent/CN101344965A/en

Links

Abstract

The invention relates to a full automatic target detecting and tracking system in the computer vision field, wherein, an input module is responsible for collecting digital images shot by a binocular camera to be taken as system input, the obtained digital images are input into a feature extraction module and feature analysis is carried out to one image to obtain a plurality of characteristic points to be taken as the subsequently processed images. By matching the characteristic points of two images, the parallax of the two images is calculated, and by combining the pre-informed external and internal parameters of the camera, the lower coordinate of a camera coordinate system of the characteristic points can be calculated; furthermore, by the relationship between a world coordinate system and the camera coordinate system, the coordinate of the world coordinate system of the characteristic points can be known. A clustering module clusters the characteristic points into an aggregation for expressing target position, while a trajectory analysis module estimates the target position on a time sequence to obtain the motion trajectory of the target. The invention can effectively and steadily detect the targets in a designated area, track the targets and calculate the motion trajectories of the targets.

Description

Tracker based on binocular camera shooting
Technical field
What the present invention relates to is the tracker in a kind of image recognition technique field, specifically, is a kind of tracker based on binocular camera shooting.
Background technology
Along with popularizing of digital camera head, digital picture has been occupied more and more important position in producing and living.Particularly in the monitoring security, digital picture has played vital role at aspects such as Target Recognition and target followings.Because the variation at monitoring scene illumination condition, visual angle has influenced the accuracy of monitoring, feasible full-automatic tracking based on image technique is used and is restricted then.Real-life monitoring scene changes very greatly, and different scenes are very big in the factor difference such as influence, camera angle, target occlusion and shade of ambient lighting.Even in some outdoor scenes, these conditions are different also to be not quite similar constantly.Supervisory system based on single camera is often very responsive to these factors, and accuracy is lower or detection speed is slower in the tracking of real scene.Therefore such tracker is difficult to finish the tracing task in the real scene.At this problem disposal route commonly used is to adopt dynamic background modeling or carry out the machine learning of large sample at specific objective; these two kinds of disposal routes have certain effect for periodic change of background and single target then, contrast can't handling of violent change of background and a plurality of complex targets.
Find through literature search prior art, Tao Zhao, Manoj Aggarwal, Rakesh Kumarand Harpeet Sawhney is at " IEEE Computer Society Conference on Computer Visionand Pattern Recognition " (international institute of electrical and electronic engineers computer vision and pattern-recognition meeting in 2005) literary composition " Real-time Wide Area Muti-Camera Stereo Tracking " (the real-time three-dimensional tracking of developing zone multiple-camera), a kind of system based on single camera and multiple-camera fusion is proposed in this article, single camera is at self visual angle detection and tracking human body, and the multiple-camera Fusion Module makes up all part tracking for same human body, forms global follow.Many people that application volume segmentation and tracking are handled under complex background in detection move, and utilize the fusion method of spatio-temporal constraint to carry out overall treatment for multiple-camera.
The deficiency of said system is: though this system can a plurality of human body targets of detection and tracking, this algorithm need cause hardware cost higher than multiple-camera (being 12 pairs of stereo cameras in the literary composition).Simultaneously algorithm has no idea to calculate the movement velocity and the direction of target, simultaneously can only be and can not be at the detection and tracking of arbitrary object at human body, and the detection and tracking of getting on the car as highway.
Summary of the invention
The objective of the invention is to overcome in the prior art and change the deficiency that the tracker performance is existed for monitoring scene, provide a kind of based on the full automatic Target Tracking System of binocular, make its automatic calculated characteristics of image of taking simultaneously put the distance of video camera according to dual camera, thereby and restore the movement locus of the real world coordinates position evaluating objects of target, can be applied to the target following of real scene and the estimation of target density and direction of motion.
The present invention is achieved by the following technical solutions, the present invention includes following module: load module, characteristic extracting module, disparity estimation module, world coordinate system computing module, target cluster module, trajectory analysis module, load module is responsible for gathering the image that the binocular camera shooting system takes and is imported as system, and the left and right sides image that is obtained is input to characteristic extracting module.Characteristic extracting module is to the object of input picture extract minutiae as subsequent treatment.The unique point that the disparity estimation module extracts for characteristic extracting module is according to the volume coordinate of calculation of parameter unique point in camera coordinates system of camera.It is coordinate that the world coordinate system computing module is converted to real world coordinates again with the coordinate of unique point in camera coordinates system.Target cluster module aggregates into a set of representing real-world object in the space according to the unique point coordinate with a plurality of unique points.The trajectory analysis module provides the movement locus of target in real space in conjunction with the position of current goal set and historical goal set.
Described load module is meant: be responsible for to gather the digital picture of binocular camera shooting system, described digital picture is that digital camera and digital scanner image that can obtain and Digital Video provide the frame in the sequence image.
Described characteristic extracting module is meant: the eigenwert of the matrix that each neighborhood of pixels during input picture is calculated is formed, when surpassing the threshold value that pre-sets, then this point is considered to the unique point in the image.
Described disparity estimation module is meant: the unique point that extracts is sought corresponding same point in another input figure.Because the difference at camera visual angle, the left and right sides, same in the image of the left and right sides image coordinate difference of correspondence, and this difference reaction the difference of its locus in camera coordinate system.The coordinate position of matching characteristic o'clock in two figure, in conjunction with parameters of pick-up head, its position in camera coordinate system can be estimated accurately.
Described world coordinate system computing module is meant: the unique point of extracting processing needs a nearlyer step change into world coordinate system after the coordinate that has obtained under the camera coordinates system.Two coordinate systems are after carrying out some couplings before system's operation, and the mapping relations of both relations can obtain.By these mapping relations, the coordinate of unique point in world coordinate system also is as can be known.
Described target cluster module is meant: will be in world coordinate system all unique points aggregate into several set according to its height and position, the actual position of target in the space of these set correspondences.
Described trajectory analysis module is meant: target is after the volume coordinate of every frame is expressed with set, and these discrete location points are determined the real motion track of whole target in the space by model.
The present invention adopts the method for computer vision in tracking, experimental analysis can determine that the motion of target in the space and the track that calculates meet substantially.Therefore this method can be carried out Tracking Recognition to the target in the captured area of space by the binocular camera shooting head.
The present invention at first imports by two captured width of cloth digital pictures of binocular camera shooting head, is according to the judging characteristic point with the matrix computations eigenwert in each neighborhood of pixels scope.Unique point under image coordinate system can obtain its coordinate under camera coordinates system by the parallax that calculates in two width of cloth figure, and this coordinate can change into world coordinates by the world coordinate system of demarcating in advance.The cluster module aggregates into discrete set with these unique points and expresses the real space object, and last trajectory analysis model connects into movement locus with target in difference position constantly.
The present invention can not only detect target under the violent environment of illumination variation, the movement locus of tracking target, and can access the movement velocity of target and the target density in zone.Compare with general detection tracking, the track that obtains extraterrestrial target that can be more accurate, more stable, simultaneously detection speed also can satisfy normal application, such characteristics make its monitor in public places with stream of people's wagon flow statistics in all have wide practical use.The present invention is based on the basis of computer stereo vision theory, added pattern-recognition, optimization method scheduling theory knowledge is becoming illumination, is stably obtained the movement locus of extraterrestrial target under the conditions such as complex background.
Description of drawings
Fig. 1 is a system architecture diagram of the present invention;
Fig. 2 is the processing flow chart of the embodiment of the invention;
Fig. 3 application example synoptic diagram of the present invention.
Embodiment
Below in conjunction with accompanying drawing embodiments of the invention are elaborated: present embodiment has provided detailed embodiment being to implement under the prerequisite with the technical solution of the present invention, but protection scope of the present invention is not limited to following embodiment.
As shown in Figure 1, present embodiment comprises: load module, and characteristic extracting module, the disparity estimation module, the world coordinate system computing module, target cluster module, the trajectory analysis module, wherein:
Described load module is responsible for gathering the image that the binocular camera shooting system takes and is imported as system, and the left and right sides image that is obtained is input to characteristic extracting module;
Described characteristic extracting module is to the object of input picture extract minutiae as subsequent treatment;
The unique point that described disparity estimation module extracts for characteristic extracting module is according to the volume coordinate of calculation of parameter unique point in camera coordinates system of camera;
It is coordinate that described world coordinate system computing module is converted to real world coordinates again with the coordinate of unique point in camera coordinates system;
Described target cluster module aggregates into a set of representing real-world object in the space according to the unique point coordinate with a plurality of unique points;
Described trajectory analysis module provides the movement locus of target in real space in conjunction with the position of current goal set and historical goal set.
Described load module is responsible for gathering the digital picture of binocular camera shooting system, described digital picture is that digital camera and digital scanner image that can obtain and Digital Video provide the frame in the sequence image, each pixel value of photographic images is deposited in the internal storage location of region of memory correspondence in order, if input picture is a coloured image, then coloured image will be divided into R, G, three passages of B are preserved respectively.
The eigenwert of the matrix that each neighborhood of pixels during described characteristic extracting module is calculated input picture is formed, when surpassing the threshold value that pre-sets, then this point is considered to the unique point in the image.
Described disparity estimation module is sought corresponding same point to the unique point that extracts in another input figure, the coordinate position of matching characteristic o'clock in two figure, in conjunction with parameters of pick-up head, its position in camera coordinate system can be estimated accurately, wherein match-on criterion is as contrast differences distance between the matrix of central point by NCC algorithm computation unique point, by the difference of position o'clock in two images in the real world is parallax, by obtaining parallax, demarcate the inside that obtains camera in advance in conjunction with two cameras, external parameter, with unique point x in image, y coordinate, parallax is converted to the coordinate figure under the camera coordinates.
Described world coordinate system computing module is after the unique point of extract handling has obtained coordinate under the camera coordinates system, with the camera coordinates of unique point be coordinate again in conjunction with the relation of the world coordinate system of demarcating in advance with camera coordinates system, it is to project in the world coordinate system that unique point is had camera coordinates again.
The unique point that described target cluster module is will be in world coordinate system all aggregates into several set according to its height and position, the actual position of target in the space of these set correspondences.
Described trajectory analysis module is meant: target is after the volume coordinate of every frame is expressed with set, and these discrete location points are determined the real motion track of whole target in the space by model.
As shown in Figure 2, present embodiment system handles process flow diagram.The image of two shot by camera about at first being read by load module, characteristic extracting module is calculated unique point among a figure therein, seeks corresponding point then in another figure.The disparity estimation module is according to the difference of 2 o'clock positions in two figure matching, and the further coordinate position of calculated characteristics point in camera coordinates is of combining camera parameter.It is coordinate that the world coordinate system computing module is converted to real world coordinates again with the coordinate of unique point in camera coordinates system; Target cluster module can be clustered into the unique point that is transformed into world coordinates some set by the camera coordinates system of coupling and the relation between the world coordinate system in advance by clustering algorithm.At last, the track module provides the real trace that it moves according to target position and current position in the past in the space.
As shown in Figure 3, the image of two shot by camera about present embodiment at first reads in.Calculate unique point among a figure therein then.Mate coordinate under the computing camera coordinate system with another figure by these unique points.Again with these spot projections in world coordinate system, the cluster that can obtain some points is by projection, these different clusters with the different gray scale sign of the depths after, can obtain the position of target in the monitoring space.By the processing to every frame, tracking module can obtain the real motion track of target.Main window among Fig. 3 shows the left and right sides image that the shooting of binocular camera shooting head obtains, show the unique point of extracting during operation in real time, the window of newly jumping out " Cam " shows that in real time ground surveillance zone, unique point are at the projection on ground and the cluster result and the pursuit path of unique point.

Claims (7)

1, a kind ofly can detect the target of appointed area and, it is characterized in that, comprise the system that it is followed the tracks of: load module, characteristic extracting module, disparity estimation module, world coordinate system computing module, target cluster module, trajectory analysis module, wherein:
Described load module is responsible for gathering the image that the binocular camera shooting system takes and is imported as system, and the left and right sides image that is obtained is input to characteristic extracting module;
Described characteristic extracting module is to the object of input picture extract minutiae as subsequent treatment;
The unique point that described disparity estimation module extracts for characteristic extracting module is according to the volume coordinate of calculation of parameter unique point in camera coordinates system of camera;
It is coordinate that described world coordinate system computing module is converted to real world coordinates again with the coordinate of unique point in camera coordinates system;
Described target cluster module aggregates into a set of representing real-world object in the space according to the unique point coordinate with a plurality of unique points;
Described trajectory analysis module provides the movement locus of target in real space in conjunction with the position of current goal set and historical goal set.
2, the tracker that has the binocular matching feature according to claim 1, it is characterized in that, described load module is responsible for gathering the digital picture of binocular camera shooting system, described digital picture is that digital camera and digital scanner image that can obtain and Digital Video provide the frame in the sequence image, each pixel value of photographic images is deposited in the internal storage location of region of memory correspondence in order, if input picture is a coloured image, then coloured image will be divided into R, G, three passages of B are preserved respectively.
3, the tracker that has the binocular matching feature according to claim 1, it is characterized in that, the eigenwert of the matrix that each neighborhood of pixels during described characteristic extracting module is calculated input picture is formed, when surpassing the threshold value that pre-sets, then this point is considered to the unique point in the image.
4, the tracker that has the binocular matching feature according to claim 1, it is characterized in that, described disparity estimation module is sought corresponding same point to the unique point that extracts in another input figure, the coordinate position of matching characteristic o'clock in two figure, in conjunction with parameters of pick-up head, its position in camera coordinate system can be estimated accurately, wherein match-on criterion is as contrast differences distance between the matrix of central point by NCC algorithm computation unique point, by the difference of position o'clock in two images in the real world is parallax, by obtaining parallax, demarcate the inside that obtains camera in advance in conjunction with two cameras, external parameter, with unique point x in image, y coordinate, parallax is converted to the coordinate figure under the camera coordinates.
5, the tracker that has the binocular matching feature according to claim 1, it is characterized in that, described world coordinate system computing module is after the unique point of extract handling has obtained coordinate under the camera coordinates system, with the camera coordinates of unique point be coordinate again in conjunction with the relation of the world coordinate system of demarcating in advance with camera coordinates system, it is to project in the world coordinate system that unique point is had camera coordinates again.
6, the tracker that has the binocular matching feature according to claim 1, it is characterized in that, the unique point that described target cluster module is will be in world coordinate system all aggregates into several set according to its height and position, the actual position of target in the space of these set correspondences.
7, the tracker that has the binocular matching feature according to claim 1, it is characterized in that, described trajectory analysis module, be meant: target is after the volume coordinate of every frame is expressed with set, and these discrete location points are determined the real motion track of whole target in the space by model.
CNA2008100424910A 2008-09-04 2008-09-04 Tracking system based on binocular camera shooting CN101344965A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNA2008100424910A CN101344965A (en) 2008-09-04 2008-09-04 Tracking system based on binocular camera shooting

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNA2008100424910A CN101344965A (en) 2008-09-04 2008-09-04 Tracking system based on binocular camera shooting

Publications (1)

Publication Number Publication Date
CN101344965A true CN101344965A (en) 2009-01-14

Family

ID=40246964

Family Applications (1)

Application Number Title Priority Date Filing Date
CNA2008100424910A CN101344965A (en) 2008-09-04 2008-09-04 Tracking system based on binocular camera shooting

Country Status (1)

Country Link
CN (1) CN101344965A (en)

Cited By (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101604448A (en) * 2009-03-16 2009-12-16 北京中星微电子有限公司 A kind of speed-measuring method of moving target and system
WO2011006382A1 (en) * 2009-07-17 2011-01-20 深圳泰山在线科技有限公司 A method and terminal equipment for action identification based on marking points
CN102034247A (en) * 2010-12-23 2011-04-27 中国科学院自动化研究所 Motion capture method for binocular vision image based on background modeling
CN102175251A (en) * 2011-03-25 2011-09-07 江南大学 Binocular intelligent navigation system
CN102214000A (en) * 2011-06-15 2011-10-12 浙江大学 Hybrid registration method and system for target objects of mobile augmented reality (MAR) system
CN102506815A (en) * 2011-11-10 2012-06-20 河北汉光重工有限责任公司 Multi-target tracking and passive distance measuring device based on image recognition
CN101877174B (en) * 2009-09-29 2012-07-25 杭州海康威视软件有限公司 Vehicle speed measurement method, supervisory computer and vehicle speed measurement system
CN102622767A (en) * 2012-03-05 2012-08-01 广州乐庚信息科技有限公司 Method for positioning binocular non-calibrated space
CN102798456A (en) * 2012-07-10 2012-11-28 中联重科股份有限公司 Method, device and system for measuring working range of engineering mechanical arm frame system
CN102819847A (en) * 2012-07-18 2012-12-12 上海交通大学 Method for extracting movement track based on PTZ mobile camera
CN103083089A (en) * 2012-12-27 2013-05-08 广东圣洋信息科技实业有限公司 Virtual scale method and system of digital stereo-micrography system
CN101877796B (en) * 2009-04-28 2013-07-24 海信集团有限公司 Optical parallax acquiring method, device and system
CN103337076A (en) * 2013-06-26 2013-10-02 深圳市智美达科技有限公司 Method and device for determining appearing range of video monitoring targets
CN103595916A (en) * 2013-11-11 2014-02-19 南京邮电大学 Double-camera target tracking system and implementation method thereof
CN103826071A (en) * 2014-03-11 2014-05-28 深圳市中安视科技有限公司 Three-dimensional camera shooting method for three-dimensional identification and continuous tracking
CN104182747A (en) * 2013-05-28 2014-12-03 株式会社理光 Object detection and tracking method and device based on multiple stereo cameras
CN104317391A (en) * 2014-09-24 2015-01-28 华中科技大学 Stereoscopic vision-based three-dimensional palm posture recognition interactive method and system
CN104408718A (en) * 2014-11-24 2015-03-11 中国科学院自动化研究所 Gait data processing method based on binocular vision measuring
CN104539909A (en) * 2015-01-15 2015-04-22 安徽大学 Video monitoring method and video monitoring server
CN104754733A (en) * 2013-12-31 2015-07-01 南京理工大学 Node position prediction method of dynamic wireless network control system
CN104820434A (en) * 2015-03-24 2015-08-05 南京航空航天大学 Velocity measuring method of ground motion object by use of unmanned plane
CN104915965A (en) * 2014-03-14 2015-09-16 华为技术有限公司 Camera tracking method and device
CN105072312A (en) * 2015-07-23 2015-11-18 柳州正高科技有限公司 Method for predicting image moving direction in dynamic video
CN105160649A (en) * 2015-06-30 2015-12-16 上海交通大学 Multi-target tracking method and system based on kernel function unsupervised clustering
CN105898265A (en) * 2014-12-18 2016-08-24 陆婷 Novel stereo video-based human body tracking method
CN105930766A (en) * 2016-03-31 2016-09-07 深圳奥比中光科技有限公司 Unmanned plane
CN105979203A (en) * 2016-04-29 2016-09-28 中国石油大学(北京) Multi-camera cooperative monitoring method and device
CN106375654A (en) * 2015-07-23 2017-02-01 韩华泰科株式会社 Apparatus and method for controlling network camera
CN106657600A (en) * 2016-10-31 2017-05-10 维沃移动通信有限公司 Image processing method and mobile terminal
CN106907988A (en) * 2017-02-27 2017-06-30 北京工业大学 The micro- visual modeling method of basic data matrix
CN107292916A (en) * 2017-08-08 2017-10-24 阔地教育科技有限公司 Target association method, storage device, straight recorded broadcast interactive terminal
CN107707821A (en) * 2017-09-30 2018-02-16 努比亚技术有限公司 Modeling method and device, bearing calibration, terminal, the storage medium of distortion parameter
CN107958461A (en) * 2017-11-14 2018-04-24 中国航空工业集团公司西安飞机设计研究所 A kind of carrier aircraft method for tracking target based on binocular vision
CN108109176A (en) * 2017-12-29 2018-06-01 北京进化者机器人科技有限公司 Articles detecting localization method, device and robot
CN108257146A (en) * 2018-01-15 2018-07-06 新疆大学 Movement locus display methods and device
CN108877269A (en) * 2018-08-20 2018-11-23 清华大学 A kind of detection of intersection vehicle-state and V2X broadcasting method
WO2020020160A1 (en) * 2018-07-25 2020-01-30 北京市商汤科技开发有限公司 Image parallax estimation
CN111918023A (en) * 2020-06-29 2020-11-10 北京大学 Monitoring target tracking method and device

Cited By (55)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101604448A (en) * 2009-03-16 2009-12-16 北京中星微电子有限公司 A kind of speed-measuring method of moving target and system
CN101604448B (en) * 2009-03-16 2015-01-21 北京中星微电子有限公司 Method and system for measuring speed of moving targets
CN101877796B (en) * 2009-04-28 2013-07-24 海信集团有限公司 Optical parallax acquiring method, device and system
WO2011006382A1 (en) * 2009-07-17 2011-01-20 深圳泰山在线科技有限公司 A method and terminal equipment for action identification based on marking points
CN101877174B (en) * 2009-09-29 2012-07-25 杭州海康威视软件有限公司 Vehicle speed measurement method, supervisory computer and vehicle speed measurement system
CN102034247B (en) * 2010-12-23 2013-01-02 中国科学院自动化研究所 Motion capture method for binocular vision image based on background modeling
CN102034247A (en) * 2010-12-23 2011-04-27 中国科学院自动化研究所 Motion capture method for binocular vision image based on background modeling
CN102175251A (en) * 2011-03-25 2011-09-07 江南大学 Binocular intelligent navigation system
CN102214000A (en) * 2011-06-15 2011-10-12 浙江大学 Hybrid registration method and system for target objects of mobile augmented reality (MAR) system
CN102506815A (en) * 2011-11-10 2012-06-20 河北汉光重工有限责任公司 Multi-target tracking and passive distance measuring device based on image recognition
CN102622767B (en) * 2012-03-05 2014-07-30 广州乐庚信息科技有限公司 Method for positioning binocular non-calibrated space
CN102622767A (en) * 2012-03-05 2012-08-01 广州乐庚信息科技有限公司 Method for positioning binocular non-calibrated space
CN102798456A (en) * 2012-07-10 2012-11-28 中联重科股份有限公司 Method, device and system for measuring working range of engineering mechanical arm frame system
CN102798456B (en) * 2012-07-10 2015-01-07 中联重科股份有限公司 Method, device and system for measuring working range of engineering mechanical arm frame system
CN102819847A (en) * 2012-07-18 2012-12-12 上海交通大学 Method for extracting movement track based on PTZ mobile camera
CN103083089A (en) * 2012-12-27 2013-05-08 广东圣洋信息科技实业有限公司 Virtual scale method and system of digital stereo-micrography system
CN103083089B (en) * 2012-12-27 2014-11-12 广东圣洋信息科技实业有限公司 Virtual scale method and system of digital stereo-micrography system
CN104182747A (en) * 2013-05-28 2014-12-03 株式会社理光 Object detection and tracking method and device based on multiple stereo cameras
CN103337076B (en) * 2013-06-26 2016-09-21 深圳市智美达科技股份有限公司 There is range determining method and device in video monitor object
CN103337076A (en) * 2013-06-26 2013-10-02 深圳市智美达科技有限公司 Method and device for determining appearing range of video monitoring targets
CN103595916A (en) * 2013-11-11 2014-02-19 南京邮电大学 Double-camera target tracking system and implementation method thereof
CN104754733B (en) * 2013-12-31 2019-03-05 南京理工大学 Dynamic wireless network control system node location prediction technique
CN104754733A (en) * 2013-12-31 2015-07-01 南京理工大学 Node position prediction method of dynamic wireless network control system
CN103826071A (en) * 2014-03-11 2014-05-28 深圳市中安视科技有限公司 Three-dimensional camera shooting method for three-dimensional identification and continuous tracking
WO2015135323A1 (en) * 2014-03-14 2015-09-17 华为技术有限公司 Camera tracking method and device
CN104915965A (en) * 2014-03-14 2015-09-16 华为技术有限公司 Camera tracking method and device
CN104317391B (en) * 2014-09-24 2017-10-03 华中科技大学 A kind of three-dimensional palm gesture recognition exchange method and system based on stereoscopic vision
CN104317391A (en) * 2014-09-24 2015-01-28 华中科技大学 Stereoscopic vision-based three-dimensional palm posture recognition interactive method and system
CN104408718A (en) * 2014-11-24 2015-03-11 中国科学院自动化研究所 Gait data processing method based on binocular vision measuring
CN104408718B (en) * 2014-11-24 2017-06-30 中国科学院自动化研究所 A kind of gait data processing method based on Binocular vision photogrammetry
CN105898265A (en) * 2014-12-18 2016-08-24 陆婷 Novel stereo video-based human body tracking method
CN104539909A (en) * 2015-01-15 2015-04-22 安徽大学 Video monitoring method and video monitoring server
CN104820434A (en) * 2015-03-24 2015-08-05 南京航空航天大学 Velocity measuring method of ground motion object by use of unmanned plane
CN105160649A (en) * 2015-06-30 2015-12-16 上海交通大学 Multi-target tracking method and system based on kernel function unsupervised clustering
CN105072312A (en) * 2015-07-23 2015-11-18 柳州正高科技有限公司 Method for predicting image moving direction in dynamic video
CN106375654B (en) * 2015-07-23 2020-09-01 韩华泰科株式会社 Apparatus and method for controlling web camera
CN106375654A (en) * 2015-07-23 2017-02-01 韩华泰科株式会社 Apparatus and method for controlling network camera
CN105930766A (en) * 2016-03-31 2016-09-07 深圳奥比中光科技有限公司 Unmanned plane
CN105979203A (en) * 2016-04-29 2016-09-28 中国石油大学(北京) Multi-camera cooperative monitoring method and device
CN106657600A (en) * 2016-10-31 2017-05-10 维沃移动通信有限公司 Image processing method and mobile terminal
CN106657600B (en) * 2016-10-31 2019-10-15 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN106907988A (en) * 2017-02-27 2017-06-30 北京工业大学 The micro- visual modeling method of basic data matrix
CN106907988B (en) * 2017-02-27 2019-03-22 北京工业大学 The micro- visual modeling method of basic data matrix
CN107292916B (en) * 2017-08-08 2020-10-27 阔地教育科技有限公司 Target association method, storage device and direct recording and broadcasting interactive terminal
CN107292916A (en) * 2017-08-08 2017-10-24 阔地教育科技有限公司 Target association method, storage device, straight recorded broadcast interactive terminal
CN107707821A (en) * 2017-09-30 2018-02-16 努比亚技术有限公司 Modeling method and device, bearing calibration, terminal, the storage medium of distortion parameter
CN107707821B (en) * 2017-09-30 2020-11-06 努比亚技术有限公司 Distortion parameter modeling method and device, correction method, terminal and storage medium
CN107958461A (en) * 2017-11-14 2018-04-24 中国航空工业集团公司西安飞机设计研究所 A kind of carrier aircraft method for tracking target based on binocular vision
CN108109176A (en) * 2017-12-29 2018-06-01 北京进化者机器人科技有限公司 Articles detecting localization method, device and robot
CN108257146B (en) * 2018-01-15 2020-08-18 新疆大学 Motion trail display method and device
CN108257146A (en) * 2018-01-15 2018-07-06 新疆大学 Movement locus display methods and device
WO2020020160A1 (en) * 2018-07-25 2020-01-30 北京市商汤科技开发有限公司 Image parallax estimation
CN108877269A (en) * 2018-08-20 2018-11-23 清华大学 A kind of detection of intersection vehicle-state and V2X broadcasting method
CN108877269B (en) * 2018-08-20 2020-10-27 清华大学 Intersection vehicle state detection and V2X broadcasting method
CN111918023A (en) * 2020-06-29 2020-11-10 北京大学 Monitoring target tracking method and device

Similar Documents

Publication Publication Date Title
Tan et al. Robust monocular SLAM in dynamic environments
US10268900B2 (en) Real-time detection, tracking and occlusion reasoning
Munaro et al. Tracking people within groups with RGB-D data
Hu et al. Joint monocular 3D vehicle detection and tracking
Luber et al. People tracking in rgb-d data with on-line boosted target models
Benfold et al. Guiding visual surveillance by tracking human attention.
Khan et al. Tracking multiple occluding people by localizing on multiple scene planes
Munoz-Salinas et al. People detection and tracking using stereo vision and color
Wang Real-time moving vehicle detection with cast shadow removal in video based on conditional random field
Manafifard et al. A survey on player tracking in soccer videos
Yamaguchi et al. Vehicle ego-motion estimation and moving object detection using a monocular camera
Haritaoglu et al. W 4 s: A real-time system for detecting and tracking people in 2 1/2d
Tian et al. Robust salient motion detection with complex background for real-time video surveillance
Javed et al. Tracking and object classification for automated surveillance
Elgammal et al. Background and foreground modeling using nonparametric kernel density estimation for visual surveillance
Zhou et al. A master-slave system to acquire biometric imagery of humans at distance
TWI520102B (en) Tracking method
Zhou et al. Object tracking in an outdoor environment using fusion of features and cameras
Xing et al. Multi-object tracking through occlusions by local tracklets filtering and global tracklets association with detection responses
Kundu et al. Moving object detection by multi-view geometric techniques from a single camera mounted robot
CN107545582B (en) Video multi-target tracking method and device based on fuzzy logic
CN102436662B (en) Human body target tracking method in nonoverlapping vision field multi-camera network
Xiao et al. Dynamic-SLAM: Semantic monocular visual localization and mapping based on deep learning in dynamic environment
CN107481270B (en) Table tennis target tracking and trajectory prediction method, device, storage medium and computer equipment
CN102456225B (en) Video monitoring system and moving target detecting and tracking method thereof

Legal Events

Date Code Title Description
PB01 Publication
C06 Publication
SE01 Entry into force of request for substantive examination
C10 Entry into substantive examination
RJ01 Rejection of invention patent application after publication

Open date: 20090114

C12 Rejection of a patent application after its publication