CN101344965A - Tracking system based on binocular camera shooting - Google Patents

Tracking system based on binocular camera shooting Download PDF

Info

Publication number
CN101344965A
CN101344965A CN 200810042491 CN200810042491A CN101344965A CN 101344965 A CN101344965 A CN 101344965A CN 200810042491 CN200810042491 CN 200810042491 CN 200810042491 A CN200810042491 A CN 200810042491A CN 101344965 A CN101344965 A CN 101344965A
Authority
CN
China
Prior art keywords
module
coordinate system
camera
target
feature
Prior art date
Application number
CN 200810042491
Other languages
Chinese (zh)
Inventor
胡福乔
岭 蔡
赵宇明
Original Assignee
上海交通大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海交通大学 filed Critical 上海交通大学
Priority to CN 200810042491 priority Critical patent/CN101344965A/en
Publication of CN101344965A publication Critical patent/CN101344965A/en

Links

Abstract

The invention relates to a full automatic target detecting and tracking system in the computer vision field, wherein, an input module is responsible for collecting digital images shot by a binocular camera to be taken as system input, the obtained digital images are input into a feature extraction module and feature analysis is carried out to one image to obtain characteristic points to be taken as the subsequently processed images. By matching the characteristic points of two images, the parallax of the two images is calculated, and by combining the pre-informed external and internal parameters of the camera, the lower coordinate of a camera coordinate system of the characteristic points can be calculated; furthermore, by the relationship between a world coordinate system and the camera coordinate system, the coordinate of the world coordinate system of the characteristic points can be known. A clustering module cluster the characteristic points into an aggregation for expressing target position, while a trajectory analysis module estimates the target position on a time sequence to obtain the motion trajectory of the target. The invention can effectively and steadily detect the targets in a designated area, track the targets and calculate the motion trajectories of the targets.

Description

基于双目摄像的跟踪系统 Based on Binocular camera tracking system

技术领域 FIELD

本发明涉及的是一种图象识别技术领域的跟踪系统,具体地说,是一种基于双目摄像的跟踪系统。 The present invention relates to an image recognition technology in the field of tracking systems, and specifically, a tracking system based on binocular imaging. 背景技术 Background technique

随着数码摄像头的普及,数字图像在生产和生活中占有了越来越重要的地位。 With the popularity of digital cameras, digital image plays an increasingly important role in the production and life. 特别是在监控安保中,数字图像在目标识别和目标跟踪等方面起到了重要作用。 Especially in monitoring security, the digital image plays an important role in target identification and target tracking and so on. 然后由于监控场景光照条件、视角的变化,影响了监控的准确性,使得基于图像技术的全自动跟踪应用受到了限制。 Since then monitoring changes in the scene lighting, viewing angle affects the accuracy of the monitoring, so that the limited image based on automatic tracking application technique. 现实生活中的监控场景变化非常大,不同场景在环境光照的影响、摄像机角度、目标遮挡和阴影等因素差别很大。 Monitoring real-life scene change is very large, the effects of different scenarios on ambient lighting, camera angles, object occlusion and shadows and other factors vary greatly. 甚至在一些室外场景中,这些条件不同时刻也是不尽相同的。 Even in some outdoor scenes, these conditions at different times is not the same. 基于单摄像头的监控系统往往对这些因素十分敏感,在真实场景的跟踪中准确性较低或者检测速度较慢。 Single camera monitoring system based on these factors is often very sensitive, low tracking accuracy of the real scene detection or slow. 因此这样的跟踪系统很难完成真实场景中的跟踪任务。 Therefore, such a tracking system is difficult to achieve in the real scene tracking tasks. 针对这个问题常用的处理方法是采用动态的背景建模或针对特定目标进行大样本的机器学习,然后这两种处理方法对于周期性的背景变化和单个目标有一定效果,对比剧烈的背景变化和多个复杂目标的无法处理。 The common approach to this problem is to use a dynamic or background modeling large sample for a particular target machine learning, then the two processing methods have some effect to the periodic variations and a single target background contrast dramatic changes and background more complex objects can not handle.

经对现有技术的文献检索发现,Tao Zhao, Manoj Aggarwal, Rakesh Kumar and Harpeet Sawhney在《IEEE Computer Society Conference on Computer Vision and Pattern Recognition》(国际电气电子工程师协会计算机视觉与模式识别2005年会议)一文"Real-time Wide Area Muti-Camera Stereo Tracking"(开发区域多摄像机实时立体跟踪),该文中提出一种基于单摄像机和多摄像机融合的系统,单个摄像头在自身视角检测并跟踪人体,而多摄像机融合模块组合所有对于同一人体的局部跟踪,形成全局跟踪。 Literature search of the prior art found, Tao Zhao, Manoj Aggarwal, Rakesh Kumar and Harpeet Sawhney in "IEEE Computer Society Conference on Computer Vision and Pattern Recognition" (Institute of Electrical and Electronics Engineers Computer Vision and Pattern Recognition, 2005 meeting) article "real-time Wide area Muti-camera stereo tracking" (development area multiple cameras in real-time three-dimensional tracking), the paper proposes a system for a single camera and multiple cameras fusion, a single camera to detect and track the body in its own angle of view, and multiple cameras fusion module for the same combination of all the body part tracking, global tracking is formed. 在检测中应用立体分割和跟踪方法处理在复杂背景下的多人移动,并且利用时间空间的约束的融合方法对于多摄像机进行综合处理。 Application of the detection method for processing three-dimensional segmentation and tracking in complex background people moving, and the process for using the multi-camera integrated fusion constraints of time and space.

上述系统的不足在于:该系统虽然可以检测和跟踪多个人体目标,但是该算法需要较多摄像机(文中是12对立体摄像机),造成硬件成本较高。 Less than the above system is that: While the system can detect and track multiple human target, but the algorithm needs more cameras (text are 12 pairs of stereo cameras), resulting in higher hardware costs. 同时算法没有办法计算目标的运动速度和方向,同时只能针对人体而不能针对任一物体的检测和跟踪,如公路上汽车的检测和跟踪。 At the same time the algorithm is no way to calculate the speed and direction of movement of the target, but only for the body but not for any of the detection and tracking of objects, such as cars on the roads detection and tracking. 发明内容 SUMMARY

本发明的目的在于克服现有技术中对于监控场景变化对跟踪系统性能存在的不足,提供一种基于双目全自动的目标跟踪系统,使其根据双摄像头同时拍摄的图像自动计算特征点到摄像机的距离,并还原出目标的真实世界坐标位置从而分析目标的运动轨迹,可以应用于真实场景的目标跟踪以及目标密度和运动方向的估计。 Object of the present invention is to overcome the disadvantages of the prior art for monitoring the performance of the scene change tracking system exists, there is provided a binocular automatic target tracking system, so that a dual image captured by the camera while automatically calculated feature points to the camera distance, and restore the real-world coordinate position of the target in order to analyze the trajectory of targets, target tracking, and can be applied to estimate the target density and direction of movement of the real scene.

本发明是通过以下技术方案实现的,本发明包括以下几个模块:输入模块、 特征提取模块、视差估计模块、世界坐标系计算模块、目标聚类模块、轨迹分析模块,输入模块负责采集双目摄像系统拍摄的图像作为系统输入,所获得的左右图像输入到特征提取模块。 The present invention is achieved by the following technical solutions, the present invention includes the following modules: an input module, a feature extraction module, a disparity estimation module, a world coordinate system calculation module, the target cluster module, trajectory analysis module, the input module is responsible for collecting binocular the image capturing system of the imaging system as an input, the left and right images are input into the feature extraction module. 特征提取模块对输入图像提取特征点作为后续处理的对象。 The feature extraction module extracts a feature point on the input image as an object for subsequent processing. 视差估计模块对于特征提取模块提取出的特征点,根据相机的参数计算特征点在相机坐标系中的空间坐标。 Disparity estimation module for feature extraction module extracts the feature points, the feature point in the spatial coordinate system of the camera parameters of the camera calculated coordinates. 世界坐标系计算模块将特征点在相机坐标系中的坐标再转换为真实世界坐标系坐标。 The world coordinate system coordinates of the feature point calculation module in the camera coordinate system and then converted to a real-world coordinate system coordinates. 目标聚类模块根据特征点坐标,将多个特征点聚合成一个代表空间中真实物体的集合。 Clustering module according to the target characteristic point coordinates, a plurality of feature points into a representative of the polymerization space, a set of real objects. 轨迹分析模块结合当前目标集合与历史目标集合的位置给出目标在真实空间中的运动轨迹。 Trajectory analysis module combines current target position and historical collection target set goals given trajectory in real space.

所述的输入模块,是指:负责采集双目摄像系统的数字图像,所述的数字图像是数码相机和数字扫描仪所能获取的图像以及数码摄像机所提供序列图像中的一帧。 Said input module means: binocular imaging system is responsible for collecting a digital image, said digital image is a digital scanner and a digital camera that can obtain images and digital video camera provides a sequence of images.

所述的特征提取模块,是指:对输入图像计算中的每个像素邻域组成的矩阵的特征值,当超过预设置的阈值时,则该点被认为是图像中的特征点。 The feature extraction module, means: the eigenvalue matrix calculation of the input image for each pixel neighborhood consisting of, when the value exceeds a preset threshold, then the point is considered to be feature points in the image.

所述的视差估计模块,是指:对提取出的特征点在另一张输入图中寻找相对应的同一点。 The disparity estimation module, means: the extracted feature points corresponding to the same point to find another input in FIG. 由于左右摄像头视角的差异,同一个点在左右图像中对应的图像坐标不同,而这个差异反应了它在摄像机坐标系中的空间位置的不同。 Since the head difference between the left and right image pickup angle of view, different from the same point in the left and right images corresponding to image coordinates, and this difference in reactivity of its different spatial locations in the camera coordinate system. 匹配特征点在两张图中的坐标位置,结合摄像头的参数,其在摄像机坐标系中的位置能被准确的估计。 Matched feature point coordinate positions in the two figures, the binding parameters of the camera, the position can be accurately estimated in the camera coordinate system.

所述的世界坐标系计算模块,是指:提取处理的特征点在得到了相机坐标 The world coordinate system calculation module, it means: extracting feature points have been processed in the camera coordinate

系下的坐标以后,需要更近一步转成世界坐标系。 After the Department of coordinates, we need to turn one step closer to the world coordinate system. 两个坐标系在系统运行之前进行一些匹配后,两者的关系的映射关系可以获得。 After two coordinate systems prior to match some operating system, a relationship between a mapping relationship can be obtained. 通过该映射关系,特征点在世界坐标系中的坐标也是可知的。 With this mapping relationship, the coordinates of the feature points in the world coordinate system is known.

所述的目标聚类模块,是指:将在世界坐标系中所有的特征点,根据其高度和位置聚合成几个集合,这些集合对应的目标在空间中的真实位置。 The target clustering module, means: all the feature points, based on their height and position of polymerization into several collections in the world coordinate system, corresponding to the set target true position in space.

所述的轨迹分析模块,是指:目标在每帧的空间坐标用集合表达后,这些离散的位置点通过模型的来确定整个目标在空间中的真实运动轨迹。 The trajectory of the analysis module, means: the goal in the space of each frame a set of coordinates expressed in these discrete points to determine the position of the entire target trajectory in real space by the model.

本发明在跟踪中采用计算机视觉的方法,实验分析可以确定空间中目标的运动与计算得到的轨迹基本符合。 The method of the present invention in the tracking computer vision, the space experiment analysis may determine the motion trajectory calculation target obtained in line. 因此该方法可以通过双目摄像头对所拍摄的空 The method may therefore by binocular camera of the air taken

间区域中的目标进行跟踪识别。 Between the target tracking area identification.

本发明首先输入由双目摄像头所拍摄的两幅数字图像,以每个像素邻域范围内的矩阵计算特征值为依据判断特征点。 The present invention first inputs two images by the binocular digital camera captured, to calculate a feature matrix within each pixel value range of the neighborhood is determined based on the feature points. 在图像坐标系下的特征点通过计算在两幅图中的视差可以得到它在相机坐标系下的坐标,该坐标通过预先标定的世界坐标系可转成世界坐标。 It is possible to obtain the coordinates in the camera coordinate system feature point in the image coordinate system by calculating the disparity in the two figures, the pre-calibrated by the coordinates of the world coordinate system can be transformed into world coordinates. 聚类模块将这些特征点聚合成离散的集合来表达真实空间物体,最后轨迹分析模型将目标在不同时刻的位置连接成运动轨迹。 These feature points clustering module polymerized into a set of discrete objects to express the real space, and finally connected to the target trajectory analysis model into a position trajectory at different times.

本发明不仅能在光照变化剧烈的环境下检测目标,跟踪目标的运动轨迹, 而且能够得到目标的运动速度和区域的目标密度。 The present invention is not only capable of detecting changes in environmental lighting vigorous target, tracing the target, and the target density can be obtained and the moving speed of the target area. 与一般的检测跟踪方法相比, 能够更准确、更稳定的得到空间目标的轨迹,同时检测速度也可以满足正常的应用,这样的特点使其在公共场所监控和人流车流统计中都有广泛的应用前景。 Compared with the general method of detection and tracking, more accurately, to get a more stable space target trajectory, but also to meet the normal speed detection applications, such characteristics make it in a public place and monitoring traffic flow statistics have a wide range of prospects. 本发明基于计算机立体视觉理论的基础上,加入了模式识别,最优化方法等理论知识,在变光照,复杂背景等条件下得到稳定地获取空间目标的运动轨迹。 The present invention is based on the computer-based stereo vision theory, the addition of theoretical knowledge pattern recognition, optimization method, a stable access to the space at the target trajectory becomes light, complex background conditions.

附图说明 BRIEF DESCRIPTION

图1为本发明系统结构框图; Figure 1 is a block diagram of the inventive system;

图2为本发明实施例的处理流程图; 2 a process flow diagram of the embodiment of the present invention;

图3本发明应用实例示意图。 Application of the present invention FIG example FIG.

具体实施方式 Detailed ways

下面结合附图对本发明的实施例作详细说明:本实施例在以本发明技术方案为前提下进行实施,给出了详细的实施方式,但本发明的保护范围不限于下述的实施例。 DRAWINGS Embodiments of the present invention will be described in detail: In the present embodiments of the present invention is a technical premise, detailed embodiments given, but the scope of the present invention is not limited to the following examples.

如图1所示,本实施例包括:输入模块,特征提取模块,视差估计模块, 世界坐标系计算模块,目标聚类模块,轨迹分析模块,其中: 1, the present embodiment includes: an input module, a feature extraction module, a disparity estimation module, a world coordinate system calculation module, the target cluster module, trajectory analysis module, wherein:

所述输入模块负责采集双目摄像系统拍摄的图像作为系统输入,所获得的左右图像输入到特征提取模块; The image input module is responsible for collecting the binocular imaging system is taken as an input system, the left and right images are input into the feature extraction module;

所述特征提取模块对输入图像提取特征点作为后续处理的对象; The feature extraction module extracts a feature point on the input image as the object of subsequent processing;

所述视差估计模块对于特征提取模块提取出的特征点,根据相机的参数计算特征点在相机坐标系中的空间坐标; The disparity estimation module for feature extraction module extracts the feature points, the feature point in the spatial coordinate system of the camera parameters of the camera calculated in accordance with the coordinates;

所述世界坐标系计算模块将特征点在相机坐标系中的坐标再转换为真实世界坐标系坐标; The world coordinate system feature point coordinates calculation module in the camera coordinate system and then converted to the coordinate system of the real world;

所述目标聚类模块根据特征点坐标,将多个特征点聚合成一个代表空间中真实物体的集合; Clustering module according to the target characteristic point coordinates, a plurality of feature points into a representative of the polymerization space, a set of real object;

所述轨迹分析模块结合当前目标集合与历史目标集合的位置给出目标在真实空间中的运动轨迹。 The analysis module tracks the current target position of the binding set of historical target set target trajectory is given in the real space.

所述的输入模块负责采集双目摄像系统的数字图像,所述的数字图像是数码相机和数字扫描仪所能获取的图像以及数码摄像机所提供序列图像中的一帧, 将拍摄图像的每个像素值按顺序存入内存区域对应的内存单元中,如果输入图像是彩色图像,则彩色图像将分为R、 G、 B三个通道分别保存。 Said input digital image acquisition module is responsible for binocular imaging system, the digital image is a digital camera and digital scanner and a digital image that can be acquired by the camera provided in a sequence of images, each captured image a memory unit sequentially stores the pixel values ​​corresponding to memory area, if the input image is a color image, the color image is divided into R, G, B three channels were preserved.

所述的特征提取模块对输入图像计算中的每个像素邻域组成的矩阵的特征值,当超过预设置的阈值时,则该点被认为是图像中的特征点。 The feature extraction module eigenvalue matrix of the input image is calculated for each pixel in neighborhood composition when preset thresholds are exceeded, then the point is considered to be feature points in the image.

所述的视差估计模块对提取出的特征点在另一张输入图中寻找相对应的同一点,匹配特征点在两张图中的坐标位置,结合摄像头的参数,其在摄像机坐标系中的位置能被准确的估计,其中匹配标准为由NCC算法计算特征点作为中心点的矩阵之间对比差距,通过现实世界中的点在两张图像中位置的差别为视差,通过得到视差,结合两个相机预先标定得到相机的内部、外部参数,将特征点在图像中x, y坐标、视差转换为相机坐标下的坐标值。 The disparity estimation module of the extracted feature points to find corresponding to the same point, matching the feature point in the two figures coordinate position another enter the binding parameters of the camera, its position in the camera coordinate system can be was accurate estimate, which matches the standard feature points calculated by the NCC algorithm as the gap between the center point of the comparison matrix, through real-world point to the disparity in the difference in the position of two images, obtained by the disparity, combined with two cameras the obtained pre-calibrated camera internal and external parameters of the feature points in the image x, y coordinates, the disparity value into coordinates in the camera coordinate.

所述的世界坐标系计算模块在提取处理的特征点得到了相机坐标系下的坐标以后,将特征点的相机坐标系坐标再结合预先标定的世界坐标系与相机坐标系的关系,将特征点再有相机坐标系投射到世界坐标系中。 The world coordinate system calculating module obtained coordinates in the camera coordinate system feature point extracting process after the coordinate system of the camera combined with the feature point and the camera coordinate system relationship world coordinate system of the pre-calibrated, the feature point again the camera coordinate system is projected to world coordinates.

所述的目标聚类模块将在世界坐标系中所有的特征点,根据其高度和位置 The goal of clustering module all feature points in the world coordinate system, in accordance with the height and position of

聚合成几个集合,这些集合对应的目标在空间中的真实位置。 Polymerized into several set of real positions corresponding to those set in the target space.

所述的轨迹分析模块,是指:目标在每帧的空间坐标用集合表达后,这些离散的位置点通过模型的来确定整个目标在空间中的真实运动轨迹。 The trajectory of the analysis module, means: the goal in the space of each frame a set of coordinates expressed in these discrete points to determine the position of the entire target trajectory in real space by the model.

如图2所示,本实施例系统处理流程图。 2, the system of the present embodiment processing flowchart. 首先由输入模块读取左右两个摄像机所拍摄的图像,特征提取模块在其中一张图中计算特征点,然后在另一张图中寻找相对应的点。 Two left and right images captured by the camera is first read by the input module, the feature extraction module in which a feature point is calculated in the FIG., And then find the corresponding points in another drawing. 视差估计模块根据匹配好的两点在两张图中位置的差异,并结合相机参数可以进一步计算特征点在相机坐标系中的坐标位置。 The disparity estimation module matches well the differences in the two o'clock position in FIG two, combined with the camera parameter may further calculate the position of feature point coordinates in the camera coordinate system. 世界坐标系计 Meter world coordinate system

算模块将特征点在相机坐标系中的坐标再转换为真实世界坐标系坐标;目标聚类 The coordinates of the feature point calculation module in the camera coordinate system and then converted to a real-world coordinate system coordinates; target clustering

模块通过预先匹配的相机坐标系和世界坐标系之间的关系,通过聚类算法可以将转换到世界坐标的特征点聚类成一些集合。 The relationship between the camera module coordinate system and the world coordinate system of the pre-matching, by clustering algorithm may be converted to world coordinates of the feature points are clustered into a number of sets. 最后,轨迹模块根据目标以往的位置和当前的位置给出其在空间中运动的真实轨迹。 Finally, the module is given its true trajectory in the trajectory space position of the target based on past and current location.

如图3所示,本实施例首先读入左右两个摄像机所拍摄的图像。 As shown in FIG. 3, the present embodiment first reads in two left and right image captured by the camera. 然后在其中 Then where

一张图中计算特征点。 A feature point is calculated in FIG. 通过这些特征点同另一张图进行匹配,计算相机坐标系下坐标。 With FIG another by matching feature points, the camera coordinate system is calculated coordinates. 再将这些点投影到世界坐标系中,可得到一些点的聚类通过投影,这些不同的聚类用深浅不同的灰度标识后,可得到监控空间中目标的位置。 Then these points projected to world coordinates, the cluster number is obtained by projecting the point, after these different clusters with different shades of gray identification, position monitoring is obtained in the target space. 通过对每帧 Through each frame

的处理,跟踪模块可以得到目标的真实运动轨迹。 Processing, tracking module can get the real trajectory targets. 图3中的主窗口显示双目摄像头拍摄得到的左右图像,运行时实时显示提取的特征点,新跳出来的窗口"Cam" 实时显示地面监视区域、特征点在地面的投影、以及特征点的聚类结果和跟踪轨 FIG. 3 of the main window displays a binocular camera photographed left and right images, real-time display feature point extraction operation, the new jump out of the window "Cam" real-time display ground surveillance region, feature points on the ground projection, and the feature points clustering and tracking rail

Claims (7)

1、一种能够检测指定区域的目标并对其进行跟踪的系统,其特征在于,包括:输入模块、特征提取模块、视差估计模块、世界坐标系计算模块、目标聚类模块、轨迹分析模块,其中: 所述输入模块负责采集双目摄像系统拍摄的图像作为系统输入,所获得的左右图像输入到特征提取模块; 所述特征提取模块对输入图像提取特征点作为后续处理的对象; 所述视差估计模块对于特征提取模块提取出的特征点,根据相机的参数计算特征点在相机坐标系中的空间坐标; 所述世界坐标系计算模块将特征点在相机坐标系中的坐标再转换为真实世界坐标系坐标; 所述目标聚类模块根据特征点坐标,将多个特征点聚合成一个代表空间中真实物体的集合; 所述轨迹分析模块结合当前目标集合与历史目标集合的位置给出目标在真实空间中的运动轨迹。 1, the designated area capable of detecting a target and its track system, characterized by comprising: an input module, a feature extraction module, a disparity estimation module, a world coordinate system calculation module, the target cluster module, trajectory analysis module, wherein: said image input system is responsible for collecting captured binocular imaging system module as input left and right images obtained by the input to the feature extraction module; the feature extraction module extracts a feature point on the input image as an object for subsequent processing; the parallax for estimation module feature extraction module extracts the feature points, the feature point in the spatial coordinates in the camera coordinate system is calculated based on the parameters of the camera; the world coordinate system feature point coordinates calculation module in the camera coordinate system and then converted to the real world coordinate Systems; clustering module according to the target characteristic point coordinates, a plurality of feature points into a representative of the polymerization space, a set of real object; trajectory analysis gave the module with the current set of historical target position of the target set in the target trajectory of real space.
2、 根据权利要求l所述的带有双目匹配功能的跟踪系统,其特征是,所述的输入模块负责采集双目摄像系统的数字图像,所述的数字图像是数码相机和数字扫描仪所能获取的图像以及数码摄像机所提供序列图像中的一帧,将拍摄图像的每个像素值按顺序存入内存区域对应的内存单元中,如果输入图像是彩色图像,则彩色图像将分为R、 G、 B三个通道分别保存。 2, the tracking system with the matching binocular as claimed in claim l, wherein said input digital image acquisition module is responsible for binocular imaging system, the digital image is a digital camera and digital scanner the images can be acquired and provided to the digital camera in a sequence of images, each pixel value of the captured image sequentially stored in the memory cell corresponding to the memory region, if the input image is a color image, the color image is divided into R, G, B three channels were preserved.
3、 根据权利要求l所述的带有双目匹配功能的跟踪系统,其特征是,所述的特征提取模块对输入图像计算中的每个像素邻域组成的矩阵的特征值,当超过预设置的阈值时,则该点被认为是图像中的特征点。 3, the tracking system with the matching binocular according to claim l, wherein said feature extraction module eigenvalue matrix of the input image is calculated for each pixel in the neighborhood composed when it exceeds a pre- when the set threshold, the point is considered to be feature points in the image.
4、 根据权利要求l所述的带有双目匹配功能的跟踪系统,其特征是,所述的视差估计模块对提取出的特征点在另一张输入图中寻找相对应的同一点,匹配特征点在两张图中的坐标位置,结合摄像头的参数,其在摄像机坐标系中的位置能被准确的估计,其中匹配标准为由NCC算法计算特征点作为中心点的矩阵之间对比差距,通过现实世界中的点在两张图像中位置的差别为视差,通过得到视差,结合两个相机预先标定得到相机的内部、外部参数,将特征点在图像中x, y坐标、视差转换为相机坐标下的坐标值。 4. The tracking system with binocular matching function according to claim l, wherein said disparity estimation module looking at the same point corresponds to enter another in the extracted feature points, match feature point position coordinates of the two figures, in conjunction with the camera parameters, can be accurately estimate its position in the camera coordinate system, wherein the NCC matching algorithm standard by comparison of feature points as the gap between the center point of the matrix by the real World points difference in the position of the two images is a parallax, parallax obtained by combining two pre-calibrated cameras get inside the camera, external parameters, the image feature points in the x, y coordinates, is converted to the camera coordinates parallax coordinate values.
5、 根据权利要求l所述的带有双目匹配功能的跟踪系统,其特征是,所述的世界坐标系计算模块在提取处理的特征点得到了相机坐标系下的坐标以后, 将特征点的相机坐标系坐标再结合预先标定的世界坐标系与相机坐标系的关系, 将特征点再有相机坐标系投射到世界坐标系中。 5, the tracking system according to claim matching function with a binocular according to claim l, wherein said calculating module world coordinate system obtained coordinates in the camera coordinate system feature point extracting process after the feature point the coordinate system of the camera combined with the relationship between the world coordinate system and the camera coordinate system is pre-calibrated, another feature points projected onto a camera coordinate system in the world coordinate system.
6、 根据权利要求l所述的带有双目匹配功能的跟踪系统,其特征是,所述的目标聚类模块将在世界坐标系中所有的特征点,根据其高度和位置聚合成几个集合,这些集合对应的目标在空间中的真实位置。 6, the tracking system with the matching binocular according to claim l, wherein said target clustering module all the feature points in the world coordinate system, into the polymerization according to the height and position of several collections, these collections correspond to the real position of the target in space.
7、 根据权利要求l所述的带有双目匹配功能的跟踪系统,其特征是,所述的轨迹分析模块,是指:目标在每帧的空间坐标用集合表达后,这些离散的位置点通过模型的来确定整个目标在空间中的真实运动轨迹。 These discrete point target position in the space coordinates of each frame after collection expression: 7, the tracking system according to claim matching function with a binocular according to claim l, wherein said trajectory analysis module, means to determine the true target trajectory in the entire space by model.
CN 200810042491 2008-09-04 2008-09-04 Tracking system based on binocular camera shooting CN101344965A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 200810042491 CN101344965A (en) 2008-09-04 2008-09-04 Tracking system based on binocular camera shooting

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 200810042491 CN101344965A (en) 2008-09-04 2008-09-04 Tracking system based on binocular camera shooting

Publications (1)

Publication Number Publication Date
CN101344965A true CN101344965A (en) 2009-01-14

Family

ID=40246964

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 200810042491 CN101344965A (en) 2008-09-04 2008-09-04 Tracking system based on binocular camera shooting

Country Status (1)

Country Link
CN (1) CN101344965A (en)

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101604448A (en) * 2009-03-16 2009-12-16 北京中星微电子有限公司 Method and system for measuring speed of moving targets
WO2011006382A1 (en) * 2009-07-17 2011-01-20 深圳泰山在线科技有限公司 A method and terminal equipment for action identification based on marking points
CN102034247A (en) * 2010-12-23 2011-04-27 中国科学院自动化研究所 Motion capture method for binocular vision image based on background modeling
CN102175251A (en) * 2011-03-25 2011-09-07 江南大学 Binocular intelligent navigation system
CN102214000A (en) * 2011-06-15 2011-10-12 浙江大学 Hybrid registration method and system for target objects of mobile augmented reality (MAR) system
CN102506815A (en) * 2011-11-10 2012-06-20 河北汉光重工有限责任公司 Multi-target tracking and passive distance measuring device based on image recognition
CN101877174B (en) * 2009-09-29 2012-07-25 杭州海康威视软件有限公司 Vehicle speed measurement method, supervisory computer and vehicle speed measurement system
CN102622767A (en) * 2012-03-05 2012-08-01 广州乐庚信息科技有限公司 Method for positioning binocular non-calibrated space
CN102798456A (en) * 2012-07-10 2012-11-28 中联重科股份有限公司 Method, device and system for measuring working range of engineering mechanical arm frame system
CN102819847A (en) * 2012-07-18 2012-12-12 上海交通大学 Method for extracting movement track based on PTZ mobile camera
CN103083089A (en) * 2012-12-27 2013-05-08 广东圣洋信息科技实业有限公司 Virtual scale method and system of digital stereo-micrography system
CN101877796B (en) * 2009-04-28 2013-07-24 海信集团有限公司 Optical parallax acquiring method, device and system
CN103337076A (en) * 2013-06-26 2013-10-02 深圳市智美达科技有限公司 Method and device for determining appearing range of video monitoring targets
CN103595916A (en) * 2013-11-11 2014-02-19 南京邮电大学 Double-camera target tracking system and implementation method thereof
CN103826071A (en) * 2014-03-11 2014-05-28 深圳市中安视科技有限公司 Three-dimensional camera shooting method for three-dimensional identification and continuous tracking
CN104182747A (en) * 2013-05-28 2014-12-03 株式会社理光 Object detection and tracking method and device based on multiple stereo cameras
CN104317391A (en) * 2014-09-24 2015-01-28 华中科技大学 Stereoscopic vision-based three-dimensional palm posture recognition interactive method and system
CN104408718A (en) * 2014-11-24 2015-03-11 中国科学院自动化研究所 Gait data processing method based on binocular vision measuring
CN104539909A (en) * 2015-01-15 2015-04-22 安徽大学 Video monitoring method and video monitoring server
CN104754733A (en) * 2013-12-31 2015-07-01 南京理工大学 Node position prediction method of dynamic wireless network control system
CN104820434A (en) * 2015-03-24 2015-08-05 南京航空航天大学 Velocity measuring method of ground motion object by use of unmanned plane
CN104915965A (en) * 2014-03-14 2015-09-16 华为技术有限公司 Camera tracking method and device
CN105072312A (en) * 2015-07-23 2015-11-18 柳州正高科技有限公司 Method for predicting image moving direction in dynamic video
CN105160649A (en) * 2015-06-30 2015-12-16 上海交通大学 Multi-target tracking method and system based on kernel function unsupervised clustering
CN105898265A (en) * 2014-12-18 2016-08-24 陆婷 Novel stereo video-based human body tracking method
CN105930766A (en) * 2016-03-31 2016-09-07 深圳奥比中光科技有限公司 Unmanned plane
CN105979203A (en) * 2016-04-29 2016-09-28 中国石油大学(北京) Multi-camera cooperative monitoring method and device
CN106375654A (en) * 2015-07-23 2017-02-01 韩华泰科株式会社 Apparatus and method for controlling network camera
CN106657600A (en) * 2016-10-31 2017-05-10 维沃移动通信有限公司 Image processing method and mobile terminal
CN106907988A (en) * 2017-02-27 2017-06-30 北京工业大学 The micro- visual modeling method of basic data matrix
CN108109176A (en) * 2017-12-29 2018-06-01 北京进化者机器人科技有限公司 Articles detecting localization method, device and robot
CN108257146A (en) * 2018-01-15 2018-07-06 新疆大学 Movement locus display methods and device
WO2020020160A1 (en) * 2018-07-25 2020-01-30 北京市商汤科技开发有限公司 Image parallax estimation

Cited By (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101604448B (en) * 2009-03-16 2015-01-21 北京中星微电子有限公司 Method and system for measuring speed of moving targets
CN101604448A (en) * 2009-03-16 2009-12-16 北京中星微电子有限公司 Method and system for measuring speed of moving targets
CN101877796B (en) * 2009-04-28 2013-07-24 海信集团有限公司 Optical parallax acquiring method, device and system
WO2011006382A1 (en) * 2009-07-17 2011-01-20 深圳泰山在线科技有限公司 A method and terminal equipment for action identification based on marking points
CN101877174B (en) * 2009-09-29 2012-07-25 杭州海康威视软件有限公司 Vehicle speed measurement method, supervisory computer and vehicle speed measurement system
CN102034247A (en) * 2010-12-23 2011-04-27 中国科学院自动化研究所 Motion capture method for binocular vision image based on background modeling
CN102034247B (en) * 2010-12-23 2013-01-02 中国科学院自动化研究所 Motion capture method for binocular vision image based on background modeling
CN102175251A (en) * 2011-03-25 2011-09-07 江南大学 Binocular intelligent navigation system
CN102214000A (en) * 2011-06-15 2011-10-12 浙江大学 Hybrid registration method and system for target objects of mobile augmented reality (MAR) system
CN102506815A (en) * 2011-11-10 2012-06-20 河北汉光重工有限责任公司 Multi-target tracking and passive distance measuring device based on image recognition
CN102622767B (en) * 2012-03-05 2014-07-30 广州乐庚信息科技有限公司 Method for positioning binocular non-calibrated space
CN102622767A (en) * 2012-03-05 2012-08-01 广州乐庚信息科技有限公司 Method for positioning binocular non-calibrated space
CN102798456B (en) * 2012-07-10 2015-01-07 中联重科股份有限公司 Method, device and system for measuring working range of engineering mechanical arm frame system
CN102798456A (en) * 2012-07-10 2012-11-28 中联重科股份有限公司 Method, device and system for measuring working range of engineering mechanical arm frame system
CN102819847A (en) * 2012-07-18 2012-12-12 上海交通大学 Method for extracting movement track based on PTZ mobile camera
CN103083089A (en) * 2012-12-27 2013-05-08 广东圣洋信息科技实业有限公司 Virtual scale method and system of digital stereo-micrography system
CN103083089B (en) * 2012-12-27 2014-11-12 广东圣洋信息科技实业有限公司 Virtual scale method and system of digital stereo-micrography system
CN104182747A (en) * 2013-05-28 2014-12-03 株式会社理光 Object detection and tracking method and device based on multiple stereo cameras
CN103337076B (en) * 2013-06-26 2016-09-21 深圳市智美达科技股份有限公司 There is range determining method and device in video monitor object
CN103337076A (en) * 2013-06-26 2013-10-02 深圳市智美达科技有限公司 Method and device for determining appearing range of video monitoring targets
CN103595916A (en) * 2013-11-11 2014-02-19 南京邮电大学 Double-camera target tracking system and implementation method thereof
CN104754733A (en) * 2013-12-31 2015-07-01 南京理工大学 Node position prediction method of dynamic wireless network control system
CN104754733B (en) * 2013-12-31 2019-03-05 南京理工大学 Dynamic wireless network control system node location prediction technique
CN103826071A (en) * 2014-03-11 2014-05-28 深圳市中安视科技有限公司 Three-dimensional camera shooting method for three-dimensional identification and continuous tracking
CN104915965A (en) * 2014-03-14 2015-09-16 华为技术有限公司 Camera tracking method and device
WO2015135323A1 (en) * 2014-03-14 2015-09-17 华为技术有限公司 Camera tracking method and device
CN104317391B (en) * 2014-09-24 2017-10-03 华中科技大学 A kind of three-dimensional palm gesture recognition exchange method and system based on stereoscopic vision
CN104317391A (en) * 2014-09-24 2015-01-28 华中科技大学 Stereoscopic vision-based three-dimensional palm posture recognition interactive method and system
CN104408718A (en) * 2014-11-24 2015-03-11 中国科学院自动化研究所 Gait data processing method based on binocular vision measuring
CN104408718B (en) * 2014-11-24 2017-06-30 中国科学院自动化研究所 A kind of gait data processing method based on Binocular vision photogrammetry
CN105898265A (en) * 2014-12-18 2016-08-24 陆婷 Novel stereo video-based human body tracking method
CN104539909A (en) * 2015-01-15 2015-04-22 安徽大学 Video monitoring method and video monitoring server
CN104820434A (en) * 2015-03-24 2015-08-05 南京航空航天大学 Velocity measuring method of ground motion object by use of unmanned plane
CN105160649A (en) * 2015-06-30 2015-12-16 上海交通大学 Multi-target tracking method and system based on kernel function unsupervised clustering
CN106375654A (en) * 2015-07-23 2017-02-01 韩华泰科株式会社 Apparatus and method for controlling network camera
CN105072312A (en) * 2015-07-23 2015-11-18 柳州正高科技有限公司 Method for predicting image moving direction in dynamic video
CN105930766A (en) * 2016-03-31 2016-09-07 深圳奥比中光科技有限公司 Unmanned plane
CN105979203A (en) * 2016-04-29 2016-09-28 中国石油大学(北京) Multi-camera cooperative monitoring method and device
CN106657600B (en) * 2016-10-31 2019-10-15 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN106657600A (en) * 2016-10-31 2017-05-10 维沃移动通信有限公司 Image processing method and mobile terminal
CN106907988A (en) * 2017-02-27 2017-06-30 北京工业大学 The micro- visual modeling method of basic data matrix
CN106907988B (en) * 2017-02-27 2019-03-22 北京工业大学 The micro- visual modeling method of basic data matrix
CN108109176A (en) * 2017-12-29 2018-06-01 北京进化者机器人科技有限公司 Articles detecting localization method, device and robot
CN108257146A (en) * 2018-01-15 2018-07-06 新疆大学 Movement locus display methods and device
WO2020020160A1 (en) * 2018-07-25 2020-01-30 北京市商汤科技开发有限公司 Image parallax estimation

Similar Documents

Publication Publication Date Title
D’Orazio et al. A review of vision-based systems for soccer video analysis
US7583815B2 (en) Wide-area site-based video surveillance system
CN100531373C (en) Video frequency motion target close-up trace monitoring method based on double-camera head linkage structure
CN101141633B (en) Moving object detecting and tracing method in complex scene
Senior et al. Appearance models for occlusion handling
Mason et al. Using histograms to detect and track objects in color video
Park et al. Exploring weak stabilization for motion feature extraction
CN101739551B (en) Method and system for identifying moving objects
US7944454B2 (en) System and method for user monitoring interface of 3-D video streams from multiple cameras
Cohen et al. Detecting and tracking moving objects for video surveillance
Lipton Local application of optic flow to analyse rigid versus non-rigid motion
US20060028552A1 (en) Method and apparatus for stereo, multi-camera tracking and RF and video track fusion
Kong et al. A viewpoint invariant approach for crowd counting
US7787011B2 (en) System and method for analyzing and monitoring 3-D video streams from multiple cameras
Bleiweiss et al. Fusing time-of-flight depth and color for real-time segmentation and tracking
US9904852B2 (en) Real-time object detection, tracking and occlusion reasoning
EP2895986B1 (en) Methods, devices and systems for detecting objects in a video
US8170278B2 (en) System and method for detecting and tracking an object of interest in spatio-temporal space
Brown View independent vehicle/person classification
Gurghian et al. Deeplanes: End-to-end lane position estimation using deep neural networksa
KR20120016479A (en) Camera tracking monitoring system and method using thermal image coordinates
CN101095149B (en) Image comparison apparatus and method
CN101389004A (en) Moving target classification method based on on-line study
Hu et al. Moving object detection and tracking from video captured by moving camera
CN101854516A (en) Video monitoring system, video monitoring server and video monitoring method

Legal Events

Date Code Title Description
C06 Publication
C10 Request of examination as to substance
C12 Rejection of an application for a patent