CN104504688A - Method and system based on binocular stereoscopic vision for passenger flow density estimation - Google Patents

Method and system based on binocular stereoscopic vision for passenger flow density estimation Download PDF

Info

Publication number
CN104504688A
CN104504688A CN201410749992.8A CN201410749992A CN104504688A CN 104504688 A CN104504688 A CN 104504688A CN 201410749992 A CN201410749992 A CN 201410749992A CN 104504688 A CN104504688 A CN 104504688A
Authority
CN
China
Prior art keywords
image
human body
body target
pixel
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410749992.8A
Other languages
Chinese (zh)
Inventor
朱秋煜
徐建忠
袁赛
王国威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Shanghai for Science and Technology
Original Assignee
University of Shanghai for Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Shanghai for Science and Technology filed Critical University of Shanghai for Science and Technology
Priority to CN201410749992.8A priority Critical patent/CN104504688A/en
Publication of CN104504688A publication Critical patent/CN104504688A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30242Counting objects in image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a method and a system based on binocular stereoscopic vision for passenger flow density estimation. The system comprises binocular parallel cameras, a DSP (Digital Signal Processor), a communication module, a video output module and the like, and the system is adopted for estimating passenger flow density. The method comprises the following steps: (1) calibrating a left camera and a right camera to obtain the internal and external parameters of the left camera and the right camera; calculating the accurate principal point difference of the left camera and the right camera; (2) collecting a left image and a right image in real time and carrying out position correction, and adopting a self-adapting window stereo matching method to obtain a current frame parallax error image; (3) splitting a foreground human body target to obtain a binary foreground human body target image; (4) generating a dimensional mapping image of the foreground human body target; (5) calculating the degree of congestion, and obtaining the density estimation through the non-linear relationship between the degree of congestion and the number of people. The method is independent of the influences of scene illumination change, shadow, perspective effect and blocking, and the system has the characteristics that the equipment is simple and the accuracy rate of the density estimation is high.

Description

Based on the method and system that the intensity of passenger flow of binocular stereo vision is estimated
Technical field
The present invention relates to computer vision technique application, specifically relates to the method and system that a kind of intensity of passenger flow based on binocular stereo vision is estimated.
Background technology
Intensity of passenger flow refers to passenger flow dense degree in one plane, and the passenger flow number on available units area represents, gathers passenger flow statistics and the management in place for crowds such as station, hall, squares.At present, in passenger flow statistics field, the method based on computer vision has become technical way and study hotspot.These class methods are divided into: the method based on monocular vision and the method based on binocular stereo vision.
Method based on monocular vision adopts single camera shooting passenger flow scenic picture, has the steps such as image acquisition, moving object detection, number of people detection and tracking, passenger flow statistics.The method cost is lower, and it is convenient to implement.But, due to very difficult process illumination variation, perspective effect and the impact of blocking shade etc., cause the detection difficult of foreground target and the segmentation error of foreground target, cause statistical precision poor.
Passenger flow statistical method based on binocular stereo vision adopts binocular camera shooting passenger flow video image, utilize depth perception principle in stereoscopic vision, the depth information of target is obtained from binocular video image, recycling depth information carries out detection, the segmentation of foreground target, and then reaches the object of density Estimation.Although these class methods effectively can process illumination variation, perspective effect in theory and block the impact that shade etc. causes.But this class methods camera calibration is loaded down with trivial details, and the calculated amount of disparity correspondence and Target Segmentation algorithm is large, precision is low, real-time and the adaptability of the density Estimation system obtained thus are poor, and estimated accuracy is low.
Summary of the invention
The object of the invention is to for existing based on Problems existing in the intensity of passenger flow estimation technique of computer vision, the method and system that a kind of intensity of passenger flow based on binocular stereo vision is estimated is provided, the impact not only being subject to the change of scene light, shade, perspective effect and blocking, and equipment is simple, algorithm calculated amount is lower, effectively can improve precision and the real-time of intensity of passenger flow estimation.
In order to achieve the above object, the technical solution used in the present invention is:
Based on the system that the intensity of passenger flow of binocular stereo vision is estimated, it is characterized in that:
This system is by groups such as binocular parallel vidicon, dsp processor, communication module, Video Output Modules, upper main frame, video monitors, dsp processor is connected with binocular parallel vidicon, communication module, Video Output Modules respectively, communication module is connected with upper main frame, Video Output Modules is connected with video monitor, described binocular parallel vidicon module, it comprises left video camera A1, right video camera A2, is respectively used to gather left images; Described communication module, it comprises network interface and RS-485 interface, for the data communication with upper main frame; Described Video Output Modules, monitors in real time for exporting passenger flow video image to video monitor.
A kind of system adopting the above-mentioned intensity of passenger flow based on binocular stereo vision to estimate is added up the passenger flow quantity in a plane, and its concrete grammar is as follows:
(1), to left and right cameras demarcate, draw the inside and outside parameter of left and right cameras; Poor according to the principal point of the disparity computation left and right cameras of the calibration point of left and right cameras;
(2), Real-time Collection left images, carry out position correction according to principal point difference, carry out Stereo matching according to adaptive window solid matching method, obtain present frame anaglyph;
(3), according to present frame anaglyph and background frames anaglyph prospect human body target is split, obtain the prospect human body target image of binaryzation;
(4), calculate the position of the plane two-dimensionally that each pixel is corresponding in prospect human body target image, generate two-dimensional map image;
(5), prospect human body target region area on ground level and background area area are done ratio obtain crowding, obtain unit area density of human number by the nonlinear relationship of crowding and number and estimate.
The method estimated of intensity of passenger flow based on binocular stereo vision of the present invention is compared with the existing passenger traffic density estimation method based on monocular image processing, there are following feature and advantage: the present invention adopts binocular camera, utilize principle of stereoscopic vision, obtain the depth information of object, solve monocular vision technique ubiquitous to light sensitive, the problem being subject to shadow interference; Utilize objective mapping to realize the segmentation of prospect human body target to two dimensional surface, effectively overcome perspective effect and block, reaching the object of density Estimation.The method that the intensity of passenger flow based on binocular stereo vision that the present invention proposes is estimated is applicable to the public place that station, exhibition room, square etc. often can welcome the commuter rush hour of short-term; the impact not being subject to the change of scene light, shade, perspective effect and blocking; there is equipment simple, the feature that density Estimation accuracy rate is high.
Accompanying drawing explanation
Fig. 1 is that the hardware system of the method that the intensity of passenger flow based on binocular stereo vision of the present invention is estimated forms schematic diagram.
Fig. 2 is the process flow diagram of the method that the intensity of passenger flow based on binocular stereo vision of the present invention is estimated.
Fig. 3 is the embodiment schematic flow sheet of the method that the intensity of passenger flow based on binocular stereo vision of the present invention is estimated.
Embodiment
Method and system intensity of passenger flow based on binocular stereo vision of the present invention estimated below in conjunction with accompanying drawing and being described in further detail.
As shown in Figure 1, the system that the above-mentioned intensity of passenger flow based on binocular stereo vision is estimated, is characterized in that:
This system is made up of binocular parallel vidicon, dsp processor, communication module, Video Output Modules, upper main frame, video monitor etc., dsp processor is connected with binocular parallel vidicon, communication module, Video Output Modules respectively, form intensity of passenger flow estimation unit together, communication module is connected with upper main frame, and Video Output Modules is connected with video monitor;
Described binocular parallel vidicon module, it comprises left video camera A1, right video camera A2, is respectively used to gather left images;
Described communication module, it comprises network interface and RS-485 interface, is respectively used to the data communication with upper main frame;
Described Video Output Modules, monitors in real time for exporting passenger flow video image to video monitor,
Tiltedly install, by the angle of inclination of camera optical axis and video camera pillar from top declining forward when system is installed αadjustment with realize setting regions intensity of passenger flow estimate.Angle is less, then density Estimation region is larger.
As shown in Figure 2 and Figure 3, the method that the above-mentioned intensity of passenger flow based on binocular stereo vision is estimated, the system adopting the above-mentioned intensity of passenger flow based on binocular stereo vision to estimate carries out the crowd density estimation in region, comprises the steps:
(1), to left and right cameras demarcate, draw the inside and outside parameter of left and right cameras; Poor according to the principal point of the disparity computation left and right cameras of the calibration point of left and right cameras, calculate background frames anaglyph, its concrete steps are as follows:
(1-1), adopt Zhang Zhengyou Camera Calibration Algorithm, arrange gridiron pattern scaling board, left and right cameras takes the uncalibrated image under at least three width image pixel coordinates systems from different directions, extracts angle point in each uncalibrated image as unique point;
(1-2), utilize the unique point in above-mentioned uncalibrated image to ask homography matrix, then obtained the inside and outside parameter of video camera by homography matrix, comprise principal point coordinate u 0, v 0, rotation matrix r, translation vector t, utilize calibration result to calculate each calibration point and exist x c y c z c coordinate under camera coordinate system;
(1-3) principal point, according to the coordinate of above-mentioned calibration point in the uncalibrated image of left and right calculating left and right cameras is poor, and the principal point of horizontal direction is poor d u computing formula be:
(1)
Wherein, z c for calibration point is under the camera coordinate system zaxle value, dthe coordinate difference of calibration point in image pixel coordinates system horizontal direction, the i.e. parallax of calibration point, bthe base length of left and right cameras, fthe focal length of left and right cameras, poor to the be averaged principal point that can obtain left and right cameras of the principal point difference calculated by each calibration point;
Adopt all calibration points under the pixel coordinate system vthe mean value of coordinate difference is poor as the principal point of vertical direction d v ;
(1-4) scene background parallax image, is calculated;
(2), Real-time Collection left images, carry out position correction according to principal point difference, carry out stereoscopic parallax coupling according to adaptive Window match method, obtain present frame anaglyph, it is specific as follows:
(2-1), the left images of Real-time Collection left and right cameras, poor according to the left and right cameras principal point described in step (1), to left image translation ( d u , d v ) to correct the impact of principal point difference on Stereo matching, for next step accurate disparity computation;
(2-2), obtain present frame anaglyph, first set piece image in left image as reference picture, another piece image is matching image, with reference to centered by pixel in image, size is a self-adapting window of the block of pixel is Matching unit, then, in matching image in disparity range centered by each pixel, size is all window, the window correlation measure of pixel to be matched in computing reference image, is designated as successively , its correlation measure expression formula is as follows: (2)
Wherein, for image pixel coordinates, with represent gray-scale value that is right, left image respectively, with be respectively pixel average gray in the window in the right side, left image centered by matched pixel point, for the side-play amount along polar curve direction within the scope of disparity correspondence, correlativity is measured value carries out quadratic form interpolation, obtains correlativity and measures side-play amount corresponding to maximal value dnamely be the parallax of gained;
The method utilizing above-mentioned zone to mate is carried out disparity computation to all pixels of image and is obtained present frame anaglyph, the initial self-adapting window width of each pixel is 4% of picture traverse, height is 3 row, then the gray variance of statistical pixel in window, if gray variance is less than certain thresholding, the structural information comprised very little is described in window, then the width of match window is amplified 50%, continue calculate its gray variance and judge, until be greater than thresholding or reach 1/5 of picture traverse, variance thresholding used is determined by experiment;
(3), according to present frame anaglyph and background frames anaglyph prospect human body target is split, obtain the prospect human body target image of binaryzation;
(3-1), calculate the background subtraction of parallax, the difference that background frames parallax and present frame parallax subtract each other gained is designated as , its expression formula is:
(3)
Wherein, represent location of pixels coordinate, expression present frame parallax (the kframe), represent background frames parallax, for the disparity difference calculated;
(3-2) binaryzation prospect human body target image, is calculated , its expression formula is:
(4)
Wherein, trepresent the threshold value of disparity difference, determined by experiment;
(4), calculate the position of the plane two-dimensionally that each pixel is corresponding in prospect human body target image, generate two-dimensional map image, it is specific as follows:
(4-1), the point in the prospect human body target image under image pixel coordinates system , map to the image coordinate system represented with physical unit (such as millimeter) , mapping relations are:
(5)
Wherein, with for each pixel is at image coordinate system axle and physical size on direction of principal axis, uwith vfor the pixel coordinate in image pixel coordinates system, with for the principal point coordinate of image;
(4-2), the pixel value in image coordinate system is mapped under camera coordinate system axle value, axle value, axle value, its relationship between expression formula is as follows:
(6)
Wherein, for under camera coordinate system the coordinate figure that axle is put, for under camera coordinate system the coordinate figure that axle is put, for under camera coordinate system the coordinate figure that axle is put, is obtained by following formulae discovery:
(7)
Wherein dthe parallax of the pixel described in step (2), bthe base length of left and right cameras, fthe focal length of left and right cameras,
(4-3), by the pixel value in camera coordinate system map to world coordinate system, obtain the position of plane two-dimensionally corresponding to each pixel in prospect human body target image, computing formula is as follows:
(8)
Wherein, for the world coordinates of space any point , R for video camera rotation matrix, t for camera translation vector, determine when the camera calibration of the first step;
(4-4), formed according to the world coordinates position of all prospect human body target points x w y w the two-dimensional map image of ground level distribution, map image x w y w coordinate figure, right z w the prospect human body target point of non-zero, the gray-scale value of its map image is set to 1, right z w the prospect human body target point of zero, the gray-scale value of its its map image is set to 0;
(5), prospect human body target region area on ground level and background area area are done ratio obtain crowding, obtain unit area density of human number by the nonlinear relationship of crowding and number and estimate, it is specific as follows:
(5-1), the density Estimation area maps of background image is arrived x w y w on ground level, obtain a background mapping area, actual prospect human body target image is also mapped to background mapping area, and prospect human body target region area and background mapping area area ratio are defined as crowding, and its calculating formula is:
(9)
Wherein, s f the prospect human body target image area in map image, s b background mapping area area, yit is crowding;
(5-2) the relation quadratic form polynomial expression, in crowding and actual scene between unit area number carries out matching, and its fit correlation formula is:
(10)
Wherein, , , be respectively quadratic form multinomial coefficient, its coefficient value is demarcated in advance by the experiment value between the crowding in actual scene and the number of unit area, and usually, unit area employing square metre, namely the number of every square metre is estimated crowd density.
Repeat step 2-5 and can obtain real-time crowd density estimation.

Claims (7)

1. the system estimated of the intensity of passenger flow based on binocular stereo vision, it is characterized in that: this system is made up of binocular parallel vidicon, dsp processor, communication module, Video Output Modules, upper main frame, video monitor etc., dsp processor is connected with binocular parallel vidicon, communication module, Video Output Modules respectively, communication module is connected with upper main frame, Video Output Modules is connected with video monitor, described binocular parallel vidicon module, it comprises left video camera A1, right video camera A2, is respectively used to gather left images; Described communication module, it comprises network interface and RS-485 interface, for the data communication with upper main frame; Described Video Output Modules, monitors in real time for exporting passenger flow video image to video monitor.
2. the method estimated of the intensity of passenger flow based on binocular stereo vision, adopt and state the system that the intensity of passenger flow based on binocular stereo vision estimates according to claim 1 the passenger flow quantity in a plane is added up, it is characterized in that, the method step is as follows:
(1), to left and right cameras demarcate, draw the inside and outside parameter of left and right cameras; Poor according to the principal point of the disparity computation left and right cameras of the calibration point of left and right cameras;
(2), Real-time Collection left images, carry out position correction according to principal point difference, carry out Stereo matching according to adaptive window solid matching method, obtain present frame anaglyph;
(3), according to present frame anaglyph and background frames anaglyph prospect human body target is split, obtain the prospect human body target image of binaryzation;
(4), calculate the position of the plane two-dimensionally that each pixel is corresponding in prospect human body target image, generate two-dimensional map image;
(5), prospect human body target region area on ground level and background area area are done ratio obtain crowding, obtain unit area density of human number by the nonlinear relationship of crowding and number and estimate.
3. the method estimated of the intensity of passenger flow based on binocular stereo vision according to claim 2, it is characterized in that, demarcating left and right cameras described in above-mentioned steps (1), draws the inside and outside parameter of left and right cameras; Poor according to the principal point of the disparity computation left and right cameras of the calibration point of left and right cameras, its concrete steps are as follows:
(1-1), adopt Zhang Zhengyou Camera Calibration Algorithm, arrange gridiron pattern scaling board, left and right cameras takes the uncalibrated image under at least three width image pixel coordinates systems from different directions, extracts angle point in each uncalibrated image as unique point;
(1-2), utilize the unique point in above-mentioned uncalibrated image to ask homography matrix, then obtained the inside and outside parameter of video camera by homography matrix, comprise principal point coordinate u 0, v 0, rotation matrix r, translation vector t, utilize calibration result to calculate each calibration point and exist x c y c z c coordinate under camera coordinate system;
(1-3) principal point, according to the coordinate of above-mentioned calibration point in the uncalibrated image of left and right calculating left and right cameras is poor, and the principal point of horizontal direction is poor d u computing formula be:
(1)
Wherein, z c for calibration point is under the camera coordinate system zaxle value, dthe coordinate difference of calibration point in image pixel coordinates system horizontal direction, the i.e. parallax of calibration point, bthe base length of left and right cameras, fthe focal length of left and right cameras, poor to the be averaged principal point that can obtain left and right cameras of the principal point difference calculated by each calibration point.
4. the method estimated of the intensity of passenger flow based on binocular stereo vision according to claim 3, it is characterized in that, Real-time Collection left images described in above-mentioned steps (2), position correction is carried out according to principal point difference, stereoscopic parallax coupling is carried out according to adaptive Window match method, obtain present frame anaglyph, it is specific as follows:
(2-1), the left images of Real-time Collection left and right cameras, poor according to the left and right cameras principal point described in step (1), to left image translation ( d u , d v ) to correct the impact of principal point difference on Stereo matching, for next step accurate disparity computation;
(2-2), obtain present frame anaglyph, first set piece image in left image as reference picture, another piece image is matching image, with reference to centered by pixel in image, size is a self-adapting window of the block of pixel is Matching unit, then, in matching image in disparity range centered by each pixel, size is all window, the window correlation measure of pixel to be matched in computing reference image, is designated as successively , its correlation measure expression formula is as follows:
(2)
Wherein, for image pixel coordinates, with represent gray-scale value that is right, left image respectively, with be respectively pixel average gray in the window in the right side, left image centered by matched pixel point, for the side-play amount along polar curve direction within the scope of disparity correspondence, correlativity is measured value carries out quadratic form interpolation, obtains correlativity and measures side-play amount corresponding to maximal value dnamely be the parallax of gained.
5. the method estimated of the intensity of passenger flow based on binocular stereo vision according to claim 4, it is characterized in that, prospect human body target is split according to present frame anaglyph and background frames anaglyph described in above-mentioned steps (3), obtain the prospect human body target image of binaryzation;
(3-1), calculate the background subtraction of parallax, the difference that background frames parallax and present frame parallax subtract each other gained is designated as , its expression formula is:
(3)
Wherein, represent location of pixels coordinate, expression present frame parallax (the kframe), represent background frames parallax, for the disparity difference calculated;
(3-2) binaryzation prospect human body target image, is calculated , its expression formula is:
(4)
Wherein, trepresent the threshold value of disparity difference, determined by experiment.
6. the method estimated of the intensity of passenger flow based on binocular stereo vision according to claim 5, it is characterized in that, the position of the plane two-dimensionally that each pixel is corresponding in calculating prospect human body target image described in above-mentioned steps (4), generate two-dimensional map image, it is specific as follows:
(4-1), the point in the prospect human body target image under image pixel coordinates system , map to the image coordinate system represented with physical unit (such as millimeter) , mapping relations are:
(5)
Wherein, with for each pixel is at image coordinate system axle and physical size on direction of principal axis, uwith vfor the pixel coordinate in image pixel coordinates system, with for the principal point coordinate of image;
(4-2), the pixel value in image coordinate system is mapped under camera coordinate system axle value, axle value, axle value, its relationship between expression formula is as follows:
(6)
Wherein, for under camera coordinate system the coordinate figure that axle is put, for under camera coordinate system the coordinate figure that axle is put, for under camera coordinate system the coordinate figure that axle is put, is obtained by following formulae discovery:
(7)
Wherein dthe parallax of the pixel described in step (2), bthe base length of left and right cameras, fthe focal length of left and right cameras,
(4-3), by the pixel value in camera coordinate system map to world coordinate system, obtain the position of plane two-dimensionally corresponding to each pixel in prospect human body target image, computing formula is as follows:
(8)
Wherein, for the world coordinates of space any point , R for video camera rotation matrix, t for camera translation vector, determine when the camera calibration of the first step;
(4-4), formed according to the world coordinates position of all prospect human body target points x w y w the two-dimensional map image of ground level distribution, map image x w y w coordinate figure, right z w the prospect human body target point of non-zero, the gray-scale value of its map image is set to 1, right z w the prospect human body target point of zero, the gray-scale value of its its map image is set to 0.
7. the method estimated of the intensity of passenger flow based on binocular stereo vision according to claim 6, it is characterized in that, prospect human body target region area on ground level and background area area done ratio obtain crowding described in above-mentioned steps (5), obtain unit area density of human number by the nonlinear relationship of crowding and number to estimate, it is specific as follows:
(5-1), the density Estimation area maps of background image is arrived x w y w on ground level, obtain a background mapping area, actual prospect human body target image is also mapped to background mapping area, and prospect human body target region area and background mapping area area ratio are defined as crowding, and its calculating formula is:
(9)
Wherein, s f the prospect human body target image area in map image, s b background mapping area area, yit is crowding;
(5-2) the relation quadratic form polynomial expression, in crowding and actual scene between unit area number carries out matching, and its fit correlation formula is:
(10)
Wherein, , , be respectively quadratic form multinomial coefficient, its coefficient value is demarcated in advance by the experiment value between the crowding in actual scene and the number of unit area.
CN201410749992.8A 2014-12-10 2014-12-10 Method and system based on binocular stereoscopic vision for passenger flow density estimation Pending CN104504688A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410749992.8A CN104504688A (en) 2014-12-10 2014-12-10 Method and system based on binocular stereoscopic vision for passenger flow density estimation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410749992.8A CN104504688A (en) 2014-12-10 2014-12-10 Method and system based on binocular stereoscopic vision for passenger flow density estimation

Publications (1)

Publication Number Publication Date
CN104504688A true CN104504688A (en) 2015-04-08

Family

ID=52946082

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410749992.8A Pending CN104504688A (en) 2014-12-10 2014-12-10 Method and system based on binocular stereoscopic vision for passenger flow density estimation

Country Status (1)

Country Link
CN (1) CN104504688A (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104902258A (en) * 2015-06-09 2015-09-09 公安部第三研究所 Multi-scene pedestrian volume counting method and system based on stereoscopic vision and binocular camera
CN105225482A (en) * 2015-09-02 2016-01-06 上海大学 Based on vehicle detecting system and the method for binocular stereo vision
CN105277169A (en) * 2015-09-25 2016-01-27 安霸半导体技术(上海)有限公司 Image segmentation-based binocular range finding method
CN105551291A (en) * 2016-02-16 2016-05-04 深圳市特维视科技有限公司 Bus taking guide system
CN106127137A (en) * 2016-06-21 2016-11-16 长安大学 A kind of target detection recognizer based on 3D trajectory analysis
CN106327473A (en) * 2016-08-10 2017-01-11 北京小米移动软件有限公司 Method and device for acquiring foreground images
CN106446789A (en) * 2016-08-30 2017-02-22 电子科技大学 Pedestrian real-time detection method based on binocular vision
CN106485735A (en) * 2015-09-01 2017-03-08 南京理工大学 Human body target recognition and tracking method based on stereovision technique
CN106503605A (en) * 2015-09-01 2017-03-15 南京理工大学 Human body target recognition methods based on stereovision technique
CN106897698A (en) * 2017-02-24 2017-06-27 常州常工电子科技股份有限公司 Classroom number detection method and system based on machine vision Yu binocular coordination technique
CN108230351A (en) * 2016-12-15 2018-06-29 上海杰轩智能科技有限公司 Sales counter evaluation method and system based on binocular stereo vision pedestrian detection
CN108960096A (en) * 2018-06-22 2018-12-07 杭州晶智能科技有限公司 Human body recognition method based on stereoscopic vision and infrared imaging
CN109447042A (en) * 2018-12-17 2019-03-08 公安部第三研究所 The system and method for top-type passenger flow monitor processing is realized based on stereovision technique
CN111279392A (en) * 2017-11-06 2020-06-12 三菱电机株式会社 Cluster density calculation device, cluster density calculation method, and cluster density calculation program
CN111486802A (en) * 2020-04-07 2020-08-04 东南大学 Rotating shaft calibration method based on self-adaptive distance weighting
CN111664798A (en) * 2020-04-29 2020-09-15 深圳奥比中光科技有限公司 Depth imaging method and device and computer readable storage medium
CN112668525A (en) * 2020-12-31 2021-04-16 深圳云天励飞技术股份有限公司 People flow counting method and device, electronic equipment and storage medium
CN114120372A (en) * 2022-01-24 2022-03-01 深圳爱莫科技有限公司 Space passenger flow heat distribution method and system based on human body detection and identification
CN115880687A (en) * 2023-02-09 2023-03-31 北京东方瑞丰航空技术有限公司 Method, device, equipment and medium for automatically generating infrared characteristics of target object

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1731456A (en) * 2005-08-04 2006-02-08 浙江大学 Bus passenger traffic statistical method based on stereoscopic vision and system therefor
CN101714293A (en) * 2009-12-16 2010-05-26 上海交通投资信息科技有限公司 Stereoscopic vision based acquisition method of congestion degree of bus passenger flow

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1731456A (en) * 2005-08-04 2006-02-08 浙江大学 Bus passenger traffic statistical method based on stereoscopic vision and system therefor
CN101714293A (en) * 2009-12-16 2010-05-26 上海交通投资信息科技有限公司 Stereoscopic vision based acquisition method of congestion degree of bus passenger flow

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104902258A (en) * 2015-06-09 2015-09-09 公安部第三研究所 Multi-scene pedestrian volume counting method and system based on stereoscopic vision and binocular camera
CN106485735A (en) * 2015-09-01 2017-03-08 南京理工大学 Human body target recognition and tracking method based on stereovision technique
CN106503605A (en) * 2015-09-01 2017-03-15 南京理工大学 Human body target recognition methods based on stereovision technique
CN105225482A (en) * 2015-09-02 2016-01-06 上海大学 Based on vehicle detecting system and the method for binocular stereo vision
CN105225482B (en) * 2015-09-02 2017-08-11 上海大学 Vehicle detecting system and method based on binocular stereo vision
CN105277169B (en) * 2015-09-25 2017-12-22 安霸半导体技术(上海)有限公司 Binocular distance-finding method based on image segmentation
CN105277169A (en) * 2015-09-25 2016-01-27 安霸半导体技术(上海)有限公司 Image segmentation-based binocular range finding method
CN105551291A (en) * 2016-02-16 2016-05-04 深圳市特维视科技有限公司 Bus taking guide system
CN105551291B (en) * 2016-02-16 2018-06-05 深圳市特维视科技有限公司 Transit riding guidance system
CN106127137A (en) * 2016-06-21 2016-11-16 长安大学 A kind of target detection recognizer based on 3D trajectory analysis
CN106327473A (en) * 2016-08-10 2017-01-11 北京小米移动软件有限公司 Method and device for acquiring foreground images
CN106446789A (en) * 2016-08-30 2017-02-22 电子科技大学 Pedestrian real-time detection method based on binocular vision
CN108230351A (en) * 2016-12-15 2018-06-29 上海杰轩智能科技有限公司 Sales counter evaluation method and system based on binocular stereo vision pedestrian detection
CN106897698B (en) * 2017-02-24 2019-12-06 常州常工电子科技股份有限公司 Classroom people number detection method and system based on machine vision and binocular collaborative technology
CN106897698A (en) * 2017-02-24 2017-06-27 常州常工电子科技股份有限公司 Classroom number detection method and system based on machine vision Yu binocular coordination technique
CN111279392A (en) * 2017-11-06 2020-06-12 三菱电机株式会社 Cluster density calculation device, cluster density calculation method, and cluster density calculation program
CN111279392B (en) * 2017-11-06 2023-12-15 三菱电机株式会社 Cluster density calculation device, cluster density calculation method, and computer-readable storage medium
CN108960096B (en) * 2018-06-22 2021-08-17 深圳市恒天伟焱科技股份有限公司 Human body identification method based on stereoscopic vision and infrared imaging
CN108960096A (en) * 2018-06-22 2018-12-07 杭州晶智能科技有限公司 Human body recognition method based on stereoscopic vision and infrared imaging
CN109447042A (en) * 2018-12-17 2019-03-08 公安部第三研究所 The system and method for top-type passenger flow monitor processing is realized based on stereovision technique
CN111486802A (en) * 2020-04-07 2020-08-04 东南大学 Rotating shaft calibration method based on self-adaptive distance weighting
CN111486802B (en) * 2020-04-07 2021-04-06 东南大学 Rotating shaft calibration method based on self-adaptive distance weighting
CN111664798A (en) * 2020-04-29 2020-09-15 深圳奥比中光科技有限公司 Depth imaging method and device and computer readable storage medium
CN112668525A (en) * 2020-12-31 2021-04-16 深圳云天励飞技术股份有限公司 People flow counting method and device, electronic equipment and storage medium
CN112668525B (en) * 2020-12-31 2024-05-07 深圳云天励飞技术股份有限公司 People flow counting method and device, electronic equipment and storage medium
CN114120372A (en) * 2022-01-24 2022-03-01 深圳爱莫科技有限公司 Space passenger flow heat distribution method and system based on human body detection and identification
CN115880687A (en) * 2023-02-09 2023-03-31 北京东方瑞丰航空技术有限公司 Method, device, equipment and medium for automatically generating infrared characteristics of target object

Similar Documents

Publication Publication Date Title
CN104504688A (en) Method and system based on binocular stereoscopic vision for passenger flow density estimation
CN105225482B (en) Vehicle detecting system and method based on binocular stereo vision
US9311542B2 (en) Method and apparatus for detecting continuous road partition
CN109840922B (en) Depth acquisition method and system based on binocular light field camera
KR100776649B1 (en) A depth information-based Stereo/Multi-view Stereo Image Matching Apparatus and Method
CN110807809B (en) Light-weight monocular vision positioning method based on point-line characteristics and depth filter
WO2017080108A1 (en) Flying device, flying control system and method
CN113034568B (en) Machine vision depth estimation method, device and system
Salih et al. Depth and geometry from a single 2d image using triangulation
CN103868460A (en) Parallax optimization algorithm-based binocular stereo vision automatic measurement method
CN105069804B (en) Threedimensional model scan rebuilding method based on smart mobile phone
CN111210481A (en) Depth estimation acceleration method of multiband stereo camera
CN107545586B (en) Depth obtaining method and system based on light field polar line plane image local part
US20150029312A1 (en) Apparatus and method for detecting object automatically and estimating depth information of image captured by imaging device having multiple color-filter aperture
CN104299220A (en) Method for filling cavity in Kinect depth image in real time
CN104240229B (en) A kind of adaptive method for correcting polar line of infrared binocular camera
CN113112588A (en) Underground pipe well three-dimensional visualization method based on RGB-D depth camera reconstruction
CN104079800A (en) Shaking preventing method for video image in video surveillance
JP2020112438A (en) Sea level measurement system, sea level measurement method and sea level measurement program
CN106033614A (en) Moving object detection method of mobile camera under high parallax
CN110992463B (en) Three-dimensional reconstruction method and system for sag of transmission conductor based on three-eye vision
Shen et al. Extrinsic calibration for wide-baseline RGB-D camera network
CN103489183B (en) A kind of sectional perspective matching process split based on edge with seed point
KR101558805B1 (en) Interpolation factors correction apparatus for stereo matching
CN108648219A (en) A kind of barrier based on binocular and area of feasible solutions detection method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20150408