CN107818287A - A kind of passenger flow statistic device and system - Google Patents

A kind of passenger flow statistic device and system Download PDF

Info

Publication number
CN107818287A
CN107818287A CN201610822384.4A CN201610822384A CN107818287A CN 107818287 A CN107818287 A CN 107818287A CN 201610822384 A CN201610822384 A CN 201610822384A CN 107818287 A CN107818287 A CN 107818287A
Authority
CN
China
Prior art keywords
pedestrian
image
current frame
frame image
pedestrian target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610822384.4A
Other languages
Chinese (zh)
Other versions
CN107818287B (en
Inventor
张杨
张盼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Priority to CN201610822384.4A priority Critical patent/CN107818287B/en
Publication of CN107818287A publication Critical patent/CN107818287A/en
Application granted granted Critical
Publication of CN107818287B publication Critical patent/CN107818287B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/53Recognition of crowd images, e.g. recognition of crowd congestion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques

Abstract

The present invention provides a kind of passenger flow statistic device and system, is related to image processing techniques, to improve the precision of passenger flow statisticses.The passenger flow statistic device of the present invention includes:Image collection module, for obtaining current frame image;Pedestrian target area image determining module, for determining pedestrian target area image corresponding to current frame image by current frame image and the background image obtained;Pedestrian target distribution situation determining module, for the pedestrian target grader obtained using the HOG features trainings based on sample image, determine the pedestrian target distribution situation in current frame image;Pedestrian density and pixel weight determination sub-module, for determining pedestrian density corresponding to current frame image and pixel weight based on pedestrian target distribution situation;Passenger flow demographics module, for determining the passenger flow number in current frame image according to pedestrian target area image, pedestrian density and pixel weight.Present invention is mainly used in passenger flow statisticses technology.

Description

A kind of passenger flow statistic device and system
Technical field
The present invention relates to image processing techniques, more particularly to a kind of passenger flow statistic device and system.
Background technology
Passenger flow is to weigh the important evidence of public place busy extent, and accurately passenger flow statisticses information is public to managing in real time Place has important directive significance.For example, if the real-time passenger flow in subway station or market can be grasped accurately, can dynamically adjust The quantity and the rational marketing strategy of formulation of staff.
With the development of information technology, video monitoring has been obtained for extensive use, and towards the side of intelligent automation To development.Compared to the mode that passenger flow statisticses are realized using artificial or video indirect labor observation, the passenger flow based on video analysis Statistics has very big advantage in terms of statistics accuracy rate and resource cost.
Number of patent application is to describe one kind in 201410380811.9 application for a patent for invention to be based on Computer Vision Passenger flow statistical method, it is special that this method is based on HOG (Histogram of Oriented Gradient, histograms of oriented gradients) The human testing of sign and feature point tracking algorithm realize the counting of bus passenger flow.Number of patent application is 201010122671.7 Also analyzed in application for a patent for invention video HOG features and textural characteristics to realize the detection to target, and take association The tracking of target is realized with strategy, and then realizes the statistics of passenger flow.
But inventor has found during the present invention is realized, passenger flow statistical method of the prior art is depended on and regarded The HOG features of frequency are detected.But the defects of due to HOG features, when the change of detection environment, pedestrian's serious shielding and light When condition is bad, the precision that passenger flow statisticses are carried out using prior art can also be declined therewith.
The content of the invention
In view of this, the present invention provides a kind of passenger flow statistic device and system, to improve the precision of passenger flow statisticses.
In order to solve the above technical problems, on the one hand, the present invention provides a kind of passenger flow statistic device, including:
Image collection module, for obtaining current frame image;
Pedestrian target area image determining module, for determining institute by the current frame image and the background image obtained State pedestrian target area image corresponding to current frame image;
Pedestrian target distribution situation determining module, for the pedestrian obtained using the HOG features trainings based on sample image Object classifiers, determine the pedestrian target distribution situation in the current frame image;
Pedestrian density and pixel weight determination sub-module, it is described current for being determined based on the pedestrian target distribution situation Pedestrian density corresponding to two field picture and pixel weight;
Passenger flow demographics module, for according to the pedestrian target area image, the pedestrian density and the pixel Weight determines the passenger flow number in the current frame image.
Wherein, the pedestrian target area image determining module includes:
Foreground image acquisition submodule, for the difference according to the current frame image and the background image, before acquisition Scape image;
Pedestrian target area image acquisition submodule, for carrying out morphologic filtering to the foreground image, described in acquisition Pedestrian target area image.
Wherein, the foreground image acquisition submodule includes:
Element marking unit, for for each pixel in the current frame image, if each pixel In the first pixel pixel value the second pixel corresponding with the first pixel described in the background image pixel value Difference be located in preset range, first pixel is labeled as background dot;Otherwise, before first pixel is labeled as Sight spot;
Image acquisition unit, for obtaining the prospect according to the background dot marked in the current frame image and foreground point Image.
Wherein, the pedestrian target distribution situation determining module includes:
Image scaling submodule, for being amplified to the current frame image according to different amplification coefficients;
Image traversal submodule, for being carried out using the pedestrian target grader to the current frame image after amplification Window travels through;
First coordinate acquisition submodule, for when carrying out window traversal, if first window is identified as with pedestrian Target, obtain the current amplification system of first coordinate and the current frame image of the pedestrian target in the first window Number;
Second coordinate acquisition submodule, for being zoomed in and out using the current amplification coefficient to first coordinate, obtain Obtain first coordinate, second coordinate corresponding in the current frame image;
Pedestrian target distribution situation acquisition submodule, for pedestrian corresponding to the coordinate of one or more second according to acquisition Target point determines the pedestrian target distribution situation in the current frame image.
Wherein, the pedestrian target distribution situation determining module also includes:
First size acquisition submodule, for obtaining first size of the pedestrian target in the first window;
Second size acquisition submodule, for being zoomed in and out using the current amplification coefficient to the first size, obtain Obtain the first size second size corresponding in the current frame image.
Wherein, the pedestrian density and pixel weight determination sub-module include:
Number acquisition submodule, for for each pedestrian target point in the current frame image, obtaining described every One pedestrian target point appears in the number for being identified as having in the first window of pedestrian target;
Pedestrian density's calculating sub module, it is predetermined centered on each pixel in the current frame image for calculating The integration of the number in region, obtains pedestrian density corresponding to each pixel, and each pixel includes described Pedestrian target point and non-pedestrian target point;
Size determination sub-module, for the second coordinate and the second size according to corresponding to any two pedestrian target point, really The estimation size of each non-pedestrian target point in the pixel of the fixed current frame image;
Weighted value acquisition submodule, for according to corresponding to each pedestrian target point in the current frame image the second size and Estimation size corresponding to each non-pedestrian target point, determine the weight of each pedestrian target point and each non-pedestrian target point Value.
Wherein, the passenger flow demographics module includes:
Pixel determination sub-module, for determine in the current frame image with the positively related pixel of pedestrian's number;
Weighted sum determination sub-module, for according to the pedestrian target area image, the pedestrian density and the pixel Weight determines the weighted sum with the positively related pixel of pedestrian's number;
Passenger flow number calculating sub module, for obtaining regression coefficient, and it is true using the regression coefficient and the weighted sum Passenger flow number in the fixed current frame image.
Wherein, the weighted sum determination sub-module determines the weighted sum according to the following equation:
Wherein, (i, j) represents described and coordinate of the positively related pixel of pedestrian's number in current frame image;(I, J) Represent the size of the current frame image;BijRepresent described and the positively related pixel of pedestrian's number is in pedestrian's target area image In pixel value;CijRepresent the pedestrian density corresponding with the positively related pixel of pedestrian's number;WijRepresent described and pedestrian Weighted value corresponding to the positively related pixel of number.
Wherein, the passenger flow number calculating sub module includes:
Regression coefficient computing unit, for obtaining the regression coefficient according to following processes:(a) initial regression coefficient is utilized Product with the weighted sum is as first passenger flow number;(b) product of i-th of regression coefficient and the weighted sum is calculated, Using the product as i-th of passenger flow number;If (c) product is equal to the i-th -1 passenger flow number, by described i-th time Return coefficient as the regression coefficient;Otherwise (b)-(c) is repeated, until the product that (b) is obtained is equal to the i-th -1 visitor Person who lives in exile's number;Wherein i=2,3,4 ...;
Passenger flow number computing unit, for being determined using the regression coefficient and the weighted sum in the current frame image Passenger flow number.
Second aspect, the present invention provide a kind of passenger flow statistical system, including:
Image collecting device, for obtaining current frame image;
Passenger flow statistic device, for determining the current frame image by the current frame image and the background image obtained Corresponding pedestrian target area image;The pedestrian target grader obtained using the HOG features trainings based on sample image, it is determined that Pedestrian target distribution situation in the current frame image;The current frame image is determined based on the pedestrian target distribution situation Corresponding pedestrian density and pixel weight;According to the pedestrian target area image, the pedestrian density and the pixel weight Determine the passenger flow number in the current frame image.
Wherein, the system also includes:
Passenger flow number processing unit, for prompting the user with the passenger flow number.
The above-mentioned technical proposal of the present invention has the beneficial effect that:
In embodiments of the present invention, current frame image is obtained, and obtains pedestrian target administrative division map corresponding to current frame image Picture.Meanwhile the pedestrian target in current frame image is determined by using the pedestrian target grader obtained based on HOG features trainings Distribution situation, and pedestrian density and pixel weight corresponding to the current frame image are determined based on pedestrian target distribution situation, then The passenger flow in the current frame image is determined according to the pedestrian target area image, the pedestrian density and the pixel weight Number.As seen from the above, in embodiments of the present invention, except make use of the HOG features of current frame image, always according to pedestrian's mesh Mark distribution situation determines pedestrian density and pixel weight, so as to take into full account the position of pedestrian and hiding relation in image, Improve the precision of passenger flow statisticses.
Brief description of the drawings
Fig. 1 is the flow chart of the passenger flow statistical method of the embodiment of the present invention one;
Fig. 2 is the system architecture diagram using the passenger flow statistical method of the embodiment of the present invention two;
Fig. 3 is the flow chart of the passenger flow statistical method of the embodiment of the present invention two;
Fig. 4 is the schematic diagram of the passenger flow statistic device of the embodiment of the present invention three;
Fig. 5 is the schematic diagram of the passenger flow statistical system of the embodiment of the present invention four;
Fig. 6 is the another schematic diagram of the passenger flow statistical system of the embodiment of the present invention four;
Fig. 7 be the present invention be embodiment five passenger flow statisticses equipment schematic diagram.
Embodiment
Below in conjunction with drawings and examples, the embodiment of the present invention is described in further detail.Following reality Apply example to be used to illustrate the present invention, but be not limited to the scope of the present invention.
As shown in figure 1, the passenger flow statistical method of the embodiment of the present invention one includes:
Step 101, obtain current frame image.
In a particular application, real-time monitoring data is gathered using camera, obtains sequence of frames of video image.For obtaining Video sequence image in every two field picture can all be used as current frame image in this, and carry out following processing.
Step 102, row corresponding to the current frame image determined by the background image of the current frame image and acquisition People's target area image.
In a particular application, for the sequence of frames of video image of acquisition, background model is established using Gaussian Background, with Adapt to the continuous change of external environment.In modeling process, existed by each pixel in the video image to being obtained ahead of time The statistics of great amount of samples in long period, the background modeling of complicated dynamic scene can be achieved, obtain the background of video Scene Image, to adapt to the change of scene.
The background image obtained will be modeled by Gaussian Background and current frame image carries out difference, obtain foreground image.Tool Body, for each pixel in the current frame image, if the pixel of the first pixel in each pixel The difference of the pixel value of value the second pixel corresponding with the first pixel described in the background image is located in preset range, will First pixel is labeled as background dot;Otherwise, first pixel is labeled as foreground point.Wherein, background dot can be marked 0 is designated as, foreground point can be labeled as 1.Background dot and foreground point according to being marked in the current frame image can obtain it is described before Scape image.
After foreground image is obtained, foreground image first corrode expanding afterwards with the structural element with certain form Morphologic filtering operation, to avoid the interference of picture noise, so as to obtain pedestrian target region.
In embodiments of the present invention, the pedestrian target region Ω of acquisition bianry image B is represented by:
Wherein, (i, j) represents position of the pixel in current frame image.
Step 103, the pedestrian target grader obtained using the HOG features trainings based on sample image, it is determined that described work as Pedestrian target distribution situation in prior image frame.
In embodiments of the present invention, it is big in video image by cutting by the use of some video images as training sample set The pedestrian's positive sample and negative sample of amount, extract its HOG feature and utilize SVM (Support Vector Machine, supporting vector Machine) grader progress off-line training, obtain pedestrian target grader.
In the process, using the pedestrian target grader to the current frame image progress time with different amplification coefficients Go through.That is, in embodiments of the present invention, in order to reduce detection leakage phenomenon as far as possible, using based on sliding window to current frame image When carrying out target detection, introduce multiple scale detecting mechanism, i.e., using different amplification coefficients progressively amplify current frame image and to working as Prior image frame is detected.
Wherein, in the process, the current frame image is amplified according to different amplification coefficients, utilizes the row People's object classifiers carry out window traversal to the current frame image after amplification.When carrying out window traversal, if the first window Mouthful it is identified as with pedestrian target, obtains first coordinate of the pedestrian target in the first window and described current The current amplification coefficient of two field picture.Then, first coordinate is zoomed in and out using the current amplification coefficient, described in acquisition First coordinate, second coordinate corresponding in the current frame image, and according to corresponding to the coordinate of one or more second of acquisition Pedestrian target point determines the pedestrian target distribution situation in the current frame image.
Thus, in embodiments of the present invention, the pedestrian target distribution situation includes the position of each pedestrian target (i.e. respectively Individual pedestrian target point), or further, may also include the size (or size) of each pedestrian target.
Therefore, in this embodiment, can also further include:The pedestrian target is obtained in the first window First size, and the first size is zoomed in and out using the current amplification coefficient, the first size is obtained described Second size corresponding in current frame image.
Step 104, pedestrian density and picture corresponding to the current frame image determined based on the pedestrian target distribution situation Plain weight.
In embodiments of the present invention, above-mentioned pedestrian target point can be divided into for the pixel in current frame image The non-pedestrian target point of pedestrian target is not to be regarded as with others.
In embodiments of the present invention, for each pedestrian target point in the current frame image, obtain described each Individual pedestrian target point appears in the number for being identified as having in the first window of pedestrian target, then calculates with the present frame The integration of the number in the presumptive area centered on each pixel in image, obtain row corresponding to each pixel People's density.Wherein, the big I of the presumptive area is arbitrarily set.
Specifically, a certain pedestrian target point appears in the first window for being identified as having pedestrian target in note current frame image Number in mouthful is ρ.That is some pedestrian target o'clock is in one is identified as having the first window of pedestrian target, then the pedestrian Target point ρ values are 1;If the pedestrian target point simultaneously in two are identified as having the first window of pedestrian target, the row The ρ values of people's target point are 2.Then, calculate with each pixel (including pedestrian target point and the non-pedestrian mesh in current frame image Punctuate) centered on certain area in ρ values integration, the wherein integral representation pixel pedestrian density.According to each picture The pedestrian density of vegetarian refreshments, so as to obtain pedestrian density's figure corresponding with current frame image.
For in the neighborhood rectangular area centered on a certain pixel, if the pedestrian target number detected is more, Pixel pedestrian density's value corresponding in pedestrian's density map is bigger, otherwise smaller.
Larger actual characteristic is imaged according to the pedestrian target near apart from camera, based on each pedestrian's mesh in current frame image Target position and size, different weights is assigned to the pedestrian target of the diverse location in current frame image, obtains present frame figure The pixel weight as corresponding to, and then construct the weight map of current frame image.In weight map, the pedestrian target near apart from camera Point, weight are smaller;Conversely, weight is larger, it is caused to relation between pixel point feature and pedestrian's number by distance so as to eliminate Influence.
Step 105, work as according to determining the pedestrian target area image, the pedestrian density and the pixel weight Passenger flow number in prior image frame.
In this step, determined based on the analysis result to history video image in the current frame image with pedestrian's number just Related pixel, then according to determining the pedestrian target area image, the pedestrian density and the pixel weight With the weighted sum of the positively related pixel of pedestrian's number, and regression coefficient is obtained, and utilize the regression coefficient and the weighting With the passenger flow number in the determination current frame image.
As seen from the above, in embodiments of the present invention, when carrying out passenger flow statisticses, except make use of current frame image HOG features, pedestrian density and pixel weight is determined always according to pedestrian target distribution situation, so as to take into full account row in image The position of people and hiding relation, improve the precision of passenger flow statisticses.
In actual applications, the passenger flow statistical method of the embodiment of the present invention can be applied to the passenger flow system of subway, bus station etc. In meter.In second embodiment of the invention, passenger flow statistical method of the invention once is described by taking the passenger flow statisticses to subway as an example Implementation process.
In the embodiment of the present invention two, as shown in Fig. 2 the system that the passenger flow statistical method is applied includes:Camera 201, embedded device 202 and PC 203.Wherein, the embedded device may also be arranged in PC.Camera 201 is arranged on Subway tunnel top, shoots diagonally downward, the scene for needing to detect with shooting as complete as possible.The input of embedded device 202 The VT of end connection video camera 201, to obtain the pedestrian information in subway tunnel.Embedded device 202 passes through The video image that camera obtains is analyzed, obtains passenger flow statisticses information, and PC is transferred to by wireless communication module therein 203.The real-time statistics of PC 203 and analysis passenger flow information, and when a certain moment passenger flow numerical value is more than given threshold, PC 203 send the prompting of traffic dispersion to subway staff.
When camera starts, immediately beginning to obtain the video image of subway tunnel, embedded device carries out passenger flow statisticses, And send data in PC, storage passenger flow data is finally analyzed by PC, the management of subway transportation is controlled, realizes subway intelligence Energyization manages.Wherein, embedded device is core, the problem of undertaking statistical accuracy.
As shown in figure 3, the passenger flow statistical method of the embodiment of the present invention two, including:
Step 301, utilize camera acquisition video sequence image.
Step 302, the method modeled to the video sequence image got using Gaussian Background establish background model, obtain Background image.
During background modeling, pass through a large amount of samples to each pixel in video sequence image in a long time This statistics, the background modeling of complicated dynamic scene can be achieved, the background image of subway scene in video is obtained, to adapt to subway Passage Scene and the continuous change of illumination.
Step 303, obtain pedestrian target grader.
In this step, by the use of the video image of some subway tunnel scenes as training sample set, by cutting video Substantial amounts of pedestrian's positive sample and negative sample in image, extract its HOG feature and carry out off-line training using SVM classifier, obtain Pedestrian target grader, and the grader is stored in embedded device, in case follow-up pedestrian target detection.
In above process, step 301,302 can perform before step 303, can also perform after step 303.
Step 304, for the current frame image in video sequence image, determine pedestrian's mesh corresponding to the current frame image Mark area image.
For the current frame image in video, according to the difference of current frame image and background image, obtain in subway tunnel Pedestrian area image Ω be foreground image, and represented with bianry image.Wherein, the pixel in pedestrian area image is in prospect Value in image is 1, and the pixel value of the pixel outside pedestrian area image is 0.Then, then to the foreground image first corroded The filtering process expanded afterwards, it is final to obtain the two-value pedestrian target area for representing pedestrian area to avoid the interference of noise in image Area image B.
Wherein, bianry image B is represented by:
Wherein, (i, j) represents position of the pixel in current frame image.
Step 305, using pedestrian target grader, determine the pedestrian target distribution situation in the current frame image.
In this step, it is right using the pedestrian's grader trained in embedded device on multiple dimensioned locational space Current frame image carries out window traversal.In order to reduce detection leakage phenomenon as far as possible, current frame image is entered using based on sliding window During row target detection, multiple scale detecting mechanism is introduced, i.e., progressively amplifies current frame image using different amplification coefficients and is detected.
If in the current frame image with some amplification coefficient, if some detection window is identified as with pedestrian Target, then record position and the yardstick of pedestrian target, and according to the amplification coefficient by the position of pedestrian target and scaling extremely In current frame image, the position of the pedestrian target and yardstick in current frame image are obtained.
In above process, the execution of step 304 and step 305 has no fixed precedence relationship, can also perform simultaneously.
Step 306, pedestrian density and picture corresponding to the current frame image determined based on the pedestrian target distribution situation Plain weight.
Specifically, a certain pedestrian target point appears in the first window for being identified as having pedestrian target in note current frame image Number in mouthful is ρ.That is some pedestrian target o'clock is in one is identified as having the first window of pedestrian target, then the pedestrian Target point ρ values are 1;If the pedestrian target point simultaneously in two are identified as having the first window of pedestrian target, the row The ρ values of people's target point are 2.Then, calculate with each pixel (including pedestrian target point and the non-pedestrian mesh in current frame image Punctuate) centered on certain area in ρ values integration, the wherein integral representation pixel pedestrian density.According to each picture The pedestrian density of vegetarian refreshments, so as to obtain pedestrian density's figure corresponding with current frame image.
For in the neighborhood rectangular area centered on a certain pixel, if the pedestrian target number detected is more, Pixel pedestrian density's value corresponding in pedestrian's density map is bigger, otherwise smaller.
Larger actual characteristic is imaged according to the pedestrian target near apart from camera, based on each pedestrian's mesh in current frame image Target position and size, different weights is assigned to the pedestrian target of the diverse location in current frame image, obtains present frame figure The pixel weight W as corresponding to.In weight map W, the region near apart from camera, weight is smaller;Conversely, weight is larger, so as to disappear Except by the caused influence to relation between pixel point feature and pedestrian's number of distance.
Specifically, coordinate and size according to corresponding to any two pedestrian target point in current frame image, determine institute The estimation size of each non-pedestrian target point in the pixel of current frame image is stated, then according to each row in the current frame image Estimation size corresponding to size corresponding to people's target point and each non-pedestrian target point, determine each pedestrian target point and described each The weighted value of non-pedestrian target point.
Because pedestrian is usually upright in video image, so can be used pedestrian target vertical in current frame image Its positions and dimensions in current frame image is represented to coordinate and longitudinal size.
If pedestrian target point 1 is (y in the positions and dimensions of current frame image1,a1), pedestrian target point 2 is in current frame image Positions and dimensions be (y2,a2), then in position y in current frame image0Estimation size a corresponding to the non-pedestrian target point at place0It is full Foot formula (1):
According to the weight of each pixel, the weight map W of construction current frame image object.For the accurate spy for obtaining pixel Relation between sign and pedestrian's number, the influence as caused by distance is eliminated, then each pixel in weight map W corresponding to video image The size of corresponding weight and target at this is inversely proportional.
Step 307, with reference to pedestrian target area image, pedestrian density figure and weight map, obtain current frame image in row The weighted sum of the positively related pixel of people's number.Wherein the weighted sum is represented by following formula (2):
Wherein, (i, j) represents position of the pixel in current frame image;(I, J) represents the size of current frame image.
Step 308, work as according to determining the pedestrian target area image, the pedestrian density and the pixel weight Passenger flow number in prior image frame.
In this step, linear regression method, the weighted sum n of observation history video image and actual passenger flow number N are utilized Relation, obtain regression coefficientThen for current frame image, its passenger flow number N0It is represented by following formula (3):
In embodiments of the present invention, regression coefficient is being obtainedDuring, it can determine in the following way.
Fully to represent the relation of pixel weighted sum and pedestrian's number in current frame image, subway is counted by observing The history video image of passage, obtain different passenger flow number NiCorresponding regression coefficientDetailed process is as follows:
(a) it is used as first passenger flow number by the use of the product of initial regression coefficient and the weighted sum;
(b) product of i-th of regression coefficient and the weighted sum is calculated, using the product as i-th of passenger flow number;
If (c) product is equal to the i-th -1 passenger flow number, it is using i-th of regression coefficient as described return Number;Otherwise step (b)-(c) is repeated, until the product that step (b) obtains is equal to the i-th -1 passenger flow number;Wherein i=2, 3,4 ....
Specifically, the pixel weighted sum n being calculated for current frame image0, regression coefficient is randomly choosed first It is initialized, passenger flow number N is calculatedt2.Next selection Nt2Corresponding regression coefficientPassenger flow number is calculated Nt3.Judge Nt3With Nt2It is whether equal, if equal, utilizeAs the regression coefficient in this step, otherwise iterative calculation until NtnEqual to Ntn-1, now regression coefficientWith passenger flow number NtnIt is corresponding, so as to utilizeObtained currently with the product of weighted sum Passenger flow number in two field picture.Calculating process such as formula (4):
Step 309, judge whether each two field picture in video sequence image handles completion, if then terminating flow, otherwise For every two field picture, according to passenger flow number corresponding to above-mentioned steps 304-308 process acquisition.
As seen from the above, the passenger flow statistical method of the embodiment of the present invention has advantages below:(1) it is suitable for video scene Background and the change of illumination;(2) passenger flow in bus's flow monitoring scene can be counted, compensate for because serious shielding is made Into HOG detections it is inaccurate the defects of, there is the features such as statistical accuracy is high, and robustness is good;(3) system is by front-end and back-end structure Into front end is responsible for handling monitor video, obtains result, and data processed result is transmitted to rear end;Rear end pair The result being sent to carries out collect statistics and analyzed, and so as to form effective solution, is easy to apply.
As shown in figure 4, the passenger flow statistic device of the embodiment of the present invention three, including:
Image collection module 401, for obtaining current frame image;Pedestrian target area image determining module 402, for leading to The background image crossed the current frame image and obtained determines pedestrian target area image corresponding to the current frame image;Pedestrian Target distribution situation determining module 403, for the pedestrian target classification obtained using the HOG features trainings based on sample image Device, determine the pedestrian target distribution situation in the current frame image;Pedestrian density and pixel weight determination sub-module 404, use In determining pedestrian density and pixel weight corresponding to the current frame image based on the pedestrian target distribution situation;Passenger flow number Statistical module 405, for working as according to the determination of the pedestrian target area image, the pedestrian density and the pixel weight Passenger flow number in prior image frame.
Wherein, the pedestrian target area image determining module 403 includes:
Foreground image acquisition submodule, for the difference according to the current frame image and the background image, before acquisition Scape image;Pedestrian target area image acquisition submodule, for carrying out morphologic filtering to the foreground image, obtain the row People's target area image.
Specifically, the foreground image acquisition submodule includes:Element marking unit, for for the current frame image In each pixel, if the pixel value of the first pixel in each pixel and described in the background image the The difference of the pixel value of second pixel corresponding to one pixel is located in preset range, and first pixel is labeled as into background Point;Otherwise, first pixel is labeled as foreground point;Image acquisition unit, for being got the bid according to the current frame image The background dot of note and foreground point obtain the foreground image.
Wherein, the pedestrian target distribution situation determining module 402 includes:
Image scaling submodule, for being amplified to the current frame image according to different amplification coefficients;Image time Submodule is gone through, for carrying out window traversal to the current frame image after amplification using the pedestrian target grader;First Coordinate acquisition submodule, for when carrying out window traversal, if first window is identified as with pedestrian target, described in acquisition The current amplification coefficient of first coordinate and the current frame image of the pedestrian target in the first window;Second coordinate obtains Submodule is taken, for being zoomed in and out using the current amplification coefficient to first coordinate, obtains first coordinate in institute Second coordinate corresponding to stating in current frame image;Pedestrian target distribution situation acquisition submodule, for one according to acquisition or Pedestrian target point determines the pedestrian target distribution situation in the current frame image corresponding to multiple second coordinates.
In addition, further to improve the accuracy of passenger flow statisticses, the pedestrian target distribution situation determining module 402 is also wrapped Include:First size acquisition submodule, for obtaining first size of the pedestrian target in the first window;Second size Acquisition submodule, for being zoomed in and out using the current amplification coefficient to the first size, obtain the first size and exist Second size corresponding in the current frame image.
Specifically, the pedestrian density and pixel weight determination sub-module include:
Number acquisition submodule, for for each pedestrian target point in the current frame image, obtaining described every One pedestrian target point appears in the number for being identified as having in the first window of pedestrian target;Pedestrian density calculates submodule Block, for calculating the integration of the number in the presumptive area centered on each pixel in the current frame image, obtain Pedestrian density corresponding to each pixel is obtained, each pixel includes the pedestrian target point and non-pedestrian target Point;Size determination sub-module, for the second coordinate and the second size according to corresponding to any two pedestrian target point, it is determined that described The estimation size of each non-pedestrian target point in the pixel of current frame image;Weighted value acquisition submodule, for according to Size is estimated corresponding to the second size corresponding to each pedestrian target point and each non-pedestrian target point in current frame image, it is determined that described The weighted value of each pedestrian target point and each non-pedestrian target point.
Wherein, the passenger flow demographics module 405 includes:
Pixel determination sub-module, for determine in the current frame image with the positively related pixel of pedestrian's number;Add Power and determination sub-module, for determining institute according to the pedestrian target area image, the pedestrian density and the pixel weight State the weighted sum with the positively related pixel of pedestrian's number;Passenger flow number calculating sub module, for obtaining regression coefficient, and is utilized The regression coefficient and the weighted sum determine the passenger flow number in the current frame image.
Specifically, the weighted sum determination sub-module determines the weighted sum according to the following equation:
Wherein, (i, j) represents described and coordinate of the positively related pixel of pedestrian's number in current frame image;(I, J) Represent the size of the current frame image;BijRepresent described and the positively related pixel of pedestrian's number is in pedestrian's target area image In pixel value;CijRepresent the pedestrian density corresponding with the positively related pixel of pedestrian's number;WijRepresent described and pedestrian Weighted value corresponding to the positively related pixel of number.
In actual applications, the passenger flow number calculating sub module includes:Regression coefficient computing unit, for according to following Process obtains the regression coefficient:(a) it is used as first passenger flow number by the use of the product of initial regression coefficient and the weighted sum; (b) product of i-th of regression coefficient and the weighted sum is calculated, using the product as i-th of passenger flow number;It is if (c) described Product is equal to the i-th -1 passenger flow number, then using i-th of regression coefficient as the regression coefficient;Otherwise repeat (b)-(c), until the product that (b) is obtained is equal to the i-th -1 passenger flow number;Wherein i=2,3,4 ....
The operation principle of device of the present invention can refer to the description of preceding method embodiment.
As seen from the above, in embodiments of the present invention, when carrying out passenger flow statisticses, except make use of current frame image HOG features, pedestrian density and pixel weight is determined always according to pedestrian target distribution situation, so as to take into full account row in image The position of people and hiding relation, improve the precision of passenger flow statisticses.
As shown in figure 5, the passenger flow statistical system of the embodiment of the present invention four, including:
Image collecting device 501, for obtaining current frame image;
Passenger flow statistic device 502, for determining the present frame by the current frame image and the background image obtained Pedestrian target area image corresponding to image;The pedestrian target grader obtained using the HOG features trainings based on sample image, Determine the pedestrian target distribution situation in the current frame image;The present frame is determined based on the pedestrian target distribution situation Pedestrian density corresponding to image and pixel weight;According to the pedestrian target area image, the pedestrian density and the pixel Weight determines the passenger flow number in the current frame image.
Wherein, described image harvester includes camera.
As shown in fig. 6, further to improve Consumer's Experience, the system may also include:Passenger flow number processing unit 503, For prompting the user with the passenger flow number.For example, the passenger flow number processing unit can be the equipment such as PC.
In a particular application, the passenger flow statistic device can also be provided in the passenger flow number processing unit, now the visitor Person who lives in exile's number processing unit has the functions such as passenger flow analysing, statistics, processing.
As seen from the above, in embodiments of the present invention, except make use of the HOG features of current frame image, always according to row People's target distribution situation determines pedestrian density and pixel weight, so as to take into full account the position of pedestrian in image and block pass System, improve the precision of passenger flow statisticses.
As shown in fig. 7, the embodiment of the present invention five additionally provides a kind of passenger flow statisticses equipment, it is possible to achieve Fig. 1 of the present invention or The flow of embodiment illustrated in fig. 3.The passenger flow statisticses equipment can be PC (PC), and tablet personal computer and various intelligence are set It is standby etc..As shown in fig. 7, above-mentioned passenger flow statisticses equipment can include:Housing 701, processor 702, memory 703, circuit board 704 With power circuit 705, wherein, circuit board 704 is placed in the interior volume that housing 701 surrounds, processor 702 and memory 703 It is arranged on circuit board 704;Power circuit 705, for each circuit or the device power supply for above-mentioned passenger flow statisticses equipment;Storage Device 703 is used to store executable program code;Processor 702 is by reading the executable program code stored in memory 703 To run program corresponding with executable program code, for performing following steps:Obtain current frame image;By described current Two field picture and the background image obtained determine pedestrian target area image corresponding to the current frame image;Using based on sample graph The pedestrian target grader that the HOG features trainings of picture obtain, determine the pedestrian target distribution situation in the current frame image;Base Pedestrian density and pixel weight corresponding to the current frame image are determined in the pedestrian target distribution situation;According to the pedestrian Target area image, the pedestrian density and the pixel weight determine the passenger flow number in the current frame image.
In several embodiments provided herein, it should be understood that disclosed method and apparatus, can be by other Mode realize.For example, device embodiment described above is only schematical, for example, the division of the unit, only For a kind of division of logic function, there can be other dividing mode when actually realizing, such as multiple units or component can combine Or another system is desirably integrated into, or some features can be ignored, or do not perform.Another, shown or discussed phase Coupling or direct-coupling or communication connection between mutually can be by some interfaces, the INDIRECT COUPLING or communication of device or unit Connection, can be electrical, mechanical or other forms.
In addition, each functional unit in each embodiment of the present invention can be integrated in a processing unit, can also That the independent physics of unit includes, can also two or more units it is integrated in a unit.Above-mentioned integrated list Member can both be realized in the form of hardware, can also be realized in the form of hardware adds SFU software functional unit.
The above-mentioned integrated unit realized in the form of SFU software functional unit, can be stored in one and computer-readable deposit In storage media.Above-mentioned SFU software functional unit is stored in a storage medium, including some instructions are causing a computer Equipment (can be personal computer, server, or network equipment etc.) performs receiving/transmission method described in each embodiment of the present invention Part steps.And foregoing storage medium includes:USB flash disk, mobile hard disk, read-only storage (Read-Only Memory, abbreviation ROM), random access memory (Random Access Memory, abbreviation RAM), magnetic disc or CD etc. are various to store The medium of program code.
Described above is the preferred embodiment of the present invention, it is noted that for those skilled in the art For, on the premise of principle of the present invention is not departed from, some improvements and modifications can also be made, these improvements and modifications It should be regarded as protection scope of the present invention.

Claims (11)

  1. A kind of 1. passenger flow statistic device, it is characterised in that including:
    Image collection module, for obtaining current frame image;
    Pedestrian target area image determining module, for determining described work as by the current frame image and the background image obtained Pedestrian target area image corresponding to prior image frame;
    Pedestrian target distribution situation determining module, for the pedestrian target obtained using the HOG features trainings based on sample image Grader, determine the pedestrian target distribution situation in the current frame image;
    Pedestrian density and pixel weight determination sub-module, for determining the present frame figure based on the pedestrian target distribution situation Pedestrian density and pixel weight as corresponding to;
    Passenger flow demographics module, for according to the pedestrian target area image, the pedestrian density and the pixel weight Determine the passenger flow number in the current frame image.
  2. 2. device according to claim 1, it is characterised in that the pedestrian target area image determining module includes:
    Foreground image acquisition submodule, for the difference according to the current frame image and the background image, obtain foreground picture Picture;
    Pedestrian target area image acquisition submodule, for carrying out morphologic filtering to the foreground image, obtain the pedestrian Target area image.
  3. 3. device according to claim 2, it is characterised in that the foreground image acquisition submodule includes:
    Element marking unit, for for each pixel in the current frame image, if in each pixel The difference of the pixel value of the pixel value of first pixel the second pixel corresponding with the first pixel described in the background image In preset range, first pixel is labeled as background dot;Otherwise, first pixel is labeled as prospect Point;
    Image acquisition unit, for obtaining the foreground picture according to the background dot marked in the current frame image and foreground point Picture.
  4. 4. device according to claim 1, it is characterised in that the pedestrian target distribution situation determining module includes:
    Image scaling submodule, for being amplified to the current frame image according to different amplification coefficients;
    Image traversal submodule, for carrying out window to the current frame image after amplification using the pedestrian target grader Traversal;
    First coordinate acquisition submodule, for when carrying out window traversal, if first window is identified as with pedestrian target, Obtain the current amplification coefficient of first coordinate and the current frame image of the pedestrian target in the first window;
    Second coordinate acquisition submodule, for being zoomed in and out using the current amplification coefficient to first coordinate, obtain institute State the first coordinate second coordinate corresponding in the current frame image;
    Pedestrian target distribution situation acquisition submodule, for pedestrian target corresponding to the coordinate of one or more second according to acquisition Point determines the pedestrian target distribution situation in the current frame image.
  5. 5. device according to claim 4, it is characterised in that the pedestrian target distribution situation determining module also includes:
    First size acquisition submodule, for obtaining first size of the pedestrian target in the first window;
    Second size acquisition submodule, for being zoomed in and out using the current amplification coefficient to the first size, obtain institute State first size second size corresponding in the current frame image.
  6. 6. device according to claim 5, it is characterised in that the pedestrian density and pixel weight determination sub-module bag Include:
    Number acquisition submodule, for for each pedestrian target point in the current frame image, obtain it is described each Pedestrian target point appears in the number for being identified as having in the first window of pedestrian target;
    Pedestrian density's calculating sub module, for calculating the presumptive area centered on each pixel in the current frame image The integration of the interior number, obtains pedestrian density corresponding to each pixel, and each pixel includes the pedestrian Target point and non-pedestrian target point;
    Size determination sub-module, for the second coordinate and the second size according to corresponding to any two pedestrian target point, determine institute State the estimation size of each non-pedestrian target point in the pixel of current frame image;
    Weighted value acquisition submodule, for the second size and Ge Fei according to corresponding to each pedestrian target point in the current frame image Estimation size corresponding to pedestrian target point, determine the weighted value of each pedestrian target point and each non-pedestrian target point.
  7. 7. device according to claim 1, it is characterised in that the passenger flow demographics module includes:
    Pixel determination sub-module, for determine in the current frame image with the positively related pixel of pedestrian's number;
    Weighted sum determination sub-module, for according to the pedestrian target area image, the pedestrian density and the pixel weight It is determined that the described and positively related pixel of pedestrian's number weighted sum;
    Passenger flow number calculating sub module, institute is determined for obtaining regression coefficient, and using the regression coefficient and the weighted sum State the passenger flow number in current frame image.
  8. 8. device according to claim 7, it is characterised in that the weighted sum determination sub-module determines according to the following equation The weighted sum:
    <mrow> <msub> <mi>n</mi> <mn>0</mn> </msub> <mo>=</mo> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mn>0</mn> </mrow> <mi>I</mi> </munderover> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>j</mi> <mo>=</mo> <mn>0</mn> </mrow> <mi>J</mi> </munderover> <msub> <mi>B</mi> <mrow> <mi>i</mi> <mi>j</mi> </mrow> </msub> <msub> <mi>C</mi> <mrow> <mi>i</mi> <mi>j</mi> </mrow> </msub> <msub> <mi>W</mi> <mrow> <mi>i</mi> <mi>j</mi> </mrow> </msub> </mrow>
    Wherein, (i, j) represents described and coordinate of the positively related pixel of pedestrian's number in current frame image;(I, J) is represented The size of the current frame image;BijRepresent described and the positively related pixel of pedestrian's number is in pedestrian's target area image Pixel value;CijRepresent the pedestrian density corresponding with the positively related pixel of pedestrian's number;WijRepresent described and pedestrian's number Weighted value corresponding to positively related pixel.
  9. 9. device according to claim 7, it is characterised in that the passenger flow number calculating sub module includes:
    Regression coefficient computing unit, for obtaining the regression coefficient according to following processes:(a) initial regression coefficient and institute are utilized The product of weighted sum is stated as first passenger flow number;(b) product of i-th of regression coefficient and the weighted sum is calculated, by institute Product is stated as i-th of passenger flow number;If (c) product is equal to the i-th -1 passenger flow number, system is returned by described i-th Number is used as the regression coefficient;Otherwise (b)-(c) is repeated, until the product that (b) is obtained is equal to the i-th -1 passenger flow people Number;Wherein i=2,3,4 ...;
    Passenger flow number computing unit, for determining the visitor in the current frame image using the regression coefficient and the weighted sum Person who lives in exile's number.
  10. A kind of 10. passenger flow statistical system, it is characterised in that including:
    Image collecting device, for obtaining current frame image;
    Passenger flow statistic device, for determining that the current frame image is corresponding with the background image obtained by the current frame image Pedestrian target area image;The pedestrian target grader obtained using the HOG features trainings based on sample image, it is determined that described Pedestrian target distribution situation in current frame image;Determine that the current frame image is corresponding based on the pedestrian target distribution situation Pedestrian density and pixel weight;Determined according to the pedestrian target area image, the pedestrian density and the pixel weight Passenger flow number in the current frame image.
  11. 11. system according to claim 10, it is characterised in that the system also includes:
    Passenger flow number processing unit, for prompting the user with the passenger flow number.
CN201610822384.4A 2016-09-13 2016-09-13 Passenger flow statistics device and system Active CN107818287B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610822384.4A CN107818287B (en) 2016-09-13 2016-09-13 Passenger flow statistics device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610822384.4A CN107818287B (en) 2016-09-13 2016-09-13 Passenger flow statistics device and system

Publications (2)

Publication Number Publication Date
CN107818287A true CN107818287A (en) 2018-03-20
CN107818287B CN107818287B (en) 2022-02-18

Family

ID=61600446

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610822384.4A Active CN107818287B (en) 2016-09-13 2016-09-13 Passenger flow statistics device and system

Country Status (1)

Country Link
CN (1) CN107818287B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111046747A (en) * 2019-11-21 2020-04-21 北京金山云网络技术有限公司 Crowd counting model training method, crowd counting method, device and server
CN112446922A (en) * 2020-11-24 2021-03-05 厦门熵基科技有限公司 Pedestrian reverse judgment method and device for channel gate

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104318263A (en) * 2014-09-24 2015-01-28 南京邮电大学 Real-time high-precision people stream counting method
CN104463204A (en) * 2014-12-04 2015-03-25 四川九洲电器集团有限责任公司 Target quantity statistical method
WO2015066618A1 (en) * 2013-11-01 2015-05-07 The Florida International University Board Of Trustees Context based algorithmic framework for identifying and classifying embedded images of follicle units
CN105740819A (en) * 2016-01-29 2016-07-06 中国科学院信息工程研究所 Integer programming based crowd density estimation method
CN105809206A (en) * 2014-12-30 2016-07-27 江苏慧眼数据科技股份有限公司 Pedestrian tracking method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015066618A1 (en) * 2013-11-01 2015-05-07 The Florida International University Board Of Trustees Context based algorithmic framework for identifying and classifying embedded images of follicle units
CN104318263A (en) * 2014-09-24 2015-01-28 南京邮电大学 Real-time high-precision people stream counting method
CN104463204A (en) * 2014-12-04 2015-03-25 四川九洲电器集团有限责任公司 Target quantity statistical method
CN105809206A (en) * 2014-12-30 2016-07-27 江苏慧眼数据科技股份有限公司 Pedestrian tracking method
CN105740819A (en) * 2016-01-29 2016-07-06 中国科学院信息工程研究所 Integer programming based crowd density estimation method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
LEMPITSKY VICTOR AND ANDREW ZISSERMAN: "《Learning to count objects in images》", 《ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 23 (2010)》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111046747A (en) * 2019-11-21 2020-04-21 北京金山云网络技术有限公司 Crowd counting model training method, crowd counting method, device and server
CN111046747B (en) * 2019-11-21 2023-04-18 北京金山云网络技术有限公司 Crowd counting model training method, crowd counting method, device and server
CN112446922A (en) * 2020-11-24 2021-03-05 厦门熵基科技有限公司 Pedestrian reverse judgment method and device for channel gate

Also Published As

Publication number Publication date
CN107818287B (en) 2022-02-18

Similar Documents

Publication Publication Date Title
CN109697435B (en) People flow monitoring method and device, storage medium and equipment
CN102542289B (en) Pedestrian volume statistical method based on plurality of Gaussian counting models
CN104794737B (en) A kind of depth information Auxiliary Particle Filter tracking
CN110020592A (en) Object detection model training method, device, computer equipment and storage medium
CN105512627A (en) Key point positioning method and terminal
CN105550678A (en) Human body motion feature extraction method based on global remarkable edge area
US10255673B2 (en) Apparatus and method for detecting object in image, and apparatus and method for computer-aided diagnosis
CN109754009B (en) Article identification method, article identification device, vending system and storage medium
CN106097393A (en) A kind of based on multiple dimensioned and adaptive updates method for tracking target
CN104781849A (en) Fast initialization for monocular visual simultaneous localization and mapping (SLAM)
CN104766320A (en) Bernoulli smoothing weak target detection and tracking method under thresholding measuring
CN108961330A (en) The long measuring method of pig body and system based on image
CN104463240B (en) A kind of instrument localization method and device
CN110307903A (en) A kind of method of the contactless temperature dynamic measurement of poultry privileged site
CN110503662A (en) Tracking and Related product
CN109684986A (en) A kind of vehicle analysis method and system based on automobile detecting following
CN106569946A (en) Mobile terminal performance testing method and system
CN107818287A (en) A kind of passenger flow statistic device and system
CN110930384A (en) Crowd counting method, device, equipment and medium based on density information
CN114219936A (en) Object detection method, electronic device, storage medium, and computer program product
CN105632003B (en) The real-time estimating method and device of a kind of clearance port queuing time
CN104123569B (en) Video person number information statistics method based on supervised learning
CN113327269A (en) Unmarked cervical vertebra movement detection method
CN108765463A (en) A kind of moving target detecting method calmodulin binding domain CaM extraction and improve textural characteristics
Cetindag et al. Transfer Learning Methods for Using Textural Features in Histopathological Image Classification

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant