CN105023231A - Bus data acquisition method based on video recognition and cell phone GPS - Google Patents
Bus data acquisition method based on video recognition and cell phone GPS Download PDFInfo
- Publication number
- CN105023231A CN105023231A CN201510435604.3A CN201510435604A CN105023231A CN 105023231 A CN105023231 A CN 105023231A CN 201510435604 A CN201510435604 A CN 201510435604A CN 105023231 A CN105023231 A CN 105023231A
- Authority
- CN
- China
- Prior art keywords
- data
- target
- bus
- public transport
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02W—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO WASTEWATER TREATMENT OR WASTE MANAGEMENT
- Y02W30/00—Technologies for solid waste management
- Y02W30/50—Reuse, recycling or recovery technologies
- Y02W30/82—Recycling of waste of electrical or electronic equipment [WEEE]
Landscapes
- Image Analysis (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention discloses a bus data acquisition method based on video recognition and cell phone GPS. The method includes steps of acquiring bus video data and cell phone GPS data; acquiring bus transit routes and user boarding data according to the bus video data; acquiring user travelling path data of a user according to the cell phone GPS data; matching the user travelling path data with bus transit routes and the user boarding data and obtaining bus transit OD data, bus load factor real time data and line passenger flow data. According to the invention, by utilizing a method combining video recognition and cell phone GPS, waste of human and material resources is reduced and a problem of inaccuracy in matching of cell phone positioning and bus positioning, bus data investigation types are increased. Bus transit OD data, bus load factor real time data and line passenger flow data can be acquired accurately and data support can be provided for bus route planning, bus resource configuration and schedule optimization. At the same time, important data is provided for real time bus scheduling.
Description
Technical field
The invention belongs to public transport data survey technical field, particularly relate to a kind of public transport data capture method based on video identification and cellphone GPS.
Background technology
Along with socio-economic development, urbanization process speed goes is fast, and traffic problems are on the rise, and public transport is as the effective means solving urban transport problems, and rational programming dispatching operation rationality seems particularly important.Build rational bus operation system can improve service quality on the one hand and attract citizen to adopt bus trip, reduce traffic congestion, reduce the operation cost of public transport company own on the other hand, the trip reducing car is in addition conducive to alleviating environmental problem, realizes sustainable development.
Passenger flow trip data and vehicle load factor are the significant data bases of public transport planning and dispatching operation, but existing two kinds are mainly divided into for this data acquisition modes: one just uses artificial acquisition mode, another is these data of record acquisition of swiping the card based on public transport IC-card, former data procurement cost is higher, and owing to being that artificial collection exists error, particularly there is hysteresis quality in data, cannot obtain real time data; The subject matter of the latter is that public transport IC cannot obtain the information of getting off, and adopts theoretical data probability derivation model, unavoidably occurs data error even mistake.
Summary of the invention
Goal of the invention of the present invention is: comparatively large in order to solve in prior art public transport data survey workload and error, the problems such as enquiry data is single, the present invention proposes a kind of public transport data capture method based on video identification and cellphone GPS.
Technical scheme of the present invention is: a kind of public transport data capture method based on video identification and cellphone GPS, comprises the following steps:
A, acquisition public transport video data and cellphone GPS data;
B, to obtain public transport operation route and user according to public transport video data and to get on or off the bus data;
C, obtain user's trip route data according to cellphone GPS data;
D, data of user's trip route data and public transport operation route and user being got on or off the bus are mated, and obtain bus trip OD data and public transport load factor real time data and circuit passenger flow data.
Further, obtain user according to public transport video data in described step B and to get on or off the bus data, specifically comprise step by step following:
B1, extract target travel region in conjunction with inter-frame difference method and background difference method;
B2, the watershed algorithm based on silhouette markup is utilized to split target;
B3, the Target Recognition Algorithms based on boundary profile is utilized to identify splitting the target obtained in step B2;
B4, utilize target tracking method to identifying in step B3 that the target obtained is followed the tracks of, and count according to Target Motion Character, obtain user and to get on or off the bus data.
Further, described step B1 utilizes inter-frame difference method and background difference method to extract target travel region, specifically comprises step by step following:
B11, utilize inter-frame difference method and background difference method to process to present frame in public transport video data, obtain inter-frame difference image and background subtraction partial image;
B12, inter-frame difference image and background subtraction partial image to be carried out or computing obtains or arithmograph picture;
B13, by inter-frame difference image and or arithmograph picture carry out gridding process respectively;
B14, the non-zero pixels number in small grid in inter-frame difference image to be added up, and whether be less than threshold value by non-zero pixels number in setting threshold decision small grid; If non-zero pixels number is less than threshold value in small grid, then judge that this small grid is noise grid; If non-zero pixels number is more than or equal to threshold value in small grid, then judge that this small grid is normal grid;
B15, by or arithmograph picture in the small grid pixel value corresponding with inter-frame difference noise in image grid position be set to zero, extract obtain target travel area image.
Further, described step B2 utilizes to be split target based on the watershed algorithm of silhouette markup, specifically comprises step by step following:
B21, contours segmentation process is carried out to target travel region;
B22, target carried out to unrestrained water and fill process;
B23, Morphological scale-space is carried out to target travel area image;
B24, extract color lump profile record each candidate target, obtain Target Segmentation image.
Further, described B3 utilizes the Target Recognition Algorithms based on boundary profile to identify splitting the target obtained, and specifically comprises step by step following:
B31, target to be screened, remove the target not meeting target signature and position;
B32, respectively to target carry out front and back identify and left and right identify, obtain target recognition result.
Further, 6. as claimed in claim 1 based on the public transport data capture method of video identification and cellphone GPS, it is characterized in that, described step D data of user's trip route data and public transport operation route and user being got on or off the bus are carried out mating and are obtained bus trip OD data and public transport load factor real time data and circuit passenger flow data, specifically comprise step by step following:
D1, user's trip route data to be mated with public transport operation route, obtain bus trip path collection;
D2, to be concentrated in bus trip path every paths to arrive each bus station time and public transport in public transport operation route to arrive each bus station time and mate, obtain bus trip OD;
D3, data of getting on or off the bus according to user, obtain the real-time load factor of public transit vehicle; And the bus trip OD obtained in integrating step D2, obtain public bus network passenger flow data.
The invention has the beneficial effects as follows: the method that the present invention utilizes video identification and cellphone GPS to combine, decrease human and material resources consumption, solve the problem adopting mobile phone location and public transport reservation coupling to there is inaccuracy, improve public transport data survey kind, can Obtaining Accurate bus trip OD data and public transport load factor real time data and circuit passenger flow data, point out, simultaneously also for public transport Real-Time Scheduling provides significant data for Public transport network planning, Public Resource configuration and optimizing scheduling provide data.
Accompanying drawing explanation
Fig. 1 is the public transport data capture method schematic flow sheet based on video identification and cellphone GPS of the present invention.
Embodiment
In order to make object of the present invention, technical scheme and advantage clearly understand, below in conjunction with drawings and Examples, the present invention is further elaborated.Should be appreciated that specific embodiment described herein only in order to explain the present invention, be not intended to limit the present invention.
As shown in Figure 1, for of the present invention based on the public transport data capture method schematic flow sheet of video identification and cellphone GPS.Based on a public transport data capture method for video identification and cellphone GPS, comprise the following steps:
A, acquisition public transport video data and cellphone GPS data;
B, to obtain public transport operation route and user according to public transport video data and to get on or off the bus data;
C, obtain user's trip route data according to cellphone GPS data;
D, data of user's trip route data and public transport operation route and user being got on or off the bus are mated, and obtain bus trip OD data and public transport load factor real time data and circuit passenger flow data.
In step, public transport video data of the present invention can obtain by carrying out video capture to car door region.Due to the necessary channel that car door region is passenger getting on/off, therefore using door step region as guarded region, just can be detected when there being moving target, and judge to get on or off the bus behavior by target trajectory, thus the counting of realize target.When the present invention carries out video capture, video camera adopts the mode of tilt, and tilt can photograph human body major part somatic stigmata, and can avoid partial occlusion, tilt can ensure that clarification of objective is continuous print in video sequence simultaneously.
In stepb, because public transport operation programme path and each website thereof are fixed, therefore the present invention obtains public transport operation route according to the behavior of arriving at a station of public transit vehicle in public transport video data and moment of arriving each website.Because some bus stop treats that user has gone up car and closed behind the door, have again the user of only a few to need get on the bus and need to open the door, therefore the present invention arrives at a station using effective door opening moment as public transit vehicle the criterion of behavior.Here effectively the door opening moment refers to and to judge by arranging the time difference of threshold value to this door opening behavior distance door opening behavior last time; When the time difference of this door opening behavior distance door opening behavior last time is less than threshold value, then this door opening behavior corresponding moment is not effective door opening moment; When the time difference of this door opening behavior distance door opening behavior last time is more than or equal to threshold value, then this door opening behavior corresponding moment is effective door opening moment.
Because the head feature individual difference of target changes greatly, therefore merely based on head feature, larger limitation is existed to the method for object count.And with the behavior of getting on or off the bus of the angle target of video sequence, the position feature of target is more suitable for the criterion as target detection and object count.Obtain user according to public transport video data in the present invention to get on or off the bus data, specifically comprise step by step following:
B1, inter-frame difference method and background difference method is utilized to extract target travel region;
B2, the watershed algorithm based on silhouette markup is utilized to split target;
B3, the Target Recognition Algorithms based on boundary profile is utilized to identify splitting the target obtained in step B2;
B4, utilize target tracking method to identifying in step B3 that the target obtained is followed the tracks of, and count according to Target Motion Character, obtain user and to get on or off the bus data.
In step bl is determined., the present invention adopts inter-frame difference method and background difference method to extract target travel region, and computing is simple, and Detection results is good, and relative to single algorithm, target area is extracted quick and precisely, and background interference is less.
Inter-frame difference method refers to and changes by front and back two Frame storage in different visual texture data by the odd even of frame number, then obtains difference image by the difference of two frames.Target due to difference is to extract moving region, and the effect of difference can reduce relatively, and therefore the present invention adopts the method for fixed threshold to obtain difference image.The present invention adopts the object of inter-frame difference to be to extract target main region, utilize inter-frame difference to the insensitivity of background, reject background to the interference of target, thus the threshold value in difference choose can be relatively low, choosing of threshold value can make adaptability revision according to different application demand here.Be those skilled in the art's common technology means owing to obtaining difference image according to inter-frame difference method, therefore the present invention does not repeat.
Because public transit vehicle causes video capture environment to be change in the stop of different website, therefore the present invention also needs to carry out average background method to the video gathered ensures that algorithm is carried out as initial background and when background extracting is failed, the each pixel value of the method to successive video frames is added up, by the mean value of the counting statistics pixel value background pixel value as this point, thus initial background is extracted successfully.Extract successfully in initial background, need to dynamically update initial background according to the non-foreground area of every frame.
The present invention utilizes the integrality of background difference method energy basic guarantee target detection, therefore utilize inter-frame difference method to carry out denoising Processing to ground unrest on this basis, by the combination of background difference and inter-frame difference, change of background and the incomplete defect of Objective extraction can be avoided.The present invention utilizes inter-frame difference method and background difference method to extract target travel region, specifically comprises step by step following:
B11, utilize inter-frame difference method and background difference method to process to present frame in public transport video data, obtain inter-frame difference image and background subtraction partial image;
B12, inter-frame difference image and background subtraction partial image to be carried out or computing obtains or arithmograph picture;
B13, by inter-frame difference image and or arithmograph picture carry out gridding process respectively;
B14, the non-zero pixels number in small grid in inter-frame difference image to be added up, and whether be less than threshold value by non-zero pixels number in setting threshold decision small grid; If non-zero pixels number is less than threshold value in small grid, then judge that this small grid is noise grid; If non-zero pixels number is more than or equal to threshold value in small grid, then judge that this small grid is normal grid;
B15, by or arithmograph picture in the small grid pixel value corresponding with inter-frame difference noise in image grid position be set to zero, extract obtain target travel area image.
In step B2, be solve the excessive problem of Target Segmentation, first detected image pixel region of variation larger part herein, by this region alternatively target, grows the region detecting outline line periphery, thus realizes splitting target.For the characteristic that all around adhesion between the target of peak period is serious, the present invention proposes the watershed algorithm of improvement, on traditional watershed algorithm basis, by the mark of contours segmentation, avoid the blindness of prime area growth, decrease pseudo-destination number, also remain the segmentation of algorithm to target adhesion well simultaneously.The present invention utilizes to be split target based on the watershed algorithm of silhouette markup, specifically comprises step by step following:
B21, contours segmentation process is carried out to target travel region;
B22, target carried out to unrestrained water and fill process;
B23, Morphological scale-space is carried out to target travel area image;
B24, extract color lump profile record each candidate target, obtain Target Segmentation image.
In step B21, the target that the present invention carries out contours segmentation process to target travel region is to tell pixel sudden change, instead of can extract complete profile, simultaneously also in order to reduce the impact of too much pseudo-target.Therefore the present invention can adopt higher threshold value (30 ~ 60) when utilizing Canny algorithm to carry out contours segmentation process to target travel region.Here utilize Canny algorithm to carry out contours segmentation to target travel region and be treated to those skilled in the art's common technology means, the present invention does not repeat.
In step B23, Morphological scale-space is carried out to target travel area image and comprises dilation and erosion computing is carried out to image.Erosion operation refers to the process reducing one or more pixel layer along object boundary, exists, by erosion operation " hole " scope by extended in " hole " if having in join domain.Dilation operation refers to processing procedure object boundary being expanded one deck.
In step B24, when extracting color lump profile, also needing to screen color lump, rejecting the color lump comprised compared with large regions.
In step B3, the present invention utilizes the Target Recognition Algorithms based on boundary profile to identify splitting the target obtained, and specifically comprises step by step following:
B31, target to be screened, remove the target not meeting target signature and position;
B32, respectively to target carry out front and back identify and left and right identify, obtain target recognition result.
In step B31, the rule that target is screened is comprised: remove the target that profile length is less than 300; Remove the target that target body region length and width is greater than image 1/2nd length and width; The length removing the left summit of target and left picture boundary or the right summit of target and image right margin is less than the target of image 1/4th width.
In step B32, when the vertical direction distance of target is close to picture altitude, the present invention utilizes front and back recognition methods to identify target, is specially: first to the mean pixel Data-Statistics of target; Analyze pixel value variance again and be less than 10, gray-scale value close to 10 target; The position of evaluating objects, is divided into front target and rear target according to midline position by target; Finally carry out up-and-down boundary according to midline position to front target and rear target to determine.
When carrying out left and right to target and identifying, need to judge according to the front and back recognition result of target.
When front and back recognition result is 1 target, to the sequence that all candidate targets carry out from small to large according to the X value on summit, record minimum X value; Then according to summit X value, all candidate targets are added that the value of corresponding width carries out sequence from small to large, record maximum value H; And the Z-axis marking minimum x value and maximum H value correspondence is right boundary; Judge left and right destination number side by side, judged by the distance between X value and H value, 2 targets side by side about when distance is greater than threshold value 1/2 graphic width, otherwise be 1 target; If judged result is 1 target, then sort according to X value, X+ width value, Y value, Y+ height value to target, minimum x value is the left margin of target, and maximum X+ width value is the right margin of target, minimum Y value is the coboundary of target, and maximum Y+ height value is the lower boundary of target; If judged result is 2 targets, then near the center line on two borders, find from nearest separatrix, this border, calculate the summit, left and right of target and the distance value of center line, value of adjusting the distance judges from small to large successively.First analyzing this distance value is left summit or right summit, if left summit is then using the border of the vertical axis of this vertex correspondence as right target, if right summit is then using the border of the vertical axis of this vertex correspondence as left target, circulation is until the border of left and right target is determined.Up to the present the right boundary of target has been determined, the candidate target of summit X value between the right boundary in target 1 region is carried out the sequence that Y value and Y value add height value, minimum Y value and maximum Y add the up-and-down boundary that height value is target 1, in like manner determine the up-and-down boundary of target 2.
When front and back recognition result is 2 targets, to the marginal analysis of left and right target in front target area, first filter out the candidate target of front target area, i.e. the target of the Y value of representative points between Y1 and Y2, the target in this region is judged to the method determination target of object boundary according to above-mentioned left and right; To the marginal analysis of left and right target in rear target area, identical with said method, target area after the target of representative points Y value between Y3 and Y4 being defined as, then judges the method determination target of object boundary according to left and right.
In step B4, the feature that the present invention is directed to target identified region is extracted, and decreases the interference of background and other target, simple to operate, and tracking efficiency is high.The present invention chooses the left summit of target identified region as target tracking unique point, and because step B3 ensure that target is identified, therefore the core of target tracking is to embody the space length change between same target.Because this point is positioned on the outline line of target, each target uniqueness can be ensured, also can embody the spatial position change of target in moving process simultaneously.The present invention utilizes target tracking method to identifying in step B3 that the target obtained is carried out tracking and is specially:
First according to identifying the target obtained in step B3, the quantity N of record object and the left vertex position of each target;
Then, after the target identification of a new frame completes, judge whether the destination number of a new frame increases relative to former frame destination number;
When the destination number of a new frame is constant relative to former frame destination number or reduce, first the target signature point of front and back frame is carried out to the distance value calculating in transverse and longitudinal direction, when fore-and-aft distance is less than threshold value, be judged to be same target, and target location is upgraded, here target location update strategy is distance is carried out in the target location of front and back frame and initial position calculate, and latest position is position corresponding to position that distance value is large; Then to the target that former frame cannot be mated, judge that target following terminates; Finally to the target that a rear frame cannot mate, the distance value on target and border, front and back is calculated, is judged to be newly-increased target when the threshold is met, otherwise is judged to be jamming target.
When the destination number of a new frame increases relative to former frame destination number, first the target signature point of front and back frame is carried out to the distance value calculating in transverse and longitudinal direction, adjust the distance recently and distance is less than threshold value time, be judged to be same target, and target location is upgraded, here target location update strategy is distance is carried out in the target location of front and back frame and initial position calculate, and latest position is position corresponding to position that distance value is large; Then to the target that former frame cannot be mated, judge that target following terminates; Finally to the target that a rear frame cannot mate, the distance value on target and border, front and back is calculated, is judged to be newly-increased target when the threshold is met.
Adopt the method for counting of based target complete sequence herein, consider the complicacy of passenger on public transport in peak period kinetic characteristic, improve the accuracy of algorithm.The present invention carries out counting according to Target Motion Character and is specially: first extract the target initial point position information of following the trail of and terminating to be labeled; Calculate the fore-and-aft distance of starting point again; If be greater than threshold value and be greater than 0, the number of getting off adds 1, if absolute value is greater than threshold value and be less than 0, the number of getting on the bus adds 1; If fore-and-aft distance is less than threshold value, the distance of evaluating objects initial point position and image coboundary, if coboundary distance is greater than threshold value, then not to object count; Be greater than 0 when target fore-and-aft distance and be less than threshold value, if terminal and coboundary distance value are less than threshold value, the number of getting off adds 1; Be less than in 0 when target fore-and-aft distance and be less than threshold value, if starting point and coboundary distance value are less than threshold value, the number of getting on the bus adds 1.
In step C, obtaining user's trip route data according to cellphone GPS data is those skilled in the art's common technology means, and the present invention does not repeat.
In step D, the present invention's data of user's trip route data and public transport operation route and user being got on or off the bus are carried out mating and are obtained bus trip OD data and public transport load factor real time data and circuit passenger flow data, specifically comprise step by step following:
D1, user's trip route data to be mated with public transport operation route, obtain bus trip path collection;
D2, to be concentrated in bus trip path every paths to arrive each bus station time and public transport in public transport operation route to arrive each bus station time and mate, obtain bus trip OD;
D3, data of getting on or off the bus according to user, obtain the real-time load factor of public transit vehicle; And the bus trip OD obtained in integrating step D2, obtain public bus network passenger flow data.
In step D1, user's trip route data are mated with public transport operation route by the present invention, and from user's trip route, screening obtains bus trip path collection.
In step d 2, bus trip path is concentrated every paths to arrive each bus station time and public transport in public transport operation route and is arrived each bus station time and mate by the present invention, if the public transport that there is one or above public transport operation route arrives website time and this route matching successfully, then represent that this path is bus trip path.Obtain bus trip path according to coupling screening, bus trip OD can be obtained.
In step D3, the number of getting on or off the bus in real time of each public transit vehicle can be obtained according to user's data of getting on or off the bus, thus obtain the real-time load factor of public transit vehicle.And the bus trip OD obtained in integrating step D2, obtain the real-time passenger flow data of every bar public bus network.
Those of ordinary skill in the art will appreciate that, embodiment described here is to help reader understanding's principle of the present invention, should be understood to that protection scope of the present invention is not limited to so special statement and embodiment.Those of ordinary skill in the art can make various other various concrete distortion and combination of not departing from essence of the present invention according to these technology enlightenment disclosed by the invention, and these distortion and combination are still in protection scope of the present invention.
Claims (6)
1., based on a public transport data capture method for video identification and cellphone GPS, it is characterized in that, comprise the following steps:
A, acquisition public transport video data and cellphone GPS data;
B, to obtain public transport operation route and user according to public transport video data and to get on or off the bus data;
C, obtain user's trip route data according to cellphone GPS data;
D, data of user's trip route data and public transport operation route and user being got on or off the bus are mated, and obtain bus trip OD data and public transport load factor real time data and circuit passenger flow data.
2. as claimed in claim 1 based on the public transport data capture method of video identification and cellphone GPS, it is characterized in that, obtain user according to public transport video data in described step B and to get on or off the bus data, specifically comprise step by step following:
B1, extract target travel region in conjunction with inter-frame difference method and background difference method;
B2, the watershed algorithm based on silhouette markup is utilized to split target;
B3, the Target Recognition Algorithms based on boundary profile is utilized to identify splitting the target obtained in step B2;
B4, utilize target tracking method to identifying in step B3 that the target obtained is followed the tracks of, and count according to Target Motion Character, obtain user and to get on or off the bus data.
3., as claimed in claim 2 based on the public transport data capture method of video identification and cellphone GPS, it is characterized in that, described step B1 utilizes inter-frame difference method and background difference method to extract target travel region, specifically comprises step by step following:
B11, utilize inter-frame difference method and background difference method to process to present frame in public transport video data, obtain inter-frame difference image and background subtraction partial image;
B12, inter-frame difference image and background subtraction partial image to be carried out or computing obtains or arithmograph picture;
B13, by inter-frame difference image and or arithmograph picture carry out gridding process respectively;
B14, the non-zero pixels number in small grid in inter-frame difference image to be added up, and whether be less than threshold value by non-zero pixels number in setting threshold decision small grid; If non-zero pixels number is less than threshold value in small grid, then judge that this small grid is noise grid; If non-zero pixels number is more than or equal to threshold value in small grid, then judge that this small grid is normal grid;
B15, by or arithmograph picture in the small grid pixel value corresponding with inter-frame difference noise in image grid position be set to zero, extract obtain target travel area image.
4., as claimed in claim 2 based on the public transport data capture method of video identification and cellphone GPS, it is characterized in that, described step B2 utilizes to be split target based on the watershed algorithm of silhouette markup, specifically comprises step by step following:
B21, contours segmentation process is carried out to target travel region;
B22, target carried out to unrestrained water and fill process;
B23, Morphological scale-space is carried out to target travel area image;
B24, extract color lump profile record each candidate target, obtain Target Segmentation image.
5., as claimed in claim 2 based on the public transport data capture method of video identification and cellphone GPS, it is characterized in that, described B3 utilizes the Target Recognition Algorithms based on boundary profile to identify splitting the target obtained, and specifically comprises step by step following:
B31, target to be screened, remove the target not meeting target signature and position;
B32, respectively to target carry out front and back identify and left and right identify, obtain target recognition result.
6. as claimed in claim 1 based on the public transport data capture method of video identification and cellphone GPS, it is characterized in that, described step D data of user's trip route data and public transport operation route and user being got on or off the bus are carried out mating and are obtained bus trip OD data and public transport load factor real time data and circuit passenger flow data, specifically comprise step by step following:
D1, user's trip route data to be mated with public transport operation route, obtain bus trip path collection;
D2, to be concentrated in bus trip path every paths to arrive each bus station time and public transport in public transport operation route to arrive each bus station time and mate, obtain bus trip OD;
D3, data of getting on or off the bus according to user, obtain the real-time load factor of public transit vehicle; And the bus trip OD obtained in integrating step D2, obtain public bus network passenger flow data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510435604.3A CN105023231B (en) | 2015-07-23 | 2015-07-23 | Public transport data capture method based on video identification and cellphone GPS |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510435604.3A CN105023231B (en) | 2015-07-23 | 2015-07-23 | Public transport data capture method based on video identification and cellphone GPS |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105023231A true CN105023231A (en) | 2015-11-04 |
CN105023231B CN105023231B (en) | 2018-07-17 |
Family
ID=54413179
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510435604.3A Expired - Fee Related CN105023231B (en) | 2015-07-23 | 2015-07-23 | Public transport data capture method based on video identification and cellphone GPS |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105023231B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106683404A (en) * | 2016-12-06 | 2017-05-17 | 华南理工大学 | Method for obtaining bus passenger flow OD (origin-destination) matrix through mobile phone location technology |
CN107357422A (en) * | 2017-06-28 | 2017-11-17 | 深圳先进技术研究院 | Video camera projection interaction touch control method, device and computer-readable recording medium |
CN108242146A (en) * | 2016-12-27 | 2018-07-03 | 北京亿阳信通科技有限公司 | Based on mass transit card data analysis passenger ride website it is timely between method and system |
CN108922178A (en) * | 2018-07-01 | 2018-11-30 | 北京工业大学 | The real-time load factor calculation method of public transit vehicle based on public transport multi-source data |
CN117132948A (en) * | 2023-10-27 | 2023-11-28 | 南昌理工学院 | Scenic spot tourist flow monitoring method, system, readable storage medium and computer |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004062763A (en) * | 2002-07-31 | 2004-02-26 | Nec Soft Ltd | Bus traffic control system and bus traffic control method |
JP2004227207A (en) * | 2003-01-22 | 2004-08-12 | Kyosan Electric Mfg Co Ltd | System, terminal, program, and method for managing bus travelling |
JP2004227241A (en) * | 2003-01-22 | 2004-08-12 | Kyosan Electric Mfg Co Ltd | Bus travelling schedule preparing device |
CN101256686A (en) * | 2008-03-26 | 2008-09-03 | 河北工业大学 | Device and method for collecting public traffic vehicle passenger flow |
CN101615207A (en) * | 2009-07-10 | 2009-12-30 | 重庆大学 | A kind of method of obtaining bus stations with bus-waiting and bus-IC-card-holding passengers |
CN101673423A (en) * | 2009-08-26 | 2010-03-17 | 深圳市飞瑞斯科技有限公司 | Bus passenger number statistical system capable of analyzing by means of video |
CN101976500A (en) * | 2010-10-25 | 2011-02-16 | 中国科学院深圳先进技术研究院 | Method and system for analyzing traffic network |
CN102013163A (en) * | 2010-11-25 | 2011-04-13 | 广州通易科技有限公司 | Method for bus origin-destination (OD) investigation by using mobile phone base station data and operating vehicle global position system (GPS) data |
CN102097002A (en) * | 2010-11-22 | 2011-06-15 | 东南大学 | Method and system for acquiring bus stop OD based on IC card data |
CN102339488A (en) * | 2011-04-29 | 2012-02-01 | 重庆市科学技术研究院 | Radio frequency identification (RFID) technology-based public transport passenger flow information acquisition system and method |
CN102592339A (en) * | 2012-02-21 | 2012-07-18 | 重庆市科学技术研究院 | System and method for acquiring bus passenger flow information |
CN102622798A (en) * | 2012-03-28 | 2012-08-01 | 东南大学 | Passenger flow statistical analysis system |
CN103177561A (en) * | 2011-12-26 | 2013-06-26 | 北京掌城科技有限公司 | Method and system for generating bus real-time traffic status |
CN103593974A (en) * | 2013-11-06 | 2014-02-19 | 福建工程学院 | Bus passenger capacity collection method based on locating information |
CN103810851A (en) * | 2014-01-23 | 2014-05-21 | 广州地理研究所 | Mobile phone location based traffic mode identification method |
-
2015
- 2015-07-23 CN CN201510435604.3A patent/CN105023231B/en not_active Expired - Fee Related
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004062763A (en) * | 2002-07-31 | 2004-02-26 | Nec Soft Ltd | Bus traffic control system and bus traffic control method |
JP2004227207A (en) * | 2003-01-22 | 2004-08-12 | Kyosan Electric Mfg Co Ltd | System, terminal, program, and method for managing bus travelling |
JP2004227241A (en) * | 2003-01-22 | 2004-08-12 | Kyosan Electric Mfg Co Ltd | Bus travelling schedule preparing device |
CN101256686A (en) * | 2008-03-26 | 2008-09-03 | 河北工业大学 | Device and method for collecting public traffic vehicle passenger flow |
CN101615207A (en) * | 2009-07-10 | 2009-12-30 | 重庆大学 | A kind of method of obtaining bus stations with bus-waiting and bus-IC-card-holding passengers |
CN101673423A (en) * | 2009-08-26 | 2010-03-17 | 深圳市飞瑞斯科技有限公司 | Bus passenger number statistical system capable of analyzing by means of video |
CN101976500A (en) * | 2010-10-25 | 2011-02-16 | 中国科学院深圳先进技术研究院 | Method and system for analyzing traffic network |
CN102097002A (en) * | 2010-11-22 | 2011-06-15 | 东南大学 | Method and system for acquiring bus stop OD based on IC card data |
CN102013163A (en) * | 2010-11-25 | 2011-04-13 | 广州通易科技有限公司 | Method for bus origin-destination (OD) investigation by using mobile phone base station data and operating vehicle global position system (GPS) data |
CN102339488A (en) * | 2011-04-29 | 2012-02-01 | 重庆市科学技术研究院 | Radio frequency identification (RFID) technology-based public transport passenger flow information acquisition system and method |
CN103177561A (en) * | 2011-12-26 | 2013-06-26 | 北京掌城科技有限公司 | Method and system for generating bus real-time traffic status |
CN102592339A (en) * | 2012-02-21 | 2012-07-18 | 重庆市科学技术研究院 | System and method for acquiring bus passenger flow information |
CN102622798A (en) * | 2012-03-28 | 2012-08-01 | 东南大学 | Passenger flow statistical analysis system |
CN103593974A (en) * | 2013-11-06 | 2014-02-19 | 福建工程学院 | Bus passenger capacity collection method based on locating information |
CN103810851A (en) * | 2014-01-23 | 2014-05-21 | 广州地理研究所 | Mobile phone location based traffic mode identification method |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106683404A (en) * | 2016-12-06 | 2017-05-17 | 华南理工大学 | Method for obtaining bus passenger flow OD (origin-destination) matrix through mobile phone location technology |
CN106683404B (en) * | 2016-12-06 | 2019-10-18 | 华南理工大学 | A method of bus passenger flow OD is obtained by Mobile Location Technology |
CN108242146A (en) * | 2016-12-27 | 2018-07-03 | 北京亿阳信通科技有限公司 | Based on mass transit card data analysis passenger ride website it is timely between method and system |
CN108242146B (en) * | 2016-12-27 | 2020-10-16 | 北京亿阳信通科技有限公司 | Method and system for analyzing passenger bus station and time based on bus card data |
CN107357422A (en) * | 2017-06-28 | 2017-11-17 | 深圳先进技术研究院 | Video camera projection interaction touch control method, device and computer-readable recording medium |
CN107357422B (en) * | 2017-06-28 | 2023-04-25 | 深圳先进技术研究院 | Camera-projection interactive touch control method, device and computer readable storage medium |
CN108922178A (en) * | 2018-07-01 | 2018-11-30 | 北京工业大学 | The real-time load factor calculation method of public transit vehicle based on public transport multi-source data |
CN117132948A (en) * | 2023-10-27 | 2023-11-28 | 南昌理工学院 | Scenic spot tourist flow monitoring method, system, readable storage medium and computer |
CN117132948B (en) * | 2023-10-27 | 2024-01-30 | 南昌理工学院 | Scenic spot tourist flow monitoring method, system, readable storage medium and computer |
Also Published As
Publication number | Publication date |
---|---|
CN105023231B (en) | 2018-07-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102542289B (en) | Pedestrian volume statistical method based on plurality of Gaussian counting models | |
CN105336169B (en) | A kind of method and system that traffic congestion is judged based on video | |
Lai et al. | Image-based vehicle tracking and classification on the highway | |
CN105023231A (en) | Bus data acquisition method based on video recognition and cell phone GPS | |
CN104268519B (en) | Image recognition terminal and its recognition methods based on pattern match | |
CN103150559B (en) | Head recognition and tracking method based on Kinect three-dimensional depth image | |
CN104978567B (en) | Vehicle checking method based on scene classification | |
CN110517288A (en) | Real-time target detecting and tracking method based on panorama multichannel 4k video image | |
CN100573618C (en) | A kind of traffic intersection four-phase vehicle flow detection method | |
CN104517095B (en) | A kind of number of people dividing method based on depth image | |
CN103049787A (en) | People counting method and system based on head and shoulder features | |
CN103150903B (en) | Video vehicle detection method for adaptive learning | |
CN107491720A (en) | A kind of model recognizing method based on modified convolutional neural networks | |
Pan et al. | Traffic surveillance system for vehicle flow detection | |
CN102289948A (en) | Multi-characteristic fusion multi-vehicle video tracking method under highway scene | |
CN101567097B (en) | Bus passenger flow automatic counting method based on two-way parallactic space-time diagram and system thereof | |
CN103577875A (en) | CAD (computer-aided design) people counting method based on FAST (features from accelerated segment test) | |
CN107644528A (en) | A kind of vehicle queue length detection method based on vehicle tracking | |
CN105844229A (en) | Method and system for calculating passenger crowdedness degree | |
CN103383733A (en) | Lane video detection method based on half-machine study | |
EP2813973B1 (en) | Method and system for processing video image | |
CN102842037A (en) | Method for removing vehicle shadow based on multi-feature fusion | |
CN103646257A (en) | Video monitoring image-based pedestrian detecting and counting method | |
CN105893962A (en) | Method for counting passenger flow at airport security check counter | |
CN104063692A (en) | Method and system for pedestrian positioning detection |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20180717 Termination date: 20190723 |