CN110033479A - Traffic flow parameter real-time detection method based on Traffic Surveillance Video - Google Patents
Traffic flow parameter real-time detection method based on Traffic Surveillance Video Download PDFInfo
- Publication number
- CN110033479A CN110033479A CN201910299470.5A CN201910299470A CN110033479A CN 110033479 A CN110033479 A CN 110033479A CN 201910299470 A CN201910299470 A CN 201910299470A CN 110033479 A CN110033479 A CN 110033479A
- Authority
- CN
- China
- Prior art keywords
- vehicle
- time
- traffic
- video
- real
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 44
- 238000011897 real-time detection Methods 0.000 title claims abstract description 29
- 238000001514 detection method Methods 0.000 claims abstract description 76
- 238000012360 testing method Methods 0.000 claims abstract description 18
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 17
- 238000013507 mapping Methods 0.000 claims abstract description 16
- 238000013136 deep learning model Methods 0.000 claims abstract description 7
- 238000012549 training Methods 0.000 claims description 24
- 238000012544 monitoring process Methods 0.000 claims description 10
- 230000003287 optical effect Effects 0.000 claims description 6
- 230000008859 change Effects 0.000 claims description 5
- 230000008569 process Effects 0.000 claims description 5
- 238000004364 calculation method Methods 0.000 claims description 4
- 238000006243 chemical reaction Methods 0.000 claims description 4
- 238000012546 transfer Methods 0.000 claims description 4
- 230000008901 benefit Effects 0.000 claims description 3
- 238000013526 transfer learning Methods 0.000 claims description 3
- 238000013519 translation Methods 0.000 claims description 3
- 238000013135 deep learning Methods 0.000 claims description 2
- 238000009432 framing Methods 0.000 claims description 2
- 230000007613 environmental effect Effects 0.000 description 4
- 238000007689 inspection Methods 0.000 description 4
- 238000009434 installation Methods 0.000 description 3
- 239000000523 sample Substances 0.000 description 3
- 241000208340 Araliaceae Species 0.000 description 2
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 2
- 235000003140 Panax quinquefolius Nutrition 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 238000001739 density measurement Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 235000008434 ginseng Nutrition 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 235000013399 edible fruits Nutrition 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000004907 flux Effects 0.000 description 1
- 230000006698 induction Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003032 molecular docking Methods 0.000 description 1
- 230000000474 nursing effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 239000000725 suspension Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a kind of traffic flow parameter real-time detection method based on Traffic Surveillance Video, comprising: video preprocessor calibration: demarcate type and the position of vehicle;Target detection: with the data demarcated in advance, the deep learning model of the vehicle target detection based on SSD is trained;Coordinate mapping: the mapping relations of monitor video image coordinate system and world coordinate system are solved;Vehicle target tracking: using coring correlation filter tracing algorithm, in conjunction with the deep learning model that vehicle target detects, carries out real-time tracing to vehicle driving;Index selection and calculating: setting calibration region timer, acquisition time index, in conjunction with vehicle target testing result, the tracking result of car tracing algorithm and the timing result of timer, and mapped and converted by coordinate, obtain the real-time detection result of traffic flow parameter.The present invention solves the problems, such as to obtain traffic flow parameter directly from Traffic Surveillance Video, disposably can complete accurately to detect in real time to multinomial traffic flow parameter.
Description
Technical field
The present invention relates to technical field of computer vision, especially a kind of traffic flow parameter based on Traffic Surveillance Video is real
When detection method.
Background technique
Traffic parameter can provide data support for intelligent transportation system, promote intelligent transportation system to give full play to itself and make
With the research and analysis for traffic video is a very popular research field.If researcher can be directly from view
Frequency obtains the information such as flow, density, the speed of traffic flow in, this is particularly significant for the development of intelligent transportation system,
Some DETECTION OF TRAFFIC PARAMETERS methods are difficult to realize once complete the problem of the flow of traffic flow, density and speed real-time detection.It passes
System method is mainly based upon the traffic flow parameter video detection of background modeling, these methods are vulnerable to occlusion, light variation etc.
External environmental condition interference etc. and judge by accident;And it is based on machine learning method, change the mode of inter-pixel dynamic change parsing, and
It focuses on the target identification extraction of vehicle sample space, there is anti-interference advantage.There are many when can be realized high-precision real
The deep learning target detection basic model of detection is proposed that this should be towards intelligent, net for DETECTION OF TRAFFIC PARAMETERS technology in succession
Network, autonomous learning direction develop and provide the foundation, such as:
Detection based on magnetic frequency: being most widely used that the detection based on electromagnetic coil, by the toroid being embedded under road surface
Coil sensor, signal detection processing unit and feeder line three parts composition.Its testing principle is detecting signal unit and loop coil
A tuning circuit is formed with feeder line, has detected whether that vehicle passes through by the variation of detection circuit resonance frequency, Ke Yijian
Test cross flux, the parameters such as occupation rate and rough speed.But this method needs install additional equipment, and coil under road surface
Using effect influenced by pavement quality it is very big.
Detection based on wave frequency: wave frequency vehicle detection be with microwave, ultrasonic wave, infrared waves etc. to vehicle emissions electromagnetic wave and
Generate induction.Wherein supersonic detector is on highway using more one kind, by probe and control mechanism at being set to
Whether road has vehicle process directly above or obliquely above, according to the diversity judgement of probe transmitted wave and close echo.Device suspension type
Installation is compared to road surface installation and has many advantages, but its detection is vulnerable to weather, pedestrian, the influence of vehicle flowrate size, detection essence
It spends poor.
Detection based on video: the method for the magnitude of traffic flow detection based on video include optical flow method based on target detection,
Frame differential method, background subtraction etc.;Traffic speed detection method based on video includes based on sequence image, motion vector
The method etc. of cluster;Traffic density detection method based on video mainly has Online SVM classifier combination background modeling
Technology realize traffic density measurement, using image shared by vehicle pixel value and whole image ratio replace record vehicle number come
Carry out traffic density measurement etc..These are traditional to be based primarily upon background modeling all based on the detection method of video to complete traffic flow
The detection of parameter, these methods are judged by accident vulnerable to environmental condition interference etc., and precision is not high enough;Another aspect calculation amount is larger,
There are also to be hoisted for real-time.
Summary of the invention
To solve problems of the prior art, the object of the present invention is to provide a kind of friendships based on Traffic Surveillance Video
Through-flow parameter real-time detection method, the present invention solve the problems, such as the acquisition traffic flow parameter directly from Traffic Surveillance Video, energy one
Accurate detection in real time is completed to secondary property to multinomial traffic flow parameter.
To achieve the above object, the technical solution adopted by the present invention is that: it is a kind of based on Traffic Surveillance Video traffic flow ginseng
Number real-time detection method, comprising the following steps:
S10, video preprocessor calibration: the type of vehicle and position in calibration Traffic Surveillance Video;
S20, target detection: it is trained by transfer learning and off-line training based on SSD's with the data demarcated in advance
Vehicle target detection deep learning model, for various in Traffic Surveillance Video and its Traffic Surveillance Video draw
The identification of position in face;
S30, coordinate mapping: the video camera Automatic parameter scaling method based on vanishing Point Detection Method is used, monitoring is solved
The mapping relations of video image coordinate system and world coordinate system;
S40, vehicle target tracking: coring correlation filter tracing algorithm is used, the depth detected in conjunction with vehicle target
Model is practised, real-time tracing is carried out to vehicle driving;
S50, index selection and calculating: setting calibration region timer obtains corresponding time index, in conjunction with vehicle target
Testing result, the tracking result of car tracing algorithm and the timing result of corresponding timer, and sat by monitor video image
The coordinate of mark and world coordinates, which maps, to convert, and obtains the real-time detection result of traffic flow parameter.
As a preferred embodiment, the step S10 is specific as follows:
Acquisition includes multiple angles, the Traffic Surveillance Video of multiple periods within a certain period of time, and by video every one
Framing saves as a picture, obtains picture set, using the type of vehicle in picture annotation tool labelImg calibration video
And position coordinates, and script is divided with automatic, picture set is divided into training set, verifying collection and test set.
As another preferred embodiment, the step S20 is specific as follows:
The SSD basic model by pre-training based on VGG model is downloaded, customized detection classification is the type of vehicle,
Transfer training is carried out to basic model using the training set, the hyper parameter of basic model is adjusted using verifying collection
It is whole, using the performance of test set observation basic model, until performance reaches the off-line learning that model is completed in requirement.
As another preferred embodiment, the step S30 is specific as follows:
IfFor the end point position of road surface boundary, the monitor video image coordinate system of vehicle is mapped
For world coordinate system:
In above formula, x, z be respectively in road plane any point three dimensional space coordinate along road surface transverse direction and direction of advance
Coordinate, and u, v are then coordinate of any point in two dimensional image;θ and d is respectively the folder between monitor camera and road surface
Angle and monitor camera export to the distance between its optical axis and road surface intersection point, determine reflecting for video coordinates and world coordinates
Relationship is penetrated, C is translation constant, can be ignored, and can demarcate prison by the automatic detection of lines end point and lamppost end point
The angle theta between video camera and road surface is controlled, is adjusted the distance and is measured as object of reference using the standard lines of highway, calibration ginseng
Number d.
As another preferred embodiment, the step S40 is specific as follows:
Coring correlation filter tracing algorithm uses the HOG feature of picture, one target detection of training in tracing process
Device goes whether detection next frame predicted position is target using object detector, reuse new testing result update training set into
And update object detector;Using the included KCF tracker in OpenCV, the Vehicle Object that each is detected is instantiated
When initialize a KCF tracker, KCF tracker receive a frame and target coordinate position, when being loaded into newest frame, KCF
Tracker calculates target the location of in a new frame, according to car tracing result judge between different frame vehicle whether
Previous frame occurred or newly entered, and completed the instant number statistics of vehicle in Traffic Surveillance Video picture.
As another preferred embodiment, in the step S50, pass through monitor video image coordinate and world coordinates
Coordinate map conversion, obtain first link length, vehicle movement and vehicle pass through number primary indicators, referred to according to the primary
Mark calculates traffic density, space occupancy and vehicle flowrate based on entire Traffic Surveillance Video picture, based on each detection vehicle
Average speed is calculated, time headway and time occupancy are calculated based on calibration region.
As another preferred embodiment, the step S50 is specific as follows:
To each vehicle detected, according to vehicle target testing result, each vehicle is in image in real-time record picture
On position coordinates ((umin,vmin),(umax,vmax)), and it is mapped to world coordinate system coordinate ((xmin,zmin),(xmax,
zmax)), wherein x-axis is road surface transverse direction, and z-axis is vehicle forward direction;Timer is set simultaneously, records vehicle in real time
From picture is entered to the time t for being driven out to picture, to every lane in picture in m lane, comprehensive vehicle detection and tracking
As a result, recording divided lane vehicle fleet n in a subframe in real timek(k=1,2 ..., m), calculate vehicle fleet N in picture,
The vehicle fleet N for once passing through picture leading edge is calculated at regular intervalsp;According to coordinate mapping result, picture road is obtained
The world coordinates z in direction of advance of beginning and endsWith this ze, find out the total length l (z of road in entire picturee-zs);Its
The calculation formula difference of middle divided lane density, space occupancy, vehicle flowrate and average speed is as follows: Vehicle flowrate=360*Np (/h), average speed=1/2 (zi,max+
zi,min)-zs/ t, zi,maxAnd zi,minThe z coordinate value in the world coordinate system lower left corner upper right corner of i-th vehicle is respectively represented,
zi,max-zi,minRepresent the length in its direction of advance of i-th vehicle, 1/2 (zi,max+zi,min) represent vehicle central point
The position coordinates of direction of advance.
As another preferred embodiment, the step S50 further include:
Based on calibration line in Traffic Surveillance Video picture, the detection of calibration edges of regions and timer is separately provided, detection is logical
The vehicle number and vehicle for crossing calibration line are by the time, and every 2 vehicles are by detection zone edge, when the primary two vehicles headstock of calculating passes through
Between be spaced, as time headway;Every M vehicle has edge by detection zone, and calculating each car enters the tailstock from headstock and leaves
Time interval Δ Ti, record total time Ts,
It is further comprising the steps of as another preferred embodiment:
S60, traffic flow parameter calculated result and video real-time detection result are docked to intelligent traffic monitoring interface, in real time
Show traffic flow parameters detection as a result, additional transport measure real-time adjustment.
The beneficial effects of the present invention are: the present invention, by the SSD algorithm of target detection based on Analysis On Multi-scale Features figure, training is pre-
The multi-angle traffic video data of calibration, obtain vehicle detection deep learning model, and the type and vehicle of real-time detection vehicle exist
Position coordinates in video;By the camera self-calibration method based on vanishing Point Detection Method, calculates video coordinates and reality is sat
Target conversion, to detect link length, vehicle movement etc.;By coring correlation filter track algorithm, and combine vehicle target
Detection algorithm is tracked the vehicle for driving into picture;In conjunction with car tracing arithmetic result and default or manual calibration region
Timer carries out timing to the vehicle for driving into picture, calculates time occupancy;Meanwhile record vehicle drive into the time and with it is next
The vehicle driven into calculates the time difference, calculates time headway, and the present invention is based only upon Traffic Surveillance Video, disposable inspection accurate in real time
Multinomial traffic flow parameter is surveyed, without installing and safeguarding additional magnetic frequency or wave frequency detecting instrument equipment, is enhanced to environmental change
Robustness;In detection effect, Traffic flow detecting vehicle flowrate > 95%, time occupancy > 97%, density > 90%, average speed
Spend > 90%, space occupancy > 95%.
Detailed description of the invention
Fig. 1 is the flow diagram of the embodiment of the present invention.
Specific embodiment
The embodiment of the present invention is described in detail with reference to the accompanying drawing.
Embodiment:
As shown in Figure 1, a kind of traffic flow parameter real-time detection method based on Traffic Surveillance Video, the present embodiment is specific
When implementation, may include on hardware video acquisition module, intelligent transportation parameter real-time detection module, traffic flow parameter computing module with
And intelligent traffic monitoring module, intelligent transportation parameter real-time detection module include again target detection model, target tracking algorithm and
Three main modulars of coordinate mapping method.
The groundwork process of traffic flow parameter real-time detection method is: firstly, being completed based on Traffic Surveillance Video real-time
The acquisition of traffic data is compared to traditional detection method based on word frequency and wave frequency, The method reduces additional installation and
The cost of to maintain equipment.Traffic video data are input to intelligent traffic flow parameter real-time detection module, by the good mesh of off-line training
Mark detection module directly identifies vehicle target in monitor video, exists in conjunction with car tracing algorithm and coordinate mapping method
The testing result for demarcating region, calculates each traffic flow parameter in real time.Other current methods based on video are compared to, this
Method more can be reduced the influence of environmental change based on the detection of deep learning model, and precision is higher;Based on core correlation filter
Tracking, tracking speed are more preferable compared to real-times such as traditional optical flow methods.Then, traffic flow parameter computing module combines detection, chases after
As a result, calculating every traffic flow parameter when the result of track and auxiliary calibration region and timer.Finally, the traffic number calculated
According to intelligent traffic monitoring module is docked to, in monitoring interface real-time display, convenient for combining arithmetic for real-time traffic flow situation to traffic measure
It is scheduled in time.
The method specifically includes the following steps:
Step 1, video preprocessor calibration: the type of vehicle and position in calibration Traffic Surveillance Video;Acquisition includes multiple angles,
The traffic video of multiple periods several hours, and video is saved as into a picture every 30 frames, obtain picture set.With
LabelImg tool demarcates the type of vehicle in video, the coordinate ((x in the vehicle target frame lower left corner and the upper right cornermin,ymin),
(xmax,ymax)), it is stored as the xml document of voc format, and divide script with automatic, the data of picture set are divided into training
Collection, verifying collection and test set.
Step 2, target detection: it is trained by transfer learning and off-line training based on SSD with the data demarcated in advance
Vehicle target detection deep learning model, for various in Traffic Surveillance Video and its in Traffic Surveillance Video
The identification of position in picture;Specifically, the transfer training of the vehicle target detection model based on SSD: firstly, by voc format
File is converted to tfrecord format, and the binary file reading speed of tfrecord format is faster;Then, downloading is based on VGG
The SSD model by pre-training, customized detection classification be class of vehicle, such as car, bus, track etc.;Finally, instructing in advance
On the basis for practicing model, transfer training is carried out to basic model using the training set data in step 1, collects data using verifying
The hyper parameter of model is adjusted, using test set data observing and nursing performance, until performance reach requirement complete model from
Line study.Using trained model, input traffic video and detected, can in real-time detection video the type of vehicle and its
The position coordinates of detection block.
Step 3, coordinate mapping: the video camera Automatic parameter scaling method based on vanishing Point Detection Method is used, prison is solved
Control the mapping relations of video image coordinate system and world coordinate system;For being installed on above road, before optical axis direction and road
In the same plane into direction approximation, then it is in the typical road monitoring camera configuration of certain pitch angle with road surface, takes
Self-calibrating method based on vanishing Point Detection Method.IfFor the end point position of road surface boundary, according to the following formula by vehicle
Video coordinates system be mapped as world coordinate system.In formula, x, z are respectively three dimensional space coordinate edge in any point in road plane
The lateral coordinate with direction of advance in road surface, and u, v are then coordinate of this in two dimensional image.θ and d is respectively video camera and road
Angle theta and video camera between face export to the distance between its optical axis and road surface intersection point, determine video coordinates and the world
The mapping relations of coordinate.C is translation constant, can be ignored.It can by the automatic detection of lines end point and lamppost end point
With the pitching angle theta of calibrating camera, is adjusted the distance and measured as object of reference using the standard lines of highway, calibrating parameters d:
Step 4, vehicle target tracking: coring correlation filter tracing algorithm is used, the depth detected in conjunction with vehicle target
Learning model carries out real-time tracing to vehicle driving;Using based on core correlation filter (KCF) track algorithm to vehicle driving into
Row tracking.KCF algorithm uses the HOG feature of picture, and one object detector of training, uses object detector in tracing process
It goes whether detection next frame predicted position is target, then reuses new testing result and go to update training set and then the inspection of more fresh target
Survey device.Using the included KCF tracker in OpenCV, at the beginning of when the Vehicle Object detected in each step 2 instantiates
One KCF tracker of beginningization.KCF tracker receives the coordinate position of a frame and target, when being loaded into newest frame, KCF tracking
Device calculates target the location of in a new frame.Judge which vehicle is previous frame between different frame according to car tracing result
Occurred, which vehicle newly enters, the accurate instant number statistics for completing vehicle in video pictures.
Step 5, index selection and calculating: setting calibration region timer obtains corresponding time index, in conjunction with vehicle mesh
Testing result, the tracking result of car tracing algorithm and the timing result of corresponding timer are marked, and passes through monitor video image
The mapping conversion of the coordinate of coordinate and world coordinates, obtains the real-time detection result of traffic flow parameter.The inspection of combining step 2- step 4
It surveys as a result, calculating traffic flow index;Based on entire picture, traffic density, space occupancy and vehicle flowrate are calculated;Based on each inspection
Measuring car calculates average speed;Based on calibration region, time headway and time occupancy are calculated.The specific calculating of each index
Method is as follows:
Each Che Tu in picture is recorded in real time according to vehicle target testing result to each vehicle detected
As upper position coordinates ((umin,vmin),(umax,vmax)), and it is mapped to world coordinate system coordinate ((xmin,zmin),(xmax,
zmax)), wherein x-axis is road surface transverse direction, and z-axis is vehicle forward direction;Simultaneously timer is set, in real time record vehicle from
Into picture to the time t for being driven out to picture.To every lane in picture in m lane, comprehensive vehicle detection and tracking knot
Fruit records divided lane vehicle fleet n in a subframe in real timek(k=1,2 ..., m), calculate vehicle fleet N in picture, often
The vehicle fleet N for once passing through picture leading edge is calculated every 10sp;According to coordinate mapping result, obtain picture road starting point and
The world coordinates z in direction of advance of terminalsWith this ze, find out the total length l (z of road in entire picturee-zs).Divided lane is close
The calculation formula difference of the speed of degree, space occupancy, vehicle flowrate and each car is as follows: Vehicle flowrate=360*Np(/h), average speed=1/2 (zi,max+zi,min)-zs/ t,
zi,maxAnd zi,minRespectively represent the z coordinate value in the world coordinate system lower left corner upper right corner of i-th vehicle, zi,max-zi,minRepresent
The length in its direction of advance of i vehicle, 1/2 (zi,max+zi,min) represent vehicle central point direction of advance position sit
Mark.
In addition, the detection of calibration edges of regions and timer is separately provided based on calibration line in picture, detection passes through calibration line
Vehicle number and vehicle pass through the time.Every 2 vehicles calculate primary two vehicles headstock and pass through time interval by detection zone edge,
As time headway.Every 5 vehicles have edge by detection zone, calculate each car between the time that headstock enters that the tailstock leaves
Every Δ Ti, record total time Ts, such as following formula
Step 6, docking intelligent traffic monitoring interface: traffic flow parameter calculated result and video real-time detection result are docked
To Qt intelligent traffic monitoring interface, real-time display traffic flow parameters detection as a result, additional transport measure real-time adjustment.
A specific embodiment of the invention above described embodiment only expresses, the description thereof is more specific and detailed, but simultaneously
Limitations on the scope of the patent of the present invention therefore cannot be interpreted as.It should be pointed out that for those of ordinary skill in the art
For, without departing from the inventive concept of the premise, various modifications and improvements can be made, these belong to guarantor of the invention
Protect range.
Claims (9)
1. a kind of traffic flow parameter real-time detection method based on Traffic Surveillance Video, which comprises the following steps:
S10, video preprocessor calibration: the type of vehicle and position in calibration Traffic Surveillance Video;
S20, target detection: the vehicle based on SSD is trained by transfer learning and off-line training with the data demarcated in advance
The deep learning model of target detection, for various in Traffic Surveillance Video and its in Traffic Surveillance Video picture
The identification of position;
S30, coordinate mapping: the video camera Automatic parameter scaling method based on vanishing Point Detection Method is used, monitor video is solved
The mapping relations of image coordinate system and world coordinate system;
S40, vehicle target tracking: coring correlation filter tracing algorithm is used, the deep learning mould detected in conjunction with vehicle target
Type carries out real-time tracing to vehicle driving;
S50, index selection and calculating: setting calibration region timer obtains corresponding time index, detects in conjunction with vehicle target
As a result, the tracking result of car tracing algorithm and the timing result of corresponding timer, and by monitor video image coordinate with
The coordinate of world coordinates maps conversion, obtains the real-time detection result of traffic flow parameter.
2. the traffic flow parameter real-time detection method according to claim 1 based on Traffic Surveillance Video, which is characterized in that
The step S10 is specific as follows:
Acquisition includes multiple angles, the Traffic Surveillance Video of multiple periods within a certain period of time, and by video every a framing
A picture is saved as, picture set is obtained, type and position using vehicle in picture annotation tool labelImg calibration video
Coordinate is set, and divides script with automatic, picture set is divided into training set, verifying collection and test set.
3. the traffic flow parameter real-time detection method according to claim 2 based on Traffic Surveillance Video, which is characterized in that
The step S20 is specific as follows:
The SSD basic model by pre-training based on VGG model is downloaded, customized detection classification is the type of vehicle, is utilized
The training set carries out transfer training to basic model, is adjusted using verifying collection to the hyper parameter of basic model, benefit
The performance of basic model is observed with the test set, until performance reaches the off-line learning that model is completed in requirement.
4. the traffic flow parameter real-time detection method according to claim 3 based on Traffic Surveillance Video, which is characterized in that
The step S30 is specific as follows:
IfFor the end point position of road surface boundary, the monitor video image coordinate system of vehicle is mapped as the world
Coordinate system:
In above formula, x, z be respectively in road plane any point three dimensional space coordinate along road surface transverse direction and direction of advance coordinate,
And u, v are then coordinate of any point in two dimensional image;θ and d be respectively the angle between monitor camera and road surface with
And monitor camera exports to the distance between its optical axis and road surface intersection point, determines that the mapping of video coordinates and world coordinates is closed
System, C are translation constant, can be ignored, and can demarcate monitoring by the automatic detection of lines end point and lamppost end point and take the photograph
Angle theta between camera and road surface is adjusted the distance as object of reference using the standard lines of highway and is measured, calibrating parameters d.
5. the traffic flow parameter real-time detection method according to claim 4 based on Traffic Surveillance Video, which is characterized in that
The step S40 is specific as follows:
Coring correlation filter tracing algorithm uses the HOG feature of picture, and one object detector of training, makes in tracing process
It goes whether detection next frame predicted position is target with object detector, reuses new testing result and update training set and then update
Object detector;It is initial when being instantiated to the Vehicle Object that each is detected using the included KCF tracker in OpenCV
Change a KCF tracker, KCF tracker receives the coordinate position of a frame and target, when being loaded into newest frame, KCF tracker
Target is calculated the location of in a new frame, judges between different frame whether vehicle goes out in previous frame according to car tracing result
It now crosses or newly enters, complete the instant number statistics of vehicle in Traffic Surveillance Video picture.
6. the traffic flow parameter real-time detection method according to claim 5 based on Traffic Surveillance Video, which is characterized in that
In the step S50, by the coordinate of monitor video image coordinate and world coordinates map convert, obtain first link length,
Vehicle movement and vehicle pass through number primary indicator, according to the primary indicator, calculate vehicle based on entire Traffic Surveillance Video picture
Density, space occupancy and vehicle flowrate calculate average speed based on each detection vehicle, when calculating headstock based on calibration region
Away from and time occupancy.
7. the traffic flow parameter real-time detection method according to claim 6 based on Traffic Surveillance Video, which is characterized in that
The step S50 is specific as follows:
The position of each vehicle on the image in picture is recorded in real time according to vehicle target testing result to each vehicle detected
Set coordinate ((umin,vmin),(umax,vmax)), and it is mapped to world coordinate system coordinate ((xmin,zmin),(xmax,zmax)), wherein x-axis
For road surface transverse direction, z-axis is vehicle forward direction;Timer is set simultaneously, records vehicle in real time from picture is entered to being driven out to picture
The time t in face, to every lane in picture in m lane, comprehensive vehicle detection and tracking are as a result, in real time in one subframe of record
Divided lane vehicle fleet nk(k=1,2 ..., m), calculate vehicle fleet N in picture, calculate once passes through picture at regular intervals
The vehicle fleet N of leading edgep;According to coordinate mapping result, the world in direction of advance for obtaining picture road beginning and end is sat
Mark zsWith this ze, find out the total length l (z of road in entire picturee-zs);Wherein divided lane density, space occupancy, vehicle flowrate and
The calculation formula difference of average speed is as follows:
Vehicle flowrate=360*Np (/h), average speed=1/2 (zi,max+zi,min)-zs/ t, zi,maxAnd zi,minRespectively represent i-th vehicle
The world coordinate system lower left corner upper right corner z coordinate value, zi,max-zi,minRepresent the length in its direction of advance of i-th vehicle
Degree, 1/2 (zi,max+zi,min) represent vehicle central point the position coordinates in direction of advance.
8. the traffic flow parameter real-time detection method according to claim 7 based on Traffic Surveillance Video, which is characterized in that
The step S50 further include:
Based on calibration line in Traffic Surveillance Video picture, the detection of calibration edges of regions and timer is separately provided, detection passes through mark
The vehicle number and vehicle of alignment are calculated primary two vehicles headstock and are passed through between the time by time, every 2 vehicles by detection zone edge
Every as time headway;Every M vehicle has an edge by detection zone, calculate that each car enters that the tailstock leaves from headstock when
Between interval delta Ti, record total time Ts,
9. the traffic flow parameter real-time detection method according to claim 1-8 based on Traffic Surveillance Video,
It is characterized in that, further comprising the steps of:
S60, traffic flow parameter calculated result and video real-time detection result are docked to intelligent traffic monitoring interface, real-time display
Traffic flow parameters detection as a result, additional transport measure real-time adjustment.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910299470.5A CN110033479B (en) | 2019-04-15 | 2019-04-15 | Traffic flow parameter real-time detection method based on traffic monitoring video |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910299470.5A CN110033479B (en) | 2019-04-15 | 2019-04-15 | Traffic flow parameter real-time detection method based on traffic monitoring video |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110033479A true CN110033479A (en) | 2019-07-19 |
CN110033479B CN110033479B (en) | 2023-10-27 |
Family
ID=67238407
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910299470.5A Active CN110033479B (en) | 2019-04-15 | 2019-04-15 | Traffic flow parameter real-time detection method based on traffic monitoring video |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110033479B (en) |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110555423A (en) * | 2019-09-09 | 2019-12-10 | 南京东控智能交通研究院有限公司 | Multi-dimensional motion camera-based traffic parameter extraction method for aerial video |
CN110807924A (en) * | 2019-11-04 | 2020-02-18 | 吴钢 | Multi-parameter fusion method and system based on full-scale full-sample real-time traffic data |
CN111161545A (en) * | 2019-12-24 | 2020-05-15 | 北京工业大学 | Intersection region traffic parameter statistical method based on video |
CN111310736A (en) * | 2020-03-26 | 2020-06-19 | 上海同岩土木工程科技股份有限公司 | Rapid identification method for unloading and piling of vehicles in protected area |
CN111429484A (en) * | 2020-03-31 | 2020-07-17 | 电子科技大学 | Multi-target vehicle track real-time construction method based on traffic monitoring video |
CN111462249A (en) * | 2020-04-02 | 2020-07-28 | 北京迈格威科技有限公司 | Calibration data acquisition method, calibration method and device for traffic camera |
CN111599173A (en) * | 2020-05-12 | 2020-08-28 | 杭州云视通互联网科技有限公司 | Vehicle information automatic registration method, computer equipment and readable storage medium |
CN111613061A (en) * | 2020-06-03 | 2020-09-01 | 徐州工程学院 | Traffic flow acquisition system and method based on crowdsourcing and block chain |
CN111753797A (en) * | 2020-07-02 | 2020-10-09 | 浙江工业大学 | Vehicle speed measuring method based on video analysis |
CN112464854A (en) * | 2020-12-09 | 2021-03-09 | 北京四方继保工程技术有限公司 | Method and system for accurately judging state of mechanical isolation switch based on deep learning |
CN112632208A (en) * | 2020-12-25 | 2021-04-09 | 际络科技(上海)有限公司 | Traffic flow trajectory deformation method and device |
CN112837541A (en) * | 2020-12-31 | 2021-05-25 | 遵义师范学院 | Intelligent traffic vehicle flow management method based on improved SSD |
CN112907978A (en) * | 2021-03-02 | 2021-06-04 | 江苏集萃深度感知技术研究所有限公司 | Traffic flow monitoring method based on monitoring video |
CN112991742A (en) * | 2021-04-21 | 2021-06-18 | 四川见山科技有限责任公司 | Visual simulation method and system for real-time traffic data |
CN113139495A (en) * | 2021-04-29 | 2021-07-20 | 姜冬阳 | Tunnel side-mounted video traffic flow detection method and system based on deep learning |
CN113380035A (en) * | 2021-06-16 | 2021-09-10 | 山东省交通规划设计院集团有限公司 | Road intersection traffic volume analysis method and system |
CN113762139A (en) * | 2021-09-03 | 2021-12-07 | 万申科技股份有限公司 | Machine vision detection system and method for 5G + industrial Internet |
US11645906B2 (en) | 2021-04-29 | 2023-05-09 | Tetenav, Inc. | Navigation system with traffic state detection mechanism and method of operation thereof |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009245042A (en) * | 2008-03-31 | 2009-10-22 | Hitachi Ltd | Traffic flow measurement device and program |
CN104200657A (en) * | 2014-07-22 | 2014-12-10 | 杭州智诚惠通科技有限公司 | Traffic flow parameter acquisition method based on video and sensor |
CN107918765A (en) * | 2017-11-17 | 2018-04-17 | 中国矿业大学 | A kind of Moving target detection and tracing system and its method |
CN108629973A (en) * | 2018-05-11 | 2018-10-09 | 四川九洲视讯科技有限责任公司 | Road section traffic volume congestion index computational methods based on fixed test equipment |
CN108734959A (en) * | 2018-04-28 | 2018-11-02 | 扬州远铭光电有限公司 | A kind of embedded vision train flow analysis method and system |
CN108831161A (en) * | 2018-06-27 | 2018-11-16 | 深圳大学 | A kind of traffic flow monitoring method, intelligence system and data set based on unmanned plane |
CN109584558A (en) * | 2018-12-17 | 2019-04-05 | 长安大学 | A kind of traffic flow statistics method towards Optimization Control for Urban Traffic Signals |
-
2019
- 2019-04-15 CN CN201910299470.5A patent/CN110033479B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009245042A (en) * | 2008-03-31 | 2009-10-22 | Hitachi Ltd | Traffic flow measurement device and program |
CN104200657A (en) * | 2014-07-22 | 2014-12-10 | 杭州智诚惠通科技有限公司 | Traffic flow parameter acquisition method based on video and sensor |
CN107918765A (en) * | 2017-11-17 | 2018-04-17 | 中国矿业大学 | A kind of Moving target detection and tracing system and its method |
CN108734959A (en) * | 2018-04-28 | 2018-11-02 | 扬州远铭光电有限公司 | A kind of embedded vision train flow analysis method and system |
CN108629973A (en) * | 2018-05-11 | 2018-10-09 | 四川九洲视讯科技有限责任公司 | Road section traffic volume congestion index computational methods based on fixed test equipment |
CN108831161A (en) * | 2018-06-27 | 2018-11-16 | 深圳大学 | A kind of traffic flow monitoring method, intelligence system and data set based on unmanned plane |
CN109584558A (en) * | 2018-12-17 | 2019-04-05 | 长安大学 | A kind of traffic flow statistics method towards Optimization Control for Urban Traffic Signals |
Non-Patent Citations (3)
Title |
---|
UNNIKRISHNAN KIZHAKKEMADAM SREEKUMAR等: "Real-Time Traffic Pattern Collection and Analysis Model for Intelligent Traffic Intersection", 2018 IEEE INTERNATIONAL CONFERENCE ON EDGE COMPUTING (EDGE), pages 30 - 31 * |
冯莹莹 等: "《智能监控视频中运动目标跟踪方法研究》", pages: 30 - 31 * |
张洁颖等: "基于视频图像处理的交通流检测系统", 《电视技术》, no. 06, 17 June 2008 (2008-06-17) * |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110555423A (en) * | 2019-09-09 | 2019-12-10 | 南京东控智能交通研究院有限公司 | Multi-dimensional motion camera-based traffic parameter extraction method for aerial video |
CN110555423B (en) * | 2019-09-09 | 2021-12-21 | 南京东控智能交通研究院有限公司 | Multi-dimensional motion camera-based traffic parameter extraction method for aerial video |
CN110807924A (en) * | 2019-11-04 | 2020-02-18 | 吴钢 | Multi-parameter fusion method and system based on full-scale full-sample real-time traffic data |
CN111161545A (en) * | 2019-12-24 | 2020-05-15 | 北京工业大学 | Intersection region traffic parameter statistical method based on video |
CN111310736A (en) * | 2020-03-26 | 2020-06-19 | 上海同岩土木工程科技股份有限公司 | Rapid identification method for unloading and piling of vehicles in protected area |
CN111429484A (en) * | 2020-03-31 | 2020-07-17 | 电子科技大学 | Multi-target vehicle track real-time construction method based on traffic monitoring video |
CN111429484B (en) * | 2020-03-31 | 2022-03-15 | 电子科技大学 | Multi-target vehicle track real-time construction method based on traffic monitoring video |
CN111462249A (en) * | 2020-04-02 | 2020-07-28 | 北京迈格威科技有限公司 | Calibration data acquisition method, calibration method and device for traffic camera |
CN111599173A (en) * | 2020-05-12 | 2020-08-28 | 杭州云视通互联网科技有限公司 | Vehicle information automatic registration method, computer equipment and readable storage medium |
CN111613061A (en) * | 2020-06-03 | 2020-09-01 | 徐州工程学院 | Traffic flow acquisition system and method based on crowdsourcing and block chain |
CN111613061B (en) * | 2020-06-03 | 2021-11-02 | 徐州工程学院 | Traffic flow acquisition system and method based on crowdsourcing and block chain |
CN111753797A (en) * | 2020-07-02 | 2020-10-09 | 浙江工业大学 | Vehicle speed measuring method based on video analysis |
CN112464854A (en) * | 2020-12-09 | 2021-03-09 | 北京四方继保工程技术有限公司 | Method and system for accurately judging state of mechanical isolation switch based on deep learning |
CN112632208A (en) * | 2020-12-25 | 2021-04-09 | 际络科技(上海)有限公司 | Traffic flow trajectory deformation method and device |
CN112837541A (en) * | 2020-12-31 | 2021-05-25 | 遵义师范学院 | Intelligent traffic vehicle flow management method based on improved SSD |
CN112837541B (en) * | 2020-12-31 | 2022-04-29 | 遵义师范学院 | Intelligent traffic vehicle flow management method based on improved SSD |
CN112907978A (en) * | 2021-03-02 | 2021-06-04 | 江苏集萃深度感知技术研究所有限公司 | Traffic flow monitoring method based on monitoring video |
CN112991742A (en) * | 2021-04-21 | 2021-06-18 | 四川见山科技有限责任公司 | Visual simulation method and system for real-time traffic data |
CN113139495A (en) * | 2021-04-29 | 2021-07-20 | 姜冬阳 | Tunnel side-mounted video traffic flow detection method and system based on deep learning |
US11645906B2 (en) | 2021-04-29 | 2023-05-09 | Tetenav, Inc. | Navigation system with traffic state detection mechanism and method of operation thereof |
CN113380035A (en) * | 2021-06-16 | 2021-09-10 | 山东省交通规划设计院集团有限公司 | Road intersection traffic volume analysis method and system |
CN113762139A (en) * | 2021-09-03 | 2021-12-07 | 万申科技股份有限公司 | Machine vision detection system and method for 5G + industrial Internet |
CN113762139B (en) * | 2021-09-03 | 2023-07-25 | 万申科技股份有限公司 | Machine vision detection system and method for 5G+ industrial Internet |
Also Published As
Publication number | Publication date |
---|---|
CN110033479B (en) | 2023-10-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110033479A (en) | Traffic flow parameter real-time detection method based on Traffic Surveillance Video | |
CN110322702A (en) | A kind of Vehicular intelligent speed-measuring method based on Binocular Stereo Vision System | |
CN103559791B (en) | A kind of vehicle checking method merging radar and ccd video camera signal | |
CN102768804B (en) | Video-based traffic information acquisition method | |
CN103456172B (en) | A kind of traffic parameter measuring method based on video | |
CN110285793A (en) | A kind of Vehicular intelligent survey track approach based on Binocular Stereo Vision System | |
CN109615870A (en) | A kind of traffic detection system based on millimetre-wave radar and video | |
CN106056100A (en) | Vehicle auxiliary positioning method based on lane detection and object tracking | |
CN109871776B (en) | All-weather lane line deviation early warning method | |
CN111563469A (en) | Method and device for identifying irregular parking behaviors | |
CN101571997A (en) | Method and device for fusion processing of multi-source traffic information | |
CN106096525A (en) | A kind of compound lane recognition system and method | |
CN104616502A (en) | License plate identification and positioning system based on combined type vehicle-road video network | |
CN107909601A (en) | A kind of shipping anti-collision early warning video detection system and detection method suitable for navigation mark | |
WO2023240805A1 (en) | Connected vehicle overspeed early warning method and system based on filtering correction | |
CN103164958B (en) | Method and system for vehicle monitoring | |
CN107985189A (en) | Towards driver's lane change Deep Early Warning method under scorch environment | |
CN114170580A (en) | Highway-oriented abnormal event detection method | |
CN107516423A (en) | A kind of vehicle heading detection method based on video | |
CN113592905A (en) | Monocular camera-based vehicle running track prediction method | |
CN115273005A (en) | Visual navigation vehicle environment perception method based on improved YOLO algorithm | |
CN106205135A (en) | A kind of detection method of vehicle behavior that turns around violating the regulations, Apparatus and system and a kind of ball machine | |
CN117198057A (en) | Experimental method and system for road side perception track data quality inspection | |
Wu et al. | Vehicle Classification and Counting System Using YOLO Object Detection Technology. | |
CN112466121A (en) | Speed measuring method based on video |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |