CN114937358B - Highway multi-lane traffic flow statistics method - Google Patents

Highway multi-lane traffic flow statistics method Download PDF

Info

Publication number
CN114937358B
CN114937358B CN202210549529.3A CN202210549529A CN114937358B CN 114937358 B CN114937358 B CN 114937358B CN 202210549529 A CN202210549529 A CN 202210549529A CN 114937358 B CN114937358 B CN 114937358B
Authority
CN
China
Prior art keywords
background
moving vehicle
frame
virtual detection
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210549529.3A
Other languages
Chinese (zh)
Other versions
CN114937358A (en
Inventor
程婧雅
马志强
宝财吉拉呼
李雷孝
万剑雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Inner Mongolia University of Technology
Original Assignee
Inner Mongolia University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inner Mongolia University of Technology filed Critical Inner Mongolia University of Technology
Priority to CN202210549529.3A priority Critical patent/CN114937358B/en
Publication of CN114937358A publication Critical patent/CN114937358A/en
Application granted granted Critical
Publication of CN114937358B publication Critical patent/CN114937358B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • G06V20/42Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items of sport video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/54Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/056Detecting movement of traffic to be counted or controlled with provision for distinguishing direction of travel
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/08Detecting or categorising vehicles
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a method for counting traffic flow of a multi-lane highway, which is applied to the technical field of computer vision and comprises the following steps: acquiring and preprocessing expressway video data to obtain video frames; performing background modeling and updating on the video frames to obtain a background video frame model; dividing a moving vehicle target from the video frame by using a background difference method; dynamically tracking a moving vehicle target by using a dynamic centroid distance tracking method; and carrying out flow statistics on the moving vehicle targets in different directions by using a double virtual detection line method. According to the method, the influence of noise and camera shake on video data is reduced through optimal gray level processing and edge pixel sharpening processing of a median filtering method on the video data; by adopting a dynamic centroid distance tracking method and presetting a minimum distance threshold and a frame number threshold, timeliness of appearance of a new vehicle and disappearance identification of an old vehicle in vehicle tracking is improved; and the traffic flow statistics in different directions is realized by a double virtual detection line method.

Description

Highway multi-lane traffic flow statistics method
Technical Field
The invention relates to the technical field of computer vision, in particular to a method for counting traffic flow of a highway multi-lane vehicle.
Background
The traffic flow statistics of the expressways is aimed at the total number of vehicles passing through a specified road section in unit time, the number detection and statistics of the traffic flow of the expressways are that the intelligent traffic flow can timely describe the load condition of each expressway, and the method for conveniently combining the detection, tracking and counting of the vehicles is mainly adopted in the current traffic flow statistics method based on videos for traffic control and intelligent scheduling.
The traffic flow statistics of the expressway is often divided into three steps, namely: vehicle identification, vehicle tracking, and traffic statistics. The expressway video shot by the camera is usually affected by noise and camera shake due to the environment of the expressway, so that the accuracy of vehicle identification, vehicle tracking and vehicle flow statistics performed on the expressway video is affected. In addition, the existing vehicle tracking is often accompanied with the problems of poor real-time performance and accuracy, and the addition of a new vehicle and the disappearance of an old vehicle cannot be recognized in time. And the problem that the existing traffic flow statistics cannot be used for carrying out traffic flow statistics in different directions.
Therefore, how to provide a method for calculating the traffic of the expressway multi-lane traffic, which can avoid the noise existing in the expressway video and the influence of camera shake, has strong tracking real-time performance, can timely identify the addition of new vehicles and the disappearance of old vehicles and calculate the traffic of different directions, is a problem to be solved by the skilled in the art.
Disclosure of Invention
In view of the above, the present invention provides a method for calculating traffic flow of a highway with multiple lanes. The method has the advantages that the obtained expressway video data are subjected to optimal gray level processing, and the background differential image obtained through background updating and a background differential method is subjected to edge pixel sharpening processing through a median filtering method, so that the influence of noise and camera shake on the video data is reduced, and the accuracy of follow-up vehicle identification, vehicle tracking and vehicle flow statistics is improved; in order to improve the real-time performance of vehicle tracking, a dynamic centroid distance tracking method is adopted to dynamically track the vehicle; the minimum distance threshold is preset, and under the condition that the minimum distance threshold is met, the boundary frame with the minimum distance is subjected to association tracking, so that the accuracy of vehicle tracking is improved; the frame number threshold is preset, and the boundary frames which cannot be associated in the adjacent preset frame number threshold are identified as the appearance of the new vehicle and the disappearance of the old vehicle, so that the timeliness and the accuracy of identifying the appearance of the new vehicle and the disappearance of the old vehicle in vehicle tracking are improved; according to the method of double virtual detection lines, the running direction of the vehicle is identified through the sequence of the vehicle passing through the double virtual detection lines while the traffic flow is counted, so that the traffic flow statistics in different directions is realized.
In order to achieve the above purpose, the present invention adopts the following technical scheme:
a highway multi-lane traffic flow statistical method, comprising:
step (1): and acquiring expressway video data, and preprocessing the video data to obtain video frames.
Step (2): performing background modeling and updating on the video frames to obtain a background video frame model; a background differencing method is applied to segment the moving vehicle object from the video frame.
Step (3): dynamic centroid distance tracking is applied to dynamically track moving vehicle targets.
Step (4): and carrying out flow statistics on the moving vehicle targets in different directions by using a double virtual detection line method.
Optionally, in step (1), the preprocessing is an optimal gray scale processing;
the optimal gray scale processing formula is as follows:
Gray=(R*28+G*61+B*11)/100;
wherein Gray is the pixel value of Gray; r is the red channel, G is the green channel, and B is the blue channel.
Optionally, in step (2), the background modeling and updating are specifically: and taking the first frame in the video frames as a background image, and then continuously inputting the video frames for background modeling and updating to obtain a background video frame model.
Optionally, in step (2), the background differentiation method specifically includes: the video frame input is subjected to difference making with the obtained background video frame model, and a background difference image is obtained; performing binarization processing on the background difference image to segment out a moving vehicle target;
the background difference formula is as follows:
L i (x,y)=|I i (x,y)-B i (x,y)|
wherein L is i (x, y) is the resulting background differential image; i i (x, y) is the current frame image in the video frame; b (B) i (x, y) is a background image in a background video frame model;
the binarization processing formula is as follows:
Figure BDA0003654164370000031
wherein T is a binarization threshold; t (T) i (x, y) is a moving vehicle target, and is also a background differential image conforming to the binarization threshold T.
Optionally, after the video frame input is differenced from the obtained background video frame model to obtain the background differential image, the method further includes: performing edge pixel sharpening on the background differential image by a median filtering method;
the median filtering method has the following formula:
g(x,y)=med{h(x-k,y-t),(k,t)w};
wherein g (x, y) is a gray value obtained after median filtering treatment; h (x, y) represents the original gray value; w is a window template with different selectable patterns; k and t are the size of the window template; med is a median filter function.
Optionally, in step (3), the dynamic centroid distance tracking method specifically includes: a minimum distance threshold value is preset, the distances between the mass center positions of the boundary frames of the moving vehicle target of the current frame and all the boundary frames in the previous frame are calculated, and under the condition that the distances accord with the preset minimum distance threshold value, the boundary frames with the minimum distances in the previous frame are associated with the boundary frames of the moving vehicle target of the current frame, so that the moving vehicle target is dynamically tracked.
Optionally, the dynamic centroid distance tracking method further comprises: a threshold of a frame number is preset, and when the boundary frame of the moving vehicle object cannot be associated within the adjacent threshold of the preset frame number, the moving vehicle object is judged to be the appearance of a new vehicle or the disappearance of an old vehicle.
Optionally, in step (4), the double virtual detection line method specifically includes: setting two virtual detection lines perpendicular to the lane and positioned in the middle of the video frame, wherein the distance between the two virtual detection lines is greater than the length of a target body of the moving vehicle; presetting a pixel value change threshold; setting a state s=1 in which the moving vehicle target does not pass through the virtual detection line; through the virtual detection line, the pixel value of the virtual detection line changes and is larger than a preset pixel value change threshold, and the state is changed to s=0; when the moving vehicle target leaves the virtual detection line, the state again becomes s=1; when the state of the virtual detection line is changed from s=1 to s=0 to s=1, detecting that the moving vehicle object passes, and adding one to the counter; and identifying the running direction of the moving vehicle target through the sequence of the moving vehicle target passing through the double virtual detection lines.
Compared with the prior art, the invention provides a highway multi-lane traffic flow statistical method. The method has the advantages that the obtained expressway video data are subjected to optimal gray level processing, and the background differential image obtained through background updating and a background differential method is subjected to edge pixel sharpening processing through a median filtering method, so that the influence of noise and camera shake on the video data is reduced, and the accuracy of follow-up vehicle identification, vehicle tracking and vehicle flow statistics is improved; in order to improve the real-time performance of vehicle tracking, a dynamic centroid distance tracking method is adopted to dynamically track the vehicle; the minimum distance threshold is preset, and under the condition that the minimum distance threshold is met, the boundary frame with the minimum distance is subjected to association tracking, so that the accuracy of vehicle tracking is improved; the frame number threshold is preset, and the boundary frames which cannot be associated in the adjacent preset frame number threshold are identified as the appearance of the new vehicle and the disappearance of the old vehicle, so that the timeliness and the accuracy of identifying the appearance of the new vehicle and the disappearance of the old vehicle in vehicle tracking are improved; according to the method of double virtual detection lines, the running direction of the vehicle is identified through the sequence of the vehicle passing through the double virtual detection lines while the traffic flow is counted, so that the traffic flow statistics in different directions is realized.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present invention, and that other drawings can be obtained according to the provided drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic view of the overall framework of the present invention.
Fig. 2 is a schematic diagram of a vehicle identification process according to the present invention.
Fig. 3 is a schematic flow chart of implementing dynamic tracking in vehicle tracking according to the present invention.
Fig. 4 is a schematic flow chart for identifying the appearance of a new vehicle and the disappearance of an old vehicle in time in the vehicle tracking of the present invention.
Fig. 5 is a schematic diagram of a flow statistics process of a vehicle according to the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Example 1
The embodiment of the invention discloses a method for counting traffic flow of a multi-lane highway, which comprises the following steps:
a highway multi-lane traffic flow statistical method, comprising:
step (1): and obtaining the video data of the expressway, and performing optimal gray scale processing on the video data to obtain a video frame.
The optimal gray scale processing formula is as follows:
Gray=(R*28+G*61+B*11)/100;
wherein Gray is the pixel value of Gray; r is a red channel, G is a green channel, and B is a blue channel; 28. 61, 11 represent the percentage of the color change of the red channel, the green channel, and the blue channel in the optimal gray scale process obtained through multiple experiments, respectively.
The optimal gray processing removes the over-bright or over-dark pixel points in the image, improves the visual effect and definition of the image, improves the vehicle detection precision, ensures that the image is beneficial to computer processing, improves the calculation speed, reduces the calculation cost and improves the method operation speed.
Step (2): the first frame in the video frames is used as a background image, and then the video frames are continuously input for background modeling and updating so as to accurately obtain the background matched with the actual scene of the expressway and obtain a background video frame model.
Applying a background difference method to make a difference between the video frame input and the obtained background video frame model to obtain a background difference image; and the background differential image is subjected to edge pixel sharpening processing by a median filtering method, so that a more complete and clear foreground moving vehicle target is obtained, too bright or too dark spots are removed, and the accuracy and the effectiveness of vehicle detection are improved.
The median filtering method has the following formula:
g(x,y)=med{h(x-k,y-t),(k,t)w};
wherein g (x, y) is a gray value obtained after median filtering treatment; h (x, y) represents the original gray value; w is a window template with different selectable graphs, and aiming at different resolutions of expressway videos, the median filtering effects of different window sizes are different; k and t are the size of the window template; med is a median filter function.
And carrying out binarization processing on the background difference image subjected to median filtering processing to divide the moving vehicle target.
The background difference formula is as follows:
L i (x,y)=|I i (x,y)-B i (x,y)|
wherein L is i (x, y) is the resulting background differential image; i i (x, y) is the current frame image in the video frame; b (B) i (x, y) is a background image in the background video frame model.
The binarization processing formula is as follows:
Figure BDA0003654164370000071
wherein T is a binarization threshold; t (T) i (x, y) is a moving vehicle target, and is also a background differential image conforming to the binarization threshold T.
Step (3): dynamically tracking the moving vehicle target by using a dynamic centroid distance tracking method; the dynamic centroid distance tracking method specifically comprises the following steps: a minimum distance threshold is preset, the distances between the mass center positions of the boundary frames of the moving vehicle target of the current frame and all the boundary frames in the previous frame are calculated, and under the condition that the distances meet the preset minimum distance threshold (the distances are larger than the preset minimum distance threshold), the boundary frames with the minimum distances in the previous frame are associated with the boundary frames of the moving vehicle target of the current frame, so that the moving vehicle target is dynamically tracked, and the accuracy of distance tracking is improved.
The dynamic centroid distance tracking method further comprises: when the frame number threshold is preset and the boundary frames of the moving vehicle targets cannot be associated within the adjacent preset frame number threshold (namely, the moving vehicle targets do not accord with the minimum distance threshold in the adjacent preset frame numbers), the moving vehicle targets are judged to be the appearance of the new vehicle or the disappearance of the old vehicle, and the accuracy and timeliness of identifying the appearance of the new vehicle or the disappearance of the old vehicle in the distance tracking are improved.
Step (4): the method for carrying out flow statistics on moving vehicle targets in different directions by using a double virtual detection line method comprises the following steps: setting two virtual detection lines perpendicular to the lane and positioned in the middle of the video frame, wherein the distance between the two virtual detection lines is greater than the length of a target body of the moving vehicle; presetting a pixel value change threshold; setting a state s=1 in which the moving vehicle target does not pass through the virtual detection line; through the virtual detection line, the pixel value of the virtual detection line changes and is larger than a preset pixel value change threshold, and the state is changed to s=0; when the moving vehicle target leaves the virtual detection line, the state again becomes s=1; when the state of the virtual detection line is changed from s=1 to s=0 to s=1, detecting that the moving vehicle object passes, and adding one to the counter; and identifying the running direction of the moving vehicle target through the sequence of the moving vehicle target passing through the double virtual detection lines.
The invention discloses a highway multilane traffic flow statistical method. The method has the advantages that the obtained expressway video data are subjected to optimal gray level processing, and the background differential image obtained through background updating and a background differential method is subjected to edge pixel sharpening processing through a median filtering method, so that the influence of noise and camera shake on the video data is reduced, and the accuracy of follow-up vehicle identification, vehicle tracking and vehicle flow statistics is improved; in order to improve the real-time performance of vehicle tracking, a dynamic centroid distance tracking method is adopted to dynamically track the vehicle; the minimum distance threshold is preset, and under the condition that the minimum distance threshold is met, the boundary frame with the minimum distance is subjected to association tracking, so that the accuracy of vehicle tracking is improved; the frame number threshold is preset, and the boundary frames which cannot be associated in the adjacent preset frame number threshold are identified as the appearance of the new vehicle and the disappearance of the old vehicle, so that the timeliness and the accuracy of identifying the appearance of the new vehicle and the disappearance of the old vehicle in vehicle tracking are improved; according to the method of double virtual detection lines, the running direction of the vehicle is identified through the sequence of the vehicle passing through the double virtual detection lines while the traffic flow is counted, so that the traffic flow statistics in different directions is realized.
In the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other. For the device disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant points refer to the description of the method section.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (5)

1. A method for highway multi-lane traffic statistics, comprising:
step (1): acquiring expressway video data, and preprocessing the video data to obtain video frames;
step (2): performing background modeling and updating on the video frames to obtain a background video frame model; dividing a moving vehicle target from the video frame by using a background difference method;
step (3): dynamically tracking the moving vehicle target by using a dynamic centroid distance tracking method;
step (4): carrying out flow statistics on the moving vehicle targets in different directions by using a double virtual detection line method;
in the step (3), the dynamic centroid distance tracking method specifically includes: presetting a minimum distance threshold, calculating the distance between the mass center position of the boundary frame of the moving vehicle target of the current frame and the mass center positions of all boundary frames in the previous frame, and under the condition that the distance accords with the preset minimum distance threshold, associating the boundary frame with the minimum distance in the previous frame with the boundary frame of the moving vehicle target of the current frame;
the dynamic centroid distance tracking method further comprises: a frame number threshold is preset, and when the boundary frames of the moving vehicle targets cannot be associated within the adjacent preset frame number threshold, the moving vehicle targets are judged to be the appearance of new vehicles or the disappearance of old vehicles;
in the step (4), the double virtual detection line method specifically includes: setting two virtual detection lines perpendicular to a lane and positioned in the middle of a video frame, wherein the distance between the two virtual detection lines is larger than the length of a target body of the moving vehicle; presetting a pixel value change threshold; setting a state s=1 in which the moving vehicle target does not pass through the virtual detection line; through the virtual detection line, the pixel value of the virtual detection line changes and is larger than the preset pixel value change threshold, and the state is changed to s=0; when the moving vehicle target is driven off the virtual detection line, the state is changed to s=1 again; when the state of the virtual detection line is changed from S=1 to S=0 to S=1, detecting that the moving vehicle target passes, and adding one to a counter; and identifying the running direction of the moving vehicle target according to the sequence of the moving vehicle target passing through the double virtual detection lines.
2. The method according to claim 1, wherein in the step (1), the preprocessing is optimal gray scale processing;
the optimal gray scale processing formula is as follows:
Gray=(R*28+G*61+B*11)/100;
wherein Gray is the pixel value of Gray; r is the red channel, G is the green channel, and B is the blue channel.
3. The method of claim 1, wherein in step (2), the background modeling and updating is specifically: and taking the first frame in the video frames as a background image, and then continuously inputting the video frames for background modeling and updating to obtain the background video frame model.
4. The method for calculating the traffic flow of a highway multilane according to claim 1, wherein in the step (2), the background difference method is specifically: the video frame input is subjected to difference making with the obtained background video frame model, and a background differential image is obtained; performing binarization processing on the background differential image to segment the moving vehicle target;
the background difference method formula is as follows:
L i (x,y)=|I i (x,y)-B i (x,y)|
wherein L is i (x, y) is the resulting background differential image; i i (x, y) is a current frame image in the video frame; b (B) i (x, y) is a background image in the background video frame model;
the binarization processing formula is as follows:
Figure QLYQS_1
/>
wherein T is a binarization threshold;T i (x, y) is the moving vehicle target and is also a background difference image conforming to the binarization threshold T.
5. The method according to claim 4, wherein after the video frame input is differenced from the obtained background video frame model to obtain the background differential image, further comprising: performing edge pixel sharpening on the background differential image by a median filtering method;
the formula of the median filtering method is as follows:
g(x,y)=med{h(x-k,y-t),(k,t)∈w};
wherein g (x, y) is a gray value obtained after median filtering treatment; h (x, y) represents the original gray value; w is a window template with different selectable patterns; k and t are the size of the window template; med is a median filter function.
CN202210549529.3A 2022-05-20 2022-05-20 Highway multi-lane traffic flow statistics method Active CN114937358B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210549529.3A CN114937358B (en) 2022-05-20 2022-05-20 Highway multi-lane traffic flow statistics method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210549529.3A CN114937358B (en) 2022-05-20 2022-05-20 Highway multi-lane traffic flow statistics method

Publications (2)

Publication Number Publication Date
CN114937358A CN114937358A (en) 2022-08-23
CN114937358B true CN114937358B (en) 2023-04-21

Family

ID=82865476

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210549529.3A Active CN114937358B (en) 2022-05-20 2022-05-20 Highway multi-lane traffic flow statistics method

Country Status (1)

Country Link
CN (1) CN114937358B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117994741A (en) * 2024-01-03 2024-05-07 广东智视云控科技有限公司 Vehicle speed detection method, system and storage medium based on video monitoring

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104599502A (en) * 2015-02-13 2015-05-06 重庆邮电大学 Method for traffic flow statistics based on video monitoring
CN110136453A (en) * 2019-06-14 2019-08-16 内蒙古工业大学 Traffic flow detecting method based on the part LK difference optical flow method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100533482C (en) * 1999-11-03 2009-08-26 特许科技有限公司 Image processing techniques for a video based traffic monitoring system and methods therefor
CN101510358B (en) * 2009-03-20 2010-08-25 吉林大学 Method and apparatus for processing real time statistical vehicle flowrate using video image
CN104183142B (en) * 2014-08-18 2017-03-15 安徽科力信息产业有限责任公司 A kind of statistical method of traffic flow based on image vision treatment technology
CN105427626B (en) * 2015-12-19 2018-03-02 长安大学 A kind of statistical method of traffic flow based on video analysis
CN109102702A (en) * 2018-08-24 2018-12-28 南京理工大学 Vehicle speed measuring method based on video encoder server and Radar Signal Fusion
CN110310494A (en) * 2019-05-21 2019-10-08 同济大学 A kind of DETECTION OF TRAFFIC PARAMETERS method and system based on video image
CN111951178B (en) * 2020-07-07 2024-04-30 中国人民解放军93114部队 Image processing method and device for remarkably improving image quality and electronic equipment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104599502A (en) * 2015-02-13 2015-05-06 重庆邮电大学 Method for traffic flow statistics based on video monitoring
CN110136453A (en) * 2019-06-14 2019-08-16 内蒙古工业大学 Traffic flow detecting method based on the part LK difference optical flow method

Also Published As

Publication number Publication date
CN114937358A (en) 2022-08-23

Similar Documents

Publication Publication Date Title
CN110178167B (en) Intersection violation video identification method based on cooperative relay of cameras
CN105260699B (en) A kind of processing method and processing device of lane line data
Hadi et al. Vehicle detection and tracking techniques: a concise review
CN101840507B (en) Target tracking method based on character feature invariant and graph theory clustering
CN103324930B (en) A kind of registration number character dividing method based on grey level histogram binaryzation
CN105336169B (en) A kind of method and system that traffic congestion is judged based on video
CN110992693B (en) Deep learning-based traffic congestion degree multi-dimensional analysis method
CN107895492A (en) A kind of express highway intelligent analysis method based on conventional video
CN110379168B (en) Traffic vehicle information acquisition method based on Mask R-CNN
CN107945523B (en) Road vehicle detection method, traffic parameter detection method and device
CN103871079A (en) Vehicle tracking method based on machine learning and optical flow
CN104134222A (en) Traffic flow monitoring image detecting and tracking system and method based on multi-feature fusion
CN112069944A (en) Road congestion level determination method
CN113221861B (en) Multi-lane line detection method, device and detection equipment
CN105261034A (en) Method and device for calculating traffic flow on highway
CN114937358B (en) Highway multi-lane traffic flow statistics method
DE102013217569A1 (en) VIDEO REPLACEMENT FOR VIDEO-BASED SPEED CONTROL
CN103065325A (en) Target tracking method based on color distance of multicolors and image dividing and aggregating
Wang et al. The research on edge detection algorithm of lane
CN116503818A (en) Multi-lane vehicle speed detection method and system
CN111797738A (en) Multi-target traffic behavior fast extraction method based on video identification
CN113516853B (en) Multi-lane traffic flow detection method for complex monitoring scene
CN110458029B (en) Vehicle detection method and device in foggy environment
CN111932908A (en) Deep learning-based steering ratio and traffic flow statistical method
CN109954854B (en) Method and device for monitoring breakout of crystallizer of continuous casting machine, storage medium and electronic terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant