CN111833598B - Automatic traffic incident monitoring method and system for unmanned aerial vehicle on highway - Google Patents

Automatic traffic incident monitoring method and system for unmanned aerial vehicle on highway Download PDF

Info

Publication number
CN111833598B
CN111833598B CN202010406068.5A CN202010406068A CN111833598B CN 111833598 B CN111833598 B CN 111833598B CN 202010406068 A CN202010406068 A CN 202010406068A CN 111833598 B CN111833598 B CN 111833598B
Authority
CN
China
Prior art keywords
target vehicle
vehicle
target
speed
unmanned aerial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010406068.5A
Other languages
Chinese (zh)
Other versions
CN111833598A (en
Inventor
刘海青
王胜利
秦媚
黄滕
孙昱平
侯国阳
薛婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University of Science and Technology
Original Assignee
Shandong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University of Science and Technology filed Critical Shandong University of Science and Technology
Priority to CN202010406068.5A priority Critical patent/CN111833598B/en
Publication of CN111833598A publication Critical patent/CN111833598A/en
Application granted granted Critical
Publication of CN111833598B publication Critical patent/CN111833598B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0116Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/012Measuring and analyzing of parameters relative to traffic conditions based on the source of data from other sources than vehicle or roadside beacons, e.g. mobile networks
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • G08G1/0175Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The invention discloses an automatic traffic incident monitoring method and system for unmanned aerial vehicles on a highway, wherein the method comprises the following steps: acquiring real-time video data acquired when the unmanned aerial vehicle automatically cruises at the current road section; performing video image processing on the real-time video data to obtain a video image sequence; extracting a target vehicle in a video image sequence by using a dynamic target extraction algorithm to obtain the category information and the contour information of the target vehicle; calculating the position of the next frame of target vehicle by using a target tracking algorithm according to the category information and the contour information of the target vehicle of the current frame to obtain the contour information of the same target vehicle; calculating the actual distance between adjacent frames of the target vehicle according to the contour information of the same target vehicle in the two adjacent frames of video pictures; calculating the speed of the target vehicle according to the actual distance between adjacent frames of the target vehicle; and judging the running behavior of the target vehicle according to the vehicle speed and the contour information of the target vehicle. The invention improves the event detection range and enables the event detection types to be more comprehensive.

Description

Automatic traffic incident monitoring method and system for unmanned aerial vehicle on highway
Technical Field
The invention relates to the technical field of road traffic state detection and management, in particular to an unmanned aerial vehicle traffic event automatic monitoring method and system for an expressway.
Background
With the rapid growth of social economy and the increasing traffic volume, the construction of highways develops rapidly. By the end of 2018, the total operating mileage of the expressway in China reaches 14 kilometers, and the expressway is the first place in the world. The highway raises the transport efficiency among cities and brings a serious safety problem. According to statistics, the number of traffic accidents in China reaches about 20 thousands every year, and huge losses of life and property of people are caused. In the case of an accident on a highway, the accident caused by the driver's unlawful driving behavior, such as speeding, parking violation, etc., accounts for about 25% of the total number of accidents. In particular, there is also an extremely dangerous driving behavior such as reversing or reversing due to missing an exit ramp at a particular location such as a ramp.
The existing method for automatically monitoring traffic events in expressway scenes is mainly realized by using video equipment, such as roadside fixed-point speed measuring equipment, high-point video monitoring equipment and the like. The fixed-point speed measuring equipment can only realize automatic monitoring of less types of events such as overspeed and low speed, and the coverage range is small. Although the high-point video monitoring equipment can acquire comprehensive traffic information, the implementation mode is manual inspection, a large amount of manpower is consumed, and all-weather full-automatic event monitoring cannot be realized.
Therefore, how to solve the problems of the conventional highway event monitoring method, such as small monitoring range, few event monitoring types, high labor consumption, incapability of realizing all-weather full-automatic event detection, and the like, is a technical problem to be solved in the field.
Disclosure of Invention
The invention aims to provide an automatic traffic incident monitoring method and system for unmanned aerial vehicles on highways, which are used for improving the incident detection range, enabling the incident detection types to be more comprehensive, reducing the manpower consumption and realizing all-weather automatic detection of the highway incidents.
In order to achieve the purpose, the invention provides an automatic traffic incident monitoring method for unmanned aerial vehicles on a highway, which comprises the following steps:
acquiring real-time video data acquired when the unmanned aerial vehicle automatically cruises at the current road section;
performing video image processing on the real-time video data to obtain a video image sequence;
extracting a target vehicle in the video image sequence by using a dynamic target extraction algorithm to obtain the category information and the contour information of the target vehicle;
calculating the position of the next frame of the target vehicle by using a target tracking algorithm according to the category information and the contour information of the target vehicle of the current frame to obtain the contour information of the same target vehicle in two adjacent frames of video pictures;
calculating the actual distance between adjacent frames of the target vehicle according to the contour information of the same target vehicle in the two adjacent frames of video pictures;
calculating the speed of the target vehicle according to the actual distance between the adjacent frames of the target vehicle;
judging the driving behavior of the target vehicle according to the vehicle speed and the contour information of the target vehicle; the driving behavior includes: overspeed, low speed, violation, retrograde motion, and frequent lane change.
Optionally, after the determining the driving behavior of the target vehicle according to the vehicle speed, the method further includes:
and sending the real-time video data acquired by the unmanned aerial vehicle and the driving behavior of the target vehicle to a data display terminal for display.
Optionally, extracting a target vehicle in the video image sequence by using a dynamic target extraction algorithm to obtain category information and contour information of the target vehicle, and specifically including:
extracting a target vehicle in the video image sequence by using a trained target detection frame to obtain the category information and the rectangular outline of the target vehicle; the rectangular outline comprises the length l and the width h of a circumscribed rectangle of the target vehicle and an angular point coordinate (x, y);
calculating the area S and the aspect ratio mu of the target vehicle by taking the length and the width of the rectangular outline of the vehicle target as the actual length and width of the target vehicle:
S=l×h (1)
Figure BDA0002491327130000021
calculating the coordinates of the center of mass of the target vehicle with the center of the rectangular outline of the target vehicle as the center of mass of the target vehicle:
Figure BDA0002491327130000022
Figure BDA0002491327130000023
and establishing a vehicle sample feature library according to the scene in the real-time video data and the category information of the target vehicle, and updating the trained target detection framework by using the data in the vehicle sample feature library and a deep learning algorithm.
Optionally, the method includes calculating the position of the next frame of the target vehicle by using a target tracking algorithm according to the category information and the contour information of the target vehicle of the current frame to obtain the contour information of the same target vehicle in two adjacent frames of video pictures, and specifically includes:
establishing a vehicle model for the region of the target vehicle of the current frame;
according to the current frame of the targetThe centroid coordinates of the vehicle predict the predicted centroid coordinates of the target vehicle for the next frame: p't+1(xt+1,yt+1)=f(Δx,Δy)Pt(xt,yt) + w; wherein, Pt(xt,yt) For the current frame the centroid coordinates, P, of the target vehiclet+1(xt+1,yt+1) For the next frame of predicted centroid coordinates of the target vehicle, f (Δ x, Δ y) is the state transfer function, w is the noise of the system process;
extracting the actual centroid coordinates of the next frame of the target vehicle by using a dynamic target extraction algorithm;
calculating a Euclidean distance between an actual centroid coordinate of the target vehicle and a predicted centroid coordinate of the target vehicle;
searching the image area with the minimum Euclidean distance by using the vehicle model;
and matching the moving vehicle target in the image area with the minimum Euclidean distance, determining that the next frame of target vehicle and the current frame of target vehicle are the same vehicle, and obtaining the contour information of the next frame of target vehicle.
Optionally, the moving vehicle target matching is performed in the image region with the minimum euclidean distance, and it is determined that the next frame target vehicle and the current frame target vehicle are the same vehicle, specifically including:
according to the area S of the rectangular outline of the ith target vehicle of the current frameiAnd aspect ratio μiAnd the area S of the rectangular outline of the jth target vehicle in the next framejAnd aspect ratio μjJudging whether an overlapping area S existsij
When there is an overlapping area, and the condition is satisfied:
Sij>K'
ui<K″
uj<K″
the ith target vehicle of the current frame and the jth target vehicle of the next frame are considered to be the same target vehicle between two adjacent frames; wherein K' is an overlap area threshold; k "is the aspect ratio threshold.
Optionally, the actual distance between adjacent frames of the target vehicle is calculated according to the contour information of the same target vehicle in the two adjacent frames of the video pictures; the method specifically comprises the following steps:
establishing a camera coordinate system and an image plane coordinate system;
realizing the conversion of image coordinates and camera coordinates according to a perspective projection principle; the image conversion formula is:
Figure BDA0002491327130000041
wherein the content of the first and second substances,
Figure BDA0002491327130000042
as camera coordinates;
Figure BDA0002491327130000043
the coordinates of the image plane, namely the coordinates of any point of the target vehicle; f is the focal length of the camera;
and realizing the conversion between the camera coordinate and the world coordinate according to a rigid body transformation formula, wherein the rigid body transformation formula is as follows:
Figure BDA0002491327130000044
where R is a rotation matrix, T is a translation matrix, LWRepresenting a matrix formed by simplifying a rotation matrix and a translation matrix;
and calculating the actual distance of the adjacent frames of the same target vehicle according to the world coordinates of the same target vehicle in the adjacent frames.
Optionally, calculating the vehicle speed of the target vehicle according to the actual distance between adjacent frames of the target vehicle specifically includes:
decomposing the actual distance in the horizontal direction and the vertical direction, and respectively calculating the horizontal velocity vxVertical velocity vy
According to horizontal velocity vxAnd vertical velocity vyUsing a formula
Figure BDA0002491327130000045
Calculating the moving speed v of the target vehicle between adjacent framesr
Carrying out background compensation on the movement speed according to the flight speed of the unmanned aerial vehicle to obtain the absolute speed of the target vehicle: v. ofα=vr+veWherein v isαIs the absolute speed, v, of the vehiclerIs the speed of movement of the vehicle, veThe flight speed of the unmanned aerial vehicle;
and taking the absolute speed of the target vehicle as the vehicle speed of the target vehicle.
Optionally, the determining the driving behavior of the target vehicle according to the vehicle speed and the contour information of the target vehicle specifically includes:
when the vehicle speed exceeds a highest speed limit threshold value, determining that the driving behavior of the target vehicle is overspeed;
when the vehicle speed exceeds a highest speed limit threshold value, determining that the driving behavior of the target vehicle is overspeed;
when the vehicle speed is lower than the lowest speed limit threshold value, determining that the running behavior of the target vehicle is low speed;
when the vehicle speed is zero, determining that the running behavior of the target vehicle is illegal;
when the vehicle speed is a negative value, determining that the driving behavior of the target vehicle is retrograde;
the method for judging the frequent lane change of the driving behavior of the target vehicle specifically comprises the following steps of;
sampling for multiple times and recording the position points of the target vehicle;
performing linear fitting on the position points to obtain a reference straight line;
calculating the distances from the position points to the reference straight line and taking an average value r;
when r is greater than the threshold value rKAnd determining that the running behavior of the target vehicle frequently changes lanes.
The invention also provides an automatic traffic incident monitoring system for the unmanned aerial vehicle on the highway, which comprises the following components: the system comprises a data acquisition and communication unit, a data analysis and event judgment unit and a background management unit;
the data acquisition and communication unit comprises an unmanned aerial vehicle and an unmanned aerial vehicle base station, the unmanned aerial vehicle is in communication connection with the unmanned aerial vehicle base station, and a camera is arranged on the unmanned aerial vehicle and is used for acquiring real-time video data of the unmanned aerial vehicle cruising at the current road section;
the data analysis and event discrimination unit comprises a processor arranged on the road side, the processor is in communication connection with the unmanned aerial vehicle base station, and the processor is used for:
performing video image processing on the real-time video data to obtain a video image sequence;
extracting a target vehicle in the video image sequence by using a dynamic target extraction algorithm to obtain the category information and the contour information of the target vehicle;
calculating the position of the next frame of the target vehicle by using a target tracking algorithm according to the category information and the contour information of the target vehicle of the current frame to obtain the contour information of the same target vehicle in two adjacent frames of video pictures;
calculating the actual distance between the adjacent frames of the target vehicle according to the contour information of the same target vehicle in the two adjacent frames of video pictures;
calculating the speed of the target vehicle according to the actual distance between the adjacent frames of the target vehicle;
judging the driving behavior of the target vehicle according to the vehicle speed and the contour information of the target vehicle; the driving behavior includes: overspeed, low speed, violation, retrograde motion and frequent lane change;
the background management unit is respectively connected with the unmanned aerial vehicle base station and the processor, and is used for receiving and displaying real-time video data acquired by the unmanned aerial vehicle and the driving behavior of the target vehicle.
Optionally, the unmanned aerial vehicle base station is equipped with a weather monitoring device, so as to monitor the weather conditions of the road sections in real time, judge whether to execute the unmanned aerial vehicle outbound polling event and provide data support for weather condition statistics;
the background management unit comprises a data display platform, an equipment management platform, a road section management platform and a law enforcement management platform;
the data display platform dynamically displays a system profile, real-time flight positions of the unmanned aerial vehicles, working states of the unmanned aerial vehicles, base station distribution of the unmanned aerial vehicles, real-time traffic event reporting conditions, traffic event statistical conditions, traffic flow states and video monitoring states by using big data visualization and GIS application technology;
the road section management platform is used for carrying out addition, deletion, modification and check on monitored road sections, planning, managing and monitoring a cruising route of the unmanned aerial vehicle in real time and providing data support for optimizing the cruising route for the unmanned aerial vehicle;
the equipment management platform is used for carrying out online management on the unmanned aerial vehicle, unmanned aerial vehicle base station equipment, communication equipment, processing unit equipment and network equipment, and comprises equipment basic information maintenance, equipment fault alarm and personnel authority management;
the law enforcement management platform is used for auditing and enforcing the traffic abnormal events collected on site.
According to the specific embodiment provided by the invention, the invention discloses the following technical effects:
the highway event detection method and system based on unmanned aerial vehicle cruising on the highway provided by the invention have the advantages of higher flexibility and more comprehensive and accurate detection event types. The monitoring of the coverage of the whole road section can be realized, the labor cost is greatly reduced, and the monitoring efficiency is improved. The method has a promoting effect on the construction of intelligent traffic and intelligent cities.
And through unmanned aerial vehicle's cruise, cause certain frightening effect to driver psychological aspect. Meanwhile, law enforcement personnel call nearby unmanned aerial vehicles according to the circumstances, can learn the scene condition of accident fast, all have very big help to accident personnel's rescue, the evidence collection of scene of accident etc..
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a flowchart of an automatic traffic event monitoring method for an unmanned aerial vehicle on a highway according to an embodiment of the present invention;
fig. 2 is a structural diagram of an automatic traffic event monitoring system for unmanned aerial vehicles on a highway provided by an embodiment of the invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention aims to provide an automatic traffic incident monitoring method and system for unmanned aerial vehicles on highways, which are used for improving the incident detection range, enabling the incident detection types to be more comprehensive, reducing the manpower consumption and realizing all-weather automatic detection of the highway incidents.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
As shown in fig. 1, the method for automatically monitoring traffic events of unmanned aerial vehicles on highways according to the embodiment includes:
step 101: and acquiring real-time video data acquired when the unmanned aerial vehicle automatically cruises at the current road section.
According to the embodiment, the unmanned aerial vehicle is adopted to carry out itinerant flight on a specific road section, and real-time video acquisition of the running condition of the vehicles on the highway is carried out, so that the basic data in the embodiment is more real-time, the detection range is wide, and a foundation is laid for comprehensively monitoring abnormal traffic events.
Step 102: and carrying out video image processing on the real-time video data to obtain a video image sequence.
Wherein the sequence of video images may be stored in the form of a data set. The subsequent calling and processing of the video image are more convenient.
Step 103: and extracting the target vehicle in the video image sequence by using a dynamic target extraction algorithm to obtain the category information and the contour information of the target vehicle.
The step 103 specifically includes:
extracting a target vehicle in the video image sequence by using a trained target detection frame to obtain the category information and the rectangular outline of the target vehicle; the rectangular outline comprises the length l and the width h of a circumscribed rectangle of the target vehicle and one corner point coordinate (x, y);
calculating the area S and the aspect ratio mu of the target vehicle by taking the length and the width of the rectangular outline of the vehicle target as the actual length and width of the target vehicle:
S=l×h (1)
Figure BDA0002491327130000081
calculating the coordinates of the center of mass of the target vehicle with the center of the rectangular outline of the target vehicle as the center of mass of the target vehicle:
Figure BDA0002491327130000082
Figure BDA0002491327130000083
and establishing a vehicle sample feature library according to the scene in the real-time video data and the category information of the target vehicle, and updating the trained target detection framework by using the data in the vehicle sample feature library and a deep learning algorithm.
According to the method, the vehicle sample feature libraries under different scenes and different types are established, the rapid extraction of the vehicle target is realized based on the deep learning method, and the extracted target is accurate and more accords with the actual situation.
It should be noted that, in the present embodiment, a deep learning-based method is adopted for target extraction, and other dynamic target extraction algorithms are also applicable to the present invention as long as the target vehicle extraction can be achieved.
Step 104: and calculating the position of the next frame of the target vehicle by using a target tracking algorithm according to the category information and the contour information of the target vehicle of the current frame to obtain the contour information of the same target vehicle in two adjacent frames of video pictures.
In the embodiment, a target model is established for a moving vehicle in a video image at a certain moment, the position where the moving vehicle appears at the next moment is predicted, the area range of target matching is narrowed, and then target accurate matching is performed. The method specifically comprises the following steps:
establishing a vehicle model for the region of the target vehicle of the current frame;
predicting the predicted centroid coordinates of the target vehicle of the next frame according to the centroid coordinates of the target vehicle of the current frame: p't+1(xt+1,yt+1)=f(Δx,Δy)Pt(xt,yt) + w; wherein, Pt(xt,yt) For the current frame the centroid coordinates, P, of the target vehiclet+1(xt+1,yt+1) For the next frame of predicted centroid coordinates of the target vehicle, f (Δ x, Δ y) is the state transfer function, w is the noise of the system process;
extracting the actual centroid coordinates of the next frame of the target vehicle by using a dynamic target extraction algorithm;
calculating a Euclidean distance between an actual centroid coordinate of the target vehicle and a predicted centroid coordinate of the target vehicle;
searching the image area with the minimum Euclidean distance by using the vehicle model;
and matching the moving vehicle target in the image area with the minimum Euclidean distance, determining that the next frame of target vehicle and the current frame of target vehicle are the same vehicle, and obtaining the contour information of the next frame of target vehicle.
In this embodiment, the method for matching the moving vehicle target in the image region with the minimum euclidean distance may be:
according to the area S of the rectangular outline of the ith target vehicle of the current frameiAnd aspect ratio μiAnd the area S of the rectangular outline of the jth target vehicle in the next framejAnd aspect ratio μjJudging whether an overlapping area S existsij
When there is an overlapping area, and the condition is satisfied:
Sij>K'
ui<K″
uj<K″
the ith target vehicle of the current frame and the jth target vehicle of the next frame are considered to be the same target vehicle between two adjacent frames; wherein K' is an overlap area threshold; k "is the aspect ratio threshold.
It should be noted that, besides the target tracking algorithm adopted in the embodiment, other methods that can realize vehicle target tracking can also be applied to the present invention.
Step 105: and calculating the actual distance between the adjacent frames of the target vehicle according to the contour information of the same target vehicle in the two adjacent frames of video pictures.
The relevant data obtained in the above steps, such as coordinates, length, width, and area, are all image data processed based on the video image, and the image data is not real geographic scene data, so in order to calculate the running speed of the target vehicle in the actual road more accurately, step 105 may specifically include:
establishing a camera coordinate system and an image plane coordinate system;
realizing the conversion of image coordinates and camera coordinates according to a perspective projection principle; the image conversion formula is:
Figure BDA0002491327130000101
wherein the content of the first and second substances,
Figure BDA0002491327130000102
is the camera coordinate;
Figure BDA0002491327130000103
the coordinates of the image plane, namely the coordinates of any point of the target vehicle; f is the focal length of the camera;
and realizing the conversion between the camera coordinate and the world coordinate according to a rigid body transformation formula, wherein the rigid body transformation formula is as follows:
Figure BDA0002491327130000104
where R is a rotation matrix, T is a translation matrix, LWRepresenting a matrix formed by simplifying a rotation matrix and a translation matrix;
and calculating the actual distance of the adjacent frames of the same target vehicle according to the world coordinates of the same target vehicle of the adjacent frames.
Through the coordinate conversion, the image data can be converted into real geographic scene data, so that the actual distance of the same vehicle of adjacent frames or adjacent moments obtained through calculation is more practical, and a more accurate and effective data basis is provided for the follow-up more accurate judgment of the driving behavior of the vehicle.
Step 106: and calculating the speed of the target vehicle according to the actual distance between the adjacent frames of the target vehicle.
Specifically, the actual distance is first decomposed in the horizontal direction and the vertical direction, and the horizontal velocity v is calculated respectivelyxVertical velocity vy
According to horizontal velocity vxAnd vertical velocity vyUsing formulas
Figure BDA0002491327130000105
Calculating the moving speed v of the target vehicle between adjacent framesr
Carrying out background compensation on the movement speed according to the flight speed of the unmanned aerial vehicle to obtain the targetAbsolute speed of subject vehicle: v. ofα=vr+veWherein v isαIs the absolute speed, v, of the vehiclerIs the speed of movement of the vehicle, veThe flight speed of the unmanned aerial vehicle;
and taking the absolute speed of the target vehicle as the vehicle speed of the target vehicle.
Step 107: judging the driving behavior of the target vehicle according to the vehicle speed and the contour information of the target vehicle; the driving behavior includes: overspeed, low speed, violation, retrograde motion, and frequent lane change.
Specifically, the step 107 specifically includes:
when the vehicle speed exceeds a highest speed limit threshold value, determining that the driving behavior of the target vehicle is overspeed;
when the vehicle speed exceeds a highest speed limit threshold value, determining that the driving behavior of the target vehicle is overspeed;
when the vehicle speed is lower than the lowest speed limit threshold value, determining that the running behavior of the target vehicle is low speed;
when the vehicle speed is zero, determining that the running behavior of the target vehicle is illegal;
when the vehicle speed is a negative value, determining that the driving behavior of the target vehicle is retrograde;
the method for judging the frequent lane change of the driving behavior of the target vehicle specifically comprises the following steps of;
sampling for multiple times and recording the position points of the target vehicle;
performing linear fitting on the position points to obtain a reference straight line;
calculating the distances from the position points to the reference straight line and taking an average value r;
when r is greater than the threshold value rKAnd determining that the running behavior of the target vehicle frequently changes lanes.
Therefore, compared with the event types which can be judged in the prior art, the invention realizes the judgment of abnormal driving behaviors such as vehicle overspeed, low speed, reverse driving, illegal parking, frequent lane changing and the like and abnormal traffic events such as foreign matters on roads and the like, and the event monitoring types are more comprehensive.
Step 108: and sending the real-time video data acquired by the unmanned aerial vehicle and the driving behavior of the target vehicle to a data display terminal for display.
As shown in fig. 2, the embodiment further provides a system corresponding to the above method for automatically monitoring a traffic event of a highway unmanned aerial vehicle, and the system includes a data acquisition and communication unit 201, a data analysis and event discrimination unit 202, and a background management unit 203.
Data acquisition and communication unit 201 include unmanned aerial vehicle and unmanned aerial vehicle basic station, unmanned aerial vehicle with unmanned aerial vehicle basic station communication connection, be provided with the camera on the unmanned aerial vehicle, the camera is used for gathering the real-time video data that unmanned aerial vehicle cruised the current highway section.
In this embodiment, the unmanned aerial vehicle acquires real-time video data of a current road segment through automatic cruise, transmits the real-time video data to the unmanned aerial vehicle base station through the wireless communication module, and transmits the real-time video data to the data analysis and event judgment unit 202 through the network communication module by the unmanned aerial vehicle base station.
As an optional implementation mode, the unmanned aerial vehicle base station can also be provided with weather monitoring equipment to monitor the weather conditions of the road sections in real time, judge whether to execute the unmanned aerial vehicle out-of-cabin inspection event and provide data support for weather condition statistics.
The data analysis and event discrimination unit 202 includes a processor disposed at the roadside, the processor is in communication connection with the unmanned aerial vehicle base station, and the processor is configured to:
performing video image processing on the real-time video data to obtain a video image sequence;
extracting a target vehicle in the video image sequence by using a dynamic target extraction algorithm to obtain the category information and the contour information of the target vehicle;
calculating the position of the next frame of the target vehicle by using a target tracking algorithm according to the category information and the contour information of the target vehicle of the current frame to obtain the contour information of the same target vehicle in two adjacent frames of video pictures;
calculating the actual distance between the adjacent frames of the target vehicle according to the contour information of the same target vehicle in the two adjacent frames of video pictures;
calculating the speed of the target vehicle according to the actual distance between the adjacent frames of the target vehicle;
judging the driving behavior of the target vehicle according to the vehicle speed and the contour information of the target vehicle; the driving behavior includes: overspeed, low speed, violation, retrograde motion, and frequent lane change.
The processor can be an industrial personal computer, a server, an embedded processor and other computing equipment meeting the data processing performance.
The traffic event automatic monitoring method for the unmanned aerial vehicle on the highway provided by the embodiment is executed in the processor, and is not described herein again.
The background management unit 203 is respectively connected with the unmanned aerial vehicle base station and the processor, and is used for receiving and displaying real-time video data acquired by the unmanned aerial vehicle and the driving behavior of the target vehicle.
The background management unit is connected with a processor on the highway site through a communication network, receives the acquired data and the event judgment result of the road site and sends control instruction information to the road site. The background management unit mainly provides means for data display, system maintenance and law enforcement management for the high-speed traffic police, and comprises a data display platform, an equipment management platform, a road section management platform and a law enforcement management platform;
the data display platform dynamically displays a system profile, real-time flight positions of the unmanned aerial vehicles, working states of the unmanned aerial vehicles, base station distribution of the unmanned aerial vehicles, real-time traffic event reporting conditions, traffic event statistical conditions, traffic flow states and video monitoring states by using big data visualization and GIS application technology; and a basis is provided for traffic control.
The road section management platform is used for carrying out addition, deletion, modification and check on monitored road sections, planning, managing and monitoring a cruising route of the unmanned aerial vehicle in real time and providing data support for optimizing the cruising route for the unmanned aerial vehicle;
the equipment management platform is used for carrying out online management on the unmanned aerial vehicle, unmanned aerial vehicle base station equipment, communication equipment, processing unit equipment and network equipment, and comprises equipment basic information maintenance, equipment fault alarm and personnel authority management; the method comprises basic information maintenance of equipment, fault alarm of the equipment, personnel authority management and the like. Meanwhile, law enforcement personnel can call nearby unmanned aerial vehicles according to the accident situation, and the situation of the accident scene is known rapidly.
The law enforcement management platform is used for auditing and enforcing the traffic abnormal events collected on site. According to the video image information of the vehicle target, manual auditing is carried out, illegal vehicles, illegal types, time, punishment standards and the like are determined, the law enforcement accuracy is ensured, and related information is transmitted to a traffic control platform for law enforcement.
For the system disclosed by the embodiment, the description is relatively simple because the system corresponds to the method disclosed by the embodiment, and the relevant points can be referred to the method part for description.
The principles and embodiments of the present invention have been described herein using specific examples, which are provided only to help understand the method and the core concept of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, the specific embodiments and the application range may be changed. In view of the above, the present disclosure should not be construed as limiting the invention.

Claims (6)

1. A highway unmanned aerial vehicle traffic incident automatic monitoring method is characterized by comprising the following steps:
acquiring real-time video data acquired when the unmanned aerial vehicle automatically cruises at the current road section;
performing video image processing on the real-time video data to obtain a video image sequence;
extracting a target vehicle in the video image sequence by using a dynamic target extraction algorithm to obtain the category information and the contour information of the target vehicle;
calculating the position of the next frame of the target vehicle by using a target tracking algorithm according to the category information and the contour information of the target vehicle of the current frame to obtain the contour information of the same target vehicle in two adjacent frames of video pictures;
calculating the actual distance between the adjacent frames of the target vehicle according to the contour information of the same target vehicle in the two adjacent frames of video pictures;
calculating the speed of the target vehicle according to the actual distance between the adjacent frames of the target vehicle;
judging the driving behavior of the target vehicle according to the vehicle speed and the contour information of the target vehicle; the driving behavior includes: overspeed, low speed, violation, retrograde motion and frequent lane change;
the extracting of the target vehicle in the video image sequence by using a dynamic target extraction algorithm to obtain the category information and the contour information of the target vehicle specifically comprises the following steps:
extracting a target vehicle in the video image sequence by using a trained target detection frame to obtain the category information and the rectangular outline of the target vehicle; the rectangular outline comprises the length l and the width h of a circumscribed rectangle of the target vehicle and an angular point coordinate (x, y);
calculating the area S and the aspect ratio mu of the target vehicle by taking the length and the width of the rectangular outline of the vehicle target as the actual length and width of the target vehicle:
S=l×h (1)
Figure FDA0003637709960000011
calculating the coordinates of the center of mass of the target vehicle with the center of the rectangular outline of the target vehicle as the center of mass of the target vehicle:
Figure FDA0003637709960000012
Figure FDA0003637709960000013
establishing a vehicle sample feature library according to scenes in the real-time video data and the category information of the target vehicle, and updating a trained target detection framework by using data in the vehicle sample feature library through a deep learning algorithm;
calculating the position of the next frame of target vehicle by using a target tracking algorithm according to the category information and the contour information of the target vehicle of the current frame to obtain the contour information of the same target vehicle in two adjacent frames of video pictures, and specifically comprises the following steps:
establishing a vehicle model for the region of the target vehicle of the current frame;
predicting the predicted centroid coordinates of the target vehicle of the next frame according to the centroid coordinates of the target vehicle of the current frame: p't+1(xt+1,yt+1)=f(Δx,Δy)Pt(xt,yt) + w; wherein, Pt(xt,yt) For the current frame the centroid coordinates, P, of the target vehiclet+1(xt+1,yt+1) For the next frame of predicted centroid coordinates of the target vehicle, f (Δ x, Δ y) is the state transfer function, w is the noise of the system process;
extracting the actual centroid coordinates of the next frame of the target vehicle by using a dynamic target extraction algorithm;
calculating the Euclidean distance between the actual centroid coordinate of the target vehicle and the predicted centroid coordinate of the target vehicle;
searching the image area with the minimum Euclidean distance by using the vehicle model;
matching moving vehicle targets in the image area with the minimum Euclidean distance, determining that the next frame of target vehicle and the current frame of target vehicle are the same vehicle, and obtaining the contour information of the next frame of target vehicle;
performing moving vehicle target matching in the image area with the minimum Euclidean distance, and determining that the next frame target vehicle and the current frame target vehicle are the same vehicle, specifically comprising:
according to the area S of the rectangular outline of the ith target vehicle of the current frameiAnd aspect ratio μiAnd the surface of the rectangular outline of the jth target vehicle of the next frameProduct SjAnd aspect ratio μjJudging whether an overlapping area S existsij
When there is an overlapping area, and the condition is satisfied:
Sij>K'
ui<K”
uj<K”
the ith target vehicle of the current frame and the jth target vehicle of the next frame are considered to be the same target vehicle between two adjacent frames; wherein K' is an overlap area threshold; k' is an aspect ratio threshold;
calculating the speed of the target vehicle according to the actual distance between the adjacent frames of the target vehicle, and specifically comprises the following steps:
decomposing the actual distance in the horizontal direction and the vertical direction, and respectively calculating the horizontal velocity vxVertical velocity vy
According to horizontal velocity vxAnd vertical velocity vyUsing formulas
Figure FDA0003637709960000021
Calculating the moving speed v of the target vehicle between adjacent framesr
Carrying out background compensation on the movement speed according to the flight speed of the unmanned aerial vehicle to obtain the absolute speed of the target vehicle: v. ofα=vr+veWherein v isαIs the absolute speed, v, of the vehiclerIs the speed of movement of the vehicle, veThe flight speed of the unmanned aerial vehicle;
taking the absolute speed of a target vehicle as the vehicle speed of the target vehicle;
the method for judging the frequent lane change of the driving behavior of the target vehicle specifically comprises the following steps of;
sampling for multiple times and recording the position points of the target vehicle;
performing linear fitting on the position points to obtain a reference straight line;
calculating the distances from the position points to the reference straight line and taking an average value r;
when r is greater than the threshold value rKWhen it is sureAnd determining the running behavior of the target vehicle to change lanes frequently.
2. The method according to claim 1, further comprising, after said determining the driving behavior of the target vehicle based on the vehicle speed:
and sending the real-time video data acquired by the unmanned aerial vehicle and the driving behavior of the target vehicle to a data display terminal for display.
3. The method according to claim 1, wherein the actual distance between adjacent frames of the target vehicle is calculated according to the contour information of the same target vehicle in two adjacent video pictures; the method specifically comprises the following steps:
establishing a camera coordinate system and an image plane coordinate system;
realizing the conversion of image coordinates and camera coordinates according to a perspective projection principle; the image conversion formula is:
Figure FDA0003637709960000031
wherein the content of the first and second substances,
Figure FDA0003637709960000032
as camera coordinates;
Figure FDA0003637709960000033
the coordinates of the image plane, namely the coordinates of any point of the target vehicle; f is the focal length of the camera;
the conversion between the camera coordinates and world coordinates is realized according to a rigid body transformation formula, wherein the rigid body transformation formula is as follows:
Figure FDA0003637709960000041
wherein R is rotationMatrix, T is a translation matrix, LWRepresenting a matrix formed by simplifying a rotation matrix and a translation matrix;
and calculating the actual distance of the adjacent frames of the same target vehicle according to the world coordinates of the same target vehicle in the adjacent frames.
4. The method according to claim 1, wherein the step of determining the driving behavior of the target vehicle according to the vehicle speed and the profile information of the target vehicle comprises:
when the vehicle speed exceeds a highest speed limit threshold value, determining that the driving behavior of the target vehicle is overspeed;
when the vehicle speed exceeds a highest speed limit threshold value, determining that the driving behavior of the target vehicle is overspeed;
when the vehicle speed is lower than the lowest speed limit threshold value, determining that the running behavior of the target vehicle is low speed;
when the vehicle speed is zero, determining that the running behavior of the target vehicle is illegal;
and when the vehicle speed is a negative value, determining that the running behavior of the target vehicle is retrograde.
5. An automatic traffic event monitoring system for unmanned highway vehicles, the system comprising: the system comprises a data acquisition and communication unit, a data analysis and event judgment unit and a background management unit;
the data acquisition and communication unit comprises an unmanned aerial vehicle and an unmanned aerial vehicle base station, the unmanned aerial vehicle is in communication connection with the unmanned aerial vehicle base station, and a camera is arranged on the unmanned aerial vehicle and used for acquiring real-time video data of the unmanned aerial vehicle cruising at the current road section;
the data analysis and event discrimination unit comprises a processor arranged on the road side, the processor is in communication connection with the unmanned aerial vehicle base station, and the processor is used for:
performing video image processing on the real-time video data to obtain a video image sequence;
extracting a target vehicle in the video image sequence by using a dynamic target extraction algorithm to obtain the category information and the contour information of the target vehicle;
calculating the position of the next frame of the target vehicle by using a target tracking algorithm according to the category information and the contour information of the target vehicle of the current frame to obtain the contour information of the same target vehicle in two adjacent frames of video pictures;
calculating the actual distance between the adjacent frames of the target vehicle according to the contour information of the same target vehicle in the two adjacent frames of video pictures;
calculating the speed of the target vehicle according to the actual distance between the adjacent frames of the target vehicle;
judging the driving behavior of the target vehicle according to the vehicle speed and the contour information of the target vehicle; the driving behavior includes: overspeed, low speed, violation, retrograde motion and frequent lane change;
the extracting of the target vehicle in the video image sequence by using a dynamic target extraction algorithm to obtain the category information and the contour information of the target vehicle specifically comprises the following steps:
extracting a target vehicle in the video image sequence by using a trained target detection frame to obtain the category information and the rectangular outline of the target vehicle; the rectangular outline comprises the length l and the width h of a circumscribed rectangle of the target vehicle and an angular point coordinate (x, y);
calculating the area S and the aspect ratio mu of the target vehicle by taking the length and the width of the rectangular outline of the vehicle target as the actual length and width of the target vehicle:
S=l×h (1)
Figure FDA0003637709960000051
calculating the coordinates of the center of mass of the target vehicle with the center of the rectangular outline of the target vehicle as the center of mass of the target vehicle:
Figure FDA0003637709960000052
Figure FDA0003637709960000053
establishing a vehicle sample feature library according to scenes in the real-time video data and the category information of the target vehicle, and updating a trained target detection framework by using data in the vehicle sample feature library through a deep learning algorithm;
calculating the position of the next frame of target vehicle by using a target tracking algorithm according to the category information and the contour information of the target vehicle of the current frame to obtain the contour information of the same target vehicle in two adjacent frames of video pictures, and specifically comprises the following steps:
establishing a vehicle model for the region of the target vehicle of the current frame;
predicting the predicted centroid coordinates of the target vehicle of the next frame according to the centroid coordinates of the target vehicle of the current frame: p't+1(xt+1,yt+1)=f(Δx,Δy)Pt(xt,yt) + w; wherein, Pt(xt,yt) For the current frame the centroid coordinates, P, of the target vehiclet+1(xt+1,yt+1) For the next frame of predicted centroid coordinates of the target vehicle, f (Δ x, Δ y) is the state transfer function, w is the noise of the system process;
extracting the actual centroid coordinates of the next frame of the target vehicle by using a dynamic target extraction algorithm;
calculating the Euclidean distance between the actual centroid coordinate of the target vehicle and the predicted centroid coordinate of the target vehicle;
searching the image area with the minimum Euclidean distance by using the vehicle model;
matching moving vehicle targets in the image area with the minimum Euclidean distance, determining that the next frame of target vehicle and the current frame of target vehicle are the same vehicle, and obtaining the contour information of the next frame of target vehicle;
performing moving vehicle target matching in the image area with the minimum Euclidean distance, and determining that the next frame target vehicle and the current frame target vehicle are the same vehicle, specifically comprising:
according to the area S of the rectangular outline of the ith target vehicle of the current frameiAnd aspect ratio μiAnd the area S of the rectangular outline of the jth target vehicle in the next framejAnd aspect ratio μjJudging whether an overlapping area S existsij
When there is an overlapping area, and the condition is satisfied:
Sij>K'
ui<K”
uj<K”
the ith target vehicle of the current frame and the jth target vehicle of the next frame are considered to be the same target vehicle between two adjacent frames; wherein K' is an overlap area threshold; k' is an aspect ratio threshold;
the background management unit is respectively connected with the unmanned aerial vehicle base station and the processor, and is used for receiving and displaying real-time video data acquired by the unmanned aerial vehicle and the driving behavior of the target vehicle;
calculating the speed of the target vehicle according to the actual distance between the adjacent frames of the target vehicle, and specifically comprises the following steps:
decomposing the actual distance in the horizontal direction and the vertical direction, and respectively calculating the horizontal velocity vxVertical velocity vy
According to horizontal velocity vxAnd vertical velocity vyUsing formulas
Figure FDA0003637709960000061
Calculating the moving speed v of the target vehicle between adjacent framesr
Carrying out background compensation on the movement speed according to the flight speed of the unmanned aerial vehicle to obtain the absolute speed of the target vehicle: v. ofα=vr+veWherein v isαIs the absolute speed, v, of the vehiclerIs the speed of movement of the vehicle, veIs the flying speed of the unmanned plane;
Taking the absolute speed of a target vehicle as the vehicle speed of the target vehicle;
the method for judging the frequent lane change of the driving behavior of the target vehicle specifically comprises the following steps of;
sampling for multiple times and recording the position points of the target vehicle;
performing linear fitting on the position points to obtain a reference straight line;
calculating the distances from the position points to the reference straight line and taking an average value r;
when r is greater than the threshold value rKAnd determining that the running behavior of the target vehicle frequently changes lanes.
6. The automatic traffic event monitoring system for unmanned aerial vehicles on expressways according to claim 5, wherein a weather monitoring device is carried by the unmanned aerial vehicle base station to monitor weather conditions of road sections in real time, judge whether to execute unmanned aerial vehicle outbound inspection events and provide data support for weather condition statistics;
the background management unit comprises a data display platform, an equipment management platform, a road section management platform and a law enforcement management platform;
the data display platform dynamically displays a system profile, real-time flight positions of the unmanned aerial vehicles, working states of the unmanned aerial vehicles, base station distribution of the unmanned aerial vehicles, real-time traffic event reporting conditions, traffic event statistical conditions, traffic flow states and video monitoring states by using big data visualization and GIS application technology;
the road section management platform is used for carrying out addition, deletion, modification and check on monitored road sections, planning, managing and monitoring a cruising route of the unmanned aerial vehicle in real time and providing data support for optimizing the cruising route for the unmanned aerial vehicle;
the equipment management platform is used for carrying out online management on the unmanned aerial vehicle, unmanned aerial vehicle base station equipment, communication equipment, processing unit equipment and network equipment, and comprises equipment basic information maintenance, equipment fault alarm and personnel authority management;
the law enforcement management platform is used for auditing and enforcing the traffic abnormal events collected on site.
CN202010406068.5A 2020-05-14 2020-05-14 Automatic traffic incident monitoring method and system for unmanned aerial vehicle on highway Active CN111833598B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010406068.5A CN111833598B (en) 2020-05-14 2020-05-14 Automatic traffic incident monitoring method and system for unmanned aerial vehicle on highway

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010406068.5A CN111833598B (en) 2020-05-14 2020-05-14 Automatic traffic incident monitoring method and system for unmanned aerial vehicle on highway

Publications (2)

Publication Number Publication Date
CN111833598A CN111833598A (en) 2020-10-27
CN111833598B true CN111833598B (en) 2022-07-05

Family

ID=72913805

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010406068.5A Active CN111833598B (en) 2020-05-14 2020-05-14 Automatic traffic incident monitoring method and system for unmanned aerial vehicle on highway

Country Status (1)

Country Link
CN (1) CN111833598B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112735164B (en) * 2020-12-25 2022-08-05 北京智能车联产业创新中心有限公司 Test data construction method and test method
CN112818753A (en) * 2021-01-11 2021-05-18 精英数智科技股份有限公司 Pit falling object detection method, device and system
CN113505638B (en) * 2021-05-27 2024-04-02 中国科学院深圳先进技术研究院 Method and device for monitoring traffic flow and computer readable storage medium
WO2022246850A1 (en) * 2021-05-28 2022-12-01 深圳市大疆创新科技有限公司 Traffic scenario monitoring method based on aerial survey data
CN113470374B (en) * 2021-06-30 2022-09-16 平安科技(深圳)有限公司 Vehicle overspeed monitoring method and device, computer equipment and storage medium
CN113850170A (en) * 2021-09-17 2021-12-28 泰州市雷信农机电制造有限公司 Reverse-running identification system based on block chain storage

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622886A (en) * 2012-03-23 2012-08-01 长安大学 Video-based method for detecting violation lane-changing incident of vehicle
CN103150908A (en) * 2013-02-05 2013-06-12 长安大学 Average vehicle speed detecting method based on video
CN104008387A (en) * 2014-05-19 2014-08-27 山东科技大学 Lane line detection method based on feature point piecewise linear fitting
CN106643749A (en) * 2016-09-14 2017-05-10 北京航空航天大学 Dangerous driving behavior detection method based on intelligent cellphone
CN107563310A (en) * 2017-08-16 2018-01-09 电子科技大学 A kind of lane change detection method violating the regulations
WO2018057393A1 (en) * 2016-09-23 2018-03-29 Pcms Holdings, Inc. Methods and apparatus for improved navigation notification based on localized traffic flow
CN108366099A (en) * 2018-01-23 2018-08-03 深圳市北航旭飞科技有限公司 Control method, system, device based on panorama unmanned plane and panorama unmanned plane
CN108596129A (en) * 2018-04-28 2018-09-28 武汉盛信鸿通科技有限公司 A kind of vehicle based on intelligent video analysis technology gets over line detecting method
CN109299674A (en) * 2018-09-05 2019-02-01 重庆大学 A kind of lane change detection method violating the regulations of the tunnel based on car light
CN109949578A (en) * 2018-12-31 2019-06-28 上海眼控科技股份有限公司 A kind of illegal automatic auditing method of vehicle crimping based on deep learning
CN110807930A (en) * 2019-11-07 2020-02-18 中国联合网络通信集团有限公司 Dangerous vehicle early warning method and device
CN110992683A (en) * 2019-10-29 2020-04-10 山东科技大学 Dynamic image perception-based intersection blind area early warning method and system

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101870293B (en) * 2009-04-24 2013-06-05 南京理工大学 Vehicle driving state evaluating method based on road-switching behavior detection
US9922567B2 (en) * 2011-07-21 2018-03-20 Bendix Commercial Vehicle Systems Llc Vehicular fleet management system and methods of monitoring and improving driver performance in a fleet of vehicles
CN103593678B (en) * 2013-10-16 2016-09-28 长安大学 A kind of long-span bridge vehicle dynamic load distribution detection method
CN103942533A (en) * 2014-03-24 2014-07-23 河海大学常州校区 Urban traffic illegal behavior detection method based on video monitoring system
CN104282020B (en) * 2014-09-22 2017-11-17 中海网络科技股份有限公司 A kind of vehicle speed detection method based on target trajectory
CN205003849U (en) * 2015-07-29 2016-01-27 崔忠光 Aerial intelligent monitoring system of collecting evidence
CN105511495B (en) * 2016-02-15 2019-02-01 国家电网公司 Power circuit UAV Intelligent inspection control method and system
CN107688764B (en) * 2016-08-03 2020-04-10 浙江宇视科技有限公司 Method and device for detecting vehicle violation
CN108230667A (en) * 2016-12-14 2018-06-29 贵港市瑞成科技有限公司 A kind of vehicle peccancy behavioral value method
CN107103292A (en) * 2017-04-12 2017-08-29 湖南源信光电科技股份有限公司 A kind of statistical method of traffic flow tracked based on moving vehicle
CN207095552U (en) * 2017-06-06 2018-03-13 苏腾飞 A kind of water utilities monitoring system based on unmanned plane
US10953880B2 (en) * 2017-09-07 2021-03-23 Tusimple, Inc. System and method for automated lane change control for autonomous vehicles
CN109102702A (en) * 2018-08-24 2018-12-28 南京理工大学 Vehicle speed measuring method based on video encoder server and Radar Signal Fusion
CN109544909B (en) * 2018-10-29 2021-06-04 华蓝设计(集团)有限公司 Method for analyzing lane changing behavior of driver based on aerial video vehicle track
CN109902677B (en) * 2019-01-30 2021-11-12 深圳北斗通信科技有限公司 Vehicle detection method based on deep learning
CN109887281B (en) * 2019-03-01 2021-03-26 北京云星宇交通科技股份有限公司 Method and system for monitoring traffic incident
CN109948582B (en) * 2019-03-28 2021-03-02 湖南大学 Intelligent vehicle reverse running detection method based on tracking trajectory analysis
CN110188689B (en) * 2019-05-30 2022-11-29 重庆大学 Virtual driving target collision detection method based on real scene modeling
CN110444014A (en) * 2019-07-01 2019-11-12 淮阴工学院 The anti-method for early warning that knocks into the back based on reversed ST-MRF vehicle tracking algorithm
CN110597245B (en) * 2019-08-12 2020-11-20 北京交通大学 Automatic driving track-changing planning method based on quadratic planning and neural network
CN110570664B (en) * 2019-09-23 2023-04-07 山东科技大学 Automatic detection system for highway traffic incident
CN111026156A (en) * 2019-12-17 2020-04-17 北京京东乾石科技有限公司 Inspection system, method, control device, equipment and storage medium
CN111145545B (en) * 2019-12-25 2021-05-28 西安交通大学 Road traffic behavior unmanned aerial vehicle monitoring system and method based on deep learning

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622886A (en) * 2012-03-23 2012-08-01 长安大学 Video-based method for detecting violation lane-changing incident of vehicle
CN103150908A (en) * 2013-02-05 2013-06-12 长安大学 Average vehicle speed detecting method based on video
CN104008387A (en) * 2014-05-19 2014-08-27 山东科技大学 Lane line detection method based on feature point piecewise linear fitting
CN106643749A (en) * 2016-09-14 2017-05-10 北京航空航天大学 Dangerous driving behavior detection method based on intelligent cellphone
WO2018057393A1 (en) * 2016-09-23 2018-03-29 Pcms Holdings, Inc. Methods and apparatus for improved navigation notification based on localized traffic flow
CN107563310A (en) * 2017-08-16 2018-01-09 电子科技大学 A kind of lane change detection method violating the regulations
CN108366099A (en) * 2018-01-23 2018-08-03 深圳市北航旭飞科技有限公司 Control method, system, device based on panorama unmanned plane and panorama unmanned plane
CN108596129A (en) * 2018-04-28 2018-09-28 武汉盛信鸿通科技有限公司 A kind of vehicle based on intelligent video analysis technology gets over line detecting method
CN109299674A (en) * 2018-09-05 2019-02-01 重庆大学 A kind of lane change detection method violating the regulations of the tunnel based on car light
CN109949578A (en) * 2018-12-31 2019-06-28 上海眼控科技股份有限公司 A kind of illegal automatic auditing method of vehicle crimping based on deep learning
CN110992683A (en) * 2019-10-29 2020-04-10 山东科技大学 Dynamic image perception-based intersection blind area early warning method and system
CN110807930A (en) * 2019-11-07 2020-02-18 中国联合网络通信集团有限公司 Dangerous vehicle early warning method and device

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
Lane marking detection using image features and line fitting model;Danilo Cácere Hernández;《2017 10th International Conference on Human System Interactions (HSI)》;20170818;234-238 *
基于单线激光雷达的车辆检测与跟踪;刘伟;《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》;20200115;C035-418 *
基于机器视觉的车道保持辅助系统研究;赵文明;《数字技术与应用》;20171115;63-64 *
基于视觉的车道级定位算法研究;刘亚群;《中国优秀硕士学位论文全文数据库 信息科技辑》;20160815;I138-926 *
基于车载视频的道路车辆及行人检测;唐诗;《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》;20180815;C035-24 *
多车道高速公路运行方式对分流区交通安全影响研究;段力;《中国博士学位论文全文数据库 工程科技Ⅱ辑》;20160715;C034-23 *

Also Published As

Publication number Publication date
CN111833598A (en) 2020-10-27

Similar Documents

Publication Publication Date Title
CN111833598B (en) Automatic traffic incident monitoring method and system for unmanned aerial vehicle on highway
CN109686088B (en) Traffic video alarm method, equipment and system
CN109255955B (en) Highway accident monitoring method and system based on foundation and empty foundation cooperative sensing
CN112099040A (en) Whole-course continuous track vehicle tracking system and method based on laser radar network
CN114489122B (en) UAV and matching airport-based automatic highway inspection method and system
CN112633722A (en) Vehicle-mounted road safety risk assessment system and method
CN113487877A (en) Road vehicle illegal parking monitoring method
CN114363316A (en) Intelligent networking monitoring and supervision system for cross-regional road infrastructure
CN114494998B (en) Intelligent analysis method and system for vehicle data
CN116631187B (en) Intelligent acquisition and analysis system for case on-site investigation information
CN113392817A (en) Vehicle density estimation method and device based on multi-row convolutional neural network
CN111275957A (en) Traffic accident information acquisition method, system and camera
CN116229396B (en) High-speed pavement disease identification and warning method
CN109934161B (en) Vehicle identification and detection method and system based on convolutional neural network
CN114333331B (en) Method and system for identifying vehicle passing information and vehicle weight of multi-lane bridge
Yin et al. Detecting illegal pickups of intercity buses from their gps traces
CN113538968B (en) Method and apparatus for outputting information
Pan et al. Identifying Vehicles Dynamically on Freeway CCTV Images through the YOLO Deep Learning Model.
CN114724356A (en) GIS (geographic information system) highway accident early warning method and system based on meteorological data integration
CN112419713A (en) Urban traffic monitoring system based on cloud computing
CN113128847A (en) Entrance ramp real-time risk early warning system and method based on laser radar
CN112215070A (en) Unmanned aerial vehicle aerial photography video traffic flow statistical method, host and system
KR20220071822A (en) Identification system and method of illegal parking and stopping vehicle numbers using drone images and artificial intelligence technology
CN111028510B (en) Comprehensive emergency management platform architecture for cloud computing
CN113936456B (en) Street-crossing traffic identification and feature analysis method based on millimeter wave radar

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant