CN114758297A - Traffic incident detection method and system based on fusion of radar and video - Google Patents

Traffic incident detection method and system based on fusion of radar and video Download PDF

Info

Publication number
CN114758297A
CN114758297A CN202210421476.7A CN202210421476A CN114758297A CN 114758297 A CN114758297 A CN 114758297A CN 202210421476 A CN202210421476 A CN 202210421476A CN 114758297 A CN114758297 A CN 114758297A
Authority
CN
China
Prior art keywords
vehicle
abnormal
radar
data
detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210421476.7A
Other languages
Chinese (zh)
Inventor
梁昭伟
孙昊
范栋男
安泽萍
黄群龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Highway Engineering Consultants Corp
Original Assignee
China Highway Engineering Consultants Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Highway Engineering Consultants Corp filed Critical China Highway Engineering Consultants Corp
Priority to CN202210421476.7A priority Critical patent/CN114758297A/en
Publication of CN114758297A publication Critical patent/CN114758297A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Evolutionary Biology (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Probability & Statistics with Applications (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Alarm Systems (AREA)

Abstract

The invention discloses a traffic incident detection method and system based on fusion of radar and video, and relates to the technical field of traffic incident detection. The method comprises the following specific steps: acquiring radar data and video data, and constructing a vehicle detection data set; training a YOLOv5 network by using the vehicle detection data set to obtain an abnormal target detection network; detecting suspicious vehicles through the abnormal target detection network; further processing the information of the suspicious vehicles, and judging whether the suspicious vehicles are abnormal vehicles; and after the abnormal vehicle is determined, fusing the radar data to judge an abnormal event. According to the method, the detection of the abnormal events in the tunnel is completed by fusing the radar and the video data, compared with manual monitoring and cloud processing, the method greatly liberates manpower, saves labor cost, avoids careless omission of workers in monitoring processing due to fatigue and the like, and also avoids larger time delay caused by cloud processing of data.

Description

Traffic incident detection method and system based on fusion of radar and video
Technical Field
The invention relates to the technical field of traffic incident detection, in particular to a traffic incident detection method and system based on fusion of radar and video.
Background
Along with the rapid rise of the fields of artificial intelligence technology, image processing technology, sensor technology and the like in recent years, the expressway monitoring system combining a plurality of advanced technologies is produced. Most of them rely primarily on visual techniques to monitor various types of traffic events. However, since the highway tunnel almost covers various complex conditions in the traffic road environment, and the requirements of people on the reliability and completeness of the highway tunnel intelligent monitoring system cannot be met only by means of the vision technology, the highway tunnel intelligent monitoring system must be required to be capable of efficiently and comprehensively collecting traffic information, environmental information and the like of the monitored tunnel section. However, the vision technology has the defects of short detection distance, easy influence of external environment and the like, and cannot acquire comprehensive tunnel information.
The millimeter wave radar sensor has the advantages of high measurement precision, long measurement distance and the like, can robustly cope with various severe weather environments, is easily influenced by clutter, can have the phenomenon that partial areas cannot be detected in a closed tunnel environment, and is not suitable for being used in the tunnel independently. Therefore, it is an urgent problem for those skilled in the art to combine radar and vision for traffic event detection.
Disclosure of Invention
In view of the above, the present invention provides a traffic incident detection method and system based on the fusion of radar and video, so as to solve the problems in the background art.
In order to achieve the purpose, the invention adopts the following technical scheme: a traffic incident detection method based on the fusion of radar and video is characterized by comprising the following specific steps:
acquiring radar data and video data, and constructing a vehicle detection data set;
training a YOLOv5 network by using the vehicle detection data set to obtain an abnormal target detection network;
detecting suspicious vehicles through the abnormal target detection network;
further processing the information of the suspicious vehicle, and judging whether the suspicious vehicle is an abnormal vehicle;
and after the abnormal vehicle is determined, fusing the radar data to judge an abnormal event.
Optionally, the method further includes compiling the YOLOv5 network and the weights after training the YOLOv5 network, and generating a corresponding engine file.
Optionally, the abnormal vehicle is determined in the following manner: for the information of the suspicious vehicle, reading vehicle position data, and carrying out K-means clustering operation on the vehicle position data; according to the elbow rule, when more than 6 suspicious vehicle position data exist in the clustering center, the suspicious vehicle is considered as an abnormal vehicle.
Optionally, the specific manner of fusing the radar data to determine the abnormal event is as follows: if the vehicle speed in the radar data in the current detection period is a positive value, a reverse abnormal event exists in the current detection period; when the average speed of the vehicles in the detection period is lower than 20m/s, a traffic congestion event exists; after the abnormal vehicle is determined in the current detection period, the acceleration value in the radar data exceeds 15m/s ^2, and the abnormal event of the traffic accident is considered to exist; after the abnormal vehicle is determined, if the position information of the abnormal vehicle is in the emergency lane area, the abnormal vehicle is considered as an emergency lane parking abnormal event; and if the abnormal vehicle position information is not in the emergency lane area, the illegal parking event is considered.
Optionally, the video data is processed into an average frame image, and the average frame image is inferred by using an abnormal target detection network to determine whether the vehicle is a suspicious vehicle.
Optionally, the method for constructing the vehicle detection data set includes: and extracting the categories of buses, trucks, automobiles, motorcycles and pedestrians from the coco data set, collecting the fire data set, converting the labels of the fire data set into the format of the coco data set, and splicing the six types of images into a vehicle detection data set.
On the other hand, the system comprises a data acquisition module, a model training module, a suspicious vehicle detection module, an abnormal vehicle detection module and a radar information fusion module; wherein, the first and the second end of the pipe are connected with each other,
the data acquisition module is used for acquiring radar data and video data and constructing a vehicle detection data set;
the model training module is used for training a YOLOv5 network by using the vehicle detection data set to obtain an abnormal target detection network;
the suspicious vehicle detection module is used for detecting suspicious vehicles through the abnormal target detection network;
the abnormal vehicle detection module is used for further processing the information of the suspicious vehicle and judging whether the suspicious vehicle is an abnormal vehicle;
and the radar information fusion module is used for fusing the radar data to judge the abnormal event after the abnormal vehicle is determined.
Optionally, the system further includes a preprocessing module, configured to process the video data into an average frame image, and process the radar data into a communication format.
Compared with the prior art, the invention discloses and provides a traffic incident detection method and system based on the fusion of radar and video, and the method and system have the following beneficial technical effects:
(1) the detection of abnormal events in the tunnel, including traffic accidents, illegal parking, retrograde motion, fire hazards, traffic jams, pedestrian break-in, emergency lane parking and the like, is completed only by collecting radar information and video information as input and fusing radar and video data;
(2) compared with manual monitoring and cloud processing, the method greatly liberates manpower, saves labor cost, avoids careless omission of workers in monitoring processing due to fatigue and the like, and also avoids larger time delay caused by cloud processing data;
(3) the method is an important basis for macroscopic linkage control, traffic dispersion and abnormal event response, so that the whole tunnel can be operated in the optimal state, intelligent diversified management of the highway tunnel is created, and the characteristics of convenience, rapidness and safety of the highway tunnel are fully exerted.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a flow chart of a method of the present invention;
FIG. 2 is a flow chart of the method for determining the type of an abnormal event by fusion of the laser visual information according to the present invention;
FIG. 3 is a system block diagram of the present invention;
FIG. 4 is a communication coordination diagram of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The embodiment 1 of the invention discloses a traffic event detection method based on the fusion of radar and video, which comprises the following specific steps as shown in figure 1:
s1, acquiring radar data and video data, and constructing a vehicle detection data set;
the method specifically comprises the identification of vehicles, fire disasters and pedestrians, wherein the vehicles comprise buses (buses), trucks (trucks) and cars (car). Firstly, a data set is constructed, bus, truck, car, motorcycle and person categories are extracted from a coco data set (Microsoft Common Objects in Context), then a fire data set is collected, the labels of the fire data set are converted into the format of the coco data set, and the six types of images are spliced into a vehicle detection data set.
S2, training a YOLOv5 network by using a vehicle detection data set to obtain an abnormal target detection network;
the YOLOv5 network is used for the detection of suspicious vehicles, fires, pedestrians, etc. Firstly, adjusting a configuration file of network training, adjusting prediction types to be 6 types, namely person, car, motorcycle, bus, truck and fire, then retraining a YOLOv5 network, compiling the YOLOv5 network and weights after training the YOLOv5, and generating a corresponding engine file, so that an abnormal event detection result can be output more quickly.
S3, detecting suspicious vehicles through an abnormal target detection network;
the method comprises the steps that traffic accidents, illegal parking, traffic jam and emergency lane parking are always accompanied by parking of abnormal vehicles, and the vehicles are always in a running state in actual traffic of a tunnel, so that information of normal vehicles can be blurred out through an average frame processing method, information of the abnormal vehicles is highlighted, then an abnormal target detection network is used for reasoning an average frame, as long as the vehicles appearing in an average frame image are considered as suspicious vehicles, after the suspicious vehicles are detected, detection results are written into a json file, and the next step of processing is waited.
In order to sufficiently avoid the interference of normal vehicle information and highlight suspicious vehicle information, in the embodiment, 10 frames of video are sampled every second, average is performed every two seconds, and 10 average frame images are generated every 20 seconds, so that a detection period is set every 20 seconds, and when an abnormal event occurs in a tunnel, the abnormal event can be detected within 20 seconds.
S4, further processing the information of the suspicious vehicle, and judging whether the suspicious vehicle is an abnormal vehicle;
the suspicious vehicles highlighted in the average frame have three conditions, namely, vehicles at a position far away from the camera are recorded in the average frame slowly when moving on a video picture due to the problem of the view angle of the installation of the monitoring video; secondly, the vehicle is recorded in an average frame due to the fact that the vehicle drives slowly; and thirdly, recording the abnormal parking of the vehicle in the average frame.
In order to distinguish that the suspicious vehicles in the average frame are abnormal vehicles, the vehicle position information in the suspicious vehicle result file is read, and K-means clustering operation is carried out on the position information. According to the elbow rule, when more than 6 pieces of suspicious vehicle position information exist in the clustering center, the suspicious vehicle is considered as an abnormal vehicle, and the radar information needs to be fused for the next judgment specifically for the abnormal events.
And S5, after the abnormal vehicle is determined, the radar data is fused to judge the abnormal event.
Through observation of data, it is concluded that traffic accidents, illegal parking, traffic congestion and emergency lane parking in tunnels are often accompanied by parking of abnormal vehicles, so the detection thinking of the abnormal events is to judge suspicious vehicles through abnormal parking, further combine radar detection information to complete fusion of radar vision information, and finally output the types of the abnormal events. Fire events, pedestrian intrusion events, and the like can be judged through a video detection algorithm. The retrograde event may be judged by the speed direction in the radar detection information.
The radar information is not designed for pedestrian intrusion and fire abnormal events, and the detection is carried out only by means of video information. When the abnormal target detection network detects the pedestrian category, the pedestrian is considered as a pedestrian intrusion event; when the abnormal target detection network detects a flame category, it is considered a fire abnormal event. After the abnormal vehicle is determined, radar information needs to be fused for judging an abnormal event. As shown in fig. 2, a flowchart for determining the abnormal event category through the fusion of the laser view information is shown.
The vehicle speed in the radar data in the current detection period is a positive value, and a reverse abnormal event exists in the current detection period; when the average speed of the vehicles in the detection period is lower than 20m/s, the traffic congestion event is considered to exist; after the abnormal vehicle is determined in the current detection period, the acceleration value in the radar data exceeds 15m/s ^2, and the abnormal event of the traffic accident is considered to exist; after the abnormal vehicle is determined, if the position information of the abnormal vehicle is in the emergency lane area, the abnormal vehicle is considered as an emergency lane parking abnormal event; and if the abnormal vehicle position information is not in the emergency lane area, determining that the abnormal vehicle is a parking violation event.
The embodiment 2 of the invention discloses a traffic incident detection system based on fusion of radar and video, which comprises a data acquisition module, a model training module, a suspicious vehicle detection module, an abnormal vehicle detection module and a radar information fusion module, wherein the data acquisition module is used for acquiring data of a traffic incident; wherein the content of the first and second substances,
the data acquisition module is used for acquiring radar data and video data and constructing a vehicle detection data set;
the model training module is used for training the YOLOv5 network by using a vehicle detection data set to obtain an abnormal target detection network;
the suspicious vehicle detection module is used for detecting suspicious vehicles through an abnormal target detection network;
the abnormal vehicle detection module is used for further processing the information of the suspicious vehicle and judging whether the suspicious vehicle is an abnormal vehicle;
and the radar information fusion module is used for fusing radar data to judge the abnormal event after the abnormal vehicle is determined.
The system further comprises a preprocessing module, a data processing module and a data processing module, wherein the preprocessing module is used for processing the video data into an average frame image and processing the radar data into a communication format; and putting the average frame image and the analyzed radar result into a shared memory.
And the system further comprises a projectile detection module which is used for detecting the average frame image stored in the shared memory and judging whether a projectile is contained.
In order to stably operate the system and achieve the purpose of real-time online detection, the modules are divided into a client and a server, and a CS mode is adopted for cooperative work. The preprocessing module performs average frame calculation and radar data analysis and writes the average frame calculation and the radar data analysis into a shared memory to serve as a client of the system, and the other modules serve as servers of the system.
The communication coordination schematic diagram of the system is shown in fig. 4, information interaction among modules is realized by adopting a TCP communication mode in the traffic event detection system, a client and a server are operated in parallel in a multithreading mode, and a module is reasonably scheduled to access and read memory space and file information by triggering conditions.
The system sets every 20 seconds as a detection period, and performs detection reporting work once. In the figure, T is the current detection period, and T-1 and T +1 represent the previous detection period and the next detection period, respectively. In the T-1 detection period, after the preprocessing module of the client finishes the average frame and radar analysis and puts the result into a memory space, a finishing signal is sent to the server for anomaly detection, and then the client continues to calculate the average frame and analyze the radar in the T period. In the current detection period T, the preprocessing module operation of the period is carried out, and meanwhile, the preprocessing result of the T-1 period is reasoned. And the server side infers and reports the T-1 period preprocessing result by sequentially calling the suspicious vehicle detection module, the spray detection module, the abnormal vehicle detection module and the radar information fusion module. After the inference is finished, the server side is in an idle state and continues to wait for the client side signal. And after the preprocessing of the current detection period T is completed, continuously sending a completion signal to the server, and processing the signal in the period T equal to 1. In order to prevent incomplete data writing and reading caused by the fact that the client and the server read one shared memory at the same time, two shared memories are opened up in the system so that the client and the server can be used alternately.
Compared with manual monitoring and cloud processing, the method and the system greatly liberate manpower, save labor cost, avoid careless omission of workers in monitoring processing due to fatigue and the like, and avoid large time delay caused by cloud processing data. Precious time is created for accident post-processing and rescue work, and meanwhile, the information can be used as an important basis for macro linkage control, traffic dispersion and abnormal event response, so that the fact that the whole tunnel can operate in the best state is guaranteed, intelligent diversified management of the highway tunnel is created, and the characteristics of convenience, rapidness and safety of the highway tunnel are fully exerted. The intelligent monitoring system of the highway can play a great promoting role in the healthy sustainable development of the national highway network tunnels and the economic construction and management of surrounding cities.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The device disclosed by the embodiment corresponds to the method disclosed by the embodiment, so that the description is simple, and the relevant points can be referred to the method part for description.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (8)

1. A traffic incident detection method based on the fusion of radar and video is characterized by comprising the following specific steps:
radar data and video data are obtained, and a vehicle detection data set is constructed;
training a YOLOv5 network by using the vehicle detection data set to obtain an abnormal target detection network;
detecting suspicious vehicles through the abnormal target detection network;
further processing the information of the suspicious vehicle, and judging whether the suspicious vehicle is an abnormal vehicle;
and after the abnormal vehicle is determined, fusing the radar data to judge an abnormal event.
2. The method of claim 1, further comprising training a YOLOv5 network and compiling the YOLOv5 network and the weights to generate corresponding engine files.
3. The method for detecting the traffic event based on the fusion of the radar and the video as claimed in claim 1, wherein the abnormal vehicle is determined by the following method: for the information of the suspicious vehicle, reading vehicle position data, and carrying out K-means clustering operation on the vehicle position data; according to the elbow rule, when more than 6 suspicious vehicle position data exist in the clustering center, the suspicious vehicle is considered as an abnormal vehicle.
4. The method for detecting the traffic incident based on the fusion of the radar and the video as claimed in claim 1, wherein the specific way of fusing the radar data to judge the abnormal incident is as follows: if the vehicle speed in the radar data in the current detection period is a positive value, a reverse driving abnormal event exists in the current detection period; when the average speed of the vehicles in the detection period is lower than 20m/s, a traffic congestion event exists; after the abnormal vehicle is determined in the current detection period, the acceleration value in the radar data exceeds 15m/s ^2, and the abnormal event of the traffic accident is considered to exist; after the abnormal vehicle is determined, if the position information of the abnormal vehicle is in the emergency lane area, the abnormal vehicle is considered as an emergency lane parking abnormal event; and if the abnormal vehicle position information is not in the emergency lane area, the illegal parking event is considered.
5. The method of claim 1, wherein the video data is processed into an average frame image, and the average frame image is inferred by an abnormal target detection network to determine whether the vehicle is a suspicious vehicle.
6. The method for detecting the traffic incident based on the fusion of the radar and the video as claimed in claim 1, wherein the manner of constructing the vehicle detection data set is as follows: and extracting the categories of buses, trucks, automobiles, motorcycles and pedestrians from the coco data set, collecting the fire data set, converting the labels of the fire data set into the format of the coco data set, and splicing the six types of images into a vehicle detection data set.
7. A traffic incident detection system based on fusion of radar and video is characterized by comprising a data acquisition module, a model training module, a suspicious vehicle detection module, an abnormal vehicle detection module and a radar information fusion module; wherein the content of the first and second substances,
the data acquisition module is used for acquiring radar data and video data and constructing a vehicle detection data set;
the model training module is used for training a YOLOv5 network by using the vehicle detection data set to obtain an abnormal target detection network;
the suspicious vehicle detection module is used for detecting suspicious vehicles through the abnormal target detection network;
the abnormal vehicle detection module is used for further processing the information of the suspicious vehicle and judging whether the suspicious vehicle is an abnormal vehicle;
and the radar information fusion module is used for fusing the radar data to judge the abnormal event after the abnormal vehicle is determined.
8. The system of claim 7, further comprising a preprocessing module for processing the video data into an average frame image and the radar data into a communication format.
CN202210421476.7A 2022-04-21 2022-04-21 Traffic incident detection method and system based on fusion of radar and video Pending CN114758297A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210421476.7A CN114758297A (en) 2022-04-21 2022-04-21 Traffic incident detection method and system based on fusion of radar and video

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210421476.7A CN114758297A (en) 2022-04-21 2022-04-21 Traffic incident detection method and system based on fusion of radar and video

Publications (1)

Publication Number Publication Date
CN114758297A true CN114758297A (en) 2022-07-15

Family

ID=82330756

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210421476.7A Pending CN114758297A (en) 2022-04-21 2022-04-21 Traffic incident detection method and system based on fusion of radar and video

Country Status (1)

Country Link
CN (1) CN114758297A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115049993A (en) * 2022-08-17 2022-09-13 成都考拉悠然科技有限公司 Vehicle abnormal stop monitoring method based on deep learning
CN115527364A (en) * 2022-08-25 2022-12-27 西安电子科技大学广州研究院 Traffic accident tracing method and system based on radar vision data fusion
CN116630866A (en) * 2023-07-24 2023-08-22 中电信数字城市科技有限公司 Abnormal event monitoring method, device, equipment and medium for audio-video radar fusion

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115049993A (en) * 2022-08-17 2022-09-13 成都考拉悠然科技有限公司 Vehicle abnormal stop monitoring method based on deep learning
CN115527364A (en) * 2022-08-25 2022-12-27 西安电子科技大学广州研究院 Traffic accident tracing method and system based on radar vision data fusion
CN115527364B (en) * 2022-08-25 2023-11-21 西安电子科技大学广州研究院 Traffic accident tracing method and system based on radar data fusion
CN116630866A (en) * 2023-07-24 2023-08-22 中电信数字城市科技有限公司 Abnormal event monitoring method, device, equipment and medium for audio-video radar fusion
CN116630866B (en) * 2023-07-24 2023-10-13 中电信数字城市科技有限公司 Abnormal event monitoring method, device, equipment and medium for audio-video radar fusion

Similar Documents

Publication Publication Date Title
CN114758297A (en) Traffic incident detection method and system based on fusion of radar and video
CN109345829B (en) Unmanned vehicle monitoring method, device, equipment and storage medium
CN110866427A (en) Vehicle behavior detection method and device
US20020072847A1 (en) Vision-based method and apparatus for monitoring vehicular traffic events
CN110895662A (en) Vehicle overload alarm method and device, electronic equipment and storage medium
Battiato et al. On-board monitoring system for road traffic safety analysis
CN113676702A (en) Target tracking monitoring method, system and device based on video stream and storage medium
CN114387785A (en) Safety management and control method and system based on intelligent highway and storable medium
Yuan et al. Key points of investigation and analysis on traffic accidents involving intelligent vehicles
CN112528759A (en) Traffic violation behavior detection method based on computer vision
CN111310696B (en) Parking accident identification method and device based on analysis of abnormal parking behaviors and vehicle
CN113125133A (en) System and method for monitoring state of railway bridge and culvert height limiting protection frame
Hamdane et al. Description of pedestrian crashes in accordance with characteristics of Active Safety Systems
Detzer et al. Analysis of traffic safety for cyclists: The automatic detection of critical traffic situations for cyclists
Paredes et al. Intelligent collision risk detection in medium-sized cities of developing countries, using naturalistic driving: A review
CN110996053B (en) Environment safety detection method and device, terminal and storage medium
Bachmann et al. Responsible integration of autonomous vehicles in an autocentric society
CN113301327A (en) Performance determination method and device of image collector, medium and electronic equipment
CN108664695B (en) System for simulating vehicle accident and application thereof
CN112435470A (en) Traffic incident video detection system
Hnoohom et al. The video-based safety methodology for pedestrian crosswalk safety measured: The case of Thammasat University, Thailand
Meng et al. ROCO: A Roundabout Traffic Conflict Dataset
Upadhyay et al. Traffic Monitoring System using YOLOv3 Model
Ahmad et al. Comparative study of dashcam-based vehicle incident detection techniques
CN211349574U (en) 360-degree all-round-looking early warning system for vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination