WO2023019761A1 - 一种面向混合交通流的路网运行状态检测系统及方法 - Google Patents
一种面向混合交通流的路网运行状态检测系统及方法 Download PDFInfo
- Publication number
- WO2023019761A1 WO2023019761A1 PCT/CN2021/129854 CN2021129854W WO2023019761A1 WO 2023019761 A1 WO2023019761 A1 WO 2023019761A1 CN 2021129854 W CN2021129854 W CN 2021129854W WO 2023019761 A1 WO2023019761 A1 WO 2023019761A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- traffic
- information
- vehicle
- road
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 37
- 238000000034 method Methods 0.000 title abstract description 25
- 238000004891 communication Methods 0.000 claims abstract description 82
- 230000004927 fusion Effects 0.000 claims abstract description 53
- 230000003993 interaction Effects 0.000 claims abstract description 15
- 238000004458 analytical method Methods 0.000 claims abstract description 10
- 238000000605 extraction Methods 0.000 claims description 22
- 230000008447 perception Effects 0.000 claims description 16
- 238000012545 processing Methods 0.000 claims description 13
- 230000001133 acceleration Effects 0.000 claims description 10
- 238000010276 construction Methods 0.000 claims description 7
- 238000007405 data analysis Methods 0.000 claims description 7
- 238000013527 convolutional neural network Methods 0.000 claims description 6
- 101001093748 Homo sapiens Phosphatidylinositol N-acetylglucosaminyltransferase subunit P Proteins 0.000 claims description 4
- QVFWZNCVPCJQOP-UHFFFAOYSA-N chloralodol Chemical compound CC(O)(C)CC(C)OC(O)C(Cl)(Cl)Cl QVFWZNCVPCJQOP-UHFFFAOYSA-N 0.000 claims description 3
- 239000013589 supplement Substances 0.000 abstract description 2
- 238000011897 real-time detection Methods 0.000 abstract 1
- 230000008569 process Effects 0.000 description 9
- 238000010586 diagram Methods 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 6
- 239000013598 vector Substances 0.000 description 6
- 230000008859 change Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 3
- 238000007476 Maximum Likelihood Methods 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 2
- 230000005856 abnormality Effects 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 230000007812 deficiency Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 238000007499 fusion processing Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 230000035515 penetration Effects 0.000 description 2
- 230000001953 sensory effect Effects 0.000 description 2
- 230000001502 supplementing effect Effects 0.000 description 2
- 206010039203 Road traffic accident Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000012826 global research Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 238000011160 research Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0125—Traffic data processing
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0112—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0116—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/40—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
- H04W4/44—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]
Definitions
- the invention belongs to the technical field of intelligent transportation, and in particular relates to a road network operation state detection system and method for mixed traffic flow.
- Networked vehicles can use vehicle-to-vehicle communication technology to obtain real-time status information of the vehicle in front, thereby assisting human drivers in decision-making and driving, alleviating road traffic stability, and improving the overall traffic capacity of the road network.
- the mixed traffic flow state in which networked vehicles and non-networked vehicles are randomly mixed in different proportions is an inevitable stage of development, and it will be in this stage for a long time. Therefore, for mixed traffic flow
- the research on the road network operation state detection system in the environment is particularly important.
- the existing work provides a road traffic operation status detection system applied to a traffic environment where all connected vehicles are distributed, and does not take into account the different characteristics of the mixed traffic flow composed of connected vehicles and non-connected vehicles, such as perception methods, tracking, etc.
- Vehicle mode, vehicle distance control, etc., as well as networked vehicle communication data that do not fully consider the different penetration rates in mixed traffic roads, cannot achieve accurate perception and accurate identification of operating status in mixed traffic environments, and the versatility and compatibility are weak.
- the present invention provides a road network operation state detection system and method for mixed traffic flow.
- the technical problem to be solved in the present invention is realized through the following technical solutions:
- the present invention provides a mixed traffic flow-oriented road network operating state detection system, including a perception module, a communication module and a cloud control platform module, wherein,
- the perception module is used to collect traffic information data of connected vehicles and non-connected vehicles in the observation section in real time;
- the communication module is used to perform information interaction between networked vehicles, between networked vehicles and roadsides, and transmit the traffic information data to the cloud control platform module;
- the cloud control platform module is used for data fusion and analysis of the traffic information data within a predetermined time period, so as to judge the vehicle operation condition and obtain the evolution law of the road network operation situation.
- the sensing module includes a vehicle sensing unit and a roadside sensing unit, wherein,
- the on-vehicle sensing unit is arranged on the connected vehicle, and is used to collect the operation information of the connected vehicle and the operation information of the adjacent non-connected vehicles, and the operation information of the adjacent non-connected vehicles includes at least non-connected Vehicle orientation, distance position information data, acceleration and deceleration information data and path data;
- the roadside sensing unit is arranged on the observed road section and is used to collect traffic information on the observed road section, and the traffic information on the observed road section at least includes road video data, road picture data and vehicle information in the observed road section.
- the communication module includes a vehicle communication unit and a roadside communication unit, wherein,
- the vehicle communication unit is arranged on the networked vehicle, and is used for information interaction with the vehicle communication unit on other networked vehicles, the roadside communication unit and the cloud control platform module;
- the roadside communication unit is arranged on the observed road section, and is used for information interaction with the on-board communication unit on the connected vehicle passing through the observed road section and the cloud control platform module.
- the vehicle-mounted communication unit is any one of WiFi, DSRC, LoRa, Bluetooth, LTE-V, 4G/5G/6G; the roadside communication unit is WiFi, Bluetooth, LTE - Any of V, 4G/5G/6G.
- the cloud control platform module includes a cloud platform communication unit and a data processing unit, wherein,
- the PTZ communication unit can be connected to the on-vehicle communication unit and the roadside communication unit for receiving different types of traffic information data measured by different sensors on the connected vehicle and the roadside;
- the data processing unit is used to extract and fuse data features from the different types of traffic information data, and obtain the evolution law of road network operation situation according to the fusion result.
- the data processing unit includes a point cloud data feature extraction subunit, an image data feature extraction subunit, a data fusion subunit, a traffic basic map construction subunit and a data analysis subunit, wherein,
- the point cloud data feature extraction subunit is used to establish a three-dimensional coordinate system for the scattered point type data in the traffic information data to obtain three-dimensional point cloud data;
- the image data feature extraction subunit is used to extract image feature information from the image type data in the traffic information data using a convolutional neural network
- the data fusion subunit is used to fuse the three-dimensional point cloud data and the corresponding image feature information to obtain a fusion result of specific traffic data;
- the traffic basic map construction subunit is used to construct a traffic basic map according to the fusion results of multiple sets of specific traffic data
- the data analysis sub-unit is used for judging the operation status of vehicles according to the basic traffic map and the traffic congestion index standard, identifying the operation status of the road network, and obtaining the evolution law of the operation situation of the road network.
- Another aspect of the present invention provides a road network operating state detection method for mixed traffic flow, including:
- S1 Real-time collection of traffic information data of networked vehicles and non-networked vehicles in the observation section;
- S2 Perform information interaction between connected vehicles, between connected vehicles and the roadside, and transmit the traffic information data to the cloud control platform;
- S3 Using the cloud control platform to perform data fusion and analysis on the traffic information data within a predetermined period of time, so as to judge the vehicle operation condition and obtain the evolution rule of the road network operation situation.
- said S1 includes:
- S11 Collect the operation information of the connected vehicle, the operation information of the adjacent non-connected vehicle, and the traffic information of the road section where the vehicle is located.
- the operation information of the adjacent non-connected vehicle includes at least the position and distance information data of the non-connected vehicle , acceleration and deceleration information data and path data;
- S12 Collect traffic information on the observed road section, the traffic information on the observed road section at least includes road video data, road image data, and vehicle information in the observed road section.
- said S3 includes:
- S31 Receive different types of traffic information data measured by connected vehicles and different roadside sensors
- S32 Perform data feature extraction and fusion on the different types of traffic information data, and obtain the evolution rule of the road network operation situation according to the fusion result.
- said S32 includes:
- S321 Establish a three-dimensional coordinate system for the scattered point type data in the traffic information data, and obtain three-dimensional point cloud data;
- S324 Construct a traffic basic map according to the fusion result of multiple sets of specific traffic data
- S325 Judging vehicle running conditions according to the basic traffic map and traffic congestion index standards, identifying road network running states, and obtaining road network running situation evolution rules.
- the present invention is oriented to the mixed traffic flow road network operation state detection system, considering the random composition of networked vehicles and non-networked vehicles Due to the different characteristics of mixed traffic flow, when there are non-networked vehicles without communication capabilities, the sensory communication of networked vehicles and roadside equipment in the road can collect their position, proportion, vehicle speed change, driving path, and vehicle distance information and transmission, supplementing road state information, and improving the accuracy of real-time collection of road network state information by combining networked vehicle communication data.
- the road network operating state detection system for mixed traffic flows of the present invention aims at different degrees of intelligent network connection existing in mixed traffic environments, through information fusion, feature extraction, and analysis of characteristic data, to identify the intelligent level of mixed traffic roads and Running status, judging whether there is congestion ahead, whether there are obstacles and other event information, which improves the openness, versatility and compatibility of the road network running status detection system.
- the cloud control platform module of the present invention can carry out holographic traffic information fusion and overcome the deficiency of a single information source.
- the state information collected and interacted from multiple directions and dimensions will be fused to eliminate redundancy, make up for the lack, check the abnormality, improve the accuracy of traffic state recognition, and facilitate the operation of the auxiliary driving end of the networked vehicle and the manual driving end of the non-networked vehicle Real-time decision-making control for safe driving.
- Fig. 1 is a block diagram of a road network operating state detection system oriented to mixed traffic flow provided by an embodiment of the present invention
- FIG. 2 is a specific structural diagram of a mixed traffic flow-oriented road network operating state detection system provided by an embodiment of the present invention
- Fig. 3 is a schematic structural diagram of a data processing unit provided by an embodiment of the present invention.
- Fig. 4 is a schematic flow chart of a road network operation state detection method oriented to mixed traffic flow provided by an embodiment of the present invention.
- FIG. 1 is a block diagram of a mixed traffic flow-oriented road network operation status detection system provided by an embodiment of the present invention.
- the road network operation status detection system includes a perception module 1, a communication module 2 and a cloud control platform module 3, wherein the perception module 1 is used for real-time collection of traffic information data of networked vehicles and non-networked vehicles in the observed road section; the communication module 2 It is used for information interaction between connected vehicles, between connected vehicles and the roadside, and transmits the traffic information data to the cloud control platform module 3; The traffic information data is fused and analyzed to judge the vehicle operation situation and obtain the evolution law of the road network operation situation.
- FIG. 2 is a specific structural diagram of a mixed traffic flow-oriented road network operation status detection system provided by an embodiment of the present invention.
- the perception module 1 of this embodiment includes a vehicle-mounted sensing unit 11 and a roadside sensing unit 12, wherein the vehicle-mounted sensing unit 11 is arranged on a networked vehicle, and is used to collect the operation information of the networked vehicle and the information of adjacent non-networked vehicles. Operation information, the operation information of the networked vehicle includes the position, speed, acceleration and deceleration, vehicle type information, etc. Deceleration information data and route data.
- the roadside sensing unit 12 is arranged on the observed road section, and is used for collecting traffic information on the observed road section, and the traffic information on the observed road section includes at least road video data, road picture data and vehicle information in the observed road section.
- the vehicle-mounted perception unit 11 of this embodiment may include millimeter-wave radar, GPS, IMU inertial navigation equipment, vehicle-mounted camera, etc., and can obtain driving information such as the position, vehicle type, vehicle speed, acceleration, and surrounding images of the current connected vehicle in real time. Obtain information such as the vehicle type, orientation, distance, location information, acceleration and deceleration, and path data of adjacent non-networked vehicles.
- the roadside sensing unit 12 may include various sensors such as microwave, geomagnetic, vibration and other Internet of Things sensors, roadside radar, camera, video detector, etc., for collecting and observing the speed, density, position and image information of vehicles in the road section from the roadside. .
- the video detector can start the camera to shoot when the vehicle passes the capture range, and then the video processing technology will count traffic information such as the occupancy rate of connected vehicles, signal lights, traffic weather, road congestion, and emergencies.
- the roadside perception unit 12 can also obtain information such as the vehicle type, orientation, distance, position information, acceleration and deceleration, and route data of non-networked vehicles within the detection range.
- both connected vehicles and non-connected vehicles are included, and connected vehicles are randomly distributed, and non-connected vehicles cannot communicate with roadside equipment and cloud control platform module 3 For data interaction.
- the vehicle-mounted sensing unit 11 installed on the networked vehicle can obtain the orientation, distance position information data, acceleration and deceleration information data, path data, etc. of its adjacent non-networked vehicles, so that the perception is more comprehensive, accurate and timely extension small.
- the communication module 2 includes a vehicle-mounted communication unit 21 and a roadside communication unit 22, wherein the vehicle-mounted communication unit 21 is arranged on the connected vehicle, and is used to communicate with the vehicle-mounted communication unit, the roadside communication unit 22 and the connected vehicle on other connected vehicles.
- the cloud control platform module 3 performs information interaction; the roadside communication unit 22 is set on the observed road section, and is used for information interaction with the vehicle communication unit 21 and the cloud control platform module 3 on the networked vehicles passing through the observed road section.
- Both the on-vehicle communication unit 21 and the roadside communication unit 22 use low-latency, highly reliable information communication methods to communicate between vehicles and between vehicles and roads.
- the real-time communication capabilities of networked vehicles provide guarantee for the realization of traffic information interaction. It can effectively improve the stability and safety of the vehicle.
- the vehicle communication unit 21 of the present embodiment can be any one in WiFi, DSRC, LoRa, Bluetooth, LTE-V, 4G/5G/6G;
- the roadside communication unit 22 is WiFi, Bluetooth, LTE-V, 4G/5G any of /6G.
- the vehicle-mounted communication unit 21 is directly connected to CAN (Controller Area Network, Controller Area Network CAN) in the vehicle, and realizes communication between vehicles and between vehicles and roads through different communication protocols.
- the on-vehicle communication unit 21 uses DSRC technology to communicate with the roadside communication unit 22 through a microwave device, and the Bluetooth sensor network realizes the connection between vehicles on the road, and realizes it through cellular network communication, such as LTE-V/4G/5G/6G base stations Interact with the cloud control platform module 3, and share the vehicle-end road traffic information collected by the vehicle-mounted sensing unit 11.
- the roadside communication unit 22 is connected and communicated with the vehicle-mounted communication unit 21 , and the roadside communication unit 22 can also transmit the location information of non-networked vehicles within the detection range to the cloud control platform module 3 .
- the addition of the on-vehicle communication unit 21 and the roadside communication unit 22 effectively provides real-time networked workshop and inter-vehicle information services, vehicle monitoring and management.
- the non-connected vehicle has no communication function, it can collect information such as its position and vehicle speed changes through the sensors of adjacent connected vehicles and transmit it to other connected vehicles, roadside equipment and cloud control platform module 3 by the vehicle communication unit 21.
- intelligent networked vehicles with different penetration rates can communicate in real time and collect road condition information around the vehicle to assist the operation of the auxiliary driving terminal of the connected vehicle and the manual driving terminal of the non-connected vehicle, and change the car-following strategy of mixed traffic vehicles , improve traffic stability and reduce traffic accidents.
- the driving terminal of the connected vehicle can adjust the driving strategy of the vehicle according to the acquired road condition information and vehicle distribution information around the vehicle, while the driving terminal of the non-connected vehicle can adjust the operation of the vehicle according to the driving conditions of other connected vehicles.
- the cloud control platform module 3 of the present embodiment includes a cloud platform communication unit 31 and a data processing unit 32, wherein the cloud platform communication unit 31 can be connected to the vehicle communication unit 21 and the roadside communication unit 22 for receiving information from networked vehicles and roadside communication units.
- the data processing unit 32 is used to extract and fuse data features from the different types of traffic information data, and obtain the evolution rule of the road network operation situation according to the fusion results.
- FIG. 3 is a schematic structural diagram of a data processing unit provided by an embodiment of the present invention.
- the data processing unit 32 includes a point cloud data feature extraction subunit 321, an image data feature extraction subunit 322, a data fusion subunit 323, a traffic basic map construction subunit 324 and a data analysis subunit 325, wherein the point cloud data feature extraction subunit Unit 321 is used to establish a three-dimensional coordinate system for the scatter-type data in the traffic information data to obtain three-dimensional point cloud data; the image data feature extraction sub-unit 322 is used to use convolution for the image-type data in the traffic information data The neural network extracts image feature information; the data fusion subunit 323 is used to fuse the three-dimensional point cloud data and the corresponding image feature information to obtain the fusion result of specific traffic data; the traffic basic map construction subunit 324 is used to The fusion results of multiple groups of specific traffic data construct a basic traffic map; the data analysis subunit 325 is used to judge the vehicle operation situation according to the basic traffic map
- the point cloud data feature extraction subunit 321 obtains the position data information of vehicles in the observed road collected at different time points, constructs a three-dimensional coordinate system, and integrates the position data information at different time points As the point cloud data, it is expressed as a group of three-dimensional points in the three-dimensional coordinate system, wherein each point is a vector of its coordinates, and carries the position information data of the vehicle.
- the image data feature extraction subunit 322 obtains image data of vehicles in the observed road collected at different time points, that is, obtains images that can reflect vehicle position information at the time points corresponding to the above-mentioned point cloud data.
- the image data collected by the camera passes through the volume
- the product neural network extracts the image feature information, and establishes the corresponding relationship between different coordinate systems, and the point cloud coordinate data corresponding to the image feature is obtained from the mapping matrix, and the point cloud coordinate data also includes the position information data of the vehicle.
- the data fusion subunit 323 is used to fuse the above two sets of point cloud coordinate data, and the specific processing process is: using a symmetric function to aggregate the information of each coordinate vector in the two sets of point cloud coordinate data to realize data feature extraction; using the entropy weight method For the extracted feature vectors, the probability distribution of each feature value is calculated by the maximum likelihood method, and the probability from different data sources (ie, from point cloud data or image data) is calculated by the Bayesian formula. After calculation Shannon entropy obtains the mutual information between different data sources, and then calculates the weights corresponding to different data sources through mutual information, and obtains the result of data fusion according to the weight calculation.
- the accuracy is higher, the redundancy is eliminated, the lack is made up, the abnormal data is checked, and the accuracy and stability of the system are improved.
- the above process only exemplifies the fusion process of vehicle location information collected by different devices. For other information such as traffic flow, vehicle density, and headway, etc., the above process can also be used for data fusion. To obtain more accurate data, the specific process will not be repeated here.
- the traffic basic map construction subunit 324 analyzes the traffic flow characteristics according to various fused characteristic data (for example, traffic volume, vehicle speed, density, headway, etc.) basic diagram. Subsequently, the data analysis subunit 325 judges the operation status (smooth, mild congestion, moderate congestion or severe congestion) according to the traffic congestion index standard, identifies the road network operation status, and obtains the evolution rule of the road network operation situation, such as location, traffic volume , the degree of congestion, whether there is a sudden change in vehicle speed, whether there is a possibility of smooth flow or congestion, etc.
- various fused characteristic data for example, traffic volume, vehicle speed, density, headway, etc.
- the data analysis subunit 325 judges the operation status (smooth, mild congestion, moderate congestion or severe congestion) according to the traffic congestion index standard, identifies the road network operation status, and obtains the evolution rule of the road network operation situation, such as location, traffic volume , the degree of congestion, whether there is a sudden change in vehicle speed, whether there is a possibility of smooth
- the cloud control platform module 3 of this embodiment achieves the perception module through the computing function and the communication function, and the communication module is combined with the cloud control platform module to construct an integrated platform for perception, computing and communication.
- this embodiment is oriented to the mixed traffic flow road network operation state detection system, considering the random composition of networked vehicles and non-networked vehicles Due to the different characteristics of mixed traffic flow, when there are non-networked vehicles without communication capabilities, the sensory communication of networked vehicles and roadside equipment in the road can collect their position, proportion, vehicle speed change, driving path, and vehicle distance information and transmission, supplementing road state information, and improving the accuracy of real-time collection of road network state information by combining networked vehicle communication data.
- the system aims at the different degrees of intelligent networking in the mixed traffic environment, through information fusion, feature extraction, and analysis of characteristic data, it can identify the intelligent level and operating status of mixed traffic roads, and judge whether there is congestion ahead, whether there are obstacles and other event information , which improves the openness, versatility and compatibility of the road network operation status detection system.
- this embodiment provides a road network operation state detection method for mixed traffic flow.
- the method for detecting the running state of the road network includes:
- S1 Real-time collection of traffic information data of networked vehicles and non-networked vehicles in the observation section;
- S2 Perform information interaction between connected vehicles, between connected vehicles and the roadside, and transmit the traffic information data to the cloud control platform;
- S3 Using the cloud control platform to perform data fusion and analysis on the traffic information data within a predetermined period of time, so as to judge the vehicle operation condition and obtain the evolution rule of the road network operation situation.
- the S1 includes:
- S11 Collect the operation information of the connected vehicle, the operation information of the adjacent non-connected vehicle, and the traffic information of the road section where the vehicle is located.
- the operation information of the adjacent non-connected vehicle includes at least the position and distance information data of the non-connected vehicle , acceleration and deceleration information data and path data;
- S12 Collect traffic information on the observed road section, the traffic information on the observed road section at least includes road video data, road image data, and vehicle information in the observed road section.
- said S3 includes:
- S31 Receive different types of traffic information data measured by connected vehicles and different roadside sensors
- S32 Perform data feature extraction and fusion on the different types of traffic information data, and obtain the evolution rule of the road network operation situation according to the fusion result.
- said S32 includes:
- S321 Establish a three-dimensional coordinate system for the scattered point type data in the traffic information data, and obtain three-dimensional point cloud data;
- S324 Construct a traffic basic map according to the fusion result of multiple sets of specific traffic data
- S325 Judging vehicle running conditions according to the basic traffic map and traffic congestion index standards, identifying road network running states, and obtaining road network running situation evolution rules.
- the fusion analysis and fusion of vehicle position data first obtain the position data information of vehicles in the observed road collected at different time points, construct a three-dimensional coordinate system, and use the position data information at different time points as point cloud data.
- the above-mentioned three-dimensional coordinate system is represented as a group of three-dimensional points, wherein each point is a vector of its coordinates, and contains the position information data of the vehicle.
- the image data collected by the camera is extracted through a convolutional neural network.
- the point cloud coordinate data corresponding to the image feature is obtained from the mapping matrix, and the point cloud coordinate data also includes the position information data of the vehicle.
- the above two sets of point cloud coordinate data are fused, and the specific process is as follows: the information of each coordinate vector in the two sets of point cloud coordinate data is aggregated using a symmetric function to realize data feature extraction; the data fusion using the entropy weight method, for For the extracted feature vectors, the probability distribution of each feature value is calculated by the maximum likelihood method, and the probability from different data sources (that is, from point cloud data or image data) is calculated by Bayesian formula, and different data are obtained by calculating Shannon entropy Mutual information between sources, and then calculate the weights corresponding to different data sources through mutual information, and calculate the result of data fusion according to the weights.
- the above process only exemplifies the fusion process of vehicle location information collected by different devices. For other information such as traffic flow, vehicle density, and headway, etc., the above process can also be used for data fusion. To obtain more accurate data, the specific process will not be repeated here.
- the traffic flow characteristics are analyzed, and the basic traffic map is constructed by the cellular automata model.
- the traffic congestion index standard to judge the operation status (smooth, mild congestion, moderate congestion or severe congestion), identify the operation status of the road network, and obtain the evolution law of the road network operation situation, such as location, traffic volume, degree of congestion, whether There are sudden changes in vehicle speed, whether there is a possibility of traffic flow or congestion, etc.
- the method for detecting the road network operating status in this embodiment can perform holographic traffic information fusion to overcome the deficiency of a single information source.
- the state information collected and interacted from multiple directions and dimensions will be fused to eliminate redundancy, make up for the lack, check the abnormality, improve the accuracy of traffic state recognition, and facilitate the operation of the auxiliary driving end of the networked vehicle and the manual driving end of the non-networked vehicle Real-time decision-making control for safe driving.
Landscapes
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Traffic Control Systems (AREA)
Abstract
一种面向混合交通流的路网运行状态检测系统及方法,所述系统包括感知模块(1)、通信模块(2)和云控平台模块(3),其中,所述感知模块(1)用于实时采集观测路段内网联车和非网联车的交通信息数据;所述通信模块(2)用于进行网联车之间、网联车与路侧之间的信息交互,并将所述交通信息数据传输至所述云控平台模块(3);所述云控平台模块(3)用于对预定时间内的交通信息数据进行数据融合及分析,以判断车辆运行情况并获得路网运行态势演化规律。该系统考虑混合交通流下的不同特性,通过道路中网联车及路侧设备对非网联车的运行数据进行采集与传递,补充道路状态信息,通过结合网联车通信数据提高路网状态信息实时检测识别准确度。
Description
本发明属于智能交通技术领域,具体涉及一种面向混合交通流的路网运行状态检测系统及方法。
随着智能驾驶技术迅速发展为车联网环境提供基础,智能车辆已经成为目前全球研究的热点。网联车可以利用车车通信技术以获取前车行车的实时状态信息,从而辅助人工驾驶员进行决策驾驶,缓解道路交通稳定性,提高路网整体通行能力。而在车联网普及的过程中,网联车辆与非网联车辆在不同比例下随机混合行驶的混合交通流状态是发展的必经阶段,且目前将长期处于这一阶段,故针对混合交通流环境下路网运行状态检测系统的研究尤为重要。
智能交通的道路状态检测识别作为目前新兴技术领域,实现城市道路交通、车辆、通信、计算机、信息技术等多个学科和行业广泛且深度的交叉融合。通过车车、车路信息交互与共享,充分实现车辆之间以及车辆与基础设施之间协同通信,从而提高路网运行状态检测准确度,缓解道路拥堵状态,提高交通效率,保证交通安全和稳定性。
由于辅助驾驶系统车辆在道路上加入使用,车辆驾驶行为特性、状态信息感知及采集方式、道路环境均会发生变化。然而现有工作提供了应用于全为网联车分布的交通环境的道路交通运行状态检测系统,没有考虑到网联车与非网联车构成的混合交通流下的不同特性,例如感知方式、跟车模式、 车辆间距控制等,以及未充分考虑混合流道路中不同渗透率的网联车通信数据,无法实现混合交通环境下精准感知、运行状态精准识别,通用性和兼容性较弱。
发明内容
为了解决现有技术中存在的上述问题,本发明提供了一种面向混合交通流的路网运行状态检测系统及方法。本发明要解决的技术问题通过以下技术方案实现:
本发明提供了一种面向混合交通流的路网运行状态检测系统,包括感知模块、通信模块和云控平台模块,其中,
所述感知模块用于实时采集观测路段内网联车和非网联车的交通信息数据;
所述通信模块用于进行网联车之间、网联车与路侧之间的信息交互,并将所述交通信息数据传输至所述云控平台模块;
所述云控平台模块用于对预定时间内的交通信息数据进行数据融合及分析,以判断车辆运行情况并获得路网运行态势演化规律。
在本发明的一个实施例中,所述感知模块包括车载感知单元和路侧感知单元,其中,
所述车载感知单元设置在所述网联车上,用于采集所述网联车的运行信息及邻近非网联车的运行信息,所述邻近非网联车的运行信息至少包括非网联车方位、间距位置信息数据、加减速信息数据和路径数据;
所述路侧感知单元设置在观测路段上,用于采集所述观测路段上的交通信息,所述观测路段上的交通信息至少包括道路视频数据、道路图片数据以及观测路段内的车辆信息。
在本发明的一个实施例中,所述通信模块包括车载通信单元和路侧通信单元,其中,
所述车载通信单元设置在所述网联车上,用于与其他网联车上的车载通信单元、所述路侧通信单元以及所述云控平台模块进行信息交互;
所述路侧通信单元设置在观测路段上,用于与经过所述观测路段的网联车上的车载通信单元以及所述云控平台模块进行信息交互。
在本发明的一个实施例中,所述车载通信单元为WiFi、DSRC、LoRa、蓝牙、LTE-V、4G/5G/6G中的任一种;所述路侧通信单元为WiFi、蓝牙、LTE-V、4G/5G/6G中的任一种。
在本发明的一个实施例中,所述云控平台模块包括云台通信单元和数据处理单元,其中,
所述云台通信单元能够连接所述车载通信单元和所述路侧通信单元,用于接收来自网联车和路侧不同传感器测量的不同类型的交通信息数据;
所述数据处理单元用于对来自所述不同类型的交通信息数据进行数据特征提取和融合,并根据融合结果获得路网运行态势演化规律。
在本发明的一个实施例中,所述数据处理单元包括点云数据特征提取子单元、图像数据特征提取子单元、数据融合子单元、交通基本图构建子单元和数据分析子单元,其中,
所述点云数据特征提取子单元用于对所述交通信息数据中的散点类型数据建立三维坐标系,获得三维点云数据;
所述图像数据特征提取子单元用于对所述交通信息数据中的图像类型数据利用卷积神经网络提取图像特征信息;
所述数据融合子单元用于对所述三维点云数据和对应的所述图像特征 信息进行融合,获得特定交通数据的融合结果;
所述交通基本图构建子单元用于根据多组特定交通数据的融合结果构建交通基本图;
所述数据分析子单元用于根据所述交通基本图及交通拥堵指数标准判断车辆运行情况,识别路网运行状态,得到路网运行态势演化规律。
本发明的另一方面提供了一种面向混合交通流的路网运行状态检测方法,包括:
S1:实时采集观测路段内网联车和非网联车的交通信息数据;
S2:进行网联车之间、网联车与路侧之间的信息交互,并将所述交通信息数据传输至云控平台;
S3:利用所述云控平台对预定时间内的交通信息数据进行数据融合及分析,以判断车辆运行情况并获得路网运行态势演化规律。
在本发明的一个实施例中,所述S1包括:
S11:采集所述网联车的运行信息、邻近非网联车的运行信息以及车辆所在路段的交通信息,所述邻近非网联车的运行信息至少包括非网联车方位、间距位置信息数据、加减速信息数据和路径数据;
S12:采集所述观测路段上的交通信息,所述观测路段上的交通信息至少包括道路视频数据、道路图片数据以及观测路段内的车辆信息。
在本发明的一个实施例中,所述S3包括:
S31:接收来自网联车和路侧不同传感器测量的不同类型的交通信息数据;
S32:对来自所述不同类型的交通信息数据进行数据特征提取和融合,并根据融合结果获得路网运行态势演化规律。
在本发明的一个实施例中,所述S32包括:
S321:对所述交通信息数据中的散点类型数据建立三维坐标系,获得三维点云数据;
S322:对所述交通信息数据中的图像类型数据利用卷积神经网络提取图像特征信息;
S323:对所述三维点云数据和对应的所述图像特征信息进行融合,获得特定交通数据的融合结果;
S324:根据多组特定交通数据的融合结果数据融合结果构建交通基本图;
S325:根据所述交通基本图及交通拥堵指数标准判断车辆运行情况,识别路网运行状态,得到路网运行态势演化规律。
与现有技术相比,本发明的有益效果在于:
1、相比于现有应用于全为网联车分布的交通环境的道路交通状态检测,本发明面向混合交通流的路网运行状态检测系统,考虑网联车与非网联车随机构成的混合交通流下的不同特性,面对存在非网联车没有通信能力时,通过道路中网联车及路侧设备的感知通信能对其位置、比例、车速变化、行驶路径、车辆间距信息进行采集与传递,补充道路状态信息,通过结合网联车通信数据提高路网状态信息实时采集准确度。
2、本发明面向混合交通流的路网运行状态检测系统,针对混合交通环境下存在的不同智能化网联化程度,通过信息融合,特征提取,解析特性数据,识别混合流道路智能化等级及运行状态,判断前方是否拥堵、是否有障碍等事件信息,提高了路网运行状态检测系统的开放性、通用性、兼容性。
3、本发明的云控平台模块能够进行全息交通信息融合,克服单一信息 来源的不足。将从多方位、多维度采集交互的状态信息进行融合,消除冗余,弥补缺失,校验异常,提高交通状态识别精准度,有利于网联车辅助驾驶端和非网联车人工驾驶端操作的实时决策控制,安全驾驶。
以下将结合附图及实施例对本发明做进一步详细说明。
图1是本发明实施例提供的一种面向混合交通流的路网运行状态检测系统的模块图;
图2是本发明实施例提供的一种面向混合交通流的路网运行状态检测系统的具体结构图;
图3是本发明实施例提供的一种数据处理单元的结构示意图;
图4是本发明实施例提供的一种面向混合交通流的路网运行状态检测方法的流程示意图。
为了进一步阐述本发明为达成预定发明目的所采取的技术手段及功效,以下结合附图及具体实施方式,对依据本发明提出的一种面向混合交通流的路网运行状态检测系统及方法进行详细说明。
有关本发明的前述及其他技术内容、特点及功效,在以下配合附图的具体实施方式详细说明中即可清楚地呈现。通过具体实施方式的说明,可对本发明为达成预定目的所采取的技术手段及功效进行更加深入且具体地了解,然而所附附图仅是提供参考与说明之用,并非用来对本发明的技术方案加以限制。
应当说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗 示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的物品或者设备中还存在另外的相同要素。
实施例一
请参见图1,图1是本发明实施例提供的一种面向混合交通流的路网运行状态检测系统的模块图。该路网运行状态检测系统包括感知模块1、通信模块2和云控平台模块3,其中,感知模块1用于实时采集观测路段内网联车和非网联车的交通信息数据;通信模块2用于进行网联车之间、网联车与路侧之间的信息交互,并将所述交通信息数据传输至所述云控平台模块3;云控平台模块3用于对预定时间内的交通信息数据进行数据融合及分析,以判断车辆运行情况并获得路网运行态势演化规律。
进一步地,请参见图2,图2是本发明实施例提供的一种面向混合交通流的路网运行状态检测系统的具体结构图。本实施例的感知模块1包括车载感知单元11和路侧感知单元12,其中,车载感知单元11设置在网联车上,用于采集所述网联车的运行信息及邻近非网联车的运行信息,所述网联车的运行信息包括车联网的位置、速度、加减速、车型信息等,所述邻近非网联车的运行信息至少包括非网联车方位、间距位置信息数据、加减速信息数据和路径数据。路侧感知单元12设置在观测路段上,用于采集所述观测路段上的交通信息,所述观测路段上的交通信息至少包括道路视频数据、道路图片数据以及观测路段内的车辆信息。
本实施例的车载感知单元11可以包括毫米波雷达、GPS、IMU惯导设备、 车载摄像头等,可以实时获得当前网联车的位置、车型、车速、加速度、周围图像等行车信息,同时也可以获得邻近非网联车的车型、方位、间距位置信息数据、加减速和路径数据等信息。
路侧感知单元12可以包括微波、地磁、震动等物联网传感器、路侧雷达、摄像头、视频检测器等多种传感器,用于从路侧采集观测路段内车辆的车速、密度、位置及图像信息。其中,视频检测器可在车辆通过捕捉范围时启动摄像头拍摄,再由视频处理技术统计网联车占有率、信号灯、交通气象、道路拥堵情况、突发事件等交通信息。路侧感知单元12也可以获得探测范围内非网联车的车型、方位、间距位置信息数据、加减速和路径数据等信息。
需要说明的是,在本实施例的混合交通流的情况中,同时包括网联车和非网联车,网联车随机分布,而非网联车无法与路侧设备和云控平台模块3进行数据交互。在本实施例中,可以通过设置在网联车上的车载感知单元11获得其临近非网联车方位、间距位置信息数据、加减速信息数据、路径数据等,使感知更全面、精确且时延小。
进一步地,通信模块2包括车载通信单元21和路侧通信单元22,其中,车载通信单元21设置在网联车上,用于与其他网联车上的车载通信单元、路侧通信单元22以及云控平台模块3进行信息交互;路侧通信单元22设置在观测路段上,用于与经过该观测路段的网联车上的车载通信单元21以及云控平台模块3进行信息交互。
车载通信单元21和路侧通信单元22均采用低时延、高可靠的信息通信方式,进行车车之间、车路之间通信,网联车的实时通信能力为实现交通信息交互提供保障,能够有效提高车辆行驶的稳定性和安全性。
本实施例的车载通信单元21可以是WiFi、DSRC、LoRa、蓝牙、LTE-V、 4G/5G/6G中的任一种;路侧通信单元22为WiFi、蓝牙、LTE-V、4G/5G/6G中的任一种。
在本实施例中,车载通信单元21与车内CAN(控制器局域网络,Controller Area NetworkCAN)直接相连,通过不同通信协议实现车车之间、车路之间通信。车载通信单元21采用DSRC技术与路侧通信单元22通过微波装置进行通信,由蓝牙传感网络实现道路中车与车连接,通过蜂窝网络通信方式,如LTE-V/4G/5G/6G基站实现与云控平台模块3交互,共享车载感知单元11采集到的车端道路交通信息。
路侧通信单元22与车载通信单元21进行连接通信,路侧通信单元22也可向云控平台模块3传递探测范围内非网联车位置信息。
车载通信单元21和路侧通信单元22的加入有效提供实时网联车间、车路间信息服务,车辆监控与管理。非网联车虽没有通信功能,但可通过邻近网联车的传感器采集到其位置、车速变化等信息并由车载通信单元21传递给其它网联车、路侧设备和云控平台模块3,补充道路状态感知数据,扩大交通信息采集范围。在混合交通场景中,不同渗透率的智能网联车可通过实时通信并采集车辆周围路况信息,以协助网联车辅助驾驶端和非网联车人工驾驶端操作,变化混合流车辆跟驰策略,提高车流稳定性,减少交通事故。具体地,网联车驾驶端可以根据获取的车辆周围路况信息及车辆分布信息等调整车辆驾驶策略,而非网联车的驾驶端可以根据其他网联车的驾驶情况调整车辆的操作。
本实施例的云控平台模块3包括云台通信单元31和数据处理单元32,其中,云台通信单元31能够连接车载通信单元21和路侧通信单元22,用于接收来自网联车和路侧不同传感器测量的不同类型的交通信息数据;数据处理 单元32用于对来自所述不同类型的交通信息数据进行数据特征提取和融合,并根据融合结果获得路网运行态势演化规律。
请参见图3,图3是本发明实施例提供的一种数据处理单元的结构示意图。数据处理单元32包括点云数据特征提取子单元321、图像数据特征提取子单元322、数据融合子单元323、交通基本图构建子单元324和数据分析子单元325,其中,点云数据特征提取子单元321用于对所述交通信息数据中的散点类型数据建立三维坐标系,获得三维点云数据;图像数据特征提取子单元322用于对所述交通信息数据中的图像类型数据利用卷积神经网络提取图像特征信息;数据融合子单元323用于对所述三维点云数据和对应的所述图像特征信息进行融合,获得特定交通数据的融合结果;交通基本图构建子单元324用于根据多组特定交通数据的融合结果构建交通基本图;数据分析子单元325用于根据所述交通基本图及交通拥堵指数标准判断车辆运行情况,识别路网运行状态,得到路网运行态势演化规律。
具体地,以车辆位置数据的融合分析融合为例,点云数据特征提取子单元321获得不同时间点采集的观测道路中车辆的位置数据信息,构建三维坐标系,将不同时间点的位置数据信息作为点云数据,在所述三维坐标系中表示为一组三维点,其中每个点是其坐标的向量,带有车辆的位置信息数据。
图像数据特征提取子单元322获得不同时间点采集的观测道路中车辆的图像数据,即,获得与上述点云数据对应时间点能反应车辆位置信息的图像,在此,摄像头采集的图像数据经过卷积神经网络提取图像特征信息,通过在不同坐标系间建立对应关系,由映射矩阵得到图像特征对应的点云坐标数据,该点云坐标数据中也包括车辆的位置信息数据。
数据融合子单元323用于将上述两组点云坐标数据进行融合,具体处理 过程为:利用对称函数聚合两组点云坐标数据中每个坐标向量的信息,实现数据特征提取;采用熵权法的数据融合,对于所提取的特征向量,通过最大似然法计算每个特征值的概率分布,由贝叶斯公式计算来自不同数据源(即来自点云数据或图像数据)的概率,经过计算香农熵得到不同数据源之间的互信息,再通过互信息计算不同数据源对应的权值,根据权值计算得到数据融合后的结果。
通过融合以不同方式获得的不同格式的数据,使得准确度更高,消除冗余,弥补缺失,校验异常数据,提高系统准确性及稳定性。
需要说明的是,上述过程仅示例性地说明了对不同设备采集的车辆位置信息的融合过程,对于其他信息例如车流量、车辆密度、车头时距等,也可以采用上述过程进行数据融合,以获得更加准确的数据,具体过程这里不再赘述。
接着,交通基本图构建子单元324根据混合交通流的各种融合后的特征数据(例如、车流量、车速、密度、车头时距等),解析交通流特性,由元胞自动机模型构建交通基本图。随后,数据分析子单元325根据交通拥堵指数标准判断运行情况(畅通、轻度拥堵、中度拥堵或重度拥堵),识别路网运行状态,得到路网运行态势演化规律,如位置、车流量大小、拥堵程度、是否有车速突然变化、是否有变畅通或拥堵的可能等。
本实施例的云控平台模块3通过计算功能、通信功能达到感知模块、通信模块与云控平台模块相结合,构建感知计算通信一体化平台。提供面向混合交通流环境的云端全息信息融合,消除冗余数据,填补丢失数据,校验异常数据,以提高数据准确度,在不同道路智能化水平及不同智能网联车渗透率情况下的混合流环境下,实现交通道路运行态势识别。有利于辅助驾驶端 的实时决策控制,安全驾驶,协助非网联车人工驾驶对路况的判断分析,例如是否应加减速,前方道路是否有拥堵情况。
相比于现有应用于全为网联车分布的交通环境的道路交通状态检测系统,本实施例面向混合交通流的路网运行状态检测系统,考虑网联车与非网联车随机构成的混合交通流下的不同特性,面对存在非网联车没有通信能力时,通过道路中网联车及路侧设备的感知通信能对其位置、比例、车速变化、行驶路径、车辆间距信息进行采集与传递,补充道路状态信息,通过结合网联车通信数据提高路网状态信息实时采集准确度。该系统针对混合交通环境下存在的不同智能化网联化程度,通过信息融合,特征提取,解析特性数据,识别混合流道路智能化等级及运行状态,判断前方是否拥堵、是否有障碍等事件信息,提高了路网运行状态检测系统的开放性、通用性、兼容性。
实施例二
在上述实施例的基础上,本实施例提供了一种面向混合交通流的路网运行状态检测方法。该路网运行状态检测方法包括:
S1:实时采集观测路段内网联车和非网联车的交通信息数据;
S2:进行网联车之间、网联车与路侧之间的信息交互,并将所述交通信息数据传输至云控平台;
S3:利用所述云控平台对预定时间内的交通信息数据进行数据融合及分析,以判断车辆运行情况并获得路网运行态势演化规律。
进一步地,所述S1包括:
S11:采集所述网联车的运行信息、邻近非网联车的运行信息以及车辆所在路段的交通信息,所述邻近非网联车的运行信息至少包括非网联车方 位、间距位置信息数据、加减速信息数据和路径数据;
S12:采集所述观测路段上的交通信息,所述观测路段上的交通信息至少包括道路视频数据、道路图片数据以及观测路段内的车辆信息。
进一步地,所述S3包括:
S31:接收来自网联车和路侧不同传感器测量的不同类型的交通信息数据;
S32:对来自所述不同类型的交通信息数据进行数据特征提取和融合,并根据融合结果获得路网运行态势演化规律。
进一步地,所述S32包括:
S321:对所述交通信息数据中的散点类型数据建立三维坐标系,获得三维点云数据;
S322:对所述交通信息数据中的图像类型数据利用卷积神经网络提取图像特征信息;
S323:对所述三维点云数据和对应的所述图像特征信息进行融合,获得特定交通数据的融合结果;
S324:根据多组特定交通数据的融合结果数据融合结果构建交通基本图;
S325:根据所述交通基本图及交通拥堵指数标准判断车辆运行情况,识别路网运行状态,得到路网运行态势演化规律。
具体地,以车辆位置数据的融合分析融合为例,首先获得不同时间点采集的观测道路中车辆的位置数据信息,构建三维坐标系,将不同时间点的位置数据信息作为点云数据,在所述三维坐标系中表示为一组三维点,其中每个点是其坐标的向量,带有车辆的位置信息数据。
接着,获得不同时间点采集的观测道路中车辆的图像数据,即,获得与上述点云数据对应时间点能反应车辆位置信息的图像,在此,摄像头采集的图像数据经过卷积神经网络提取图像特征信息,通过在不同坐标系间建立对应关系,由映射矩阵得到图像特征对应的点云坐标数据,该点云坐标数据中也包括车辆的位置信息数据。
随后对将上述两组点云坐标数据进行融合,具体处理过程为:利用对称函数聚合两组点云坐标数据中每个坐标向量的信息,实现数据特征提取;采用熵权法的数据融合,对于所提取的特征向量,通过最大似然法计算每个特征值的概率分布,由贝叶斯公式计算来自不同数据源(即来自点云数据或图像数据)的概率,经过计算香农熵得到不同数据源之间的互信息,再通过互信息计算不同数据源对应的权值,根据权值计算得到数据融合后的结果。
需要说明的是,上述过程仅示例性地说明了对不同设备采集的车辆位置信息的融合过程,对于其他信息例如车流量、车辆密度、车头时距等,也可以采用上述过程进行数据融合,以获得更加准确的数据,具体过程这里不再赘述。
接着,根据混合交通流的各种融合后的特征数据(例如、车流量、车速、密度、车头时距等),解析交通流特性,由元胞自动机模型构建交通基本图。随后,根据交通拥堵指数标准判断运行情况(畅通、轻度拥堵、中度拥堵或重度拥堵),识别路网运行状态,得到路网运行态势演化规律,如位置、车流量大小、拥堵程度、是否有车速突然变化、是否有变畅通或拥堵的可能等。
此外,本实施例的路网运行状态检测方法能够进行全息交通信息融合,克服单一信息来源的不足。将从多方位、多维度采集交互的状态信息进行融合,消除冗余,弥补缺失,校验异常,提高交通状态识别精准度,有利于网 联车辅助驾驶端和非网联车人工驾驶端操作的实时决策控制,安全驾驶。
以上内容是结合具体的优选实施方式对本发明所作的进一步详细说明,不能认定本发明的具体实施只局限于这些说明。对于本发明所属技术领域的普通技术人员来说,在不脱离本发明构思的前提下,还可以做出若干简单推演或替换,都应当视为属于本发明的保护范围。
Claims (10)
- 一种面向混合交通流的路网运行状态检测系统,其特征在于,包括感知模块(1)、通信模块(2)和云控平台模块(3),其中,所述感知模块(1)用于实时采集观测路段内网联车和非网联车的交通信息数据;所述通信模块(2)用于进行网联车之间、网联车与路侧之间的信息交互,并将所述交通信息数据传输至所述云控平台模块(3);所述云控平台模块(3)用于对预定时间内的交通信息数据进行数据融合及分析,以判断车辆运行情况并获得路网运行态势演化规律。
- 根据权利要求1所述的面向混合交通流的路网运行状态检测系统,其特征在于,所述感知模块(1)包括车载感知单元(11)和路侧感知单元(12),其中,所述车载感知单元(11)设置在所述网联车上,用于采集所述网联车的运行信息及邻近非网联车的运行信息,所述邻近非网联车的运行信息至少包括非网联车方位、间距位置信息数据、加减速信息数据和路径数据;所述路侧感知单元(12)设置在观测路段上,用于采集所述观测路段上的交通信息,所述观测路段上的交通信息至少包括道路视频数据、道路图片数据以及观测路段内的车辆信息。
- 根据权利要求1所述的面向混合交通流的路网运行状态检测系统,其特征在于,所述通信模块(2)包括车载通信单元(21)和路侧通信单元(22),其中,所述车载通信单元(21)设置在所述网联车上,用于与其他网联车上的车载通信单元、所述路侧通信单元(22)以及所述云控平台模块(3)进行信息交互;所述路侧通信单元(22)设置在观测路段上,用于与经过所述观测路段的 网联车上的车载通信单元(21)以及所述云控平台模块(3)进行信息交互。
- 根据权利要求3所述的面向混合交通流的路网运行状态检测系统,其特征在于,所述车载通信单元(21)为WiFi、DSRC、LoRa、蓝牙、LTE-V、4G/5G/6G中的任一种;所述路侧通信单元(22)为WiFi、蓝牙、LTE-V、4G/5G/6G中的任一种。
- 根据权利要求1所述的面向混合交通流的路网运行状态检测系统,其特征在于,所述云控平台模块(3)包括云台通信单元(31)和数据处理单元(32),其中,所述云台通信单元(31)能够连接所述车载通信单元(21)和所述路侧通信单元(22),用于接收来自网联车和路侧不同传感器测量的不同类型的交通信息数据;所述数据处理单元(32)用于对来自所述不同类型的交通信息数据进行数据特征提取和融合,并根据融合结果获得路网运行态势演化规律。
- 根据权利要求5所述的面向混合交通流的路网运行状态检测系统,其特征在于,所述数据处理单元(32)包括点云数据特征提取子单元(321)、图像数据特征提取子单元(322)、数据融合子单元(323)、交通基本图构建子单元(324)和数据分析子单元(325),其中,所述点云数据特征提取子单元(321)用于对所述交通信息数据中的散点类型数据建立三维坐标系,获得三维点云数据;所述图像数据特征提取子单元(322)用于对所述交通信息数据中的图像类型数据利用卷积神经网络提取图像特征信息;所述数据融合子单元(323)用于对所述三维点云数据和对应的所述图像特征信息进行融合,获得特定交通数据的融合结果;所述交通基本图构建子单元(324)用于根据多组特定交通数据的融合结果构建交通基本图;所述数据分析子单元(325)用于根据所述交通基本图及交通拥堵指数标准判断车辆运行情况,识别路网运行状态,得到路网运行态势演化规律。
- 一种面向混合交通流的路网运行状态检测方法,其特征在于,包括:S1:实时采集观测路段内网联车和非网联车的交通信息数据;S2:进行网联车之间、网联车与路侧之间的信息交互,并将所述交通信息数据传输至云控平台;S3:利用所述云控平台对预定时间内的交通信息数据进行数据融合及分析,以判断车辆运行情况并获得路网运行态势演化规律。
- 根据权利要求7所述的面向混合交通流的路网运行状态检测方法,其特征在于,所述S1包括:S11:采集所述网联车的运行信息、邻近非网联车的运行信息以及车辆所在路段的交通信息,所述邻近非网联车的运行信息至少包括非网联车方位、间距位置信息数据、加减速信息数据和路径数据;S12:采集所述观测路段上的交通信息,所述观测路段上的交通信息至少包括道路视频数据、道路图片数据以及观测路段内的车辆信息。
- 根据权利要求8所述的面向混合交通流的路网运行状态检测方法,其特征在于,所述S3包括:S31:接收来自网联车和路侧不同传感器测量的不同类型的交通信息数据;S32:对来自所述不同类型的交通信息数据进行数据特征提取和融合,并根据融合结果获得路网运行态势演化规律。
- 根据权利要求9所述的面向混合交通流的路网运行状态检测方法,其特征在于,所述S32包括:S321:对所述交通信息数据中的散点类型数据建立三维坐标系,获得三维点云数据;S322:对所述交通信息数据中的图像类型数据利用卷积神经网络提取图像特征信息;S323:对所述三维点云数据和对应的所述图像特征信息进行融合,获得特定交通数据的融合结果;S324:根据多组特定交通数据的融合结果数据融合结果构建交通基本图;S325:根据所述交通基本图及交通拥堵指数标准判断车辆运行情况,识别路网运行状态,得到路网运行态势演化规律。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110960547.6 | 2021-08-20 | ||
CN202110960547.6A CN113870553B (zh) | 2021-08-20 | 2021-08-20 | 一种面向混合交通流的路网运行状态检测系统及方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023019761A1 true WO2023019761A1 (zh) | 2023-02-23 |
Family
ID=78987924
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/129854 WO2023019761A1 (zh) | 2021-08-20 | 2021-11-10 | 一种面向混合交通流的路网运行状态检测系统及方法 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN113870553B (zh) |
WO (1) | WO2023019761A1 (zh) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116403437A (zh) * | 2023-03-16 | 2023-07-07 | 安徽海博智能科技有限责任公司 | 一种基于云雾融合的露天矿山车路协同系统 |
CN116564084A (zh) * | 2023-05-08 | 2023-08-08 | 苏州大学 | 一种基于纯路端感知的网联式辅助驾驶控制方法及系统 |
CN116958763A (zh) * | 2023-05-04 | 2023-10-27 | 浙江大学 | 特征-结果级融合的车路协同感知方法、介质及电子设备 |
CN117455121A (zh) * | 2023-12-19 | 2024-01-26 | 广东申创光电科技有限公司 | 一种智慧道路的信息管理方法及系统 |
CN118230554A (zh) * | 2024-05-23 | 2024-06-21 | 无锡学院 | 基于物联网与边缘计算的车载实时道路信息采集系统 |
CN118351695A (zh) * | 2024-05-13 | 2024-07-16 | 深圳技术大学 | 一种道路交通状态评估方法、装置、终端及存储介质 |
CN118627019A (zh) * | 2024-08-14 | 2024-09-10 | 深圳市长江连接器有限公司 | 一种基于人工智能的连接器状态检测方法及装置 |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114758494B (zh) * | 2022-03-25 | 2023-05-30 | 西安电子科技大学广州研究院 | 一种基于通信感知多源数据融合的交通参数检测系统及方法 |
CN115188181A (zh) * | 2022-05-18 | 2022-10-14 | 合众新能源汽车有限公司 | 一种多融合道路车辆感知、导航方法及系统 |
CN114937081B (zh) | 2022-07-20 | 2022-11-18 | 之江实验室 | 基于独立非均匀增量采样的网联车位置估计方法及装置 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102819954A (zh) * | 2012-08-28 | 2012-12-12 | 南京大学 | 交通区域动态地图监控预测系统 |
CN109714730A (zh) * | 2019-02-01 | 2019-05-03 | 清华大学 | 用于车车及车路协同的云控平台系统及协同系统和方法 |
US20190156668A1 (en) * | 2016-08-11 | 2019-05-23 | Jiangsu University | Driving service active sensing system and method in internet of vehicles environment |
CN111862655A (zh) * | 2020-05-27 | 2020-10-30 | 南京美慧软件有限公司 | 一种智能高速公路网交通设施系统及控制方法 |
CN112866328A (zh) * | 2020-11-06 | 2021-05-28 | 深圳慧拓无限科技有限公司 | 一种面向智能网联汽车的车路协同系统及方法 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105702031B (zh) * | 2016-03-08 | 2018-02-23 | 北京航空航天大学 | 基于宏观基本图的路网关键路段识别方法 |
CN106971565B (zh) * | 2017-04-22 | 2019-08-23 | 高新兴科技集团股份有限公司 | 基于物联网的区域交通边界控制与诱导协同方法及系统 |
CN107292965B (zh) * | 2017-08-03 | 2020-10-13 | 北京航空航天大学青岛研究院 | 一种基于深度图像数据流的虚实遮挡处理方法 |
CN107219533B (zh) * | 2017-08-04 | 2019-02-05 | 清华大学 | 激光雷达点云与图像融合式探测系统 |
CN109920246B (zh) * | 2019-02-22 | 2022-02-11 | 重庆邮电大学 | 一种基于v2x通信与双目视觉的协同局部路径规划方法 |
CN110570675B (zh) * | 2019-10-17 | 2020-10-27 | 中国公路工程咨询集团有限公司 | 一种车路协同环境下高速公路施工区的路侧控制系统 |
CN111540237B (zh) * | 2020-05-19 | 2021-09-28 | 河北德冠隆电子科技有限公司 | 基于多数据融合的车辆安全行驶保障方案自动生成的方法 |
-
2021
- 2021-08-20 CN CN202110960547.6A patent/CN113870553B/zh active Active
- 2021-11-10 WO PCT/CN2021/129854 patent/WO2023019761A1/zh active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102819954A (zh) * | 2012-08-28 | 2012-12-12 | 南京大学 | 交通区域动态地图监控预测系统 |
US20190156668A1 (en) * | 2016-08-11 | 2019-05-23 | Jiangsu University | Driving service active sensing system and method in internet of vehicles environment |
CN109714730A (zh) * | 2019-02-01 | 2019-05-03 | 清华大学 | 用于车车及车路协同的云控平台系统及协同系统和方法 |
CN111862655A (zh) * | 2020-05-27 | 2020-10-30 | 南京美慧软件有限公司 | 一种智能高速公路网交通设施系统及控制方法 |
CN112866328A (zh) * | 2020-11-06 | 2021-05-28 | 深圳慧拓无限科技有限公司 | 一种面向智能网联汽车的车路协同系统及方法 |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116403437A (zh) * | 2023-03-16 | 2023-07-07 | 安徽海博智能科技有限责任公司 | 一种基于云雾融合的露天矿山车路协同系统 |
CN116958763A (zh) * | 2023-05-04 | 2023-10-27 | 浙江大学 | 特征-结果级融合的车路协同感知方法、介质及电子设备 |
CN116564084A (zh) * | 2023-05-08 | 2023-08-08 | 苏州大学 | 一种基于纯路端感知的网联式辅助驾驶控制方法及系统 |
CN117455121A (zh) * | 2023-12-19 | 2024-01-26 | 广东申创光电科技有限公司 | 一种智慧道路的信息管理方法及系统 |
CN117455121B (zh) * | 2023-12-19 | 2024-04-02 | 广东申创光电科技有限公司 | 一种智慧道路的信息管理方法及系统 |
CN118351695A (zh) * | 2024-05-13 | 2024-07-16 | 深圳技术大学 | 一种道路交通状态评估方法、装置、终端及存储介质 |
CN118230554A (zh) * | 2024-05-23 | 2024-06-21 | 无锡学院 | 基于物联网与边缘计算的车载实时道路信息采集系统 |
CN118627019A (zh) * | 2024-08-14 | 2024-09-10 | 深圳市长江连接器有限公司 | 一种基于人工智能的连接器状态检测方法及装置 |
Also Published As
Publication number | Publication date |
---|---|
CN113870553B (zh) | 2023-08-29 |
CN113870553A (zh) | 2021-12-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2023019761A1 (zh) | 一种面向混合交通流的路网运行状态检测系统及方法 | |
CN106128140B (zh) | 车联网环境下行车服务主动感知系统及方法 | |
CN100374332C (zh) | 一种车载嵌入式系统 | |
CN206772594U (zh) | 自动驾驶车辆避让动态障碍物能力的测试场 | |
WO2017071224A1 (zh) | 一种行车信息共享的方法、车载平台及智能交通系统 | |
CN111724616B (zh) | 基于人工智能的数据获取及共享的方法与装置 | |
CN103295424B (zh) | 基于视频识别和车载自组网的汽车主动安全系统 | |
CN202424782U (zh) | 车载终端装置 | |
KR20180034268A (ko) | V2v 센서 공유 방법에 기초한 동적 교통 안내 | |
CN108010383A (zh) | 基于行驶车辆的盲区检测方法、装置、终端及车辆 | |
WO2021155685A1 (zh) | 一种更新地图的方法、装置和设备 | |
CN111469838A (zh) | 一种基于车联网的协同acc/aeb决策管理系统及该车辆 | |
CN104781865A (zh) | 用于借助于至少一辆机动车提供行驶路段信息的方法 | |
CN110675628A (zh) | 一种路侧智能网联信息交互边缘装置 | |
CN103198690A (zh) | 交通信息传送方法、车载信息终端、路边单元和数据中心 | |
US11938965B2 (en) | Information service method for vehicle dispatch system, vehicle dispatch system, and information service device | |
CN113724531B (zh) | 一种车联网环境下路口人车路协作预警系统及方法 | |
WO2022156309A1 (zh) | 一种轨迹预测方法、装置及地图 | |
CN104680838A (zh) | 用于汽车的安全辅助方法和系统 | |
CN114882700A (zh) | 一种基于v2x车路协同的精准公交实现方式 | |
CN112017459A (zh) | 车辆、车机设备及其信号灯识别的驾驶辅助方法 | |
Anushya | Vehicle monitoring for traffic violation using V2I communication | |
CN206672365U (zh) | 一种应用于智慧城市的道路交通管理系统 | |
CN105654745A (zh) | 一种基于智能手机的交通流量实时监控方法 | |
CN116403437A (zh) | 一种基于云雾融合的露天矿山车路协同系统 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21953996 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21953996 Country of ref document: EP Kind code of ref document: A1 |