CN115240405A - Traffic information management method, system, network equipment and storage medium - Google Patents

Traffic information management method, system, network equipment and storage medium Download PDF

Info

Publication number
CN115240405A
CN115240405A CN202110449528.7A CN202110449528A CN115240405A CN 115240405 A CN115240405 A CN 115240405A CN 202110449528 A CN202110449528 A CN 202110449528A CN 115240405 A CN115240405 A CN 115240405A
Authority
CN
China
Prior art keywords
monitored object
motion
motion parameter
monitoring
parameters
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110449528.7A
Other languages
Chinese (zh)
Inventor
彭福超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ZTE Corp
Original Assignee
ZTE Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ZTE Corp filed Critical ZTE Corp
Priority to CN202110449528.7A priority Critical patent/CN115240405A/en
Publication of CN115240405A publication Critical patent/CN115240405A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0116Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0137Measuring and analyzing of parameters relative to traffic conditions for specific applications
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the invention provides a traffic information management method, a system, network equipment and a storage medium, wherein the network equipment determines actual motion parameters of a monitored object according to a data acquisition result of acquisition road side acquisition equipment, predicts the acquired actual motion parameters according to extracted motion characteristic parameters to obtain predicted motion parameters of the monitored object, and then determines and outputs credibility data of the motion characteristic parameters of the monitored object according to the actual motion parameters and the predicted motion parameters of the monitored object. The reliability data reflects the accuracy of the predicted motion parameters of the monitored object, namely the reliability of the motion characteristic parameters of the monitored object. Through the traffic information management scheme, the network equipment can evaluate the credibility of the motion characteristic parameters of the monitored object, so that a traffic information application party can make a more reliable decision according to the credibility data of the motion characteristic parameters when applying the motion characteristic parameters of the monitored object on a road, the traffic scheduling effect is improved, and the traffic safety is enhanced.

Description

Traffic information management method, system, network equipment and storage medium
Technical Field
The embodiment of the invention relates to the field of vehicle networking, in particular to a traffic information management method, a traffic information management system, network equipment and a storage medium.
Background
The intelligent traffic proposed by V2X (Vehicle To evolution) depicts a traffic scene of Vehicle-road cooperation and human-Vehicle cooperation, and under the traffic scene, people have much higher traveling efficiency and traveling safety than the current one, so that traffic is not congested, and thousands of lives lost in traffic accidents every year can be saved. The V2X intelligent traffic scene collects road traffic information and provides the information to vehicles on the road. It can be understood that the accuracy and reliability of the road traffic information not only directly affect the effect of traffic scheduling, but also relate to the safety of a plurality of traffic participants.
Disclosure of Invention
The traffic information management method, system, network device and storage medium provided by the embodiment of the invention at least solve the technical problems that: how to evaluate the reliability of the integrated analysis result aiming at the road traffic information.
The embodiment of the invention provides a traffic information management method, which comprises the following steps: acquiring actual motion parameters and predicted motion parameters of a monitored object on a road; acquiring the actual motion parameters comprises: acquiring a data acquisition result obtained by data acquisition of roadside acquisition equipment aiming at the motion condition of the monitored object, and determining the actual motion parameter of the monitored object according to the data acquisition result; obtaining the predicted motion parameters includes: extracting the motion characteristic parameters of the monitored object, and predicting according to the extracted motion characteristic parameters to obtain predicted motion parameters of the monitored object; and determining and outputting reliability data of the motion characteristic parameters according to the actual motion parameters and the predicted motion parameters of the monitored object, wherein the reliability data represents the accuracy of the predicted motion parameters obtained according to the motion characteristic parameters.
The embodiment of the invention also provides network equipment, which comprises a processor, a memory and a communication bus; the communication bus is used for realizing connection communication between the processor and the memory; the processor is configured to execute one or more computer programs stored in the memory to implement the steps of the aforementioned traffic information management method.
The embodiment of the invention also provides a traffic information management system, which comprises roadside acquisition equipment and mobile edge computing MEC equipment, wherein the roadside acquisition equipment is in communication connection with the MEC equipment, and the MEC equipment is the network equipment.
An embodiment of the present invention further provides a storage medium, where one or more computer programs are stored in the storage medium, and the one or more computer programs are executable by one or more processors to implement the steps of the foregoing traffic information management method.
According to the traffic information management method, the system, the network equipment and the storage medium provided by the embodiment of the invention, on one hand, the network equipment acquires the data acquisition result of the road side acquisition equipment on the motion condition of the monitored object on the road, and then determines the actual motion parameter of the monitored object according to the data acquisition result; on the other hand, the motion characteristic parameters of the monitored object are extracted, the predicted motion parameters of the monitored object are obtained according to the extracted motion characteristic parameters in a prediction mode, and then the network equipment determines and outputs credibility data of the current motion characteristic parameters of the monitored object according to the actual motion parameters and the predicted motion parameters of the monitored object. The reliability data acquired in this way reflects the accuracy of the predicted movement parameters of the monitored object, and the predicted movement parameters of the monitored object are obtained by prediction according to the movement characteristic parameters, so that the reliability data can reflect the reliability of the movement characteristic parameters of the monitored object. By the traffic information management scheme provided by the embodiment of the invention, the network equipment can evaluate the credibility of the motion characteristic parameters of the monitored object, so that a traffic information application party can make a more reliable decision according to the credibility data of the motion characteristic parameters when applying the motion characteristic parameters of the monitored object on a road, the traffic scheduling effect is improved, and the traffic safety is enhanced.
Additional features and corresponding advantages of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
Fig. 1 is a flowchart of a traffic information management method according to an embodiment of the present invention;
fig. 2 is a flowchart of determining, by the network device, an actual motion parameter of the monitored object according to the data acquisition result according to the first embodiment of the present invention;
fig. 3 is another flowchart of the network device determining an actual motion parameter of a monitored object according to a data acquisition result in the first embodiment of the present invention;
fig. 4 is a schematic position diagram of a first monitored object and a second monitored object in the same coordinate system according to a first embodiment of the present invention;
fig. 5 is a flowchart illustrating a process of fusing a first motion parameter and a second motion parameter by a network device partition according to a first embodiment of the present invention;
fig. 6 is a schematic diagram illustrating the division of the monitoring range region according to the first embodiment of the present invention;
fig. 7 is a flowchart illustrating a method for determining a predicted motion parameter of a monitored object by a network device according to an embodiment of the present invention;
fig. 8 is a flowchart of a traffic information management method according to a second embodiment of the present invention;
fig. 9 is a schematic hardware structure diagram of a network device according to a third embodiment of the present invention;
fig. 10 is a schematic diagram of the internal functional units of the network device of fig. 9;
fig. 11 is a schematic diagram of a traffic information management system according to a third embodiment of the present invention;
fig. 12 is another schematic diagram of a traffic information management system according to a third embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention are described in detail below with reference to the accompanying drawings. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The first embodiment is as follows:
typical scenarios proposed by V2X smart traffic may include, but are not limited to, vehicle collision warning, vulnerable traffic participant warning, road spill warning, and the like.
Vehicle collision early warning: at an intersection, vehicles have their own position, speed and direction, and if two vehicles travel at their own current positions at a given speed and direction and there is a possibility of collision, an OBU (On Board Unit) can notify a driver of avoidance or deceleration after receiving a message.
Early warning of vulnerable traffic participants: when a vehicle runs, if pedestrians and non-motor vehicles cross the road, the OBU prompts a driver to pay attention to avoidance or deceleration after receiving the message.
Early warning of road sprinklers: in the driving process of a vehicle, if a Road surface is splashed, an RSU (Road Side Unit) needs to send out an early warning message to a corresponding vehicle in advance, and an OBU (on-board Unit) prompts a driver to notice to avoid after receiving the message.
In the above application scenarios, the intelligent traffic management system needs to accurately analyze the real-time conditions on the road, obtain the motion characteristics of each traffic participant (i.e. the motion rules of the traffic participants), and perform reliable prediction based on the motion characteristics to obtain the situation of the traffic participants at a certain moment in the future, so it is very important to accurately evaluate the credibility of the motion characteristics of the traffic participants. Based on this, the present embodiment provides a traffic information management method, which can be applied to a network device with MEC (Mobile Edge Computing) capability, and please refer to the flowchart of the traffic information management method shown in fig. 1, which includes, but is not limited to, the following steps.
S102: and acquiring actual motion parameters and predicted motion parameters of the monitored object on the road.
The monitoring objects include various objects which can affect traffic conditions on roads, such as moving objects like vehicles (motor vehicles and non-motor vehicles), pedestrians, sprinklers and the like, and in other examples, the monitoring objects further include fixed obstacles and the like.
The process of acquiring the actual motion parameters of the monitored object may refer to the process shown in fig. 2:
s202: and acquiring a data acquisition result of the road side acquisition equipment on the motion condition of the monitored object on the road.
In this embodiment, devices for monitoring road traffic conditions are arranged on a road or at a road side, and these devices are referred to as road side collecting devices, and the road side collecting devices are configured to collect data of motion conditions of a monitored object on the road, obtain data collecting results, and transmit the data collecting results to the network device. The roadside acquisition device and the network device are in communication connection, in some examples of the embodiment, the roadside acquisition device and the network device may communicate in a wired manner, and in other examples, the roadside acquisition device and the network device may communicate in a wireless manner.
The roadside acquisition equipment comprises at least one of but is not limited to echo type detection equipment and image acquisition equipment, wherein the image acquisition equipment is used for image acquisition and comprises a camera; the echo type detection equipment detects an object by transmitting waves and receiving echoes, wherein the transmitted waves comprise electromagnetic waves and sound waves. In some examples, the echo detection device may include at least one of a radar device, an infrared acquisition device, and the like that perform detection based on electromagnetic waves, and in other examples, the echo detection device may include at least one of an ultrasound acquisition device, a device that performs detection based on acoustic waves. For example, in some examples of the present embodiment, the roadside collection device includes only a radar device, and in other examples, the roadside collection device includes only a camera. In some examples of this embodiment, the roadside collection device includes two or more devices at the same time, for example, a radar device or a camera is installed at one intersection at the same time. The other intersection is provided with not only the two devices, but also an infrared acquisition device.
It is needless to say that the echo type detection device and the image acquisition device may be devices with a large volume and capable of working independently, and may also be sensors with corresponding functions, and when the echo type detection device or the image acquisition device exists in the form of a sensor, the echo type detection device or the image acquisition device may be integrated into other roadside devices to work, for example, into a roadside display device.
In this embodiment, the data acquisition result of the echo type detection device is referred to as an echo acquisition result, and the data acquisition result of the image acquisition device is referred to as an image acquisition result. Further, the echo acquisition result can be divided into radar data acquired by radar equipment, an ultrasonic acquisition result acquired by ultrasonic acquisition equipment and an infrared acquisition result of infrared acquisition equipment. It can be understood that, if the roadside acquisition device includes two or more types of devices at the same time, the data acquisition result obtained by the network device includes two or more types of devices at the same time. Taking the example that the roadside acquisition device includes the echo type detection device and the image acquisition device, the data acquisition result acquired by the network device includes both the echo acquisition result and the image acquisition result.
S204: and determining the actual motion parameters of the monitored object according to the data acquisition result.
In an embodiment, after the network device obtains the data acquisition result, the network device may process and analyze the data acquisition result, so as to obtain the actual motion parameter of at least one monitored object on the road, in some examples, the network device may obtain the actual motion parameters of all the monitored objects on the road currently, of course, in some examples, because the data acquisition result of the roadside acquisition device is incomplete, the network device may not obtain the actual motion parameters of some monitored objects, and therefore, in these examples, the network device may not obtain the actual motion parameters of all the monitored objects. It is understood that the network device should have the capability of processing the acquisition result of the road-side acquisition device, for example, when the road-side acquisition device includes an echo type detection device, the network device should have the capability of processing the echo acquisition result, and when the road-side acquisition device includes an image acquisition device such as a camera, the network device should have the capability of processing the image data.
The actual motion parameter refers to a current motion parameter of the monitored object obtained based on analysis of a data acquisition result of the roadside acquisition device, and the motion parameter in this embodiment includes, but is not limited to, at least one of a position, a speed, a direction, and an acceleration. The motion parameter prediction method comprises the steps of obtaining a motion characteristic parameter of a monitored object, and predicting the motion parameter of the monitored object at the current moment according to the motion characteristic parameter of the monitored object. For example, at the time T0, if the network device determines that the time when the actual motion parameter of the monitored object is acquired next time is at T1, the network device may predict the motion parameter of the monitored object at the time T1, that is, the predicted motion parameter, according to the motion characteristic parameter of the monitored object at the time T0. And when the time T1 is up, the network equipment acquires the actual motion parameters of the monitored object according to the data acquisition result of the road side acquisition equipment.
It should be understood that, when there is more than one road side collecting device, some monitored objects may appear in the monitoring range of these devices at the same time, and thus become monitored objects of corresponding devices, for example, the movement situation of a pedestrian may be monitored by the radar device and the camera at the same time, in this case, after the network device acquires the data collecting structure from the road side collecting device, different movement parameters may be acquired for the same monitored object, in order to illustrate the process that the network device acquires the actual movement parameters of the monitored object according to different data collecting results, the following description continues with the example that the road side collecting device includes the echo type detecting device and the image collecting device, please refer to a flowchart that the network device shown in fig. 3 determines the actual movement parameters of the monitored object according to the data collecting results:
s302: and determining first monitoring data according to the echo acquisition result, and determining second monitoring data according to the image acquisition result.
The network device may acquire the echo acquisition result from the echo type detection device, acquire the image acquisition result from the image acquisition device, and process the echo acquisition result, so that the network device may acquire first monitoring data, where the first monitoring data includes first motion parameters of each first monitoring object, and the first motion parameters are motion parameters of the first monitoring object determined by the network device according to the echo acquisition result. Similarly, after processing the image acquisition result, the network device may acquire second monitoring data, where the second monitoring data includes a second motion parameter of each second monitoring object, and the second motion parameter is a motion parameter of the second monitoring object determined by the network device according to the image acquisition result.
In some examples of this embodiment, the first monitoring data further includes a first identifier of each first monitored object, and the second monitoring data includes a second identifier of each second monitored object. The first identifier is identifier information that the network device divides into for the first monitored object, and is used for uniquely distinguishing each first monitored object in each first monitored object. The second identifier corresponds to the first identifier, is identifier information divided into for the second monitoring object by the network device, and is used for uniquely distinguishing each second monitoring object in each second monitoring object. Since the same monitoring object is likely to be present in both the monitoring range of the echo type detection device and the monitoring range of the image acquisition device, the same monitoring object may have both the first identifier and the second identifier, and the two identifiers are different, for example, the first identifier of a pedestrian in the first monitoring data is "L57" and the second identifier thereof in the second monitoring data is "C32".
It is understood that the image acquisition result is image data, and the network device can identify the type of each second monitoring object through image recognition, for example, determine whether the second monitoring object is a vehicle, a pedestrian or a projectile. Therefore, in some examples of the present embodiment, the second monitoring data further includes type information of each second monitoring object.
Since the subsequent process needs to fuse the first motion parameter and the second motion parameter of the same monitored object, the network device needs to ensure that the first motion parameter and the second motion parameter correspond to data at the same time. It can be understood that, in some cases, the echo acquisition result obtained by the network device may not be the same as the image acquisition result at the same time, for example, when the echo type detection device reports the data acquisition result with the image acquisition device at different frequencies, or the reporting frequencies are the same, but the reporting times are different, the time of the echo acquisition result obtained by the network device may not match the time of the image acquisition result. In order to ensure that the time of the first motion parameter is the same as that of the second motion parameter, the network device may perform prediction processing on at least one of the first motion parameter and the second motion parameter when the echo acquisition result is different from the image acquisition result, so as to ensure that the time of the finally obtained first monitoring data is the same as that of the second monitoring data, that is, the time of the first motion parameter is the same as that of the second operation parameter. For example, it is assumed that the echo collection result reported to the network device by the echo type detection device is data at time t1, which reflects the movement of the monitored object on the road at time t1, and the image collection result reported by the image collection device is data at time t2, which reflects the movement of the monitored object on the road at time t2 by the image. In this case, the network device directly processes the two data acquisition results to obtain the motion parameters of the first monitored object at the time t1 and the motion parameters of the second monitored object at the time t2, so that fusion of the motion parameters of the monitored objects cannot be achieved, or a meaningless fusion result is obtained even after forced fusion. In order to solve the problem, the network device may predict the motion parameters of the first monitored objects at the time t2 according to the motion parameters of the first monitored objects at the time t1, so as to calculate the first motion parameters of the first monitored objects at the time t2, and further fuse the first motion parameters with the second motion parameters of the second monitored objects at the time t 2. Or the motion parameters of the second monitoring objects at the time t1 are calculated according to the motion parameters of the second monitoring objects at the time t2, so that the second motion parameters of the second monitoring objects at the time t1 are calculated and obtained, and then the second motion parameters are fused with the first motion parameters of the first monitoring objects at the time t 1. In some examples, the network device may further replace a synchronization time beyond the time t1 and the time t2, and then perform calculation processing on the motion parameters of both the second monitored object and the first monitored object, so as to synchronize the first motion parameter and the second motion parameter to the synchronization time.
S304: and fusing the first motion parameter and the second motion parameter of the same monitored object to obtain the actual motion parameter of the monitored object.
After obtaining the first motion parameter and the second motion parameter at the same time, the network device may fuse the first motion parameter and the second motion parameter of the same monitored object, and it can be understood that, because the first motion parameter and the second motion parameter are the motion parameters at the same time, even though there is a certain error in the processing of the echo acquisition result and the image acquisition result by the network device, for the same monitored object, the positions of the first motion parameter and the second motion parameter should be similar, so in the fusion process, the network device may identify which first monitored object and which second monitored object are the same monitored object based on the positions of the first monitored object and the second monitored object, as shown in fig. 4:
in fig. 4, a circle is used to show first position information in the first motion parameter, that is, the position of the first monitored object, and a black dot is used to show second position information in the second motion parameter, that is, the position of the second monitored object. As can be seen from fig. 4, the first monitoring object a and the second monitoring object x are the same monitoring object, and although there is a certain distance between the first monitoring object B and the second monitoring object y, in the coordinate system, the two monitoring objects are already a pair of the first monitoring object and the second monitoring object which are closest to each other, so that they are actually the same monitoring object, and similarly, the first monitoring object C and the second monitoring object z belong to the same monitoring object. According to this example, when identifying the same monitored object, the network device may place the first position information of each first monitored object and the second position information of each second monitored object in the same coordinate system, and then regard a pair of the first monitored object and the second monitored object whose first position information and second position information are closest as the same monitored object.
It should be noted that, when the echo type detection device and the image acquisition device are respectively a radar device and a camera, because the monitoring range of the radar device is not as wide as that of the camera monitoring device, there may exist some second monitoring objects without corresponding first monitoring objects in a normal situation, and for these second monitoring objects, the network device may directly use them as independent monitoring objects.
It can be understood that the first motion parameter and the second motion parameter of the same monitored object are fused, mainly to obtain the actual motion parameters of each monitored object more accurately and comprehensively. In view of the fact that the echo type detection device, such as a radar device, has higher monitoring accuracy on the motion parameters of the monitored object, and the monitoring range of the image acquisition device is wider, such as a camera, in some examples of the present embodiment, if the monitored object is a first monitored object, the network device may directly use the first motion parameter of the monitored object as its actual motion parameter, and if one monitored object is not the first monitored object but only a second monitored object, the network device may use the second motion parameter of the monitored object as its actual motion parameter. In some examples, the network device may perform region division on a monitoring range of the roadside device (i.e., a union of the monitoring range of the echo type detection device and the monitoring range of the image acquisition device), to obtain at least two regions, and provide different fusion strategies for different regions, and for a monitoring object, a first motion parameter and a second motion parameter of the monitoring object may be fused according to a fusion strategy corresponding to a region to which the monitoring object belongs, please refer to a flowchart shown in fig. 5:
s502: the monitoring range of the roadside collection equipment is divided into at least two areas according to the distance degree from the roadside collection equipment.
In this embodiment, when the network device performs the region division on the monitoring range, the region division may be performed according to the distance between each position in the monitoring range and the roadside acquisition device, for example, as shown in fig. 6, in an example, the network device may divide the monitoring range into three regions, namely a near-segment region 61, a middle-segment region 62 and a far-segment region 63, and the distances from the three regions to the roadside acquisition device 60 are sequentially increased, as shown in fig. 6. In fig. 6, the roadside collection device 60 monitors three lanes (lane a, lane b, lane c, respectively) at the same time, and illustratively, the far-end region 63 may be a region that can be monitored by the image collection device but cannot be monitored by the echo type detection device. The proximal region 61 and the middle region 62 may share the remaining monitoring range, but in other examples, the range of the proximal region may be larger than the range of the middle region or smaller than the range of the middle region. In other examples of the present embodiment, the number of the regions obtained by dividing the monitoring range may be more, and is not listed here.
S504: and for the monitored objects in each region, fusing the first motion parameters and the second motion parameters according to the fusion strategy corresponding to the region to which the monitored objects belong.
In this embodiment, the fusion policies of different regions are not completely the same, even the fusion policies of different regions are different, for example, in an example of this embodiment, when performing motion parameter fusion on a monitored object in a near-segment region, a first motion parameter corresponding to the monitored object may be directly adopted as its actual motion parameter, and when performing motion parameter fusion on a monitored object in a middle-segment region, a mean value of the first motion parameter and a second motion parameter may be calculated as its actual motion parameter. For the monitored object in the remote area, the network device may directly take the second motion parameter thereof as the actual motion parameter thereof. It will be appreciated that in other examples, all of the fusion strategies may perform motion parameter fusion in the following manner:
W=W1*q1+W2*q2;
wherein, W represents the actual motion parameter of the monitored object, W1 and W2 are the first motion parameter and the second motion parameter of the monitored object, respectively, and q1 and q2 are the weights set for the first motion parameter and the second motion parameter in the region to which the monitored object belongs, respectively. In one example, the closer the area q1 to the roadside collection device, the smaller q2, e.g., one area q1 closest to the roadside collection device is 1, q2 is 0, one area q1 farthest from the roadside collection device is 0, and q2 is 1.
Since the second monitoring information includes the type information of each second monitoring object, after the motion parameters of each monitoring object are fused, the type of each monitoring object can be determined according to the second monitoring information.
The process of obtaining the predicted motion parameter of the monitored object may refer to the process shown in fig. 7:
s702: and extracting the motion characteristic parameters of the monitored object.
The "motion characteristic parameter" is actually some parameters that represent the motion law of the monitored object, including but not limited to one or more of the position, speed, acceleration, direction, etc. of the monitored object.
S704: and predicting to obtain the predicted motion parameters of the monitored object according to the extracted motion characteristic parameters.
The predicted motion parameter and the actual motion parameter acquired by the network device should be at the same time, and assuming that the network device acquires the motion characteristic parameter of a certain monitored object a at the time T0 and determines that the actual motion parameter acquired next should be the actual motion parameter of the monitored object after the dt duration, the network device may predict the motion parameter of the monitored object a at the time T0+ dt according to the dt duration to obtain the predicted motion parameter of the monitored object a.
For example, assume that the motion characteristic parameter of the monitored object a at time T0 is a velocity v 0 Acceleration of a 0 In the position P 0 Then the network device can determine the speed of the monitored object A at the moment T0+ dt through calculation as
v 1 =v 0 +a 0 *dt;
A moving distance of dt duration
s 1 =v 0 *dt+a 0 *dt 2 /2;
Thus, the position P of the object A at the time T0+ dt is monitored 1 Is at P 0 Based on the movement direction of s 1 The latter position.
It is understood that all of the actual motion parameters may not be included in the predicted motion parameters, for example, acceleration may not be included in the predicted motion parameters, and thus, the network device may not be used to predict acceleration of the monitored object.
Because the motion characteristic parameters can reflect the motion rules of the monitored object, and the motion rules of the monitored object will change constantly, in this embodiment, the network device determines the actual motion parameters of the monitored object according to the data acquisition result, and can update the motion characteristic parameters of the monitored object according to the actual motion parameters. After the motion characteristic parameters of the monitored object are updated, if the network device needs to predict the motion parameters of the monitored object, the network device predicts the current motion characteristic parameters of the monitored object, namely the updated motion characteristic parameters.
In some examples of this embodiment, when the motion characteristic parameter of the monitored object is updated, the network device may directly replace the parameter in the motion characteristic parameter with the parameter in the real-time motion characteristic parameter, for example, directly replace the position, speed, acceleration, and direction in the motion characteristic parameter with the position, speed, acceleration, and direction in the actual motion parameter. In some examples, considering that there may be data missing in the actual motion parameters and the actual motion parameters are relatively easily affected by accidental factors, the network device may not directly replace the original motion characteristic parameters with the actual motion parameters, but revise the existing motion characteristic parameters by combining the actual motion parameters, the motion characteristic parameters and the time interval between two times of updating the motion characteristic parameters.
S104: and determining and outputting credibility data of the motion characteristic parameters of the monitored object according to the actual motion parameters and the predicted motion parameters of the monitored object.
After the actual motion parameters of the monitored object are obtained, the network device may determine the feasibility degree data of the current motion characteristic parameters of the monitored object according to the actual motion parameters and the predicted motion parameters of the monitored object.
In some examples of this embodiment, the network device may directly calculate a difference between the actual motion parameter and the predicted motion parameter of the monitored object, and use the difference as the reliability data of the motion characteristic parameter. For example, assuming that the location in the predicted motion parameter is S1 and the location in the actual motion parameter is S2, the network device may determine confidence data for the location by calculating the distance between S1 and S2. Similarly, assuming that the speed in the predicted motion parameter is v1 and the speed in the actual motion parameter is v2, the network device may determine reliability data of the speed by calculating a difference between v1 and v2, and details of calculation methods for reliability data of other motion characteristic parameters are omitted here.
It is understood that the reliability data of the motion characteristic parameter is calculated by the network device for the traffic information application party to refer to when applying the motion characteristic parameter of each monitored object, so in some examples of the embodiment, after updating the motion characteristic parameter of the monitored object, the network device may provide the new motion characteristic parameter and the reliability data to the traffic information application party. In this embodiment, the traffic information application side includes at least one of the roadside units corresponding to the road and the V2X platform.
In the traffic information management method provided by this embodiment, the network device predicts the motion parameter of the monitored object at the predetermined time according to the motion characteristic parameter of the monitored object, and obtains the actual motion parameter of the monitored object at the predetermined time according to the data acquisition result of the roadside acquisition device, so that the reliability of the motion characteristic parameter of the monitored object is evaluated according to the difference between the actual motion parameter and the predicted motion parameter.
Example two:
in order to make the advantages and details of the foregoing traffic information management method clearer for those skilled in the art, the present embodiment will be further explained with reference to an example, please refer to a flowchart of a traffic information management method shown in fig. 8:
s800: and the network equipment predicts the predicted motion parameters of each monitored object at a preset moment according to the motion characteristic parameters of each monitored object.
For the monitored objects which are already identified by the network equipment, the network equipment basically acquires the motion characteristic parameters of the monitored objects in the previous calculation process, so that the network equipment can predict the predicted motion parameters of the monitored objects at the preset moment according to the motion characteristic parameters of the monitored objects.
S802: and the network equipment acquires a radar data acquisition result and an image acquisition result.
In this embodiment, the roadside acquisition device includes a radar device and a camera, which monitor the real-time movement of the monitored object, but it is understood that in some other examples, the roadside acquisition device may also be other types of devices. In some examples, the network device first obtains the predicted motion parameter of the monitored object, and then obtains the actual motion parameter of the monitored object, in other examples of this embodiment, the network device may also obtain the predicted motion parameter of the monitored object and obtain the actual motion parameter of the monitored object at the same time, or obtain the actual motion parameter first and then obtain the predicted motion parameter. In one example, the network device predicts the motion parameters of the monitored object while acquiring the data acquisition result of the roadside acquisition device.
S804: the network equipment respectively processes the radar data acquisition result and the image acquisition result to obtain first monitoring data and second monitoring data.
By processing the radar data acquisition result, the network device may acquire first monitoring data, where the first monitoring data includes a first identifier and a first motion parameter (position, speed, direction, and acceleration) of each first monitored object. After the image acquisition result is processed, the network device may acquire second monitoring data, where the second monitoring data includes a second identifier and a second motion parameter (position, speed, direction, and acceleration) of each second monitoring object, and the second monitoring data also includes type information of each second monitoring object.
S806: and the network equipment obtains a first motion parameter and a second motion parameter which are synchronous in time according to the first monitoring data and the second monitoring data.
In some examples of this embodiment, the network device may use a time corresponding to the first monitoring data as a synchronization time, and process the motion parameter corresponding to the second monitoring data, so as to obtain a second motion parameter corresponding to the synchronization time.
S808: and the network equipment marks each first monitoring object and each second monitoring object in the same coordinate system according to the positions in the first motion parameter and the second motion parameter.
In this embodiment, the position information in the first motion parameter is recorded as first position information, and the position information in the second motion parameter is recorded as second position information, and the network device may mark each first monitoring object in the coordinate system according to the first position information, and mark each second monitoring object according to the second position information.
S810: and the network equipment takes the first monitoring object and the second monitoring object which are closest to the first position information and the second position information as the same monitoring object.
If a second monitoring object has a first monitoring object with a position matched with the second monitoring object, the second monitoring object and the corresponding first monitoring object are the same monitoring object, and if a second monitoring object does not have a first monitoring object with a position matched with the second monitoring object in the coordinate system, the network device may directly use the second monitoring object as a monitoring object.
S812: the network equipment fuses the first motion parameter and the second motion parameter of the monitoring object according to a fusion strategy corresponding to the area where the monitoring object is located to obtain the actual motion parameter of the monitoring object, and determines the type information of each monitoring object according to the second monitoring data.
Regarding the monitored object in the near-segment area, taking the first motion parameter of the monitored object as the actual motion parameter of the monitored object; regarding the monitored object in the middle section area, taking the average value of the first motion parameter and the second motion parameter of the monitored object as the actual motion parameter of the monitored object; and regarding the monitored object in the far-range area, taking the second motion parameter of the monitored object as the actual motion parameter of the monitored object. It can be understood that, because the data acquisition result obtained by the network device from the roadside acquisition device (the radar device and the camera) is acquired by the roadside acquisition device in real time, the actual motion parameter obtained based on the fusion of the first motion parameter and the second motion parameter is also real-time without considering the time consumption of data processing.
S814: the network equipment judges whether the monitoring object has the predicted motion parameters.
If yes, executing S816, otherwise entering S818.
S816: and the network equipment determines credibility data of the motion characteristic parameters of the monitored object according to the actual motion parameters and the predicted motion parameters of the monitored object.
For the calculation method of the degree of reliability of the motion characteristic parameter, reference may be made to the description in the foregoing embodiments, and details are not described here.
S818: and the network equipment updates the motion characteristic parameters of the monitored object according to the actual motion parameters of the monitored object.
It can be understood that when a monitored object is identified by the network device for the first time, the network device cannot acquire the motion characteristic parameters of the monitored object before, and naturally, the network device does not acquire the predicted motion parameters of the monitored object, so that the network device cannot calculate the reliability data for the motion characteristic parameters of the monitored object in the calculation process. After the network device obtains the motion characteristic parameters of the monitored object, the motion characteristic parameters of the monitored object can be directly updated or created according to the actual motion parameters of the monitored object, so that the motion parameters of the monitored object can be predicted in the next calculation process.
It is understood that after the network device performs S818, the network device may continue to perform S800 again.
S820: and the network equipment sends the motion characteristic parameters and the reliability data of the monitored object to a traffic information application party.
In this embodiment, the traffic information application side may include both the road side unit and the V2X platform, and the road side unit directly sends the traffic information to the vehicle-mounted unit or sends the traffic information to the vehicle-mounted unit after processing the corresponding traffic information.
For example, if the vehicle a is behind and the vehicle B is in front and the two vehicles are in the same lane, if the two vehicles continue to keep driving at the current speed and direction and collide after 20 seconds, the OBU can immediately send collision warning, for example, by reminding the driver of the vehicle a to pay attention to deceleration or avoidance through a vehicle-mounted horn and a screen after receiving traffic information issued by the RSU and predicting the occurrence of collision through calculation; for example, assuming that the vehicles a and B are running and the pedestrian C is crossing the road, when the distance between the vehicles a and B and the pedestrian C is less than the preset distance, the OBUs of the vehicles a and B can generate early warning of the vulnerable traffic participants after receiving the traffic information issued by the RSU, and prompt the drivers of the vehicles a and B to pay attention to the pedestrian; for another example, when the vehicle a is at the back and the vehicle B is at the front, the vehicle B drops the object D, and the OBU of the vehicle a generates the object early warning after receiving the traffic information issued by the RSU, and reminds the driver of the vehicle a of paying attention to the object on the road surface.
The traffic information management method provided by the embodiment of the invention can provide the traffic information with the reliability evaluation result to the traffic information application party, so that the traffic information application party can carry out traffic scheduling according to the reliability of the traffic information, and the safety and the reliability in a traffic scene are improved.
Example three:
the present embodiments provide a storage medium including volatile or non-volatile, removable or non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, computer program modules or other data. Storage media includes, but is not limited to, RAM (Random Access Memory), ROM (Read-Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), flash Memory or other Memory technology, CD-ROM (Compact disk Read-Only Memory), digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computer.
The storage medium may store one or more computer programs that can be read, compiled, and executed by one or more processors, and in this embodiment, the storage medium may store a traffic information management program that can be executed by one or more processors to implement the flow of any one of the traffic information management methods described in the foregoing embodiments.
The present embodiments also provide a computer program product comprising a computer readable means on which a computer program as shown above is stored. The computer readable means in this embodiment may include a computer readable storage medium as shown above. For example, the computer program product comprises a network device, as shown in fig. 9: the network device 90 includes a processor 91, a memory 92, and a communication bus 93 for connecting the processor 91 and the memory 92, wherein the memory 92 may be the aforementioned storage medium storing the traffic information management program. The network devices 90 in this embodiment may include, but are not limited to, MECs, roadside devices, and the like.
For a specific process of the network device 90 for implementing the traffic information management method, reference is made to the description of the foregoing embodiment, which is not described herein again.
Referring to the functional unit diagram in the network device shown in fig. 10, the network device 90 includes a radar data processing module 1001, an image data processing module 1002, a time synchronization module 1003, a section division module 1004, a position synchronization module 1005, a near segment fusion module 1006, a middle segment fusion module 1007, a far segment fusion module 1008, and a storage module 1009.
The radar data processing module 1001 is configured to process a radar data acquisition result from a radar device, and obtain a position (LPi), a speed (LVi), a direction (LDi), an acceleration (LAi), a time (LTi), and an ID (lid) of a first monitored object on a road, where the time (LTi) is a time when the radar device acquires information, and the ID (lid) is identification information allocated by the radar data processing module 1001 to the first monitored object, that is, a first identification in the foregoing embodiment.
The image data processing module 1002 is configured to process image data from the camera, and integrate a deep learning function and an image analysis function to obtain a position (CPi), a speed (CVi), a direction (CDi), an acceleration (CAi), a time (CTi), and an ID (CIDi) of a second monitoring object on the road, where the time (CTi) refers to a time when the camera acquires information, and the ID (CIDi) refers to identification information that the image data processing module 1002 allocates to the second monitoring object, that is, a second identification in the foregoing embodiment.
A storage module 1009, which stores the data processed by the radar data processing module 1001, the data processed by the image data processing module 1002, and other data that needs to be stored.
The time synchronization module 1003 is configured to synchronize the motion parameters processed by the radar data processing module 1001 and the image data processing module 1002 to the same time.
The section dividing module 1004 is configured to divide the monitoring range into three regions, namely a near-segment region, a middle-segment region and a far-segment region, and divide the monitoring object into the corresponding regions.
The position synchronization module 1005 is configured to unify the first monitored object and the second monitored object into the same coordinate system according to the position information of the first monitored object and the position information of the second monitored object.
The near-range fusion module 1006 is configured to obtain actual motion parameters of the monitored object based on the data processed by the radar data processing module 1001, and calculate, by combining with the corresponding predicted motion parameters, reliability data of the motion characteristic parameters of the monitored object in the near-range region.
The middle section fusion module 1007 is configured to obtain actual motion parameters of the monitored object in the middle section region based on the median of the data processed by the radar data processing module 1001 and the image data processing module 1002, and calculate, by combining with the corresponding predicted motion parameters, reliability data of the motion characteristic parameters of the monitored object in the middle section region.
The far-range fusion module 1008 is configured to obtain actual motion parameters of the monitoring object in the far-range region based on the data obtained by processing by the image data processing module 1002, and calculate, by combining with the corresponding predicted motion parameters, reliability data of the motion characteristic parameters of the monitoring object in the far-range region.
As shown in fig. 11, in some examples of the present embodiment, the traffic information management system 11 includes a roadside collecting device 111 and an MEC device 112, where the roadside collecting device 111 may include, but is not limited to, at least one of a radar device, a camera, an ultrasonic collecting device, an infrared collecting device, and the like. The MEC device 112 may be the aforementioned network device 90. The traffic information management scheme in the foregoing embodiment may be implemented by the MEC device 112 and the roadside collection device 111 cooperating with each other.
In some examples, the traffic information management system 11 may further include a V2X platform 113, a roadside unit 114, and an on-board unit 115, as shown in fig. 12, the radar device, the camera, the V2X platform 113, the roadside unit 114, and the MEC device 112 in the roadside acquisition device 111 achieve communication with each other through a local network, and the roadside unit 114 is in communication connection with the on-board unit 115.
According to the traffic information management scheme provided by the embodiment, the network equipment can evaluate the credibility of the motion characteristic parameters of the monitored object, so that a traffic information application party can make a more reliable decision according to the credibility data of the motion characteristic parameters when applying the motion characteristic parameters of the monitored object on a road, the traffic scheduling effect is improved, and the traffic safety is enhanced.
It will be apparent to those skilled in the art that all or some of the steps of the methods, systems, functional modules/units in the devices disclosed above may be implemented as software (which may be implemented in computer program code executable by a computing device), firmware, hardware, and suitable combinations thereof. In a hardware implementation, the division between functional modules/units mentioned in the above description does not necessarily correspond to the division of physical components; for example, one physical component may have multiple functions, or one function or step may be performed by several physical components in cooperation. Some or all of the physical components may be implemented as software executed by a processor, such as a central processing unit, digital signal processor, or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit.
In addition, communication media typically embodies computer readable instructions, data structures, computer program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media as known to one of ordinary skill in the art. Thus, the present invention is not limited to any specific combination of hardware and software.
The foregoing is a more detailed description of embodiments of the present invention, and the present invention is not to be considered limited to such descriptions. For those skilled in the art to which the invention pertains, several simple deductions or substitutions can be made without departing from the spirit of the invention, and all shall be considered as belonging to the protection scope of the invention.

Claims (12)

1. A traffic information management method, comprising:
acquiring actual motion parameters and predicted motion parameters of a monitored object on a road; acquiring the actual motion parameter comprises: acquiring a data acquisition result obtained by data acquisition of roadside acquisition equipment aiming at the motion condition of the monitored object, and determining the actual motion parameter of the monitored object according to the data acquisition result; obtaining the predicted motion parameter comprises: extracting the motion characteristic parameters of the monitored object, and predicting to obtain the predicted motion parameters of the monitored object according to the extracted motion characteristic parameters;
and determining and outputting reliability data of the motion characteristic parameters according to the actual motion parameters and the predicted motion parameters of the monitored object, wherein the reliability data represents the accuracy of the predicted motion parameters obtained by predicting according to the motion characteristic parameters.
2. The traffic information management method according to claim 1, wherein after determining the actual motion parameter of the monitored object according to the data acquisition result, further comprising:
and updating the motion characteristic parameters of the monitored object according to the actual motion parameters.
3. The traffic information management method according to claim 2, wherein outputting reliability data of the motion characteristic parameter of the monitored object includes:
and sending the motion characteristic parameters and the reliability data of the monitored object to a traffic information application party, wherein the traffic information application party comprises at least one of a V2X platform of the Internet of vehicles and a road side unit corresponding to the road.
4. The traffic information management method according to claim 1, wherein the roadside acquisition device includes both an echo type detection device and an image acquisition device, the echo type detection device is a device that monitors an object by transmitting a wave and based on a received echo, and the data acquisition result includes an echo acquisition result obtained by the echo type detection device and an image acquisition result obtained by the image acquisition device; the determining the actual motion parameter of the monitored object according to the data acquisition result comprises:
determining first monitoring data according to the echo acquisition result, and determining second monitoring data according to the image acquisition result, wherein the first monitoring data comprise first motion parameters of all first monitoring objects, the second monitoring data comprise second motion parameters of all second monitoring objects, and the first motion parameters and the second motion parameters correspond to the same moment;
and fusing the first motion parameter and the second motion parameter of the same monitored object to obtain the actual motion parameter of the monitored object.
5. The traffic information management method according to claim 4, wherein the second monitoring data further includes type information of each of the second monitoring objects, and after the first motion parameter and the second motion parameter of the same monitoring object are fused, the method further includes:
and determining the type information of each monitored object according to the second monitoring data.
6. The traffic information management method according to claim 4, wherein the fusing the first motion parameter and the second motion parameter of the same monitored object to obtain the actual motion parameter of the monitored object comprises:
dividing the monitoring range of the roadside acquisition equipment into at least two regions according to the distance from the roadside acquisition equipment, wherein different regions have corresponding fusion strategies;
and for the monitoring object in each region, fusing the first motion parameter and the second motion parameter of the monitoring object according to the fusion strategy corresponding to the region to which the monitoring object belongs.
7. The traffic information management method according to claim 6, wherein the monitoring range is divided into a near zone, a middle zone, and a far zone in order of the degree of distance from the roadside collection device; for the monitoring object in each region, fusing the first motion parameter and the second motion parameter according to the fusion strategy corresponding to the region to which the monitoring object belongs includes:
regarding a monitored object in the near segment area, taking a first motion parameter of the monitored object as an actual motion parameter of the monitored object;
for the monitored object in the middle section area, taking the average value of the first motion parameter and the second motion parameter of the monitored object as the actual motion parameter of the monitored object;
and regarding the monitored object in the remote area, taking the second motion parameter of the monitored object as the actual motion parameter of the monitored object.
8. The traffic information management method according to claim 4, wherein the first motion parameter includes first position information of a first monitored object, the second motion parameter includes second position information of a second monitored object, and before the first motion parameter and the second motion parameter of the same monitored object are fused, the method further includes:
and taking the first monitoring object and the second monitoring object with the first position information and the second position information which are closest to each other as the same monitoring object.
9. The traffic information management method according to any one of claims 1 to 8, wherein determining reliability data of the motion characteristic parameter from the actual motion parameter and the predicted motion parameter of the monitored object includes:
and calculating a difference value between the actual motion parameter and the predicted motion parameter, and taking the difference value as credibility data of the motion characteristic parameter.
10. A network device comprising a processor, a memory, and a communication bus;
the communication bus is used for realizing connection communication between the processor and the memory;
the processor is configured to execute one or more computer programs stored in the memory to implement the steps of the traffic information management method according to any one of claims 1 to 9.
11. A traffic information management system, comprising a road side collection device and a mobile edge computing MEC device, wherein the road side collection device is in communication connection with the MEC device, and the MEC device is the network device according to claim 10.
12. A storage medium having one or more computer programs stored therein, the one or more computer programs being executable by one or more processors to implement the steps of the traffic information management method according to any one of claims 1 to 9.
CN202110449528.7A 2021-04-25 2021-04-25 Traffic information management method, system, network equipment and storage medium Pending CN115240405A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110449528.7A CN115240405A (en) 2021-04-25 2021-04-25 Traffic information management method, system, network equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110449528.7A CN115240405A (en) 2021-04-25 2021-04-25 Traffic information management method, system, network equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115240405A true CN115240405A (en) 2022-10-25

Family

ID=83666931

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110449528.7A Pending CN115240405A (en) 2021-04-25 2021-04-25 Traffic information management method, system, network equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115240405A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115909728A (en) * 2022-11-02 2023-04-04 智道网联科技(北京)有限公司 Road side sensing method and device, electronic equipment and storage medium
CN116758109A (en) * 2023-06-20 2023-09-15 杭州光线数字科技有限公司 Action appearance state synchronicity monitoring system based on intelligent equipment

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115909728A (en) * 2022-11-02 2023-04-04 智道网联科技(北京)有限公司 Road side sensing method and device, electronic equipment and storage medium
CN116758109A (en) * 2023-06-20 2023-09-15 杭州光线数字科技有限公司 Action appearance state synchronicity monitoring system based on intelligent equipment
CN116758109B (en) * 2023-06-20 2023-11-14 杭州光线数字科技有限公司 Action appearance state synchronicity monitoring system based on intelligent equipment

Similar Documents

Publication Publication Date Title
US11107356B2 (en) Cellular network-based assisted driving method and traffic control unit
US9127956B2 (en) Technique for lane assignment in a vehicle
US9889858B2 (en) Confidence estimation for predictive driver assistance systems based on plausibility rules
CN112712717B (en) Information fusion method, device and equipment
CN104217590B (en) Method for making the electronic controller in main vehicle determine traffic density
US11836985B2 (en) Identifying suspicious entities using autonomous vehicles
Lytrivis et al. An advanced cooperative path prediction algorithm for safety applications in vehicular networks
CN110570674A (en) Vehicle-road cooperative data interaction method and system, electronic equipment and readable storage medium
KR20200019696A (en) Risk handling for vehicles with autonomous driving capabilities
EP3403219A1 (en) Driver behavior monitoring
JP6392735B2 (en) Information processing apparatus, information processing method, vehicle control apparatus, and vehicle control method
CN113853640B (en) electronic control device
CN111785019A (en) Vehicle traffic data generation method and system based on V2X and storage medium
WO2013170882A1 (en) Collaborative vehicle detection of objects with a predictive distribution
CN115240405A (en) Traffic information management method, system, network equipment and storage medium
CN114023077B (en) Traffic monitoring method and device
Adla et al. Automotive collision avoidance methodologies Sensor-based and ITS-based
US11823570B2 (en) Traffic management server, and method and computer program for traffic management using the same
CN117178309A (en) Method for creating a map with collision probability
CN115497323A (en) Vehicle cooperative lane changing method and device based on V2X
CN114170832A (en) Public transport vehicle monitoring method, device, server, system and storage medium
CN111801954A (en) Method for relaying event information in multi-layer V2X system
CN115830860B (en) Traffic accident prediction method and device
CN116798224B (en) Road condition reminding method, device, equipment and storage medium based on vehicle-mounted terminal
CN116959253A (en) Target early warning method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination