CN116304986A - Vehicle event fusion method, device, equipment and readable storage medium - Google Patents

Vehicle event fusion method, device, equipment and readable storage medium Download PDF

Info

Publication number
CN116304986A
CN116304986A CN202310328662.0A CN202310328662A CN116304986A CN 116304986 A CN116304986 A CN 116304986A CN 202310328662 A CN202310328662 A CN 202310328662A CN 116304986 A CN116304986 A CN 116304986A
Authority
CN
China
Prior art keywords
vehicle event
matching degree
vehicle
event
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310328662.0A
Other languages
Chinese (zh)
Inventor
王登坤
刘羿
涂康
陈旭
郝东阳
王鸿浩
饶兴
郭子敏
毛君宇
王鹏辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fiberhome Telecommunication Technologies Co Ltd
Original Assignee
Fiberhome Telecommunication Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fiberhome Telecommunication Technologies Co Ltd filed Critical Fiberhome Telecommunication Technologies Co Ltd
Priority to CN202310328662.0A priority Critical patent/CN116304986A/en
Publication of CN116304986A publication Critical patent/CN116304986A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Abstract

The invention provides a vehicle event fusion method, device and equipment and a readable storage medium. The method comprises the following steps: acquiring a first vehicle event from a first queue and acquiring a second vehicle event from a second queue; obtaining a distance difference value based on the occurrence positions of the first vehicle event and the second vehicle event, obtaining a time difference value based on the generation moments of the first vehicle event and the second vehicle event, and obtaining a similarity value based on the event types of the first vehicle event and the second vehicle event; calculating according to the distance difference value, the time difference value and the similarity value to obtain a matching degree; and if the matching degree is greater than a preset threshold value, carrying out event fusion on the first vehicle event and the second vehicle event. According to the method and the device for calculating the matching degree of the two vehicle events, when the matching degree is larger than the preset threshold value, the fact that the two vehicle events are the same vehicle event is indicated, so that event fusion is carried out on the two vehicle events, and the finally obtained vehicle event is more complete and accurate.

Description

Vehicle event fusion method, device, equipment and readable storage medium
Technical Field
The present invention relates to the field of data fusion technologies, and in particular, to a vehicle event fusion method, device, equipment and readable storage medium.
Background
With the rapid development of national economy construction, the national expressway network has been quite large-scale. Due to the characteristics of high speed, large vehicles and the like on the expressway, once an accident occurs, the accident is serious, and the wrong driving behavior of a driver is often the main cause of the accident, so that the driving behavior needs to be monitored and identified so as to give an alarm in time.
At present, the driving behavior is monitored and identified mainly based on the driving data perceived by the sensor, so as to generate corresponding vehicle events. The generated vehicle event may contain different content for different types of sensors and may differ in accuracy for the same content. For example, a type A sensor generated vehicle event contains content related to the appearance of the vehicle, such as license plate, vehicle color, etc., while a type B sensor generated vehicle event does not contain content related to the appearance of the vehicle; the track determined by the type A sensor is more accurate than the track determined by the type B sensor, but the generation position and the generation moment determined by the type B sensor are more accurate than the type A sensor.
Therefore, in order to improve the integrity and accuracy of the finally generated vehicle event, a solution for fusing the vehicle events generated by different sensor sources is needed.
Disclosure of Invention
In order to solve the technical problems, the invention provides a vehicle event fusion method, a device, equipment and a readable storage medium.
In a first aspect, the present invention provides a vehicle event fusion method, including:
acquiring a first vehicle event from a first queue and acquiring a second vehicle event from a second queue;
obtaining a distance difference value based on the occurrence position of the first vehicle event and the occurrence position of the second vehicle event, obtaining a time difference value based on the generation time of the first vehicle event and the generation time of the second vehicle event, and obtaining a similarity value based on the event type of the first vehicle event and the event type of the second vehicle event;
calculating according to the distance difference value, the time difference value and the similarity value to obtain a matching degree;
and if the matching degree is greater than a preset threshold value, carrying out event fusion on the first vehicle event and the second vehicle event.
Optionally, the step of calculating the matching degree according to the distance difference value, the time difference value and the similarity value includes:
inputting the distance difference value, the time difference value and the similarity value into a matching degree calculation formula to obtain a theoretical matching degree, wherein the matching degree calculation formula is as follows:
Figure BDA0004154220230000021
wherein r is a theoretical matching degree, alpha is a similar value, T is a time difference value, T is a first preset value, M is a distance difference value, and M is a second preset value, wherein when T is greater than T or M is greater than M, r takes a value of 0;
and obtaining the matching degree based on the theoretical matching degree.
Optionally, the step of obtaining the matching degree based on the theoretical matching degree includes:
acquiring weather types corresponding to the generation moments of the first vehicle event and the second vehicle event;
and correcting the theoretical matching degree based on the characteristic value corresponding to the weather type to obtain the matching degree.
Optionally, the step of correcting the theoretical matching degree based on the characteristic value corresponding to the weather type to obtain the matching degree includes:
and multiplying the characteristic value corresponding to the weather type with the theoretical matching degree to obtain a product as the matching degree, wherein the larger the weather type has a negative influence on driving, the lower the corresponding characteristic value.
Optionally, before the step of acquiring the first vehicle event from the first queue and the second vehicle event from the second queue, the method further includes:
generating first-class vehicle events according to first-class sensing data of a first sensor, and storing the first-class vehicle events into a first queue according to the time sequence of generation;
generating second-class vehicle events according to second-class sensing data of a second sensor, and storing the second-class vehicle events into a second queue according to the time sequence of generation;
wherein the sensing areas of the first sensor and the second sensor are coincident.
In a second aspect, the present invention also provides a vehicle event fusion apparatus, including:
the acquisition module is used for acquiring a first vehicle event from the first queue and acquiring a second vehicle event from the second queue;
the calculation module is used for obtaining a distance difference value based on the occurrence position of the first vehicle event and the occurrence position of the second vehicle event, obtaining a time difference value based on the generation time of the first vehicle event and the generation time of the second vehicle event, and obtaining a similarity value based on the event type of the first vehicle event and the event type of the second vehicle event; calculating according to the distance difference value, the time difference value and the similarity value to obtain a matching degree;
and the fusion module is used for carrying out event fusion on the first vehicle event and the second vehicle event if the matching degree is larger than a preset threshold value.
Optionally, the computing module is configured to:
inputting the distance difference value, the time difference value and the similarity value into a matching degree calculation formula to obtain a theoretical matching degree, wherein the matching degree calculation formula is as follows:
Figure BDA0004154220230000031
wherein r is a theoretical matching degree, alpha is a similar value, T is a time difference value, T is a first preset value, M is a distance difference value, and M is a second preset value, wherein when T is greater than T or M is greater than M, r takes a value of 0;
and obtaining the matching degree based on the theoretical matching degree.
Optionally, the computing module is configured to:
acquiring weather types corresponding to the generation moments of the first vehicle event and the second vehicle event;
and correcting the theoretical matching degree based on the characteristic value corresponding to the weather type to obtain the matching degree.
In a third aspect, the present invention also provides a vehicle event fusion device, the vehicle event fusion device comprising a processor, a memory, and a vehicle event fusion program stored on the memory and executable by the processor, wherein the vehicle event fusion program, when executed by the processor, implements the steps of the vehicle event fusion method as described above.
In a fourth aspect, the present invention also provides a readable storage medium having stored thereon a vehicle event fusion program, wherein the vehicle event fusion program, when executed by a processor, implements the steps of the vehicle event fusion method as described above.
In the invention, a first vehicle event is acquired from a first queue, and a second vehicle event is acquired from a second queue; obtaining a distance difference value based on the occurrence position of the first vehicle event and the occurrence position of the second vehicle event, obtaining a time difference value based on the generation time of the first vehicle event and the generation time of the second vehicle event, and obtaining a similarity value based on the event type of the first vehicle event and the event type of the second vehicle event; calculating according to the distance difference value, the time difference value and the similarity value to obtain a matching degree; and if the matching degree is greater than a preset threshold value, carrying out event fusion on the first vehicle event and the second vehicle event. According to the method and the device for calculating the matching degree of the two vehicle events, when the matching degree is larger than the preset threshold value, the fact that the two vehicle events are the same vehicle event is indicated, so that event fusion is carried out on the two vehicle events, and the finally obtained vehicle event is more complete and accurate.
Drawings
FIG. 1 is a flow chart of an embodiment of a vehicle event fusion method according to the present invention;
FIG. 2 is a flow chart of another embodiment of a vehicle event fusion method according to the present invention;
FIG. 3 is a schematic diagram of a refinement flow chart of step S30 in FIG. 1;
FIG. 4 is a schematic diagram of a refinement flow chart of step S302 in FIG. 3;
FIG. 5 is a functional block diagram of an embodiment of a vehicle event fusion device according to the present invention;
fig. 6 is a schematic hardware structure of a vehicle event fusion device according to an embodiment of the present invention.
The achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
In a first aspect, an embodiment of the present invention provides a vehicle event fusion method.
In an embodiment, referring to fig. 1, fig. 1 is a flowchart illustrating an embodiment of a vehicle event fusion method according to the present invention. As shown in fig. 1, the vehicle event fusion method includes:
step S10, acquiring a first vehicle event from a first queue and acquiring a second vehicle event from a second queue;
in this embodiment, in the present embodiment, vehicle events generated based on sensing data of different types of sensors are stored in the first queue and the second queue, respectively. For example, vehicle events generated based on sensing data of the grating sensor are stored in a first queue, and vehicle events generated based on sensing data of the camera are stored in a second queue. Based on this, a first vehicle event may be obtained from the first queue and a second vehicle event may be obtained from the second queue.
Further, in an embodiment, referring to fig. 2, fig. 2 is a flow chart of another embodiment of the vehicle event fusion method according to the present invention. As shown in fig. 2, before step S10, the method further includes:
step S00, generating first-class vehicle events according to first-class sensing data of a first sensor, and storing the first-class vehicle events into a first queue according to the time sequence of generation; generating second-class vehicle events according to second-class sensing data of a second sensor, and storing the second-class vehicle events into a second queue according to the time sequence of generation; wherein the sensing areas of the first sensor and the second sensor are coincident.
In this embodiment, taking the first sensor as an example of the grating sensor, track data is determined according to grating data collected by the grating sensor, a first type of vehicle event is generated according to the track data, and then the first type of vehicle event is stored in the first queue according to the time sequence of generation. Specific:
according to the instantaneous speed in the track data, if n reporting periods (for example, reporting once in 200 ms) are continuous, and the speed is greater than the highest speed limit value, generating an overspeed event; if the vehicle speed is smaller than the lowest speed limit value in n continuous reporting periods, a slow event is generated. Wherein n is a preset value.
According to the instantaneous speed and the position in the track data, if the speed of the continuous m reporting periods is 0km/h and the position does not move, generating a illegal stop event; if the parking position is in the emergency lane, an emergency lane occupation event is generated. Wherein m is a preset value.
And generating a continuous lane change event if the vehicle continuously changes at least two lanes within a preset duration according to the vehicle position in the track data.
And calculating the following distance according to the speed and the position of the vehicle in the track data, and generating a following approaching event if the following distance is smaller than the shortest following distance corresponding to the speed and the visibility.
Similarly, taking a second sensor as a license plate recognition camera/multi-line laser radar as an example, generating a second-class vehicle event according to image data/radar sensing data acquired by the license plate recognition camera/multi-line laser radar, and then storing the second-class vehicle event into a second queue according to the time sequence of generation. Subsequently, when the vehicle event is acquired from the first queue and the second queue, the vehicle event in the same order may be acquired from the first queue and the second queue, respectively.
Step S20, obtaining a distance difference value based on the occurrence position of the first vehicle event and the occurrence position of the second vehicle event, obtaining a time difference value based on the generation time of the first vehicle event and the generation time of the second vehicle event, and obtaining a similarity value based on the event type of the first vehicle event and the event type of the second vehicle event;
in this embodiment, the distance difference is obtained by subtracting absolute values from the occurrence positions, and the time difference is obtained by subtracting absolute values from the generation time. The similarity value corresponding to the event types is preset, the similarity value range is 0-1, for example, the similarity value corresponding to the low speed and the overspeed is 0, the similarity value of the low speed and the parking is 0.5, and the similarity value of the two events of the same type is 1. The method is only illustrative of the similarity values corresponding to the event types, and the similarity values corresponding to the event types can be flexibly set according to actual needs.
Step S30, calculating to obtain a matching degree according to the distance difference value, the time difference value and the similarity value;
in this embodiment, the matching degree is obtained by combining a preset quantization method based on the distance difference, the time difference and the similarity of the first vehicle event and the second vehicle event. Wherein, the preset quantization method follows the following rules:
the larger the distance difference value, the larger the time difference value, and the smaller the similarity value, the smaller the matching degree.
Further, in an embodiment, referring to fig. 3, fig. 3 is a schematic diagram of a refinement process of step S30 in fig. 1. As shown in fig. 3, step S30 includes:
step S301, inputting the distance difference value, the time difference value and the similarity value into a matching degree calculation formula to obtain a theoretical matching degree, wherein the matching degree calculation formula is as follows:
Figure BDA0004154220230000061
wherein r is a theoretical matching degree, alpha is a similar value, T is a time difference value, T is a first preset value, M is a distance difference value, and M is a second preset value, wherein when T is greater than T or M is greater than M, r takes a value of 0;
in this embodiment, the distance difference, the time difference and the similarity are input into a matching degree calculation formula, so as to calculate the theoretical matching degree. And when the time difference value is larger than the first preset value or the distance difference value is larger than the second preset value, namely, the comparison of the occurrence position difference and the generation time difference is far compared with the first vehicle event or the second vehicle event, so that the probability of the same vehicle event is low, and the theoretical matching degree of the first vehicle event and the second vehicle event is directly determined to be 0. The first preset value and the second preset value are set according to actual needs.
And step S302, obtaining the matching degree based on the theoretical matching degree.
In this embodiment, after the theoretical matching degree obtained in step S301 is based, the theoretical matching degree may be directly used as the matching degree, or the theoretical matching degree may be further corrected, so as to obtain the matching degree.
Further, in an embodiment, referring to fig. 4, fig. 4 is a schematic diagram of a refinement process of step S302 in fig. 3. As shown in fig. 4, step S302 includes:
step S3021, obtaining weather types corresponding to the generation moments of the first vehicle event and the second vehicle event;
and step S3022, correcting the theoretical matching degree based on the characteristic value corresponding to the weather type to obtain the matching degree.
In this embodiment, considering that weather affects the accuracy of the sensing data collected by the sensor, the theoretical matching degree needs to be corrected according to the weather types corresponding to the generation moments of the first vehicle event and the second vehicle event, specifically, the theoretical matching degree is corrected according to the feature values corresponding to the weather types.
Further, in an embodiment, step S3022 includes:
and multiplying the characteristic value corresponding to the weather type with the theoretical matching degree to obtain a product as the matching degree, wherein the larger the weather type has a negative influence on driving, the lower the corresponding characteristic value.
In this embodiment, according to the magnitude of negative impact of different weather types on the driving, the characteristic value corresponding to each weather type is set, for example, the characteristic value corresponding to a sunny day is 1, the characteristic value corresponding to a rainy day is 0.6, the characteristic value corresponding to a snowy day is 0.5, and the characteristic value corresponding to a foggy day is 0.3. It should be noted that, only the schematic description of the feature values corresponding to the weather types is provided here, and the feature values corresponding to the weather types may be flexibly set according to actual needs.
Step S40, if the matching degree is greater than the preset threshold, performing event fusion on the first vehicle event and the second vehicle event.
In this embodiment, the preset threshold is set according to actual needs, and if the matching degree is greater than the preset threshold, it is indicated that the first vehicle event and the second vehicle event are the same vehicle event, so that event fusion can be performed on the first vehicle event and the second vehicle event.
For example, the first vehicle event may additionally include vehicle driving track data compared to the second vehicle event, and the second vehicle event may additionally include vehicle appearance data compared to the first vehicle event, so that the last vehicle event obtained by event fusion may include more complete data compared to the first vehicle event and the second vehicle event.
For another example, the first vehicle event may have a more accurate vehicle position than the second vehicle event, and the second vehicle event may have a more accurate vehicle speed than the first vehicle event, so that the last vehicle event obtained by event fusion may have a vehicle position based on the vehicle position of the first vehicle event and a speed based on the speed of the second vehicle event, thereby making the data of the last vehicle event more accurate.
In this embodiment, a first vehicle event is acquired from a first queue, and a second vehicle event is acquired from a second queue; obtaining a distance difference value based on the occurrence position of the first vehicle event and the occurrence position of the second vehicle event, obtaining a time difference value based on the generation time of the first vehicle event and the generation time of the second vehicle event, and obtaining a similarity value based on the event type of the first vehicle event and the event type of the second vehicle event; calculating according to the distance difference value, the time difference value and the similarity value to obtain a matching degree; and if the matching degree is greater than a preset threshold value, carrying out event fusion on the first vehicle event and the second vehicle event. According to the embodiment, the matching degree of the two vehicle events is calculated, and when the matching degree is larger than the preset threshold value, the two vehicle events are indicated to be the same vehicle event in essence, so that event fusion is carried out on the two vehicle events, and the finally obtained vehicle event is more complete and accurate.
In a second aspect, the embodiment of the invention further provides a vehicle event fusion device.
In an embodiment, referring to fig. 5, fig. 5 is a schematic functional block diagram of a vehicle event fusion device according to an embodiment of the invention. As shown in fig. 5, the vehicle event fusion apparatus includes:
an obtaining module 10, configured to obtain a first vehicle event from a first queue and obtain a second vehicle event from a second queue;
a calculation module 20, configured to obtain a distance difference value based on the occurrence position of the first vehicle event and the occurrence position of the second vehicle event, obtain a time difference value based on the generation time of the first vehicle event and the generation time of the second vehicle event, and obtain a similarity value based on the event type of the first vehicle event and the event type of the second vehicle event; calculating according to the distance difference value, the time difference value and the similarity value to obtain a matching degree;
and the fusion module 30 is configured to fuse the first vehicle event and the second vehicle event if the matching degree is greater than a preset threshold.
Further, in an embodiment, the computing module is configured to:
inputting the distance difference value, the time difference value and the similarity value into a matching degree calculation formula to obtain a theoretical matching degree, wherein the matching degree calculation formula is as follows:
Figure BDA0004154220230000081
wherein r is a theoretical matching degree, alpha is a similar value, T is a time difference value, T is a first preset value, M is a distance difference value, and M is a second preset value, wherein when T is greater than T or M is greater than M, r takes a value of 0;
and obtaining the matching degree based on the theoretical matching degree.
Further, in an embodiment, the computing module 20 is configured to:
acquiring weather types corresponding to the generation moments of the first vehicle event and the second vehicle event;
and correcting the theoretical matching degree based on the characteristic value corresponding to the weather type to obtain the matching degree.
Further, in an embodiment, the computing module 20 is configured to:
and multiplying the characteristic value corresponding to the weather type with the theoretical matching degree to obtain a product as the matching degree, wherein the larger the weather type has a negative influence on driving, the lower the corresponding characteristic value.
Further, in an embodiment, the vehicle event fusion apparatus further includes a generating module configured to:
generating first-class vehicle events according to first-class sensing data of a first sensor, and storing the first-class vehicle events into a first queue according to the time sequence of generation;
generating second-class vehicle events according to second-class sensing data of a second sensor, and storing the second-class vehicle events into a second queue according to the time sequence of generation;
wherein the sensing areas of the first sensor and the second sensor are coincident.
The function implementation of each module in the vehicle event fusion device corresponds to each step in the vehicle event fusion method embodiment, and the function and implementation process of each module are not described in detail herein.
In a third aspect, an embodiment of the present invention provides a vehicle event fusion apparatus, which may be a personal computer (personal computer, PC), a notebook computer, a server, or the like, having a data processing function.
Referring to fig. 6, fig. 6 is a schematic hardware structure of a vehicle event fusion apparatus according to an embodiment of the present invention. In an embodiment of the present invention, the vehicle event fusion device may include a processor 1001 (e.g., central processor CentralProcessing Unit, CPU), a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005. Wherein the communication bus 1002 is used to enable connected communications between these components; the user interface 1003 may include a Display screen (Display), an input unit such as a Keyboard (Keyboard); the network interface 1004 may optionally include a standard wired interface, a WIreless interface (e.g., WIreless-FIdelity, WI-FI interface); the memory 1005 may be a high-speed random access memory (random access memory, RAM) or a stable memory (non-volatile memory), such as a disk memory, and the memory 1005 may alternatively be a storage device independent of the processor 1001. Those skilled in the art will appreciate that the hardware configuration shown in fig. n is not limiting of the invention and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
With continued reference to fig. 6, an operating system, a network communication module, a user interface module, and a vehicle event fusion program may be included in memory 1005, fig. 6, which is a type of computer storage medium. The processor 1001 may call a vehicle event fusion program stored in the memory 1005, and execute the vehicle event fusion method provided by the embodiment of the present invention.
In a fourth aspect, embodiments of the present invention also provide a readable storage medium.
The invention stores a vehicle event fusion program on a readable storage medium, wherein the vehicle event fusion program realizes the steps of the vehicle event fusion method when being executed by a processor.
The method implemented when the vehicle event fusion program is executed may refer to various embodiments of the vehicle event fusion method of the present invention, and will not be described herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) as described above, comprising several instructions for causing a terminal device to perform the method according to the embodiments of the present invention.
The foregoing description is only of the preferred embodiments of the present invention, and is not intended to limit the scope of the invention, but rather is intended to cover any equivalents of the structures or equivalent processes disclosed herein or in the alternative, which may be employed directly or indirectly in other related arts.

Claims (10)

1. A vehicle event fusion method, characterized in that the vehicle event fusion method comprises:
acquiring a first vehicle event from a first queue and acquiring a second vehicle event from a second queue;
obtaining a distance difference value based on the occurrence position of the first vehicle event and the occurrence position of the second vehicle event, obtaining a time difference value based on the generation time of the first vehicle event and the generation time of the second vehicle event, and obtaining a similarity value based on the event type of the first vehicle event and the event type of the second vehicle event;
calculating according to the distance difference value, the time difference value and the similarity value to obtain a matching degree;
and if the matching degree is greater than a preset threshold value, carrying out event fusion on the first vehicle event and the second vehicle event.
2. The vehicle event fusion method according to claim 1, wherein the step of calculating a matching degree based on the distance difference value, the time difference value, and the similarity value comprises:
inputting the distance difference value, the time difference value and the similarity value into a matching degree calculation formula to obtain a theoretical matching degree, wherein the matching degree calculation formula is as follows:
Figure FDA0004154220210000011
wherein r is a theoretical matching degree, alpha is a similar value, T is a time difference value, T is a first preset value, M is a distance difference value, and M is a second preset value, wherein when T is greater than T or M is greater than M, r takes a value of 0;
and obtaining the matching degree based on the theoretical matching degree.
3. The vehicle event fusion method according to claim 2, wherein the step of obtaining the matching degree based on the theoretical matching degree includes:
acquiring weather types corresponding to the generation moments of the first vehicle event and the second vehicle event;
and correcting the theoretical matching degree based on the characteristic value corresponding to the weather type to obtain the matching degree.
4. The vehicle event fusion method according to claim 3, wherein the step of correcting the theoretical matching degree based on the characteristic value corresponding to the weather type to obtain the matching degree includes:
multiplying the characteristic value corresponding to the weather type with the theoretical matching degree to obtain a product
The matching degree is the matching degree, wherein the more the weather type has a negative influence on driving, the lower the corresponding characteristic value is.
5. The vehicle event fusion method of any of claims 1 to 4, further comprising, prior to the step of retrieving a first vehicle event from a first queue and retrieving a second vehicle event from a second queue:
generating first-class vehicle events according to first-class sensing data of a first sensor, and storing the first-class vehicle events into a first queue according to the time sequence of generation;
generating second-class vehicle events according to second-class sensing data of a second sensor, and storing the second-class vehicle events into a second queue according to the time sequence of generation;
wherein the sensing areas of the first sensor and the second sensor are coincident.
6. A vehicle event fusion device, characterized in that the vehicle event fusion device comprises:
the acquisition module is used for acquiring a first vehicle event from the first queue and acquiring a second vehicle event from the second queue;
the calculation module is used for obtaining a distance difference value based on the occurrence position of the first vehicle event and the occurrence position of the second vehicle event, obtaining a time difference value based on the generation time of the first vehicle event and the generation time of the second vehicle event, and obtaining a similarity value based on the event type of the first vehicle event and the event type of the second vehicle event; calculating according to the distance difference value, the time difference value and the similarity value to obtain a matching degree;
and the fusion module is used for carrying out event fusion on the first vehicle event and the second vehicle event if the matching degree is larger than a preset threshold value.
7. The vehicle event fusion device of claim 6, wherein the computing module is to:
inputting the distance difference value, the time difference value and the similarity value into a matching degree calculation formula to obtain a theoretical matching degree, wherein the matching degree calculation formula is as follows:
Figure FDA0004154220210000021
wherein r is a theoretical matching degree, alpha is a similar value, T is a time difference value, T is a first preset value, M is a distance difference value, and M is a second preset value, wherein when T is greater than T or M is greater than M, r takes a value of 0;
and obtaining the matching degree based on the theoretical matching degree.
8. The vehicle event fusion device of claim 7, wherein the computing module is to:
acquiring weather types corresponding to the generation moments of the first vehicle event and the second vehicle event;
and correcting the theoretical matching degree based on the characteristic value corresponding to the weather type to obtain the matching degree.
9. A vehicle event fusion device comprising a processor, a memory, and a vehicle event fusion program stored on the memory and executable by the processor, wherein the vehicle event fusion program, when executed by the processor, implements the steps of the vehicle event fusion method of any of claims 1 to 5.
10. A readable storage medium, wherein a vehicle event fusion program is stored on the readable storage medium, wherein the vehicle event fusion program, when executed by a processor, implements the steps of the vehicle event fusion method according to any one of claims 1 to 5.
CN202310328662.0A 2023-03-27 2023-03-27 Vehicle event fusion method, device, equipment and readable storage medium Pending CN116304986A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310328662.0A CN116304986A (en) 2023-03-27 2023-03-27 Vehicle event fusion method, device, equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310328662.0A CN116304986A (en) 2023-03-27 2023-03-27 Vehicle event fusion method, device, equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN116304986A true CN116304986A (en) 2023-06-23

Family

ID=86790353

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310328662.0A Pending CN116304986A (en) 2023-03-27 2023-03-27 Vehicle event fusion method, device, equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN116304986A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116881877A (en) * 2023-07-11 2023-10-13 安徽泽悦信息科技有限公司 Data security protection method and system based on big data analysis technology

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116881877A (en) * 2023-07-11 2023-10-13 安徽泽悦信息科技有限公司 Data security protection method and system based on big data analysis technology
CN116881877B (en) * 2023-07-11 2024-03-22 山西星宇合创信息技术有限公司 Data security protection method and system based on big data analysis technology

Similar Documents

Publication Publication Date Title
CN111815986B (en) Traffic accident early warning method and device, terminal equipment and storage medium
Wang et al. A vision-based video crash detection framework for mixed traffic flow environment considering low-visibility condition
CN110632610A (en) Autonomous vehicle localization using gaussian mixture model
WO2022078077A1 (en) Driving risk early warning method and apparatus, and computing device and storage medium
EP3875905A1 (en) Method, apparatus, device and medium for detecting environmental change
CN112307978B (en) Target detection method and device, electronic equipment and readable storage medium
EP3940487B1 (en) Estimation of probability of collision with increasing severity level for autonomous vehicles
CN112200046A (en) Driving behavior recognition method and device, electronic equipment and storage medium
CN116304986A (en) Vehicle event fusion method, device, equipment and readable storage medium
CN113963533B (en) Driving behavior abnormality detection method, device, electronic device, server and medium
De Gelder et al. Risk quantification for automated driving systems in real-world driving scenarios
Gluhaković et al. Vehicle detection in the autonomous vehicle environment for potential collision warning
CN114926540A (en) Lane line calibration method and device, terminal equipment and readable storage medium
CN110888885A (en) Track data processing method and device, server and readable storage medium
CN110803171A (en) Driving risk prompting method and device
CN111126311B (en) Urban road potential dangerous area identification method and device and electronic equipment
CN113011517A (en) Positioning result detection method and device, electronic equipment and storage medium
CN116583891A (en) Critical scene identification for vehicle verification and validation
CN110949380A (en) Location prediction for dynamic objects
CN116736409A (en) Automobile safety early warning method, device, equipment and storage medium
CN116279504A (en) AR-based vehicle speed assist system and method thereof
CN115930978A (en) Map creating method and device
CN112477868B (en) Collision time calculation method and device, readable storage medium and computer equipment
CN114906170A (en) Motion planning method and device, electronic equipment and storage medium
CN113997940A (en) Driving behavior monitoring method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination