CN117795579A - Data fusion method, device, equipment, storage medium and vehicle - Google Patents

Data fusion method, device, equipment, storage medium and vehicle Download PDF

Info

Publication number
CN117795579A
CN117795579A CN202180018741.0A CN202180018741A CN117795579A CN 117795579 A CN117795579 A CN 117795579A CN 202180018741 A CN202180018741 A CN 202180018741A CN 117795579 A CN117795579 A CN 117795579A
Authority
CN
China
Prior art keywords
vehicle
information
obstacle
road
list
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180018741.0A
Other languages
Chinese (zh)
Inventor
胡伟辰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN117795579A publication Critical patent/CN117795579A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Abstract

A data fusion method, apparatus, device, storage medium and vehicle. When the sensing state of the vehicle is abnormal, fusion information of other vehicles is used as effective positioning information of the vehicle. The method can avoid error positioning of the vehicle, improve the positioning accuracy of the vehicle, ensure that the vehicle can still stably run under the sensing abnormal conditions of positioning drift, sensor mounting position deviation, false detection or omission detection of obstacles of the vehicle and the like in the running process, and improve the running safety of the vehicle.

Description

Data fusion method, device, equipment, storage medium and vehicle Technical Field
The application relates to the field of intelligent driving, in particular to a data fusion method, a device, equipment, a storage medium and a vehicle.
Background
The intelligent driving scene is complex and various. In order to improve the accuracy of vehicle sensing data, it is a trend to introduce wireless communication technology for vehicles such as the internet of vehicles (vehicle to everything, V2X). In general, accuracy of sensing data is improved or over-distance sensing is realized by directly fusing various data, including sensing data of a vehicle, data acquired by a vehicle wireless communication technology, and the like. However, in the case of positioning drift during running of the vehicle, sensor mounting position offset caused by vibration or collision of the vehicle, false detection or omission of obstacles of the own vehicle, etc., the accuracy of perceived data of the vehicle is still low, the positioning accuracy of the vehicle is reduced, and a great potential safety hazard exists.
Disclosure of Invention
In order to solve the technical problems, the embodiment of the application provides a data fusion method, a device, equipment, a storage medium and a vehicle, which can avoid vehicle positioning errors and improve vehicle positioning accuracy under the condition of poor vehicle perception.
The first aspect of the present application provides a data fusion method, including:
acquiring a fusion obstacle list and a vehicle end state list of a road section where a first vehicle is located, wherein the vehicle end state list comprises a state value of the first vehicle, the state value of the first vehicle is used for judging a perception state of the first vehicle, and when the perception state of the first vehicle is abnormal, the fusion obstacle list contains fusion information of the first vehicle;
when the first vehicle sensing state is abnormal according to the state value of the first vehicle, using the fusion information of the first vehicle as effective positioning information of the first vehicle;
the fusion information of the first vehicle is obtained by fusing information of the first vehicle detected by other vehicles in a road section where the first vehicle is located.
Therefore, the vehicle can timely find out the sensing state abnormality such as the self-positioning drift and the like through the state value thereof, and can use the fusion information of other vehicles in the corresponding road section as the effective positioning information of the vehicle under the condition of the sensing state abnormality, thereby improving the positioning precision of the vehicle under the condition such as the positioning drift and the like, and ensuring that the vehicle can still stably run under the condition such as the positioning drift and the like.
As a possible implementation manner of the first aspect, the method further includes: when the first vehicle perception state is judged to be normal according to the state value of the first vehicle, the vehicle information of the first vehicle is used as the effective positioning information of the first vehicle.
Therefore, when the vehicle perception state is normal, the vehicle information with high reliability and high instantaneity is used as the effective positioning information of the vehicle, so that the positioning accuracy and the data instantaneity of the vehicle are improved.
As one possible implementation manner of the first aspect, the state value of the first vehicle is used to indicate a number of first other vehicles in a road section where the first vehicle is located, and information of the first vehicle detected by the first other vehicles does not match with own vehicle information of the first vehicle; the method further comprises the steps of: and when the state value of the first vehicle is larger than a preset state threshold value, judging that the perceived state of the first vehicle is abnormal.
Thus, the state of the vehicle can be determined by the information difference between the vehicle and the other vehicle.
As one possible implementation manner of the first aspect, the first other vehicle is another vehicle that does not meet a second preset condition, where the second preset condition includes: the distance between the position of the first vehicle detected by the other vehicles and the position in the own vehicle information of the first vehicle is smaller than or equal to a preset second distance threshold value.
Thus, the state of the vehicle can be determined by the difference in positional information between the vehicle and the other vehicle.
As a possible implementation manner of the first aspect, the method further includes: acquiring a road-end obstacle list of a road section where a first vehicle is located, wherein the road-end obstacle list comprises own vehicle information of vehicles with normal perception states in the road section where the first vehicle is located; and obtaining effective obstacle information of the first vehicle according to the road-end obstacle list and the information of the obstacle detected by the first vehicle.
Therefore, the vehicle can conveniently combine the information of other vehicles with normal perception states to determine own barriers and the information thereof, and the accuracy and the integrity of the barrier information of the vehicle are improved.
As one possible implementation manner of the first aspect, according to the road-side obstacle list and information of the obstacle detected by the first vehicle, obtaining effective obstacle information of the first vehicle includes: determining a first obstacle of the first vehicle and an abnormal obstacle proportion of the first vehicle according to the information of the road-side obstacle list and the obstacle detected by the first vehicle; when the abnormal obstacle proportion of the first vehicle is smaller than or equal to a preset first proportion threshold value, determining information of the first obstacle according to a road-side obstacle list and information of the obstacle detected by the first vehicle, and adding the information of the first obstacle into effective obstacle information of the first vehicle; when the corresponding information of the first obstacle of the first vehicle in the road-end obstacle list is not matched with the information of the first obstacle detected by the first vehicle, the first obstacle of the first vehicle is an abnormal obstacle of the first vehicle.
Therefore, the information of other vehicles with normal perception states is combined conveniently, the self-perception condition of the vehicle is determined, and meanwhile, the accuracy and the integrity of the obstacle information of the vehicle are improved.
As one possible implementation manner of the first aspect, the abnormal obstacle is a first obstacle that does not meet a first preset condition, where the first preset condition includes: the distance between the position of the first obstacle obtained through the road-side obstacle list and the position of the first obstacle detected by the first vehicle is smaller than or equal to a preset first distance threshold.
Therefore, the abnormal obstacle of the vehicle can be accurately positioned through the difference of the obstacle position information between the vehicle and other vehicles with normal perception states.
As one possible implementation manner of the first aspect, according to the road-side obstacle list and information of the obstacle detected by the first vehicle, obtaining effective obstacle information of the first vehicle further includes: for the obstacles other than the first obstacle in the road-side obstacle list, the information of the first obstacle obtained by the road-side obstacle list is added to the effective obstacle information of the first vehicle.
Therefore, the information of the beyond-vision obstacle of the vehicle can be conveniently obtained by combining the information of other vehicles with normal perception states, and the accuracy and the integrity of the obstacle information of the vehicle are improved.
As a possible implementation manner of the first aspect, the method further includes: an error sensor of the first vehicle is detected when the abnormal obstacle proportion of the first vehicle is greater than a first proportion threshold.
Therefore, the information of other vehicles with normal perception states is combined conveniently, when the abnormality of the perception state of the vehicle is confirmed, the error sensor of the vehicle is automatically and timely detected, and the safety risk caused by the deviation of the installation position of the sensor due to long-term running or collision of the vehicle is reduced.
As one possible implementation manner of the first aspect, detecting an error sensor of the first vehicle includes: determining a first obstacle of a first sensor, an abnormal obstacle of the first sensor and an abnormal obstacle proportion of the first sensor according to the obstacle information perceived by the first sensor of the first vehicle and the road-end obstacle list; determining that the first sensor is in error when the abnormal obstacle proportion of the first sensor is greater than or equal to a preset second proportion threshold value; when the corresponding information of the first obstacle of the first sensor in the road-end obstacle list is not matched with the information of the first obstacle perceived by the first sensor, the first obstacle of the first sensor is an abnormal obstacle of the first sensor.
Thus, it is possible to confirm whether or not the sensor of the vehicle is faulty in combination with the information of the other vehicle whose perceived state is normal.
As a possible implementation manner of the first aspect, the data fusion method further includes: and determining compensation parameters of the first sensor according to the information of the first obstacle perceived by the first sensor and the road-end obstacle list, wherein the compensation parameters of the first sensor are used for compensating the information perceived by the first sensor.
Therefore, the parameter of the error sensor of the vehicle can be timely compensated by combining the information of other vehicles with normal perception states, and the safety risk caused by the deviation of the installation position of the sensor due to long-term running or collision of the vehicle is reduced.
As a possible implementation manner of the first aspect, the data fusion method further includes: and transmitting vehicle-end information of the first vehicle to the road-end equipment, wherein the vehicle-end information comprises vehicle information of the first vehicle and information of an abnormal obstacle of the first vehicle.
Thus, it is possible to facilitate that the road-side apparatus will determine the state of each vehicle and the information of the obstacle in the road section in combination with the information of the vehicle.
As a possible implementation manner of the first aspect, the data fusion method further includes: alert in one or more of the following: determining that a perceived state of the first vehicle is abnormal; an error sensor that detects a first vehicle; the error sensor of the first vehicle belongs to a predetermined necessary sensor.
Therefore, the user can be timely prompted, and meanwhile, the driving safety of the vehicle is improved.
The second aspect of the present application provides a data fusion method, including:
acquiring vehicle end information of other vehicles in a road section where the first vehicle is located, wherein when the first vehicle perceives that the state is abnormal, the vehicle end information of the other vehicles comprises information of the first vehicle detected by the other vehicles;
acquiring fusion information of a first vehicle according to information of the first vehicle detected by other vehicles;
and sending a fusion obstacle list to the first vehicle, wherein the fusion obstacle list comprises fusion information of the first vehicle.
Therefore, the fusion information of vehicles with abnormal perception states can be obtained by collecting the information of a plurality of vehicles in the road section, so that the vehicles with abnormal perception states can be positioned in real time according to the fusion information, and the vehicles can still stably run under the condition of abnormal perception states.
As a possible implementation manner of the second aspect, the data fusion method further includes: acquiring own vehicle information of a first vehicle; updating a state value of the first vehicle according to the information of the first vehicle detected by other vehicles and the vehicle information of the first vehicle, wherein the state value is used for judging the perceived state of the first vehicle; an end of vehicle status list is sent to the first vehicle, the end of vehicle status list including status values of the first vehicle.
Therefore, the state of each vehicle can be determined by collecting the information of a plurality of vehicles in the road section, so that the vehicles can acquire the self-perception state in time, and the vehicle can avoid positioning errors under the condition of abnormal perception state.
As one possible implementation manner of the second aspect, updating the state value of the first vehicle according to the information of the first vehicle detected by the other vehicles and the own vehicle information of the first vehicle includes: determining the number of the first other vehicles according to the self-vehicle information of the first vehicles and the information of the first vehicles detected by the other vehicles, wherein the information of the first vehicles detected by the first other vehicles is not matched with the self-vehicle information of the first vehicles; the state value of the first vehicle is updated according to the number of the first other vehicles.
As one possible implementation manner of the second aspect, the first other vehicle is another vehicle that does not meet a second preset condition, where the second preset condition includes: the distance between the position of the first vehicle detected by the other vehicles and the position in the own vehicle information of the first vehicle is smaller than or equal to a preset second distance threshold value.
As one possible implementation manner of the second aspect, the vehicle-end information of the other vehicle further includes own vehicle information of the other vehicle, and the vehicle-end state list further includes state values of the other vehicle; the data fusion method further comprises the following steps: updating a road-end obstacle list according to the own vehicle information and the state value of other vehicles and the own vehicle information and the state value of a first vehicle, wherein the road-end obstacle list comprises the own vehicle information of vehicles with normal perception states in a road section where the first vehicle is located; the list of road side obstacles is sent to the first vehicle and the other vehicles.
Therefore, the information of vehicles with normal perception states in the road section can be provided for each vehicle, so that each vehicle can timely determine the self perception state, the error condition of the sensor and the compensation parameter of the error sensor according to the information.
A third aspect of the present application provides a data fusion apparatus, including:
the vehicle-end state list comprises a state value of the first vehicle, the state value of the first vehicle is used for judging the perceived state of the first vehicle, and when the perceived state of the first vehicle is abnormal, the fused obstacle list comprises fused information of the first vehicle;
a first determining unit configured to use the fusion information of the first vehicle as effective positioning information of the first vehicle when determining that the first vehicle perceived state is abnormal according to the state value of the first vehicle;
the fusion information of the first vehicle is obtained by fusing information of the first vehicle detected by other vehicles in a road section where the first vehicle is located.
As a possible implementation manner of the third aspect, the first determining unit is further configured to use the vehicle information of the first vehicle as the effective positioning information of the first vehicle when it is determined that the perceived state of the first vehicle is normal according to the state value of the first vehicle.
As a possible implementation manner of the third aspect, the first obtaining unit is further configured to obtain a road-side obstacle list of a road section where the first vehicle is located, where the road-side obstacle list includes vehicle information of vehicles with normal perception states in the road section where the first vehicle is located; the data fusion device further includes: and the second determining unit is used for obtaining effective obstacle information of the first vehicle according to the road-end obstacle list and the information of the obstacle detected by the first vehicle.
As a possible implementation manner of the third aspect, the second determining unit is further configured to: for the obstacles other than the first obstacle in the road-side obstacle list, the information of the first obstacle obtained by the road-side obstacle list is added to the effective obstacle information of the first vehicle.
As a possible implementation manner of the third aspect, the data fusion device further includes: a sensor detection unit for detecting an error sensor of the first vehicle; the second determining unit is further configured to notify the sensor detecting unit to perform detection when the abnormal obstacle proportion of the first vehicle is greater than the first proportion threshold.
As a possible implementation manner of the third aspect, the data fusion device further includes: and the compensation parameter determining unit is used for determining the compensation parameter of the first sensor according to the information of the first obstacle perceived by the first sensor and the road-side obstacle list when the sensor detecting unit detects that the first sensor of the first vehicle is an error sensor, wherein the compensation parameter of the first sensor is used for compensating the information perceived by the first sensor.
As a possible implementation manner of the third aspect, the data fusion device further includes: and the first transmitting unit is used for transmitting the vehicle-end information of the first vehicle to the road-end equipment, wherein the vehicle-end information comprises the vehicle-self information of the first vehicle and the information of the abnormal obstacle of the first vehicle.
As a possible implementation manner of the third aspect, the data fusion device further includes: an alarm unit, specifically configured to alarm in one or more of the following cases: determining that a perceived state of the first vehicle is abnormal; an error sensor that detects a first vehicle; the error sensor of the first vehicle belongs to a predetermined necessary sensor.
A fourth aspect of the present application provides a data fusion device, including:
the second acquisition unit is used for acquiring vehicle end information of other vehicles in a road section where the first vehicle is located, and when the first vehicle perceives that the state is abnormal, the vehicle end information of the other vehicles comprises information of the first vehicle detected by the other vehicles;
the fusion unit is used for obtaining fusion information of the first vehicle according to the information of the first vehicle detected by other vehicles;
and the second sending unit is used for sending a fusion obstacle list to the first vehicle, wherein the fusion obstacle list comprises fusion information of the first vehicle.
As a possible implementation manner of the fourth aspect, the second obtaining unit is further configured to obtain own vehicle information of the first vehicle; the data fusion device further includes: a state updating unit, configured to update a state value of a first vehicle according to information of the first vehicle detected by other vehicles and own vehicle information of the first vehicle, where the state value is used to determine a perceived state of the first vehicle; the second transmitting unit is further configured to transmit a vehicle-end state list to the first vehicle, where the vehicle-end state list includes a state value of the first vehicle.
As one possible implementation manner of the fourth aspect, the vehicle-end information of the other vehicle further includes own vehicle information of the other vehicle, and the vehicle-end state list further includes state values of the other vehicle; the data fusion device may further include: the road side obstacle updating unit is used for updating a road side obstacle list according to the own vehicle information and the state value of other vehicles and the own vehicle information and the state value of the first vehicle, wherein the road side obstacle list comprises the own vehicle information of vehicles with normal perception states in a road section where the first vehicle is located; the second transmitting unit may be further configured to transmit the road-side obstacle list to the first vehicle and the other vehicles.
A fifth aspect of the present application provides a computing device comprising at least one processor and at least one memory storing program instructions that, when executed by the at least one processor, cause the at least one processor to perform the data fusion method of the first aspect and/or the data fusion method of the second aspect.
A sixth aspect of the present application provides a computing device comprising a processor and interface circuitry, the processor accessing a memory through the interface circuitry, the memory storing program instructions that, when executed by the processor, cause the processor to perform the data fusion method of the first aspect and/or the data fusion method of the second aspect.
A seventh aspect of the present application provides a computer readable storage medium having stored thereon program instructions, characterized in that the program instructions, when executed by a computer, cause the computer to perform the data fusion method of the first aspect or the data fusion method of the second aspect.
An eighth aspect of the present application provides a computer program product comprising a computer program which, when executed by a processor, causes the processor to perform the data fusion method of the first aspect or the data fusion method of the second aspect.
A ninth aspect of the present application provides a vehicle comprising the computing device of the fifth aspect or the computing device of the sixth aspect.
In the embodiment of the application, when the sensing state of the first vehicle is abnormal, the fusion information obtained by fusion of the information of the first vehicle detected by other vehicles is used for replacing the own vehicle information of the first vehicle and is used as the effective positioning information of the vehicle, so that the reliability and the accuracy of subsequent processing such as planning decision and the like when the sensing state of the vehicle is abnormal can be improved, the vehicle can still stably run under the condition of self positioning error, traffic accidents in scenes such as unmanned driving are avoided, and the running safety of the vehicle is improved.
These and other aspects of the application will be apparent from and elucidated with reference to the embodiment(s) described hereinafter.
Drawings
The various features of the present application and the connections between the various features are further described below with reference to the figures. The figures are exemplary, some features are not shown in actual scale, and some features that are conventional in the art to which this application pertains and are not essential to the application may be omitted from some figures, or features that are not essential to the application may be additionally shown, and combinations of the various features shown in the figures are not meant to limit the application. In addition, throughout the specification, the same reference numerals refer to the same. The specific drawings are as follows:
fig. 1 is a schematic diagram of an exemplary application scenario of an embodiment of the present application.
Fig. 2 is an exemplary structural diagram of a vehicle according to an embodiment of the present application.
Fig. 3 is a flow chart of a data fusion method according to an embodiment of the present application.
FIG. 4 is an exemplary diagram of a scenario of an alert according to an embodiment of the present application.
Fig. 5 is a flow chart of another data fusion method according to an embodiment of the present application.
Fig. 6 is a schematic flowchart of a specific implementation of a data fusion method at a road side device in an embodiment of the present application.
Fig. 7 is a schematic flowchart of a specific implementation of a data fusion method at a vehicle end device side in an embodiment of the present application.
Fig. 8 is a schematic diagram of an interaction flow for implementing data fusion between a road side and a vehicle side in an example of an embodiment of the application.
Fig. 9 is a schematic structural diagram of a data fusion device according to an embodiment of the present application.
Fig. 10 is a schematic structural diagram of another implementation of a data fusion device according to an embodiment of the present application.
Fig. 11 is a schematic diagram of the deployment of the data fusion device in the vehicle end equipment according to an embodiment of the present application.
Fig. 12 is a schematic structural diagram of a computing device provided in an embodiment of the present application.
Detailed Description
Concepts and terms involved in the embodiments of the present application will be briefly described first:
perception (acceptable): and (3) detecting the surrounding environment by a sensor or other sensing equipment to obtain the information of the objects in the surrounding environment.
Perception information: information of obstacles in the surroundings of the vehicle and information of the vehicle perceived by a sensor or a set of sensors.
An obstacle: including objects on the road and traffic signs and facilities around the road. The object on the road includes: pedestrians, vehicles, animals, etc.; the traffic sign and traffic facilities around the road include: signal lamp, guideboard, telegraph pole, lane line, guide line, building area sign, traffic police gesture, speed limit sign etc.
Information of obstacle: including one or more of the type, location (e.g., position in world coordinate system), speed, obstacle size, orientation, time stamp, etc.
Abnormal obstacle: in this embodiment of the present application, if the vehicle-end device side information of an obstacle (for example, the obstacle information detected by the vehicle or the obstacle information perceived by one or a set of sensors) does not match the road-end device side information (for example, the information in the road-end obstacle list), the obstacle may be regarded as an abnormal obstacle.
Information of abnormal obstacle: including one or more of roadside codes, types, locations (e.g., locations under world coordinates), speeds, sizes, orientations, time stamps, etc. of the abnormal obstacles.
Information detected by the vehicle: information obtained by fusing information sensed by each or each set of sensors of the vehicle.
The embodiment of the application can be applied to a V2X scene. For example, the V2X scene may be specifically any one of the following: vehicle-to-vehicle communications (vehicle to vehicle, V2V), vehicle-to-person communications (vehicle to pedestrian, V2P), vehicle-to-network (vehicle to network, V2N) traffic, and vehicle-to-infrastructure communications (vehicle to infrastructure, V2I), among others. Here, V2X may be long term evolution (long term evolution, LTE) V2X, new Radio (NR) V2X, or V2X in other communication systems that may occur with the development of technology.
The embodiment of the application can be applied to various scenes. The method and the device are particularly suitable for scenes in which the sensing state of the vehicle is easily affected, such as the situation that missed detection is easy to occur and the position of the sensor of the vehicle is easy to deviate. For example, the embodiment of the application is particularly suitable for scenes in which the sensor mounting position is easy to deviate due to large running vibration of vehicles such as intelligent construction sites and intelligent mines and long time, and for example, the embodiment of the application is particularly suitable for scenes in which the dust is large and obstacle omission occurs easily. Of course, the embodiments of the present application may also be applied to other scenes, for example, a scene in which a vehicle travels in various roads, and the like.
The "vehicle" in the embodiments of the present application may be of any applicable type. For example, the "vehicle" herein may be, but is not limited to, a private car, a commercial car, a bus, a passenger car, a high-speed rail, a subway, an unmanned car, an unmanned aerial vehicle, a logistics transport vehicle, an unmanned transport vehicle, a ship, an aircraft, etc., and the power type of the "vehicle" may be fuel-driven, purely electric, hydrogen-fuel cell-driven, hybrid, etc. Further, the "vehicle" herein may be a human-driven vehicle, an autonomous vehicle, an unmanned vehicle, or other type of vehicle.
Fig. 1 shows an exemplary architecture diagram of a system to which embodiments of the present application are applied. Referring to fig. 1, a system applicable to an embodiment of the present application may include: the vehicle-end device 110 and the road-end device 120, wherein the vehicle-end device 110 is installed on a vehicle, the road-end device 120 is arranged beside a road, and communication can be realized between different vehicle-end devices 110 and between the vehicle-end device 110 and the road-end device 120.
In the example of fig. 1, the vehicles 1, 2, 3, 4, 5 are respectively loaded with the vehicle-end devices 110, and after the vehicles 1, 2, 3, 4, 5 enter the road segment H1 governed by the road-end device 120, the vehicles 1, 2, 3, 4, 5 can communicate with the road-end device 120 through the vehicle-end devices 110. With appropriate vehicle spacing, the vehicle end devices 110 of the vehicles 1, 2, 3, 4, 5 can communicate with each other.
For example, the vehicle-end device 110 may be an in-vehicle module, an in-vehicle component, an in-vehicle chip, or an in-vehicle unit built into a vehicle as one or more components or units. Communication can be performed between different vehicle-end devices 110 and between the vehicle-end devices 110 and the road-end device 120 through V2X.
Illustratively, the vehicle has a sensor system, which is a system comprising at least one sensor or sensing device, by means of which information about obstacles in the surroundings of the vehicle can be obtained.
Illustratively, the sensors or sensing devices may include, but are not limited to, lidar, cameras, millimeter wave radar, global positioning system (global positioning system, GPS), inertial measurement units (Inertial Measuring Unit, IMU), and the like. The sensor or sensing device may communicate with the vehicle-side device 110 via ethernet, bluetooth (BT), wireless fidelity (wireless fidelity, wiFi) network, cellular network, controller area network (Controller Area Network, CAN) bus, local interconnect network (local interconnect network, LIN) bus, etc. FIG. 2 illustrates an exemplary configuration of a vehicle having a sensor system.
For simplicity and clarity of description, a vehicle is used below in place of the end-of-vehicle device 110 in a vehicle. That is, the actions, methods or processes performed by the vehicle and the methods or processes on the vehicle side in the embodiments of the present application refer to the actions, methods or processes performed by the vehicle end device 110 in the vehicle.
Fig. 3 shows a flowchart of a data fusion method 300 according to an embodiment of the present application. Referring to fig. 3, the data fusion method 300 at the vehicle end side may include the following steps:
step S310, a fusion obstacle list and a vehicle end state list of a road section where a first vehicle is located are obtained;
here, the vehicle-end state list includes a state value of the first vehicle for determining a perceived state of the first vehicle.
Here, when the first vehicle perceived state is abnormal, the fusion obstacle list contains fusion information of the first vehicle; when the first vehicle perception state is normal, the fusion obstacle list may not include fusion information of the first vehicle.
Here, the fusion information of the vehicles is obtained by fusing information of the first vehicle detected by the other vehicles in the road section where the first vehicle is located.
In some embodiments, the fused information of the first vehicle may include one or more of fused position, fused orientation, speed, vehicle size, time stamp, and the like. In some implementations, a lossless Kalman filtering algorithm or other similar algorithm may be employed to fuse the position and orientation of the first vehicle, including that detected by other vehicles, to obtain a fused position and fused orientation. The information such as the vehicle size, the time stamp, etc. in the fusion information may be from the own vehicle information of the first vehicle.
In order to facilitate distinguishing different vehicles, the fusion information of the first vehicle can further comprise a road end code of the first vehicle, wherein the road end code is distributed to the first vehicle by the road end device after the first vehicle establishes communication with the road end device.
In particular, road-end coding may be used to distinguish between different vehicles. After communication is established with a certain vehicle, the road side device 120 or the vehicle side device 110 of the own vehicle may assign a unique road side code to the vehicle, so as to identify the vehicle through the road side code and distinguish relevant information of different vehicles.
In some embodiments, the roadside code may be empty or a default initial value (e.g., 0) when the vehicle first communicates with the roadside device 120. After communication is established, that is, after the road-end device 120 allocates the road-end code to the vehicle, the value of the road-end code is the value of the road-end code allocated by the road-end device 120 to the vehicle.
The roadside code may be custom. For example, the road-end code may contain a unique identifier of the road-end device and a unique identifier of the vehicle, so that the vehicle can be distinguished not only by the road-end code, but also by the road-end code. For example, the road-end code may include the number of the road-end device and the number of the vehicle assigned to the specific vehicle by the road-end device, the number of the road-end device may be a value of 1 to K (K is a preset value, an integer not less than 1), the number of the vehicle may be a value of 1 to M (M is a preset value, an integer not less than 1), the road-end number may be denoted as "AB", a represents the number of the road-end device, and the value is 1 to K; b represents the number of the vehicle and takes the value of 1 to M.
It should be noted that, the specific content, coding rule, format, etc. of the road-side coding are not limited in the embodiments of the present application.
In step S320, when it is determined that the first vehicle perceives the abnormal state according to the state value of the first vehicle, the fusion information of the first vehicle is used as the effective positioning information of the first vehicle.
In the embodiment of the application, the effective positioning information can be used for subsequent processing such as planning decision-making.
In general, the own vehicle information of a vehicle is used as effective positioning information of the vehicle. If the perception state of the vehicle is abnormal, the positioning of the vehicle is wrong, the accuracy of the vehicle information is lost, if the vehicle information of the vehicle is still used for the subsequent processing such as planning decision and the like, the subsequent processing such as planning decision and the like is inevitably caused to be wrong, the running function of the vehicle is affected, even the vehicle cannot stably run, and if unmanned driving is involved, traffic accidents are likely to be caused. In the embodiment of the application, when the sensing state of the first vehicle is abnormal, the fusion information is used for replacing the self-vehicle information of the first vehicle and is used as the effective positioning information of the vehicle, and because the fusion information is obtained based on the information fusion detected by other vehicles, the reliability is higher and the accuracy is good, so that the reliability and the accuracy of subsequent processing such as planning decision and the like can be improved, the vehicle can stably run under the condition of self-positioning error, meanwhile, traffic accidents in scenes such as unmanned driving and the like can be avoided, and the driving safety of the vehicle is improved.
In general, if there is a large deviation in information detected by one vehicle from information detected by a plurality of other vehicles, the reliability of the information detected by the vehicle is insufficient, and the vehicle perception is considered to be abnormal. In the embodiment of the present application, the perceived abnormal state may be represented by one or more of the following conditions: 1) The detected information of the vehicle has larger deviation with the information detected by other vehicles due to the position deviation of the sensor; 2) The information detected by the vehicle and the information detected by other vehicles have larger deviation due to the positioning drift in the running process of the vehicle; 3) The information detected by the vehicle has a large deviation from the information detected by other vehicles due to information errors (e.g., intentional error information/tampering of the message transmission process); 4) The limited sensing range of the vehicle sensor leads to incomplete information detected by the vehicle.
In this embodiment of the present application, if the information detected by one vehicle has no larger deviation than the information detected by a plurality of other vehicles, that is, the information detected by the vehicle has only a larger deviation from the information detected by a small amount of other vehicles, at this time, the information detected by the vehicle is more reliable, and the vehicle perception state can be considered to be normal. For example, the vehicle may be considered to be in a normal perceived state, although the vehicle sensor position may be offset, and its perceived information may be compensated for by compensation.
In some embodiments, the data fusion method 300 of the embodiments of the present application may further include: in step S330, when it is determined that the perceived state of the first vehicle is normal according to the state value of the first vehicle, the vehicle information of the first vehicle is used as the effective positioning information of the first vehicle.
Typically, a vehicle has three trusted domains, from high to low in confidence: the method comprises the steps of a first-level trusted domain, a second-level trusted domain and a third-level trusted domain, wherein a host vehicle is the first-level trusted domain, road-side equipment is the second-level trusted domain, other vehicles are the third-level trusted domain, namely, the information of the host vehicle is most reliable, the information of the road-side equipment is inferior, and the reliability of the information provided by other vehicles is weaker than that of the road-side equipment. Therefore, when the vehicle perception state is normal, it is preferable to use the own vehicle information of the vehicle as effective positioning information for the subsequent processing such as planning decision, which is advantageous to improve the reliability, accuracy and execution efficiency of the subsequent processing such as planning decision.
In the embodiment of the application, the state value may be used to determine the perceived state of the vehicle. It is to be understood that the state value may be set to any information that can be used to determine the perceived state of the vehicle, not limited to the number of first other vehicles described below. The embodiments of the present application are not limited to specific definitions, implementation forms, and the like of state values.
In some embodiments, the status value of the first vehicle may be used to indicate a number of first other vehicles in the road segment where the first vehicle is located, and the information of the first vehicle detected by the first other vehicle does not match the own vehicle information of the first vehicle. In other words, the state value may indicate that the information detected by the first vehicle has a larger deviation from the information detected by the plurality of other vehicles, and if the state value is greater than the preset state threshold, the state value indicates that the information detected by the first vehicle has a larger deviation from the information detected by the plurality of other vehicles, and the vehicle perception state is abnormal. Thus, the perceived state of the first vehicle can be determined efficiently and accurately by the state value.
In some embodiments, for example, in step S320 or before step S320, the data fusion method 300 of the embodiments of the present application may further include: and determining whether the first vehicle sensing state is abnormal or not according to the state value of the first vehicle and a preset state threshold value. Specifically, when the state value of the first vehicle is greater than a preset state threshold, determining that the perceived state of the first vehicle is abnormal; and when the state value of the first vehicle is smaller than or equal to a preset state threshold value, judging that the sensing state of the first vehicle is abnormal.
In a specific application, a preset state threshold (i.e., the state threshold x below) may be preset according to a scene or other requirements. For example, the preset state threshold may be set to an integer greater than 1. For example, the preset state threshold may be set to 2, 3, or 4. In addition, the preset state threshold value can be dynamically adjusted according to the road condition of the road section. The specific value, adjustment mode, setting mode and the like of the preset state threshold are not limited.
Taking the vehicle 2 in the scenario shown in fig. 1 as an example, it is determined whether the state value of the vehicle 2 is greater than the preset state threshold x, if the state value of the vehicle 2 is greater than the state threshold x, it indicates that the information of the vehicle 2 detected by more than x other vehicles (for example, the vehicle 1, the vehicle 3 and the vehicle 4) has a large deviation from the own vehicle information of the vehicle 2 (for example, the following second preset condition is not met), which indicates that the vehicle 2 senses abnormal states, for example, abnormal states such as drift, intentional sending of error information, or falsified message transmission process occur. If the state value of the vehicle 2 is less than or equal to the preset state threshold value x, it indicates that the detected related information and other vehicles with larger deviation from the own vehicle information of the vehicle 2 do not exceed x, which indicates that the perceived state of the vehicle 2 can be regarded as normal.
In some embodiments, the first other vehicle may be another vehicle that does not meet the second preset condition. In other words, when the other vehicle does not meet the second preset condition, the information of the first vehicle detected by the other vehicle does not match the own vehicle information of the first vehicle, and the other vehicle belongs to the first other vehicle. When the other vehicles meet the second preset condition, the information of the first vehicle detected by the other vehicles is matched with the own vehicle information of the first vehicle, and the other vehicles do not belong to the first other vehicles.
In some embodiments, the second preset condition may include one or both of: 1) The distance between the position of the first vehicle detected by the other vehicles and the position in the own vehicle information of the first vehicle is smaller than or equal to a preset second distance threshold value; 2) The direction of the first vehicle detected by the other vehicles is smaller than or equal to a preset second direction deviation threshold value compared with the direction in the vehicle information of the first vehicle. In particular applications, the second distance threshold, the second orientation offset threshold, etc. may take on a tested value. In addition, the second preset condition can be flexibly set according to the requirement. The embodiment of the application is not limited to specific content, specific rules, setting and specific values of the related threshold values of the second preset condition, and the like.
In some embodiments, the data fusion method 300 provided in the embodiments of the present application may further include:
step S340, obtaining a road-end obstacle list of a road section where the first vehicle is located, wherein the road-end obstacle list comprises own vehicle information of vehicles with normal perception states in the road section where the first vehicle is located;
in step S350, effective obstacle information of the first vehicle is obtained according to the road-side obstacle list and information of the obstacle detected by the first vehicle, and the effective obstacle information can be used for performing subsequent processing such as planning decision.
In this embodiment, the effective obstacle information of the vehicle is obtained by combining the obstacle information of the vehicle end and the road-end obstacle list, so that the accuracy and the integrity of the obstacle data can be improved under various conditions such as abnormal sensing state of the first vehicle, limited sensing range of the first vehicle, normal sensing of the first vehicle, and the like, thereby improving the reliability and the accuracy of subsequent processing such as planning decision, and further enabling the vehicle to run stably and run safely on the road.
When the vehicle is positioned by itself with errors or the sensor mounting position is shifted, the abnormal obstacle proportion of the vehicle is generally high, that is, the abnormal obstacle proportion may exceed a certain threshold. That is, if the abnormal obstacle proportion of the first vehicle is higher than the first proportion threshold, it may be considered that the vehicle positioning error of the first vehicle or the sensor mounting position is shifted, and the sensing data thereof is not available. If the abnormal obstacle proportion of the first vehicle is not higher than the first proportion threshold value, the positioning function of the first vehicle is considered to be free from errors, and the perception data of the first vehicle is still available.
In some embodiments, step S350 may specifically include the following steps a1 and a2:
step a1, determining a first obstacle of a first vehicle and an abnormal obstacle proportion of the first vehicle according to information of a road-side obstacle list and the obstacle detected by the first vehicle;
In some embodiments, an algorithm, such as hungarian matching, may be used to match the time-compensated road-side obstacle list with information of the obstacles detected by the first vehicle to determine the first obstacle. Here, the first obstacle of the first vehicle is an obstacle that is included in the road-side obstacle list while being able to be detected by the first vehicle.
In some embodiments, the first obstacle of the first vehicle may be considered an abnormal obstacle of the first vehicle when the corresponding information of the first obstacle in the roadside obstacle list does not match the information of the first obstacle detected by the first vehicle. When the corresponding information of the first obstacle in the road-side obstacle list matches the information of the first obstacle detected by the first vehicle, the first obstacle of the first vehicle does not belong to an abnormal obstacle of the first vehicle. Here, the abnormal obstacle of the first vehicle is another vehicle whose perceived state determined by the first vehicle is abnormal.
In some embodiments, the abnormal obstacle of the first vehicle may be a first obstacle of the first vehicle that does not satisfy the first preset condition. In other words, when the first obstacle does not satisfy the first preset condition, the corresponding information of the first obstacle in the road-side obstacle list does not match the information of the first obstacle detected by the first vehicle, which is an abnormal obstacle of the first vehicle. When the first obstacle meets a first preset condition, the corresponding information of the first obstacle in the road-end obstacle list is matched with the information of the first obstacle detected by the first vehicle, and the first obstacle does not belong to an abnormal obstacle of the first vehicle.
In some embodiments, the first preset condition may include, but is not limited to, one or both of: 1) The distance between the position of the first obstacle obtained through the road-end obstacle list and the position of the first obstacle detected by the first vehicle is smaller than or equal to a preset first distance threshold value; 2) The offset amount between the orientation of the first obstacle obtained through the road-side obstacle list and the orientation of the first obstacle detected by the first vehicle is smaller than or equal to a preset first orientation offset threshold. It should be noted that the first preset condition may be flexibly set as required. The embodiment of the present application is not limited to specific content, specific rules, and the like of the first preset condition.
Here, the position of the first obstacle obtained by the road-side obstacle list may be: the position of the first obstacle obtained after time compensation is performed on the position of the first obstacle in the road-side obstacle list, and the position of the first obstacle obtained after time compensation is the same as the position of the first obstacle detected by the first vehicle, and time information (for example, a time stamp) is the same.
Here, the orientation of the first obstacle obtained by the road-side obstacle list may be: the direction of the first obstacle obtained after time compensation is the same as the direction of the first obstacle detected by the first vehicle, and the time information (e.g., time stamp) is the same as the direction of the first obstacle obtained after time compensation is performed on the direction of the first obstacle in the road-side obstacle list.
Here, the abnormal obstacle ratio of the first vehicle may be a ratio of the number of abnormal obstacles of the first vehicle to the number of first obstacles of the first vehicle.
And a step a2 of determining information of the first obstacle according to the road-side obstacle list and the information of the obstacle detected by the first vehicle when the abnormal obstacle proportion of the first vehicle is smaller than or equal to a preset first proportion threshold value, and adding the information of the first obstacle to the effective obstacle information of the first vehicle.
Here, the first proportional threshold may be preconfigured, and the value thereof may be freely set according to the scene requirement. For example, the first proportional threshold may be a value between 10% and 20%, e.g., the first proportional threshold may be set to 10%, 12%, 15%, 17%, 20%, etc.
In some embodiments, in step a2, the abnormal obstacle proportion of the first vehicle is less than or equal to the preset first proportion threshold, which indicates that the number of abnormal obstacles of the first vehicle is relatively small, and the perception state of the first vehicle is normal, where the "information of the first obstacle obtained through the road-side obstacle list" and the "information of the first obstacle detected by the first vehicle" are both reliable. In step a2, determining the information of the first obstacle according to the road-side obstacle list and the information of the obstacle detected by the first vehicle may include: one of the "information of the first obstacle obtained from the road-side obstacle list" and "information of the first obstacle detected by the first vehicle" may be selected as the information of the first obstacle, or the information of the first obstacle may be obtained by fusion of the two. Thus, the accuracy of the vehicle obstacle data can be further improved.
In some embodiments, step S350 may further include: step a3, for the obstacles other than the first obstacle in the road-side obstacle list, adding the information of the first obstacle obtained by the road-side obstacle list to the effective obstacle information of the first vehicle. The obstacles except the first obstacle in the road-end obstacle list belong to the obstacles which cannot be detected by the first vehicle, but the obstacles are positioned on the road section where the first vehicle is positioned, namely, the obstacles except the first obstacle in the road-end obstacle list belong to beyond-the-horizon obstacles of the first vehicle, and the information of the obstacles is added into the effective obstacle information of the first vehicle, so that the integrity of vehicle obstacle data can be improved, and the defect of limited vehicle perception range can be overcome.
Here, the "information of the first obstacle obtained by the road-side obstacle list" may be: and carrying out time compensation on the information of the first obstacle in the road side obstacle list to obtain information. The time-compensated information is the same as the information of the first obstacle detected by the first vehicle, and the time information (e.g., time stamp).
In some embodiments, the data fusion method 300 provided in the embodiments of the present application may further include: step S360, detecting an error sensor of the first vehicle when the abnormal obstacle ratio of the first vehicle is greater than the first ratio threshold. In general, the abnormal obstacle proportion of the vehicle is higher, which indicates that the information detected by the vehicle is not matched with the information detected by a plurality of other vehicles, namely, the vehicle may have abnormal sensing conditions, and most of the abnormal sensing conditions belong to conditions such as sensor position deviation, sensor function abnormality and the like.
Here, the first sensor is any one or a group of sensors of the first vehicle.
In some embodiments, step S360 may specifically include: step b1, determining a first obstacle of a first sensor, an abnormal obstacle of the first sensor and an abnormal obstacle proportion of the first sensor according to obstacle information perceived by the first sensor loaded on the first vehicle and a road-side obstacle list; and b2, determining that the first sensor is in error when the abnormal obstacle proportion of the first sensor is greater than a preset second proportion threshold value. And b3, determining that the first sensor is normal when the abnormal obstacle proportion of the first sensor is smaller than or equal to a preset second proportion threshold value.
Here, the first obstacle of the first sensor is an obstacle that is included in the road-side obstacle list while being perceived by the first sensor. When the corresponding information of the first obstacle of the first sensor in the road-end obstacle list is not matched with the information of the first obstacle perceived by the first sensor, the first obstacle of the first sensor is an abnormal obstacle of the first sensor. When the corresponding information of the first obstacle of the first sensor in the road-end obstacle list is matched with the information of the first obstacle perceived by the first sensor, the first obstacle of the first sensor does not belong to the abnormal obstacle of the first sensor.
In some embodiments, if the first obstacle of the first sensor does not meet the first predetermined condition, the first obstacle of the first sensor may be regarded as an abnormal obstacle of the first sensor. If the first obstacle of the first sensor meets the first preset condition, the first obstacle of the first sensor is not an abnormal obstacle of the first sensor. In other words, when the first obstacle of the first sensor does not meet the first preset condition, the corresponding information of the first obstacle of the first sensor in the road-side obstacle list is not matched with the information of the first obstacle perceived by the first sensor. When the first obstacle of the first sensor meets a first preset condition, the corresponding information of the first obstacle of the first sensor in the road-end obstacle list is matched with the information of the first obstacle perceived by the first sensor.
In some embodiments, the obstacle information perceived by the first sensor includes content such as a position of the obstacle perceived by the first sensor, a speed of the obstacle perceived by the first sensor, a size of the obstacle, an orientation of the obstacle perceived by the first sensor, a time stamp, and the like. Here, the time stamp indicates the time at which the information of the obstacle is obtained. Here, the position and orientation of the obstacle perceived by the first sensor are derived from the data acquired by the first sensor. For example, when the first sensor is a camera, the position and orientation of the obstacle perceived by the first sensor may be obtained from the image acquired by the first sensor.
In some embodiments, the data fusion method 300 provided in the embodiments of the present application may further include: and determining a compensation parameter of the first sensor according to the information of the first obstacle perceived by the first sensor and the road-end obstacle list, wherein the compensation parameter of the first sensor is used for compensating the information perceived by the first sensor, so that the information perceived by the first sensor can be compensated based on the compensation parameter of the first sensor, and errors of perceived information caused by factors such as position deviation of the first sensor are compensated.
Taking the camera as an example, the mounting positions of the cameras participating outside the camera are directly related, so if the camera goes wrong, the compensation external parameters of the camera need to be determined. The external parameters of the camera may include a rotation matrix and a translation matrix, i.e. when the camera is in error, the compensation rotation matrix and the compensation translation matrix of the camera need to be determined first.
Assume that the set of the first obstacle positions in the road-side obstacle list of the road-side device 120 is x= { X 1 ,x 2 ,x 3 ,...x m },x i (i=1, 2,3,) m) represents the position coordinates of the i-th first obstacle in the road-side obstacle list, and the set of first obstacle positions perceived by a certain sensor or a group of sensors is p= { P 1 ,p 2 ,p 3 ,...,p m },p i (i=1, 2,3,) m) represents the position coordinates of the i-th first obstacle perceived by a certain sensor or a group of sensors, m is the number of first obstacles, and m is an integer greater than or equal to 2.
When the position coordinate of the ith first obstacle sensed by one or a group of sensors is p i The compensated position coordinate p 'of the ith first obstacle' i Can be obtained by the following formula (1):
p′ i =Rp i +t (1)
wherein i=1, 2,3, … …, m, R represents the rotation matrix of P to X, i.e. the compensation rotation matrix of the camera, and t represents the translation matrix of P to X, i.e. the compensation translation matrix of the camera.
In some implementations, the compensation rotation matrix R and the compensation translation matrix t may be determined by singular value decomposition. Specifically, the compensation rotation matrix R and the compensation translation matrix t can be obtained by the following formulas (2) to (9).
R=UV T (2)
t=u x -Ru p (3)
X′={x j -u x }={x′ j } (6)
P′={p j -u p }={p′ j } (7)
W=U∑V T (9)
Wherein T represents a transpose, U and V are unitary matrices, U is a matrix of m x m, and V is a matrix of n x n. Sigma is a matrix of m x n, all 0 except for the elements on the main diagonal, each element on the main diagonal being called a singular value, u x Is the average value of n x, u p The average value of n p is n, and n is the number of x and p.
According to the singular value decomposition (singular value decomposition, SVD) principle, the transpose of W and W are subjected to matrix multiplication, so that a matrix V with n is obtained; the transpose of W and W is multiplied by a matrix, so that a matrix U with m is obtained; the singular values in Σ satisfy the following formula (10):
In the formula (10), v j Is the eigenvector of V, u j Is the eigenvector of U, lambda j Is the characteristic value of V and U, sigma j J=1, 2, 3, … …, n, which are singular values in Σ.
In some embodiments, the data fusion method 300 provided in the embodiments of the present application may further include: the vehicle-end information of the first vehicle, which includes the own vehicle information of the first vehicle and the information of the abnormal obstacle of the first vehicle, is transmitted to the road-end device so that the road-end device 120 collects the information of each vehicle in the jurisdiction section to perform the related process.
In some embodiments, the information of the abnormal obstacle of the first vehicle may include contents of a road-end code of the abnormal obstacle (i.e., other vehicles whose perceived states determined by the first vehicle are abnormal), a position of the abnormal obstacle detected by the first vehicle, a speed of the abnormal obstacle detected by the first vehicle, an orientation of the abnormal obstacle detected by the first vehicle, a size of the abnormal obstacle, a time stamp, and the like. Here, the time stamp indicates the time at which the information of the abnormal obstacle is obtained. Here, the position and orientation of the abnormal obstacle detected by the first vehicle are obtained by fusing the positions and orientations of the abnormal obstacles sensed by the sensors of the respective or respective groups in the first vehicle. Here, the fusion may be implemented using an algorithm such as lossless kalman filtering.
In some embodiments, the self-vehicle information of the first vehicle may include: the road end code of the first vehicle, the position of the first vehicle detected by the first vehicle, the speed of the first vehicle detected by the first vehicle, the direction of the first vehicle detected by the first vehicle, the size of the first vehicle, a time stamp and the like. Here, the time stamp indicates the time at which the own vehicle information is obtained. Here, the position and orientation of the first vehicle itself detected are fused by an algorithm such as lossless kalman filtering, based on the position and orientation of the first vehicle itself sensed by each sensor or each group of sensors in the first vehicle.
Taking the scenario of fig. 1 as an example, assuming that the vehicle 2 perceives that the state is abnormal, the vehicle 1 may determine that the vehicle 2 is an own abnormal obstacle, and the information of the abnormal obstacle of the vehicle 1 may include information of the vehicle 2 detected by the vehicle 1. Further, assuming that both the vehicle 2 and the vehicle 3 perceive the state abnormality, the vehicle 1 may determine that both the vehicle 2 and the vehicle 3 are own abnormal obstacles, and the information of the abnormal obstacle determined by the vehicle 1 may include information of the vehicle 2 detected by the vehicle 1 and information of the vehicle 3 detected by the vehicle 1.
In some embodiments, the data fusion method 300 provided in the embodiments of the present application may further include: alert in one or more of the following:
1) Determining that a perceived state of the first vehicle is abnormal;
here, the perceived state abnormality of the first vehicle includes, but is not limited to: 1) Judging that the sensing state of the first vehicle is abnormal according to the state value of the first vehicle; 2) The abnormal obstacle proportion of the first vehicle is greater than a first proportion threshold.
2) An error sensor that detects a first vehicle;
here, whether the error sensor belongs to a predetermined necessary sensor or not, an alarm can be given to prompt a user to check the condition of the relevant sensor in time and repair the sensor in time.
3) When the error sensor belongs to a predetermined necessary sensor.
Here, if the error sensor belongs to a sensor in a pre-configured necessary sensor list, the user can be prompted by an alarm to stop checking in time and take over by a driver so as to ensure driving safety and avoid accidents caused by sensor errors.
In some embodiments, the manner of alerting may include, but is not limited to, the following list: 1. prompting words and icons of the instrument; 2. an alarm sound prompt; 3. voice broadcasting prompts; 4. steering wheel vibration reminding; 5. air conditioner blowing prompt; 6. an atmosphere lamp changing prompt; 7. a music sound heightening reminder; 8. automatically navigating the map to get to a nearby rest area for prompting; 9. the manual driving mode is switched to the auxiliary driving mode; 10. the automatic driving actively takes over driving operations. It will be appreciated that the instrument text and icon cues may include text and/or icon cues that are provided by a vehicle-mounted display system such as a center control screen, dashboard, head-up display (HUD), or mobile terminal (e.g., cell phone, tablet, bracelet) that is coupled to the vehicle.
FIG. 4 illustrates a schematic diagram of an alarm of an error sensor in an exemplary scenario in an embodiment of the present application. In the example of fig. 4, the sensor is alerted by a text prompt of the meter when it is in error.
Fig. 5 shows a flowchart of a data fusion method 500 according to an embodiment of the present application. Referring to fig. 5, a method 500 for fusing data at a road end may include the following steps:
step S510, obtaining vehicle end information of other vehicles in a road section where a first vehicle is located;
here, when the first vehicle perceived state is abnormal, the vehicle-end information of the other vehicle includes information of the first vehicle detected by the other vehicle. When the first vehicle perceived state is normal, the vehicle-end information of the other vehicle may not include the information of the first vehicle detected by the other vehicle.
Step S520, obtaining fusion information of the first vehicle according to the information of the first vehicle detected by other vehicles;
in step S530, a fusion obstacle list is sent to the first vehicle, where the fusion obstacle list includes fusion information of the first vehicle.
In some embodiments, the end-of-vehicle information may include information of an own vehicle detected by the vehicle through its sensor system and an end-of-vehicle obstacle list including information of obstacles detected by the vehicle through its sensor system.
The vehicle-end obstacle list may contain information of obstacles detected by the vehicle through its sensor system, and the information of the obstacles in the vehicle-end obstacle list may include, but is not limited to, type, position, size, speed, orientation, time stamp, etc. of the obstacles. Here, the information of the obstacle in the vehicle-end obstacle list is obtained by fusing the obstacle information of each sensing node in the sensor system, and each sensing node corresponds to one or a group of sensors.
Since vehicles on the same road section are obstacles to each other, the obstacles in the surrounding environment of the vehicles are also largely identical, and thus, a great amount of redundant information exists in the obstacle information of the vehicles, so as to avoid the consumption of calculation resources and time delay caused by the redundancy of information, the vehicle end information of the vehicles may only include information of partial obstacles, for example, the vehicle end information of the vehicles may only include abnormal obstacles of the vehicles. That is, in some embodiments, the vehicle-side information may include information of a vehicle's own vehicle and information of an abnormal obstacle. If the vehicle does not detect an abnormal obstacle, the vehicle-end information of the vehicle may include only own vehicle information of the vehicle. Therefore, redundancy of vehicle-end information can be greatly reduced, operand and computational complexity are reduced, and processing efficiency of the road-end equipment 120 is effectively improved, so that requirements for data instantaneity in unmanned scenes and the like are met.
In some embodiments, the fused obstacle list may contain fused information of vehicles whose perceived states are abnormal. In some examples, to facilitate the distinguishing and querying of the information, the fused obstacle list may further include a road-end code of the vehicle with abnormal perceived state, where the road-end code may be recorded in the fused information of the corresponding vehicle. Taking fig. 1 as an example, assuming that the perception states of the vehicles 2 and 3 are abnormal and the perception states of the vehicles 1, 4 and 5 are normal, the fusion obstacle list may include fusion information and road end codes of the vehicles 2 and fusion information and road end codes of the vehicles 3.
In some embodiments, the end-of-vehicle status list may include status values of vehicles in the road segment on which the first vehicle is located. In some examples, for ease of differentiation and querying, the end-of-vehicle status list may also include end-of-road codes for respective vehicles that correspond to, or may be recorded in, the status values of the vehicles. Taking fig. 1 as an example, the end-of-vehicle status list may include status values and end-of-road numbers of the vehicle 1, the vehicle 2, the vehicle 3, the vehicle 4, and the vehicle 5 in the road section H1.
In some embodiments, step S510 may further include: acquiring own vehicle information of a first vehicle; step S520 may further include: updating the state value of the first vehicle according to the information of the first vehicle detected by other vehicles and the own vehicle information of the first vehicle; step S530 may further include: an end of vehicle status list is sent to the first vehicle, the end of vehicle status list including status values of the first vehicle.
In some embodiments, taking the example that the state value is used to indicate the number of the first other vehicles in the road section where the vehicle is located, updating the state value of the first vehicle according to the information of the first vehicle detected by the other vehicles and the own vehicle information of the first vehicle may include: determining whether the information of the first vehicle detected by each other vehicle is matched with the own vehicle information of the first vehicle, wherein the information of the first vehicle detected by each other vehicle is not matched with the own vehicle information of the first vehicle, and adding 1 to the state value of the first vehicle.
Here, determining whether the information of the first vehicle detected by each other vehicle matches the own vehicle information of the first vehicle may include: and judging whether the information of the first vehicle detected by each other vehicle and the own vehicle information of the first vehicle meet a second preset condition or not. If the second preset condition is met, the two are matched, and if the second preset condition is not met, the two are not matched. For details regarding the second preset condition, reference may be made to the foregoing related description, which is not repeated here.
Taking the vehicle 1 in the scenario shown in fig. 1 as an example, the road side device 120 determines whether the own vehicle information of the vehicle 1 and the abnormal obstacle information of the same road side code in the vehicle side information of each other vehicle (i.e. the vehicle 2 and the vehicle 3) meet the second preset condition, and each other vehicle does not meet the second preset condition, which means that the information of the vehicle 1 detected by the other vehicle is added more than the own vehicle information of the vehicle 1, and the state value of the vehicle 1 is added by 1; if all other vehicles meet the second preset condition or no abnormal obstacle information which is the same as the road end code of the vehicle 1 exists, the deviation between the information of the vehicle 1 detected by all other vehicles and the own vehicle information of the vehicle 1 is in a reasonable range, and the state value of the vehicle 1 can be kept unchanged.
In some embodiments, the vehicle-end information of the other vehicles further includes own vehicle information of the other vehicles, the vehicle-end status list further includes status values of the other vehicles, and the data fusion method 500 of the road end may further include:
step S540, updating the road-side obstacle list according to the own vehicle information and the state value of other vehicles and the own vehicle information and the state value of the first vehicle;
in some embodiments, the road-side obstacle list may include own vehicle information of vehicles with normal perception states in a road section where the first vehicle is located. In addition, the road-side obstacle list may further include information of other vehicles, for example, own vehicle information of the vehicle that communicates for the first time.
To facilitate distinguishing information of different vehicles, in some examples, the road-side obstacle list may include road-side codes of vehicles with normal perception states, own vehicle information of vehicles that communicate for the first time, and road-side codes allocated to the vehicles, and the road-side codes may be recorded in the own vehicle information of the corresponding vehicles. Taking fig. 1 as an example, assuming that the perceived states of the vehicles 2 and 3 are abnormal, the vehicle 1 communicates for the first time, the perceived states of the vehicles 4 and 5 are normal, the road-side obstacle list of the road section H1 may include the own vehicle information and road-side code of the vehicle 1, the own vehicle information and road-side code of the vehicle 4, and the own vehicle information and road-side code of the vehicle 5.
In some embodiments, for a vehicle that is not in first communication, the state value of the vehicle may be queried from the vehicle-end state list according to the road-end code of the vehicle, and if the vehicle perception state is determined to be normal according to the state value of the vehicle, the vehicle information provided by the vehicle at the time may be updated to the road-end obstacle table. And if the vehicle perception state is abnormal according to the state value of the vehicle, removing the corresponding information of the vehicle in the road-end obstacle list.
In some embodiments, for a first-time communication vehicle, after a road-end code is allocated to the vehicle, the road-end code of the vehicle may be recorded in own vehicle information of the vehicle, and the own vehicle information of the vehicle may be added to the road-end obstacle list.
Step S550, a road-side obstacle list is transmitted to the first vehicle and the other vehicles.
According to the data fusion method 500, the fusion information of vehicles with abnormal perception states is obtained by collecting the information detected by a plurality of vehicles on the same road section, so that the vehicles can use the fusion information to replace own vehicle information to execute subsequent planning decision and other processes under the condition of abnormal self perception states, and the vehicles with abnormal perception states can be ensured to stably run and safely run.
In a specific application, the data fusion method 500 of the embodiment of the present application may be executed by the road side device 120, or may be executed by the vehicle side device 110 of the vehicle.
The following describes specific implementations of embodiments of the present application with reference to examples.
Fig. 6 shows a schematic flow chart of an exemplary implementation of the data fusion method according to the embodiment of the present application. Referring to fig. 6, a specific implementation flow of the data fusion method performed by the peer device 120 may include the following steps:
step S601, the road side device 120 receives the vehicle side information of each vehicle in the self-administered road section;
the vehicle-end information includes own vehicle information of the vehicle and information of an abnormal obstacle of the vehicle. If the vehicle does not have an abnormal obstacle, the vehicle-end information may include only the own vehicle information of the vehicle. The own vehicle information includes road end codes of vehicles, and the information of the abnormal obstacle includes road end codes of the abnormal obstacle.
Step S602, judging whether a vehicle with first communication exists, continuing to step S603-S604 for the vehicle with first communication, and continuing to step S605 for the vehicle with non-first communication;
if the road end code in the vehicle end information of the vehicle is an initial value or is empty, the communication is the first communication, and if the road end code in the vehicle end information is not the initial value and is valid (for example, a preset coding rule is satisfied), the communication is the non-first communication.
In step S603, the road-end device 120 allocates a road-end code to the vehicle that communicates for the first time, and sends the road-end code to the corresponding vehicle.
In step S604, the road-side device 120 records the road-side code of the vehicle that is first communicated into the own vehicle information of the vehicle, and adds the own vehicle information of the vehicle to the road-side obstacle list, and returns to step S609.
Step S605, for the non-first communication vehicles, the road side device 120 determines whether the perceived status of the vehicle is abnormal according to the vehicle side status list, and for the non-first communication vehicles with abnormal perceived status, continues to execute step S606, and for the non-first communication vehicles with normal perceived status, continues to execute step S607;
step S606, for vehicles with abnormal perception states, removing own vehicle information of the vehicles from a road-end obstacle list, obtaining fusion information of the vehicles, adding the fusion information into the fusion obstacle list, and jumping to step S609;
firstly, extracting obstacle information including road end codes of vehicles with abnormal perception states from information of abnormal obstacles of other vehicles, wherein the obstacle information is information of vehicles with abnormal perception states detected by the other vehicles; then, an algorithm such as lossless Kalman filtering is adopted to fuse the information of the vehicles with abnormal perception states detected by other vehicles, and the fusion information of the vehicles with abnormal perception states can be obtained.
Step S607, for vehicles with normal perception state, updating the information of the vehicle in the road-end obstacle list to the own vehicle information provided by the vehicle at the time;
here, the updating process may include: searching road side obstacle list containing road side coded information of vehicle, replacing said information with vehicle information provided by said vehicle.
In step S608, the status value of each vehicle in the vehicle-end status list is updated.
Here, the roadside apparatus 120 updates the state value of the vehicle based on the own vehicle information of each vehicle and information of an abnormal obstacle of the other vehicle. Specific details are set forth in the foregoing description and are not repeated here.
Step S609, the latest road-side obstacle list, the fusion obstacle list and the vehicle-side state list are sent to each vehicle in the self-administered road section, and the step S601 is returned.
Taking the scenario shown in fig. 1 as an example, where the vehicles 4 and 5 are omitted, the road-end device 120 has allocated the road-end code 1, the road-end code 2, the road-end code 3, the road-section code 4, and the road-end code 5 to the vehicles 1, 2, 3, 4, and 5, respectively, where the road-end code 1 is used to uniquely identify the vehicle 1, the road-end code 2 is used to uniquely identify the vehicle 2, the road-end code 3 is used to uniquely identify the vehicle 3, the road-end code 4 is used to uniquely identify the vehicle 4, and the road-end code 5 is used to uniquely identify the vehicle 5. Assuming that the state value of the vehicle 1 is 3, the state values of the vehicles 2 to 5 (i.e., the vehicle 2, the vehicle 3, the vehicle 4, and the vehicle 5) are 0, the perceived state of the vehicle 1 is abnormal, and the perceived state of the vehicles 2 to 5 is normal.
The abnormal obstacles of the vehicle 1 include vehicles 2 to 5, and the abnormal obstacles of the vehicles 2 to 5 each include the vehicle 1. Since the abnormal obstacle proportion of the vehicle 1 exceeds the first proportion threshold thereof, the vehicle-end information of the vehicle 1 does not contain information of the abnormal obstacle thereof, and contains only own vehicle information thereof, that is, the vehicle-end information transmitted from the vehicle 1 to the road-end apparatus 120 includes: the own vehicle information (including road side code 1) of the vehicle 1. Since the vehicle 2 perceives that the state is normal, the abnormal obstacle proportion thereof does not exceed the first proportion threshold thereof, and therefore, the vehicle-end information of the vehicle 2 includes information of the abnormal obstacle thereof and own vehicle information, that is, the vehicle-end information transmitted from the vehicle 2 to the road-end device 120 includes: the own vehicle information (including the road-end code 2) of the vehicle 2 and the information (including the road-end code 1) of the vehicle 1 detected by the vehicle 2 of the road-end code 2. As with the vehicle 2, the vehicle-side information transmitted from the vehicle 3 to the road-side device 120 includes: the own vehicle information (including the road end code 3) of the vehicle 3 and the information (including the road end code 1) of the vehicle 1 detected by the vehicle 3, and the vehicle end information transmitted from the vehicle 4 to the road end device 120 includes: the own vehicle information (including the road end code 4) of the vehicle 4 and the information (including the road end code 1) of the vehicle 1 detected by the vehicle 4, and the vehicle end information transmitted from the vehicle 5 to the road end device 120 includes: the own vehicle information (including the road-end code 5) of the vehicle 5 and the information (including the road-end code 1) of the vehicle 1 detected by the vehicle 5.
The state value of the vehicle 1 is greater than a preset state threshold value x (assuming x=2), the road side device 120 may confirm that the perceived state of the vehicle 1 is abnormal, may obtain fusion information of the vehicle 1 by fusing the vehicle 1 information detected by the vehicle 2 and the vehicle 1 information detected by the vehicle 3, add the fusion information of the vehicle 1 to the fusion obstacle list, and delete own vehicle information of the vehicle 1 in the road side obstacle list.
The state values of the vehicles 2 to 5 are all smaller than the state threshold value x (assuming x=2), and the road-side device 120 can confirm that the perception states of the vehicles 2 to 5 are normal, and update the own vehicle information of the vehicles 2 to 5 into the road-side obstacle list.
The process of the road side device 120 updating the state values of the vehicle 1, the vehicle 2, and the vehicle 3 in the vehicle side state list is as follows:
the distance between the position in the vehicle 1 information detected by the vehicle 3 and the position in the vehicle 1 own vehicle information exceeds the second distance threshold value, the distance between the position in the vehicle 1 information detected by the vehicle 2 and the position in the vehicle 1 own vehicle information also exceeds the second distance threshold value, the state value of the vehicle 1 is added with 4, and the state value of the vehicle 1 is changed from 3 to 7, but because the road section H1 has 5 vehicles in total, the abnormal obstacle of the vehicle 1 contains 4 other vehicles at most, and therefore, the state value of the vehicle 1 is updated to 4 finally.
Since there is no information including the road-side encoded abnormal obstacle of these vehicles, that is, the abnormal obstacle of these vehicles which does not belong to other vehicles, the state value thereof remains 0 for the vehicles 2 to 5.
It can be seen that, only if the state value of the vehicle 1 exceeds the state threshold value x, the state values of the vehicles 2 to 5 do not exceed the state threshold value x, and it can be confirmed that the vehicle 1 senses the abnormal state, and the vehicles 2 to 5 are normal.
Finally, the roadside apparatus 120 transmits the updated fusion obstacle list, roadside obstacle list, and vehicle-end state list to the vehicles 1 to 5. The road side obstacle list comprises own vehicle information of vehicles 2-5, and the vehicle side state list comprises state values of the vehicles 1-5 and road side codes thereof.
Fig. 7 shows a schematic flow chart of an exemplary implementation of the data integration method according to the embodiment of the present application. Referring to fig. 7, a specific implementation flow of the data integration method performed by the vehicle-end device 110 may include the following steps:
in step S701, the vehicle receives a road-side obstacle list, a fusion obstacle list, and a vehicle-side status list from the road-side device, and the vehicle acquires the vehicle information and the vehicle-side obstacle list through the own sensor system.
The vehicle-end obstacle list includes information of obstacles detected by the vehicle.
Step S702, the vehicle determines whether the self-perceived state is normal, and continues to step S703 when the perceived state is abnormal, and continues to step S704 when the perceived state is normal.
Specifically, the vehicle queries its own state value from the vehicle-end state list according to its own road-end code, determines whether the state value exceeds the state threshold x, if the state value exceeds the threshold x, determines that the perceived state is abnormal, continues to step S702, and if the state value is less than or equal to the state threshold x, determines that the perceived state is normal.
Step 703, positioning drift, sending out an alarm, replacing the vehicle information with the fusion information corresponding to the road end code in the fusion obstacle list, and providing the fusion information as effective positioning information to the regulation node, and continuing step 705.
In this step, time compensation may be performed on the fusion information of the vehicles in the fusion obstacle list, if necessary.
Step S704, the vehicle information is used as effective positioning information and provided to the regulation node, and step S705 is continued.
Step S705, performing time compensation on the information in the road side obstacle list to obtain a road side obstacle list consistent with the time stamp of the vehicle side obstacle list;
Because of short time delay in data transmission, the information time stamp in the road-end obstacle list is earlier than the time stamp of the vehicle-end obstacle list, and the road-end obstacle list with the time stamp consistent with the information detected by the vehicle can be obtained through time compensation. The time compensation may include compensation for information such as position, orientation, etc. For example, the compensation of the position may be: and predicting the position of the vehicle at the moment indicated by the time stamp in the road-side obstacle list according to the speed, the position and the time stamp in the vehicle information of the vehicle in the road-side obstacle list.
Step S706, a Hungary matching algorithm is adopted to match the road-end obstacle list after time compensation with the vehicle-end obstacle list so as to distinguish a first obstacle from a second obstacle.
The first obstacle, which may be referred to as a matching obstacle, is an obstacle included in both the road-side obstacle list and the vehicle-side obstacle list. The second obstacle, which may be referred to as a non-matching obstacle, is an obstacle that is included in the road-side obstacle list and is not included in the vehicle-side obstacle list.
In step S707, the information of the second obstacle in the road-end obstacle list after the time compensation is added to the fusion output obstacle queue of the vehicle.
Here, the fusion output obstacle queue includes effective obstacle information of the vehicle.
Here, the second obstacle is likely to be a beyond-the-horizon obstacle of the vehicle (i.e., an obstacle beyond the sensing range of the vehicle sensor system), and by adding the information of the second obstacle to the fused output obstacle array of the vehicle, the integrity of the obstacle information of the vehicle can be improved, and the defect of limited sensing range of the vehicle can be overcome.
Step S708, finding out an abnormal obstacle in the first obstacle, and recording a road-end code of the abnormal obstacle;
for each first obstacle, judging whether the first obstacle meets a first preset condition according to the information of the first obstacle in the road-end obstacle list after time compensation and the information of the first obstacle in the vehicle-end obstacle list, if so, the first obstacle does not belong to an abnormal obstacle, and if not, the first obstacle is an abnormal obstacle. For the first preset condition, reference may be made to the foregoing related description, and details are not repeated here.
Step S709, for the first obstacle not belonging to the abnormal obstacle, updating the fusion output obstacle queue of the vehicle by using the own vehicle information of the first obstacle in the road-side obstacle list after the time compensation.
Specifically, for the first obstacle not belonging to the abnormal obstacle, the vehicle information of the first obstacle in the road-end obstacle list after time compensation can be directly added into a fusion output obstacle queue of the vehicle; or, the information of the first obstacle in the vehicle-end obstacle list can be added to a fusion output obstacle queue of the vehicle; or, the vehicle information of the first obstacle in the road-end obstacle list after time compensation and the information of the first obstacle in the vehicle-end obstacle list can be fused to obtain the fusion information of the first obstacle, and the fusion information of the first obstacle is added into a fusion output obstacle queue of the vehicle.
Step S710, determining whether the abnormal obstacle proportion of the vehicle is less than or equal to a preset first proportion threshold, if the abnormal obstacle proportion of the vehicle is less than or equal to the first proportion threshold, indicating that the vehicle sensing data is available, continuing to step S711, if the abnormal obstacle proportion of the vehicle is greater than the first proportion threshold, indicating that the vehicle sensing data is not available, starting the detection and parameter compensation process of the error sensor, that is, continuing to perform steps S712-S719.
Here, the abnormal obstacle proportion of the vehicle is equal to a ratio of the number of abnormal obstacles of the vehicle to the first number of obstacles of the vehicle.
Step S711, the merged output obstacle queue of the vehicle is sent to the regulation node, the vehicle end information of the vehicle is sent to the road end device 120, and the process returns to step S701.
Here, the vehicle-side information may include information on a vehicle's own vehicle and information on an abnormal obstacle of the vehicle. If no abnormal obstacle exists, the vehicle-side information may include only the own vehicle information of the vehicle.
Step S712, the perceived obstacle list of each perceived node is matched with the road-end obstacle list after time compensation to determine the first obstacle of the perceived node, find out the abnormal obstacle in the first obstacle of each perceived node, and calculate the abnormal obstacle proportion of each perceived node.
Here, each sensing node corresponds to one or a group of sensors. The perceived obstacle list of the perceived node may be obtained from data acquired by the respective sensor, and the perceived obstacle list may include information of the obstacle perceived by the respective sensor, which may include, but is not limited to, a type, a position, a size, a speed, an orientation, a time stamp, and the like of the obstacle. In a specific application, the sensing ranges and sensing algorithms of different sensors are different, so that the obstacles in the sensing obstacle list of different sensing nodes are not identical.
Here, the abnormal obstacle ratio of the sensing node is equal to a ratio of the number of abnormal obstacles of the sensing node to the number of first obstacles of the sensing node.
Here, the matching algorithm is the same as that in the previous step S706, and the determination of the abnormal obstacle is similar to that in the previous step S706, and will not be described again.
In step S713, the sensor corresponding to the sensing node with the abnormal obstacle ratio greater than or equal to the preset second ratio threshold is the error sensor, and the error sensor is recorded in the error sensor list, and steps S714 and S715 are continued.
Here, the sensor corresponding to the sensing node whose abnormal obstacle proportion is smaller than the second proportion threshold is a normal sensor, and for the normal sensor, it is not necessary to perform the subsequent processing.
In step S714, an alarm is issued to prompt the user for the location of the error sensor, and step S715 may be continued.
For example, the position of the error sensor may be displayed to the user through a display device of the vehicle, so that the user can intuitively and accurately confirm the error sensor.
Step S715, judging whether necessary sensors exist in the error sensor list;
in particular, the list of faulty sensors may be matched to the list of necessary sensors, determining if and which faulty sensors are necessary sensors in the list of faulty sensors. Here, the necessary sensor list may be preconfigured or may be generated by querying the priorities of the sensors.
Here, the necessary sensor means a sensor necessary for vehicle positioning or a sensor having a higher priority than a predetermined value, and the unnecessary sensor means a sensor unnecessary for vehicle positioning or a sensor having a lower priority than a predetermined value.
In step S716, if the error sensor list includes the necessary sensors, that is, the necessary sensors are in error, an alarm is sent, for example, the driver is prompted to take over, and the process is ended.
In step S717, if the error sensor list does not include the necessary sensors, that is, none of the error sensors belongs to the necessary sensors, the information sensed by the error sensor may be compensated, and the information sensed by the error sensor includes the vehicle information and the obstacle information detected by the error sensor.
Here, the compensation parameter of the error sensor may be first determined, and then the sensing information of the error sensor may be compensated using the compensation parameter of the error sensor. Specific details are set forth in the foregoing description and are not repeated here.
Step S718, checking whether the compensated information of the error sensor continuously for a plurality of times or the compensated information within a preset time period is normal, if so, indicating that the error sensor after compensation is available, directly returning to step S701, if not, indicating that the error sensor after compensation is still not available, and continuing step S719;
If the number of times of verification reaches the upper limit of the number of times of verification or the multiple times of verification of the preset time length, the abnormal obstacle proportion determined by the compensated sensing information of the error sensor is still greater than or equal to the second proportion threshold value, which indicates that the compensated information of the error sensor is continuously repeated or the compensated information within the preset time length is not normally verified or abnormal, the error sensor after compensation is still not available, and if the number of times of verification does not reach the upper limit of the number of times of verification or the multiple times of verification of the preset time length, the abnormal obstacle proportion determined by the compensated sensing information of the error sensor is less than the second proportion threshold value, which indicates that the compensated information of the error sensor is continuously repeated or the compensated information within the preset time length is normally verified, and the error sensor after compensation is available.
Step S719, the reception of the corresponding error sensor is stopped, and step S701 is returned.
Taking the scenario of fig. 1 as an example, vehicles 1, 2 and 3 are obstacles, all traveling in the road segment managed by the same road-side device 120, fig. 8 shows a schematic flow chart of a specific implementation of the exemplary scenario of fig. 1, and vehicle 4 and vehicle 5 are omitted in fig. 8. In fig. 8, the vehicles 2 and 3 do not belong to the first communication, the vehicle 1 belongs to the first communication, and the road side device 120 has allocated the road side code 2 and the road side code 3 to the vehicles 2 and 3, respectively, the road side code 2 is used for uniquely identifying the vehicle 2, and the road side code 3 can be used for uniquely identifying the vehicle 3. After establishing communication with the vehicle 1, the road side device 120 assigns the vehicle 1 a road side code 1, the road side code 1 being used to uniquely identify the vehicle 1. time t1 is earlier than time t2, and time t3 is after time t 2. It should be noted that fig. 8 is only an example, and is not intended to limit the specific implementation of the embodiments of the present application.
Fig. 9 shows a schematic structural diagram of a data fusion device according to an embodiment of the present application. Referring to fig. 8, the data fusion apparatus 900 may include:
a first obtaining unit 910, configured to obtain a fused obstacle list and a vehicle-end state list of a road section where a first vehicle is located, where the vehicle-end state list includes a state value of the first vehicle, the state value of the first vehicle is used to determine a perceived state of the first vehicle, and when the perceived state of the first vehicle is abnormal, the fused obstacle list includes fused information of the first vehicle;
a first determining unit 920, configured to use the fusion information of the first vehicle as effective positioning information of the first vehicle when determining that the first vehicle perceives the abnormal state according to the state value of the first vehicle;
the fusion information of the first vehicle is obtained by fusing information of the first vehicle detected by other vehicles in a road section where the first vehicle is located.
In some embodiments, the first determining unit 920 may be further configured to use the vehicle information of the first vehicle as the effective positioning information of the first vehicle when it is determined that the perceived state of the first vehicle is normal according to the state value of the first vehicle.
In some embodiments, the status value of the first vehicle is used to indicate a number of first other vehicles in the road segment on which the first vehicle is located. Here, the first other vehicle may be another vehicle that does not meet the second preset condition including: the distance between the position of the first vehicle detected by the other vehicles and the position in the own vehicle information of the first vehicle is smaller than or equal to a preset second distance threshold value.
In some embodiments, the information of the first vehicle detected by the first other vehicle does not match the own vehicle information of the first vehicle; the first determining unit 920 may be further configured to determine the perceived status of the first vehicle according to the status value of the first vehicle and a preset status threshold. Specifically, the first determining unit 920 is configured to determine that the first vehicle perceived state is abnormal when the state value of the first vehicle is greater than the preset state threshold, and determine that the first vehicle perceived state is normal when the state value of the first vehicle is less than or equal to the preset state threshold.
In some embodiments, the first obtaining unit 910 may be further configured to obtain a road-side obstacle list of a road section where the first vehicle is located, where the road-side obstacle list includes vehicle information of vehicles with normal perception states in the road section where the first vehicle is located; the data fusion apparatus 900 may further include: a second determining unit 930 for obtaining effective obstacle information of the first vehicle according to the road-side obstacle list and information of the obstacle detected by the first vehicle.
In some embodiments, the second determining unit 930 may be specifically configured to: determining a first obstacle of the first vehicle and an abnormal obstacle proportion of the first vehicle according to the information of the road-side obstacle list and the obstacle detected by the first vehicle; when the abnormal obstacle proportion of the first vehicle is smaller than or equal to a preset first proportion threshold value, determining information of the first obstacle according to a road-side obstacle list and information of the obstacle detected by the first vehicle, and adding the information of the first obstacle into effective obstacle information of the first vehicle; when the corresponding information of the first obstacle of the first vehicle in the road-end obstacle list is not matched with the information of the first obstacle detected by the first vehicle, the first obstacle of the first vehicle is an abnormal obstacle of the first vehicle.
In some embodiments, the abnormal obstacle is a first obstacle that does not satisfy a first preset condition, the first preset condition including: the distance between the position of the first obstacle obtained through the road-side obstacle list and the position of the first obstacle detected by the first vehicle is smaller than or equal to a preset first distance threshold.
In some embodiments, the second determining unit 930 may further be configured to: for the obstacles other than the first obstacle in the road-side obstacle list, the information of the first obstacle obtained by the road-side obstacle list is added to the effective obstacle information of the first vehicle.
In some embodiments, the data fusion apparatus 900 may further include: a sensor detection unit 940 for detecting an error sensor of the first vehicle; the second determining unit 930 may be further configured to notify the sensor detecting unit to perform detection when the abnormal obstacle proportion of the first vehicle is greater than the first proportion threshold.
In some embodiments, the sensor detection unit 940 may be specifically configured to: determining a first obstacle of a first sensor, an abnormal obstacle of the first sensor and an abnormal obstacle proportion of the first sensor according to the obstacle information perceived by the first sensor of the first vehicle and the road-end obstacle list; determining that the first sensor is in error when the abnormal obstacle proportion of the first sensor is greater than or equal to a preset second proportion threshold value; when the corresponding information of the first obstacle of the first sensor in the road-end obstacle list is not matched with the information of the first obstacle perceived by the first sensor, the first obstacle of the first sensor is an abnormal obstacle of the first sensor.
In some embodiments, the data fusion apparatus 900 may further include: and a compensation parameter determining unit 950, configured to determine a compensation parameter of the first sensor according to the information of the first obstacle perceived by the first sensor and the road-side obstacle list when the sensor detecting unit 940 detects that the first sensor of the first vehicle is an error sensor, where the compensation parameter of the first sensor is used for compensating the information perceived by the first sensor.
In some embodiments, the data fusion apparatus 900 may further include: the first transmitting unit 960 is configured to transmit, to the road-side device, vehicle-side information of the first vehicle, the vehicle-side information including own vehicle information of the first vehicle and information of an abnormal obstacle of the first vehicle.
In some embodiments, the data fusion apparatus 900 may further include: an alarm unit 970, specifically configured to alarm in one or more of the following cases: determining that a perceived state of the first vehicle is abnormal; an error sensor that detects a first vehicle; the error sensor of the first vehicle belongs to a predetermined necessary sensor.
Fig. 10 is a schematic structural diagram of another data fusion device according to an embodiment of the present application. Referring to fig. 9, the data fusion apparatus 1000 may include:
A second obtaining unit 1010, configured to obtain end information of other vehicles in a road section where the first vehicle is located, where the end information of the other vehicles includes information of the first vehicle detected by the other vehicles when the perceived state of the first vehicle is abnormal;
a fusion unit 1020, configured to obtain fusion information of the first vehicle according to information of the first vehicle detected by the other vehicles;
the second transmitting unit 1030 is configured to transmit, to the first vehicle, a fused obstacle list including fused information of the first vehicle.
In some embodiments, the second obtaining unit 1010 may be further configured to obtain the vehicle information of the first vehicle; the data fusion device 1000 may further include: a state updating unit 1040 for updating a state value of the first vehicle according to the information of the first vehicle detected by the other vehicles and the own vehicle information of the first vehicle, the state value being used for determining a perceived state of the first vehicle; the second transmitting unit 1030 is further configured to transmit a vehicle-end state list to the first vehicle, where the vehicle-end state list includes a state value of the first vehicle.
In some embodiments, the vehicle-end information of the other vehicle further includes own vehicle information of the other vehicle, and the vehicle-end state list further includes a state value of the other vehicle; the data fusion device 1000 may further include: a road-side obstacle updating unit 1050, configured to update a road-side obstacle list according to the own vehicle information and the state value of the other vehicles, the own vehicle information and the state value of the first vehicle, where the road-side obstacle list includes the own vehicle information of the vehicles with normal perception states in the road section where the first vehicle is located; the second transmitting unit 1030 may also be used to transmit the road-side obstacle list to the first vehicle and other vehicles.
In some embodiments, the data fusion apparatus 900 may be disposed in the vehicle end device 110 or implemented by the vehicle end device 110, and the data fusion apparatus 1000 may be disposed in the vehicle end device 110 or the road end device 120, or may be implemented by the vehicle end device 110 or the road end device 120. In a specific application, the data fusion device 900 and/or the data fusion device 1000 may be implemented by software, hardware, or a combination of both.
Fig. 11 shows an exemplary functional framework diagram of the vehicle end device 110. Referring to fig. 10, the vehicle-end device 110 may further include, in addition to the foregoing data fusion apparatus 900, a regulation node 1110, F sensing nodes 1120 (F is an integer greater than or equal to 1), and a fusion module 1130, where each sensing node 1120 corresponds to one or a group of sensors, and is configured to obtain sensing information according to data collected by the one or a group of sensors, where the sensing information may include vehicle information sensed by the foregoing sensors and information of an obstacle sensed by the sensors, and the fusion module 1130 is configured to obtain sensing data of a vehicle by fusing the sensing information obtained by each sensing node 1120, where the sensing data of the vehicle may include vehicle information detected by the foregoing vehicle and information of an obstacle detected by the vehicle. In some examples, the fusion module 1130 may implement its fusion function using an algorithm such as kalman filtering. And a regulation node 1110 for performing a process such as path planning according to the effective obstacle information and/or the effective positioning information of the vehicle provided by the data fusion device 900, and providing the result of the process to the controller of the vehicle, for example, in a control signal manner, so that the controller of the vehicle controls the vehicle to travel according to the planned path in response to the control signal.
In some embodiments, F sensing nodes 1120 may include, but are not limited to, a laser radar (lidar) sensing node, a radar (radar) sensing node, a camera sensing node, and the like.
In some embodiments, fig. 11 may further include: the device comprises a transmission management module 1140 and a receiving management module 1150, wherein the transmission management module 1140 is used for managing information to be sent and sending the information, and the receiving management module 1150 is used for receiving information from external equipment or modules and managing the information. In some embodiments, the first obtaining unit 910 in the data fusion apparatus 900 may be implemented by the receiving management module 1150, and the first transmitting unit 960 in the data fusion apparatus 900 may be implemented by the transmitting management module 1140.
Fig. 12 is a schematic diagram of a computing device 1200 provided by an embodiment of the present application. The computing device 1200 includes: a processor 1210, and a memory 1220.
Wherein the processor 1210 may be coupled to the memory 1220. Memory 1220 may be used to store the program codes and data. Accordingly, the memory 1220 may be a storage unit inside the processor 1210, an external storage unit independent of the processor 1210, or a component including a storage unit inside the processor 1210 and an external storage unit independent of the processor 1210.
Optionally, computing device 1200 may also include a communication interface 1230. It should be appreciated that the communication interface 1230 in the computing device 1200 shown in fig. 12 may be used to communicate with other devices.
Optionally, the computing device 1200 may also include a bus. The memory 1220 and the communication interface 1230 may be connected to the processor 1210 via a bus. For ease of illustration, only one line is shown in fig. 12 for buses, but not only one bus or one type of bus.
It should be appreciated that in embodiments of the present application, the processor 1210 may employ a central processing unit (central processing unit, CPU). The processor may also be other general purpose processors, digital signal processors (digital signal processor, DSP), application specific integrated circuits (application specific integrated circuit, ASIC), field programmable gate arrays (field programmable gate Array, FPGA), CPLDs or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. Alternatively, the processor 1210 may employ one or more integrated circuits for executing associated programs to carry out the techniques provided in embodiments of the present application.
Memory 1220 may include read-only memory and random access memory, and provides instructions and data to processor 1210. A portion of processor 1210 may also include nonvolatile random access memory. For example, processor 1210 may also store information of a device type.
When the computing device 1200 is running, the processor 1210 executes computer-executable instructions in the memory 1220 to perform the operational steps of either or both of the data fusion methods described above.
It should be understood that the computing device 1200 according to the embodiments of the present application may correspond to a respective subject performing the methods according to the embodiments of the present application, and that the above and other operations and/or functions of the respective modules in the computing device 1200 are respectively for implementing the respective flows of the methods of the embodiments, and are not described herein for brevity.
The present application also provides another computing device comprising a processor and interface circuitry, the processor accessing a memory through the interface circuitry, the memory storing program instructions that, when executed by the processor, cause the processor to perform either or both of the data fusion methods described above.
Embodiments of the present application also provide a vehicle including any of the computing devices described above.
Embodiments of the present application also provide a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, causes the processor to perform any one or both of the data fusion methods described above. Here, the computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory, a read-only memory, an erasable programmable read-only memory, an optical fiber, a portable compact disc read-only memory, an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
Embodiments of the present application also provide a computer program product comprising a computer program which, when executed by a processor, causes the processor to perform any one or both of the data fusion methods described above. Here, the programming language of the computer program product may be one or more, which may include, but is not limited to, an object-oriented programming language such as Java, C++, or the like, a conventional procedural programming language such as the "C" language, or the like.
Note that the above is only the preferred embodiments of the present application and the technical principles applied. Those skilled in the art will appreciate that the present application is not limited to the particular embodiments described herein, but is capable of numerous obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the present application. Thus, while the present application has been described in terms of the foregoing embodiments, the present application is not limited to the foregoing embodiments, but may include many other equivalent embodiments without departing from the spirit of the present application, all of which fall within the scope of the present application.

Claims (30)

  1. A method of data fusion, comprising:
    acquiring a fusion obstacle list and a vehicle end state list of a road section where a first vehicle is located, wherein the vehicle end state list comprises a state value of the first vehicle, the state value of the first vehicle is used for judging the perception state of the first vehicle, and when the perception state of the first vehicle is abnormal, the fusion obstacle list contains fusion information of the first vehicle;
    when the first vehicle perception state is judged to be abnormal according to the state value of the first vehicle, the fusion information of the first vehicle is used as effective positioning information of the first vehicle;
    The fusion information of the first vehicle is obtained by fusing information of the first vehicle detected by other vehicles in a road section where the first vehicle is located.
  2. The method according to claim 1, wherein the method further comprises:
    and when the first vehicle perception state is judged to be normal according to the state value of the first vehicle, using the vehicle information of the first vehicle as the effective positioning information of the first vehicle.
  3. The method of claim 1, wherein the step of determining the position of the substrate comprises,
    the state value of the first vehicle is used for indicating the number of first other vehicles in a road section where the first vehicle is located, and the information of the first vehicle detected by the first other vehicles is not matched with the own vehicle information of the first vehicle;
    the method further comprises the steps of: and when the state value of the first vehicle is larger than a preset state threshold value, judging that the sensing state of the first vehicle is abnormal.
  4. A method according to claim 3, wherein the first other vehicle is an other vehicle that does not meet a second preset condition comprising: the distance between the position of the first vehicle detected by the other vehicles and the position in the own vehicle information of the first vehicle is smaller than or equal to a preset second distance threshold value.
  5. The method according to any one of claims 1-4, further comprising:
    acquiring a road-end obstacle list of a road section where the first vehicle is located, wherein the road-end obstacle list comprises own vehicle information of vehicles with normal perception states in the road section where the first vehicle is located;
    and obtaining effective obstacle information of the first vehicle according to the road-end obstacle list and the information of the obstacle detected by the first vehicle.
  6. The method of claim 5, wherein the obtaining effective obstacle information for the first vehicle based on the list of roadside obstacle and information for the obstacle detected by the first vehicle comprises:
    determining a first obstacle of the first vehicle and an abnormal obstacle proportion of the first vehicle according to the road-end obstacle list and the information of the obstacles detected by the first vehicle;
    determining information of the first obstacle according to the road-side obstacle list and information of the obstacle detected by the first vehicle when the abnormal obstacle proportion of the first vehicle is smaller than or equal to a preset first proportion threshold value, and adding the information of the first obstacle to effective obstacle information of the first vehicle;
    And when the corresponding information of the first obstacle of the first vehicle in the road-end obstacle list is not matched with the information of the first obstacle detected by the first vehicle, the first obstacle of the first vehicle is an abnormal obstacle of the first vehicle.
  7. The method of claim 6, wherein the abnormal obstacle is the first obstacle that does not meet a first preset condition, the first preset condition comprising: and the distance between the position of the first obstacle obtained through the road-end obstacle list and the position of the first obstacle detected by the first vehicle is smaller than or equal to a preset first distance threshold value.
  8. The method of claim 5, wherein the obtaining effective obstacle information for the first vehicle based on the list of roadside obstacle and information for the obstacle detected by the first vehicle further comprises:
    for the obstacles other than the first obstacle in the road-side obstacle list, adding information of the first obstacle obtained through the road-side obstacle list to effective obstacle information of the first vehicle.
  9. The method according to any one of claims 6-8, further comprising: detecting an error sensor of the first vehicle when the abnormal obstacle proportion of the first vehicle is greater than the first proportion threshold.
  10. The method of claim 9, wherein the detecting the error sensor of the first vehicle comprises:
    determining a first obstacle of the first sensor, an abnormal obstacle of the first sensor and an abnormal obstacle proportion of the first sensor according to the obstacle information perceived by the first sensor of the first vehicle and the road-side obstacle list;
    determining that the first sensor is in error when the abnormal obstacle proportion of the first sensor is greater than or equal to a preset second proportion threshold;
    when the corresponding information of the first obstacle of the first sensor in the road-end obstacle list is not matched with the information of the first obstacle perceived by the first sensor, the first obstacle of the first sensor is an abnormal obstacle of the first sensor.
  11. The method as recited in claim 10, further comprising: and determining compensation parameters of the first sensor according to the information of the first obstacle perceived by the first sensor and the road-end obstacle list, wherein the compensation parameters of the first sensor are used for compensating the information perceived by the first sensor.
  12. The method according to any one of claims 6-11, further comprising:
    and transmitting the vehicle-end information of the first vehicle to road-end equipment, wherein the vehicle-end information comprises the vehicle-self information of the first vehicle and the information of the abnormal obstacle of the first vehicle.
  13. The method according to any one of claims 1-12, further comprising:
    alert in one or more of the following:
    determining that the perceived state of the first vehicle is abnormal;
    an error sensor that detects the first vehicle;
    the error sensor of the first vehicle belongs to a predetermined necessary sensor.
  14. A method of data fusion, comprising:
    acquiring vehicle end information of other vehicles in a road section where a first vehicle is located, wherein when the first vehicle perception state is abnormal, the vehicle end information of the other vehicles comprises information of the first vehicle detected by the other vehicles;
    acquiring fusion information of a first vehicle according to information of the first vehicle detected by the other vehicles;
    and sending a fusion obstacle list to the first vehicle, wherein the fusion obstacle list comprises fusion information of the first vehicle.
  15. The method as recited in claim 14, further comprising:
    Acquiring own vehicle information of the first vehicle;
    updating a state value of the first vehicle according to the information of the first vehicle detected by the other vehicles and the own vehicle information of the first vehicle, wherein the state value is used for judging the perception state of the first vehicle;
    and sending a vehicle end state list to the first vehicle, wherein the vehicle end state list comprises state values of the first vehicle.
  16. The method of claim 15, wherein the step of determining the position of the probe is performed,
    the vehicle-end information of the other vehicles further comprises the vehicle-self information of the other vehicles, and the vehicle-end state list further comprises state values of the other vehicles;
    the method further comprises the steps of:
    updating a road-end obstacle list according to the own vehicle information and the state value of the other vehicles and the own vehicle information and the state value of the first vehicle, wherein the road-end obstacle list comprises the own vehicle information of vehicles with normal perception states in a road section where the first vehicle is located;
    and sending the road-side obstacle list to the first vehicle and the other vehicles.
  17. A data fusion device, comprising:
    a first obtaining unit, configured to obtain a fused obstacle list and a vehicle-end state list of a road section where a first vehicle is located, where the vehicle-end state list includes a state value of the first vehicle, where the state value of the first vehicle is used to determine a perceived state of the first vehicle, and when the perceived state of the first vehicle is abnormal, the fused obstacle list includes fused information of the first vehicle;
    A first determining unit configured to use fusion information of the first vehicle as effective positioning information of the first vehicle when determining that the first vehicle perceived state is abnormal according to the state value of the first vehicle;
    the fusion information of the first vehicle is obtained by fusing information of the first vehicle detected by other vehicles in a road section where the first vehicle is located.
  18. The apparatus according to claim 17, wherein the first determining unit is further configured to use the vehicle information of the first vehicle as the effective positioning information of the first vehicle when it is determined that the first vehicle perceived state is normal based on the state value of the first vehicle.
  19. The device according to claim 17 or 18, wherein,
    the first obtaining unit is further configured to obtain a road-side obstacle list of a road section where the first vehicle is located, where the road-side obstacle list includes vehicle information of vehicles with normal perception states in the road section where the first vehicle is located;
    the apparatus further comprises: and a second determining unit, configured to obtain effective obstacle information of the first vehicle according to the road-side obstacle list and information of the obstacle detected by the first vehicle.
  20. The apparatus of claim 19, wherein the second determining unit is further configured to: for the obstacles other than the first obstacle in the road-side obstacle list, adding information of the first obstacle obtained through the road-side obstacle list to effective obstacle information of the first vehicle.
  21. The device according to claim 19 or 20, wherein,
    the apparatus further comprises: a sensor detection unit configured to detect an error sensor of the first vehicle;
    the second determining unit is further configured to notify the sensor detecting unit to perform the detection when the abnormal obstacle proportion of the first vehicle is greater than the first proportion threshold.
  22. The apparatus of claim 21, wherein the apparatus further comprises:
    and the compensation parameter determining unit is used for determining the compensation parameter of the first sensor according to the information of the first obstacle perceived by the first sensor and the road-end obstacle list when the sensor detecting unit detects that the first sensor of the first vehicle is an error sensor, wherein the compensation parameter of the first sensor is used for compensating the information perceived by the first sensor.
  23. The apparatus according to any one of claims 19-22, wherein the apparatus further comprises:
    a first sending unit, configured to send, to a road-side device, vehicle-side information of the first vehicle, where the vehicle-side information includes vehicle-by-vehicle information of the first vehicle and information of an abnormal obstacle of the first vehicle.
  24. The apparatus according to any one of claims 17-23, wherein the apparatus further comprises:
    an alarm unit, specifically configured to alarm in one or more of the following cases:
    determining that the perceived state of the first vehicle is abnormal;
    an error sensor that detects the first vehicle;
    the error sensor of the first vehicle belongs to a predetermined necessary sensor.
  25. A data fusion device, comprising:
    the second acquisition unit is used for acquiring vehicle end information of other vehicles in a road section where the first vehicle is located, and when the first vehicle perception state is abnormal, the vehicle end information of the other vehicles comprises information of the first vehicle detected by the other vehicles;
    the fusion unit is used for obtaining fusion information of the first vehicle according to the information of the first vehicle detected by the other vehicles;
    And the second sending unit is used for sending a fusion obstacle list to the first vehicle, wherein the fusion obstacle list comprises fusion information of the first vehicle.
  26. The apparatus of claim 25, wherein the device comprises a plurality of sensors,
    the second acquisition unit is further used for acquiring own vehicle information of the first vehicle;
    the apparatus further comprises: a state updating unit, configured to update a state value of a first vehicle according to information of the first vehicle detected by the other vehicle and own vehicle information of the first vehicle, where the state value is used to determine a perceived state of the first vehicle;
    the second sending unit is further configured to send a vehicle end state list to the first vehicle, where the vehicle end state list includes a state value of the first vehicle.
  27. A computing device comprising at least one processor and at least one memory storing program instructions that, when executed by the at least one processor, cause the at least one processor to perform the method of any one of claims 1-13 and/or the method of any one of claims 14-16.
  28. A computing device comprising a processor and interface circuitry, the processor accessing a memory through the interface circuitry, the memory storing program instructions that, when executed by the processor, cause the processor to perform the method of any of claims 1-13 and/or the method of any of claims 14-16.
  29. A computer readable storage medium having stored thereon program instructions, which when executed by a computer cause the computer to perform the method of any of claims 1-13 or the method of any of claims 14-16.
  30. A vehicle comprising the computing device of claim 27 or the computing device of claim 28.
CN202180018741.0A 2021-12-02 2021-12-02 Data fusion method, device, equipment, storage medium and vehicle Pending CN117795579A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/135177 WO2023097626A1 (en) 2021-12-02 2021-12-02 Data fusion method and apparatus, and device, storage medium and vehicle

Publications (1)

Publication Number Publication Date
CN117795579A true CN117795579A (en) 2024-03-29

Family

ID=86611313

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180018741.0A Pending CN117795579A (en) 2021-12-02 2021-12-02 Data fusion method, device, equipment, storage medium and vehicle

Country Status (2)

Country Link
CN (1) CN117795579A (en)
WO (1) WO2023097626A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102015202837A1 (en) * 2014-02-20 2015-08-20 Ford Global Technologies, Llc Error handling in an autonomous vehicle
US9953535B1 (en) * 2016-06-27 2018-04-24 Amazon Technologies, Inc. Annotated virtual track to inform autonomous vehicle control
US11260876B2 (en) * 2018-11-05 2022-03-01 Robert Bosch Gmbh Inter-vehicle sensor validation using senso-fusion network
JP7382791B2 (en) * 2019-10-30 2023-11-17 株式会社日立製作所 Abnormality determination device, vehicle support system
CN112085960A (en) * 2020-09-21 2020-12-15 北京百度网讯科技有限公司 Vehicle-road cooperative information processing method, device and equipment and automatic driving vehicle

Also Published As

Publication number Publication date
WO2023097626A1 (en) 2023-06-08

Similar Documents

Publication Publication Date Title
CN109739236B (en) Vehicle information processing method and device, computer readable medium and electronic equipment
CN107972670B (en) Virtual map of vehicle
JP6699230B2 (en) Road abnormality warning system and in-vehicle device
US20210014643A1 (en) Communication control device, communication control method, and computer program
US20180090009A1 (en) Dynamic traffic guide based on v2v sensor sharing method
KR101500472B1 (en) Method and apparatus for transmitting vehicle-related information in and out of a vehicle
US9373255B2 (en) Method and system for producing an up-to-date situation depiction
US10540895B2 (en) Management of mobile objects
JP7459276B2 (en) Navigation methods and devices
US20230227057A1 (en) Reminding method and apparatus in assisted driving, reminding method and apparatus in map-assisted driving, and map
US11244565B2 (en) Method and system for traffic behavior detection and warnings
CN111699523B (en) Information generation device, information generation method, computer program, and in-vehicle device
CN108399792A (en) A kind of automatic driving vehicle preventing collision method, device and electronic equipment
EP3994423B1 (en) Collecting user-contributed data relating to a navigable network
JP2020027645A (en) Server, wireless communication method, computer program, and on-vehicle device
JP6903598B2 (en) Information processing equipment, information processing methods, information processing programs, and mobiles
KR101850254B1 (en) Inter-Vehicle Communication System For Supporting Connected Car Environment
CN114503176A (en) Method and electronic equipment for knowing own position
JP2020091652A (en) Information providing system, server, and computer program
JP2019079453A (en) Information generation system, information generation apparatus, information generation method, and computer program
CN113811930A (en) Information processing apparatus, information processing method, and program
KR20210057276A (en) Navigation apparatus and individualization map service method thereof
US11897510B2 (en) Estimating trip duration based on vehicle reroute probabilities
CN117795579A (en) Data fusion method, device, equipment, storage medium and vehicle
US20210200241A1 (en) Mobility information provision system, server, and vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination