CN113012429B - Vehicle road multi-sensor data fusion method and system - Google Patents

Vehicle road multi-sensor data fusion method and system Download PDF

Info

Publication number
CN113012429B
CN113012429B CN202110201054.4A CN202110201054A CN113012429B CN 113012429 B CN113012429 B CN 113012429B CN 202110201054 A CN202110201054 A CN 202110201054A CN 113012429 B CN113012429 B CN 113012429B
Authority
CN
China
Prior art keywords
data
collected data
error map
collected
forming
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110201054.4A
Other languages
Chinese (zh)
Other versions
CN113012429A (en
Inventor
常雪阳
宣智渊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yunkong Zhihang Shanghai Automotive Technology Co ltd
Original Assignee
Yunkong Zhihang Shanghai Automotive Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yunkong Zhihang Shanghai Automotive Technology Co ltd filed Critical Yunkong Zhihang Shanghai Automotive Technology Co ltd
Priority to CN202110201054.4A priority Critical patent/CN113012429B/en
Publication of CN113012429A publication Critical patent/CN113012429A/en
Application granted granted Critical
Publication of CN113012429B publication Critical patent/CN113012429B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0116Measuring and analyzing of parameters relative to traffic conditions based on the source of data from roadside infrastructure, e.g. beacons
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/38Services specially adapted for particular environments, situations or purposes for collecting sensor information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/44Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/70Services for machine-to-machine communication [M2M] or machine type communication [MTC]

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention relates to the technical field of intelligent driving, in particular to a method and a system for fusing vehicle road multi-sensor data, wherein the method for fusing the vehicle road multi-sensor data comprises the following steps: acquiring each first acquisition data, each second acquisition data and each third acquisition data, and synchronously processing the first acquisition data, the second acquisition data and the third acquisition data; respectively performing target matching according to the first collected data, the second collected data and the third collected data to form matching results, and forming a first error map and/or a second error map according to the matching results; and forming a fused data output according to the matching result and/or the first error map and the second error map.

Description

Vehicle road multi-sensor data fusion method and system
Technical Field
The invention relates to the technical field of intelligent driving, in particular to a method and a system for fusing vehicle road multi-sensor data.
Background
In order to further promote the development of intelligent networked automobiles and strengthen the innovative cooperation of automobile networking and intelligent technology, the vehicle-road cloud integrated fusion control system has the characteristics of vehicle-road cloud ubiquitous interconnection, traffic full-factor digital mapping, application unified arrangement, efficient computing scheduling, high reliability of system operation and the like through a cloud control basic platform for uniformly acquiring and processing vehicle-road traffic dynamic data. A roadside sensing system combined with a cloud technology can efficiently serve the fields of intelligent traffic systems and automatic driving. With the gradual maturity of the automatic driving technology, natural defects such as a perception blind area, short sight distance and the like which are difficult to overcome exist in a single-vehicle automatic driving technical route in various scenes, and the vehicle safety is influenced to a certain extent. Therefore, the roadside sensing equipment plays a certain auxiliary role in the automatic driving of the single vehicle through the cooperative sensing function of the traffic participants such as pedestrians, vehicles and the like. The automatic driving automobile can sense a target in the environment near the automobile by using the vehicle-mounted sensor, but has the problems of limited sensing distance and blind sensing area. The roadside sensor can stably sense a fixed area, a blind area is smaller than vehicle-mounted sensing, but in an area with a long distance, the precision is always worse. Therefore, a wide range of high-precision sensing has not been achieved. The effective solution is to fuse the sensing data of a plurality of automatic driving automobiles and the sensing data of the roadside sensors in real time in a network connection mode, so that the sensing range is enlarged, the sensing blind area is reduced, and the sensing precision is improved. Chinese patent publication No. CN111770451A proposes a method for improving sensing performance by matching and weighting fusion of vehicle-side and roadside sensing data, and the basis of improving sensing fusion precision lies in determining errors of vehicle-side and roadside sensing data. This error varies with sensor, environment and target conditions. The error is determined by theoretical analysis or a special experiment, and is difficult to be applied to various situations. The existing method has insufficient estimation accuracy on errors under various scenes and working conditions.
Disclosure of Invention
Based on the above disadvantages, the present invention provides a method and a system for fusing vehicle road multi-sensor data, specifically:
on one hand, the invention provides a method and a system for fusing vehicle road multi-sensor data, which comprises the steps of acquiring each first acquired data (roadside perception data), each second acquired data (vehicle end state data) and each third acquired data (vehicle end perception data), and synchronously processing the first acquired data, the second acquired data and the third acquired data;
respectively performing target matching according to the first collected data, the second collected data and the third collected data to form matching results, and forming a first error map and/or a second error map according to the matching results;
and forming a fused data output according to the matching result and/or the first error map and the second error map.
Preferably, the method for fusing vehicle-road multi-sensor data includes performing target matching on the first collected data, the second collected data, and the third collected data to form a matching result, and forming a first error map and/or a second error map according to the matching result specifically includes:
calculating first difference data between the first acquisition data and the second acquisition data under the condition that the first acquisition data is matched with the second acquisition data, and forming a first parameter according to the first difference data and the second acquisition data;
and forming or updating a first error map matched with the first parameters according to the first parameters.
Preferably, the method for fusing vehicle-road multi-sensor data includes performing target matching on the first collected data, the second collected data, and the third collected data to form a matching result, and forming a first error map and/or a second error map according to the matching result specifically includes:
calculating second difference data between the third acquired data and the second acquired data in a state that the third acquired data matches the second acquired data;
forming a second parameter according to the second difference data and second acquired data;
and forming or updating a second error map matched with the second parameters according to the second parameters.
Preferably, the above method for fusing vehicle road multi-sensor data, wherein forming a fused data output according to the matching result and/or the first error map and the second error map specifically includes:
and forming the fused data output according to the second acquired data in the state that the first acquired data or the third acquired data is matched with the second acquired data.
Preferably, in the above vehicle road multi-sensor data fusion method, the first collected data, the second collected data and the third collected data are respectively subjected to target matching to form matching results, and a first error map and/or a second error map are formed according to the matching results;
calculating third difference data between the first collected data and the third collected data in a state that the first collected data matches the third collected data;
obtaining a current parameter estimation value according to the third acquisition data, obtaining a second parameter matched with the parameter estimation value under the condition that the parameter estimation value is matched with the first error map,
forming a third parameter according to the second parameter and third difference data;
and forming and updating the second error map according to the third parameters.
Preferably, the above method for fusing vehicle road multi-sensor data, wherein forming a fused data output according to the matching result and/or the first error map and the second error map specifically includes:
forming the fused data output from the first error map and the second error map.
In still another aspect, the present invention further provides a vehicle road multi-sensor data fusion system, wherein,
the road side sensing unit is used for acquiring first acquisition data;
the vehicle end sensing unit acquires second acquired data;
the vehicle end sensing unit acquires third acquired data;
the synchronous processing unit is used for carrying out synchronous processing according to the first acquired data, the second acquired data and the third acquired data;
the map forming unit is used for respectively performing target matching according to the first collected data, the second collected data and the third collected data to form a matching result, and forming a first error map and/or a second error map according to the matching result;
and the fusion unit forms a fusion data output according to the matching result and/or the first error map and the second error map.
In still another aspect, the present invention further provides a computer-readable storage medium, on which a computer program is stored, wherein the program is executed by a processor to perform the above-mentioned method for vehicle road multi-sensor data fusion.
In another aspect, the present invention further provides an electronic device, which includes a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements a vehicle-road multi-sensor data fusion method as described above when executing the computer program.
Compared with the prior art, the invention has the advantages that:
by the vehicle-road multi-sensor data fusion method, errors required by fusion are constructed without priori knowledge, and convenience and rapidity of technical application and deployment are improved. Meanwhile, the method has good adaptability to various conditions, can obtain more accurate error estimation, and further can support realization of higher-precision perception fusion.
Drawings
Fig. 1 is a schematic flow chart of a vehicle road multi-sensor data fusion method according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of a vehicle-road multi-sensor data fusion method according to an embodiment of the present invention;
fig. 3 is a schematic flow chart of a vehicle road multi-sensor data fusion method according to an embodiment of the present invention;
fig. 4 is a schematic flow chart of a vehicle-road multi-sensor data fusion method according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
Example one
As shown in fig. 1, in one aspect, the present invention provides a method for fusing vehicle road multi-sensor data, including,
step S110, acquiring each first acquisition data, each second acquisition data and each third acquisition data, and synchronously processing the first acquisition data, the second acquisition data and the third acquisition data; the first acquisition data is road side sensing data acquired by a road side sensor, the road side sensing data at least comprises position information, speed information and course information of vehicle running, the second acquisition data is vehicle end state data formed by high-precision positioning data and vehicle motion state data acquired from an internet vehicle, and the third acquisition data is vehicle end sensing data of a vehicle external environment object state acquired by a vehicle-mounted sensor.
And performing time-space synchronous processing on the first collected data, the second collected data and the third collected data to ensure that the first collected data, the second collected data and the third collected data are in the same time.
Step S120, respectively performing target matching according to the first collected data, the second collected data and the third collected data to form matching results, and forming a first error map and/or a second error map according to the matching results; specifically, the method comprises the following steps: and respectively performing target matching according to the first collected data, the second collected data and the third collected data to form matching results, wherein the matching results can be that the first collected data is matched with the second collected data, the second collected data is matched with the third collected data, and the first collected data is matched with the second collected data.
When the first collected data is matched with the second collected data, the specific steps are as follows:
as shown in fig. 2, in step S12011, in a state where the first collected data matches the second collected data, first difference data between the first collected data and the second collected data is calculated, and since the second collected data is vehicle-end state data and has a relatively high accuracy, the second collected data is used as reference data to calculate the first difference data between the first collected data and the second collected data, and the first difference data is used as an error estimation value of the first collected data.
Step S12012, forming a first parameter according to the first difference data and the second collected data; specifically, a particular operating condition parameter may be determined based on the plurality of first collected data error estimates and the second collected data.
Step S12013, forming or updating a first error map matched with the first parameter according to the first parameter. And counting errors of the first acquired data under different working condition parameters, and then forming a first error map according to the errors of the first acquired data under different working condition parameters, or updating the existing first error map according to the errors of the first acquired data under different working condition parameters. The first error map may be a roadside perception error map. The roadside perception error map describes the perception error of the roadside sensor on objects in different positions and different motion states under a coordinate system of the roadside sensor. Statistical parameters of the error, such as mean and variance, are obtained by calculation.
As shown in fig. 2, when the third collected data matches the second collected data, the specific steps are as follows:
step S12021, calculating second difference data between the third collected data and the second collected data in a state where the third collected data matches the second collected data; and calculating second difference data between the third acquired data and the second acquired data by taking the second acquired data as reference data, wherein the second difference data is used as an error estimation value of the second acquired data. For example, the vehicle a senses the vehicle B to form third collected data, the vehicle B has vehicle end state data, the vehicle end state data of the vehicle B is the second collected data, and then the data (the third collected data) of the vehicle a sensing the vehicle B and the vehicle end state data (the second collected data) of the vehicle B are subtracted to form second difference data.
Step S12022, forming a second parameter according to the second difference data and second collected data; and the second difference data is used as an error estimation value of the third acquired data, working condition parameters are formed according to the error estimation values of the plurality of third acquired data and the second acquired data, and errors of the third acquired data under different working condition parameters are counted to form the second parameters.
Step S12023, forming or updating a second error map matched with the second parameter according to the second parameter.
And counting corresponding second parameters of the third acquired data under different working condition parameters, and further forming a second error map according to the second parameters under different working condition parameters, or updating the existing second error map according to the second parameters under different working condition parameters. The second error map may be a vehicle-end perception error map, where the vehicle-end perception error map describes perception errors of the vehicle-end sensor for objects at different positions and in different relative motion states in the vehicle coordinate system. Statistical parameters of the error, such as mean and variance, are obtained by calculation.
As shown in fig. 3, when the third collected data matches the first collected data, the specific steps are as follows:
step S12031, calculating third difference data between the first collected data and the third collected data in a state where the first collected data matches the third collected data; the third difference data may be an error estimation value of the first collected data relative to the third collected data, that is, a vehicle-end perception relative error estimation value of the vehicle-mounted sensor under a vehicle-end perception working condition corresponding to the perception target.
Step S12032, obtaining a current parameter estimation value according to the third acquired data, and obtaining a second parameter matched with the parameter estimation value in a state where the parameter estimation value is matched with the first error map, that is, if a value (second parameter) under a roadside sensing condition corresponding to a vehicle-end sensing condition exists in the roadside sensing error map.
Step S12033, forming a third parameter according to the second parameter and the third difference data; illustratively, according to the value and a plurality of vehicle-end perception relative error estimated values under corresponding vehicle-end perception conditions, the relative error estimated values are third parameters.
And step S12034, forming and updating the second error map according to the third parameters. And counting the vehicle-end sensing errors, and updating a vehicle-end sensing error map describing the vehicle-end sensing errors under different vehicle-end sensing working conditions.
It should be noted that, errors exist in both the first acquired data and the third acquired data, and the second error map is formed or updated based on the data with errors. The reason is that for a certain working condition, assuming that the vehicle-side and roadside sensing errors are both in accordance with normal distribution, the difference value between roadside sensing data and the vehicle-side sensing data is the difference value between the two errors, and is also in accordance with normal distribution, the normal distribution parameters of the roadside sensing errors are obtained through the established first error map, the normal distribution parameters of the difference value are obtained after statistics continuously according to the difference value between the two errors, and the normal distribution parameters of the vehicle-side sensing errors can be obtained under the working condition by utilizing the normal distribution parameters of the roadside sensing errors.
As shown in fig. 4, step S130 forms a fused data output according to the matching result and/or the first error map and the second error map.
Specifically, the method comprises the following steps: and 1301, forming the fused data output according to the second acquired data in the state that the first acquired data or the third acquired data is matched with the second acquired data. Or alternatively
Step 1302, forming the fused data output according to the first error map and the second error map. Illustratively, the perception error map may record a perception error mean value and a variance under different working conditions, and may be used for fusion, where a specific fusion method is, for example, a direct weighted summation method is adopted, the weight sum is 1, and the larger the error is, the smaller the corresponding weight is, for example, the roadside perception error is 3, and the vehicle-end perception error is 5, and the fusion data is (5/8) roadside perception data + (3/8) vehicle-end perception data.
By the vehicle-road multi-sensor data fusion method, errors required by fusion are built without priori knowledge, and convenience and rapidity of technical application and deployment are improved. Meanwhile, the method has good adaptability to various scenes and working conditions, can obtain more accurate perception error estimation, and can further support and realize higher-precision perception fusion.
Example two
In still another aspect, the present invention further provides a vehicle road multi-sensor data fusion system, wherein,
the road side sensing unit is used for acquiring first acquisition data;
the vehicle end sensing unit acquires second acquired data;
the vehicle end sensing unit acquires third acquired data;
the synchronous processing unit is used for carrying out synchronous processing according to the first acquired data, the second acquired data and the third acquired data;
the map forming unit is used for respectively carrying out target matching according to the first collected data, the second collected data and the third collected data to form a matching result, and forming a first error map and/or a second error map according to the matching result;
and the fusion unit forms a fusion data output according to the matching result and/or the first error map and the second error map.
The working principle of the vehicle-road multi-sensor data fusion system is the same as that of the vehicle-road multi-sensor data fusion method provided in the first embodiment, and details are not described here.
EXAMPLE III
In another aspect, the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements any one of the above road side awareness management methods based on a cross-point location, specifically: under the state of acquiring first acquisition data and second acquisition data, forming first prediction data according to the first acquisition data, and forming second prediction data according to the second acquisition data;
calculating and forming the similarity of each target according to the first prediction data and the second prediction data, and forming a matching set according to the similarity;
and acquiring a group of matching data with the maximum similarity, and acquiring a matching target object according to the matching data.
Storage medium-any of various types of memory devices or storage devices. The term "storage medium" is intended to include: mounting media such as CD-ROM, floppy disk, or tape devices; computer system memory or random access memory such as DRAM, DDR RAM, SRAM, EDO RAM, Lanbas (Rambus) RAM, etc.; non-volatile memory such as flash memory, magnetic media (e.g., hard disk or optical storage); registers or other similar types of memory elements, etc. The storage medium may also include other types of memory or combinations thereof. In addition, the storage medium may be located in the computer system in which the program is executed, or may be located in a different second computer system connected to the computer system through a network (such as the internet). The second computer system may provide the program instructions to the computer for execution. The term "storage medium" may include two or more storage media that may reside in different locations, such as in different computer systems that are connected via a network. The storage medium may store program instructions (e.g., embodied as a computer program) that are executable by one or more processors.
Of course, the storage medium provided in the embodiments of the present application contains computer-executable instructions, and the computer-executable instructions are not limited to the wireless device testing operations described above, and may also perform related operations in the wireless device testing method provided in any embodiments of the present application.
Example four
The embodiment of the application provides electronic equipment, and the wireless equipment testing device provided by the embodiment of the application can be integrated into the electronic equipment. Fig. 5 is a schematic structural diagram of an electronic device according to a fourth embodiment of the present application. As shown in fig. 5, the present embodiment provides an electronic device 400, which includes: one or more processors 420; storage 410 to store one or more programs that, when executed by the one or more processors 420, cause the one or more processors 420 to implement:
under the state of acquiring first acquisition data and second acquisition data, forming first prediction data according to the first acquisition data, and forming second prediction data according to the second acquisition data;
calculating and forming the similarity of each target according to the first prediction data and the second prediction data, and forming a matching set according to the similarity;
and acquiring a group of matching data with the maximum similarity, and acquiring a matching target object according to the matching data.
As shown in fig. 5, the electronic device 400 includes a processor 420, a storage device 410, an input device 430, and an output device 440; the number of the processors 420 in the electronic device may be one or more, and one processor 420 is taken as an example in fig. 5; the processor 420, the storage device 410, the input device 430, and the output device 440 in the electronic apparatus may be connected by a bus or other means, and are exemplified by a bus 450 in fig. 5.
The storage device 410 is used as a computer-readable storage medium for storing software programs, computer executable programs, and module units, such as program instructions corresponding to the wireless device testing method in the embodiments of the present application.
The storage device 410 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal, and the like. Further, the storage 410 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, storage 410 may further include memory located remotely from processor 420, which may be connected via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input means 430 may be used to receive input numerals, character information, or voice information, and to generate key signal inputs related to user settings and function control of the electronic device. The output device 440 may include a display screen, speakers, etc.
It is to be noted that the foregoing description is only exemplary of the invention and that the principles of the technology may be employed. Those skilled in the art will appreciate that the present invention is not limited to the particular embodiments described herein, and that various obvious changes, rearrangements and substitutions will now be apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (6)

1. A vehicle road multi-sensor data fusion method is characterized by comprising the following steps:
acquiring each first acquisition data, each second acquisition data and each third acquisition data, and synchronously processing the first acquisition data, the second acquisition data and the third acquisition data;
respectively performing target matching according to the first collected data, the second collected data and the third collected data to form matching results, and forming a first error map and/or a second error map according to the matching results; specifically, in a state where the first collected data matches the third collected data, third difference data between the first collected data and the third collected data is calculated; acquiring a current parameter estimation value according to the third acquired data, acquiring a second parameter matched with the parameter estimation value under the condition that the parameter estimation value is matched with the first error map, and forming a third parameter according to the second parameter and third difference data; forming and updating the second error map based on the third parameters,
forming a fused data output according to the second collected data or forming the fused data output according to the first error map and the second error map in a state where the first collected data or the third collected data matches the second collected data,
the first collected data are road side sensing data, the second collected data are vehicle end state data, and each third collected data is vehicle end sensing data.
2. The vehicle road multi-sensor data fusion method according to claim 1, wherein the respectively performing target matching according to the first collected data, the second collected data and the third collected data to form a matching result, and the forming of the first error map and/or the second error map according to the matching result specifically comprises:
calculating first difference data between the first collected data and the second collected data in a state where the first collected data matches the second collected data,
forming a first parameter according to the first difference data and the second collected data;
and forming or updating a first error map matched with the first parameters according to the first parameters.
3. The vehicle road multi-sensor data fusion method according to claim 1, wherein the respectively performing target matching according to the first collected data, the second collected data and the third collected data to form a matching result, and the forming of the first error map and/or the second error map according to the matching result specifically comprises:
calculating second difference data between the third acquired data and the second acquired data in a state that the third acquired data matches the second acquired data;
forming a second parameter according to the second difference data and second acquired data;
and forming or updating a second error map matched with the second parameters according to the second parameters.
4. A vehicle road multi-sensor data fusion system is characterized by comprising a roadside sensing unit, a road side sensing unit and a road side data fusion unit, wherein the roadside sensing unit is used for acquiring first collected data;
the vehicle-end sensing unit acquires second acquired data;
the vehicle-end sensing unit acquires third acquired data;
the synchronous processing unit is used for carrying out synchronous processing according to the first acquired data, the second acquired data and the third acquired data;
the map forming unit is used for respectively carrying out target matching according to the first collected data, the second collected data and the third collected data to form a matching result, and forming a first error map and/or a second error map according to the matching result; calculating third difference data between the first collected data and the third collected data in a state that the first collected data matches the third collected data;
acquiring a current parameter estimation value according to the third acquired data, acquiring a second parameter matched with the parameter estimation value under the condition that the parameter estimation value is matched with the first error map, and forming a third parameter according to the second parameter and third difference data; forming and updating the second error map according to the third parameters;
a fusion unit configured to form a fusion data output according to the second acquired data or form the fusion data output according to the first error map and the second error map in a state where the first acquired data or the third acquired data matches the second acquired data,
the first collected data are road side sensing data, the second collected data are vehicle end state data, and each third collected data is vehicle end sensing data.
5. A computer-readable storage medium, on which a computer program is stored, which program, when being executed by a processor, is adapted to carry out a method of vehicle route multi-sensor data fusion according to any one of claims 1-3.
6. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements a vehicle route multi-sensor data fusion method according to any one of claims 1 to 3 when executing the computer program.
CN202110201054.4A 2021-02-23 2021-02-23 Vehicle road multi-sensor data fusion method and system Active CN113012429B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110201054.4A CN113012429B (en) 2021-02-23 2021-02-23 Vehicle road multi-sensor data fusion method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110201054.4A CN113012429B (en) 2021-02-23 2021-02-23 Vehicle road multi-sensor data fusion method and system

Publications (2)

Publication Number Publication Date
CN113012429A CN113012429A (en) 2021-06-22
CN113012429B true CN113012429B (en) 2022-07-15

Family

ID=76407378

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110201054.4A Active CN113012429B (en) 2021-02-23 2021-02-23 Vehicle road multi-sensor data fusion method and system

Country Status (1)

Country Link
CN (1) CN113012429B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116552559A (en) 2022-01-29 2023-08-08 通用汽车环球科技运作有限责任公司 System and method for detecting abnormal behavior based on fused data in automatic driving system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2921878A1 (en) * 2014-03-17 2015-09-23 BAE Systems PLC Producing data describing target states
CN109211247A (en) * 2017-06-30 2019-01-15 张晓璇 A kind of space-time partition model and its application method
CN109405824A (en) * 2018-09-05 2019-03-01 武汉契友科技股份有限公司 A kind of multi-source perceptual positioning system suitable for intelligent network connection automobile
WO2020135810A1 (en) * 2018-12-29 2020-07-02 华为技术有限公司 Multi-sensor data fusion method and device
CN111783502A (en) * 2019-04-03 2020-10-16 长沙智能驾驶研究院有限公司 Visual information fusion processing method and device based on vehicle-road cooperation and storage medium
CN112085960A (en) * 2020-09-21 2020-12-15 北京百度网讯科技有限公司 Vehicle-road cooperative information processing method, device and equipment and automatic driving vehicle

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108762245B (en) * 2018-03-20 2022-03-25 华为技术有限公司 Data fusion method and related equipment
CN111652914B (en) * 2019-02-15 2022-06-24 魔门塔(苏州)科技有限公司 Multi-sensor target fusion and tracking method and system
CN111770451B (en) * 2020-05-26 2022-02-18 同济大学 Road vehicle positioning and sensing method and device based on vehicle-road cooperation

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2921878A1 (en) * 2014-03-17 2015-09-23 BAE Systems PLC Producing data describing target states
CN109211247A (en) * 2017-06-30 2019-01-15 张晓璇 A kind of space-time partition model and its application method
CN109405824A (en) * 2018-09-05 2019-03-01 武汉契友科技股份有限公司 A kind of multi-source perceptual positioning system suitable for intelligent network connection automobile
WO2020135810A1 (en) * 2018-12-29 2020-07-02 华为技术有限公司 Multi-sensor data fusion method and device
CN111783502A (en) * 2019-04-03 2020-10-16 长沙智能驾驶研究院有限公司 Visual information fusion processing method and device based on vehicle-road cooperation and storage medium
CN112085960A (en) * 2020-09-21 2020-12-15 北京百度网讯科技有限公司 Vehicle-road cooperative information processing method, device and equipment and automatic driving vehicle

Also Published As

Publication number Publication date
CN113012429A (en) 2021-06-22

Similar Documents

Publication Publication Date Title
CN111770451B (en) Road vehicle positioning and sensing method and device based on vehicle-road cooperation
GB2547999A (en) Tracking objects within a dynamic environment for improved localization
CN109612474B (en) Map road matching method, map road matching device, map road matching server and storage medium
CN109435940B (en) Method, device and system for identifying highway lane
CN114111775B (en) Multi-sensor fusion positioning method and device, storage medium and electronic equipment
CN112086010A (en) Map generation method, map generation device, map generation equipment and storage medium
CN113537362A (en) Perception fusion method, device, equipment and medium based on vehicle-road cooperation
AU2018278948A1 (en) A system to optimize scats adaptive signal system using trajectory data
CN113012429B (en) Vehicle road multi-sensor data fusion method and system
US20200035097A1 (en) Parking lot information management system, parking lot guidance system, parking lot information management program, and parking lot guidance program
CN111947669A (en) Method for using feature-based positioning maps for vehicles
CN111739293A (en) Data fusion method and device
CN111291775A (en) Vehicle positioning method, device and system
CN114264301A (en) Vehicle-mounted multi-sensor fusion positioning method and device, chip and terminal
CN111785000B (en) Vehicle state data uploading method and device, electronic equipment and storage medium
CN112640490B (en) Positioning method, device and system
US20160267792A1 (en) Method and device for providing an event message indicative of an imminent event for a vehicle
CN114264310A (en) Positioning and navigation method, device, electronic equipment and computer storage medium
CN112767545A (en) Point cloud map construction method, device, equipment and computer storage medium
CN115984417B (en) Semantic mapping method, semantic mapping device and storage medium
CN109710594B (en) Map data validity judging method and device and readable storage medium
KR102519496B1 (en) Method and apparatus for calibrating forward axis of vehicle accelerometer
CN114048626A (en) Traffic flow simulation scene construction method and system
CN114323693A (en) Test method, device, equipment and storage medium for vehicle road cloud perception system
CN113581193A (en) Driving scene simulation optimization method and system, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 201800 j879, room 2201, 888 Moyu South Road, Jiading District, Shanghai

Applicant after: Yunkong Zhihang (Shanghai) Automotive Technology Co.,Ltd.

Address before: 201800 j879, room 2201, 888 Moyu South Road, Jiading District, Shanghai

Applicant before: Enlightenment cloud control (Shanghai) Automotive Technology Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant