CN111753901B - Data fusion method, device, system and computer equipment - Google Patents

Data fusion method, device, system and computer equipment Download PDF

Info

Publication number
CN111753901B
CN111753901B CN202010584334.3A CN202010584334A CN111753901B CN 111753901 B CN111753901 B CN 111753901B CN 202010584334 A CN202010584334 A CN 202010584334A CN 111753901 B CN111753901 B CN 111753901B
Authority
CN
China
Prior art keywords
data
target level
level data
sensor
sensor target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010584334.3A
Other languages
Chinese (zh)
Other versions
CN111753901A (en
Inventor
张庆
李军
褚文博
温悦
赵盼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guoqi Beijing Intelligent Network Association Automotive Research Institute Co ltd
Original Assignee
Guoqi Beijing Intelligent Network Association Automotive Research Institute Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guoqi Beijing Intelligent Network Association Automotive Research Institute Co ltd filed Critical Guoqi Beijing Intelligent Network Association Automotive Research Institute Co ltd
Priority to CN202010584334.3A priority Critical patent/CN111753901B/en
Publication of CN111753901A publication Critical patent/CN111753901A/en
Application granted granted Critical
Publication of CN111753901B publication Critical patent/CN111753901B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a data fusion method, a device, a system and computer equipment, wherein the data fusion method comprises the following steps: respectively acquiring first sensor target level data, second sensor target level data and third sensor target level data; processing the first sensor target level data and the second sensor target level data to obtain primary fusion data, wherein the data types of the first sensor target level data and the second sensor target level data are the same; and processing the primary fusion data and the target level data of the third sensor to obtain secondary fusion data. By implementing the method, the computing time in the data fusion process is reduced, the data fusion efficiency is improved, and the instantaneity is further improved.

Description

Data fusion method, device, system and computer equipment
Technical Field
The invention relates to the technical field of intelligent automobiles, in particular to a data fusion method, a device, a system and computer equipment.
Background
Along with the continuous development of intelligent and networking technologies of automobiles, intelligent automobiles enter lives of people, play a great role in relieving traffic pressure, guaranteeing passenger safety, improving traffic efficiency and the like, and accurate and comprehensive ambient environment perception is the most critical technology in an intelligent automobile system and is also a precondition for realizing intelligent driving of automobiles, so how to accurately and comprehensively perceive ambient environment is very important.
In order to accurately and comprehensively obtain information in surrounding environments, many existing intelligent automobiles are equipped with various sensors (laser radar, camera, millimeter wave radar, ultrasonic radar, etc.) for sensing environments. For intelligent automobiles with multiple sensors, how to correctly and effectively use data detected by the multiple sensors becomes a difficult problem in an intelligent automobile sensing system. The existing multi-sensor data fusion method is mostly concentrated on the data fusion of a plurality of sensors of the same type or only carries out data fusion on two sensors, and the fusion method does not embody the unique advantages of each type of sensor, but simply carries out filtering processing on the data detected by all sensors, so that the real-time performance of the system is reduced, the accuracy of the multi-sensor data fusion cannot be ensured, and the respective advantages of each sensor cannot be fully embodied.
Disclosure of Invention
Therefore, the technical problem to be solved by the invention is to overcome the defects of insufficient advantage utilization and low data fusion accuracy of the sensor in the prior art, thereby providing a data fusion method, a device, a system and computer equipment.
According to a first aspect, an embodiment of the present invention provides a data fusion method, including: respectively acquiring first sensor target level data, second sensor target level data and third sensor target level data; processing the first sensor target level data and the second sensor target level data to obtain primary fusion data; wherein the first sensor target level data is the same data type as the second sensor target level data; and processing the primary fusion data and the third sensor target level data to obtain secondary fusion data.
With reference to the first aspect, in a first implementation manner of the first aspect, when acquiring the first sensor target level data, the second sensor target level data, and the third sensor target level data, respectively, the method further includes: and time stamping the acquired first sensor target level data, second sensor target level data and third sensor target level data.
With reference to the first aspect, in a second implementation manner of the first aspect, when the first sensor target level data and the second sensor target level data are target position data and the first sensor and the second sensor are of different types, before fusing the first sensor target level data and the second sensor target level data, the method further includes: and carrying out coordinate conversion on the first sensor target level data and the second sensor target level data to obtain the first sensor target level data and the second sensor target level data after spatial synchronization.
With reference to the second implementation manner of the first aspect, in a third implementation manner of the first aspect, processing the first sensor target level data and the second sensor target level data to obtain primary fusion data includes: calculating the time difference between the first sensor target level data of the latest frame and the second sensor target level data of the latest frame; judging whether the time difference is larger than a preset threshold value or not; when the time difference is smaller than or equal to the preset threshold value, performing association matching on the first sensor target level data and the second sensor target level data, and updating the data pairs successfully subjected to association matching to obtain the primary fusion data; and when the time difference is larger than the preset threshold value, selecting the data with the latest time from the first sensor target level data and the second sensor target level data, and taking the selected data with the latest time as the primary fusion data.
With reference to the first aspect, in a fourth implementation manner of the first aspect, after performing association matching on the first sensor target level data and the second sensor target level data, the method further includes: and filtering the first sensor target level data or the second sensor target level data which are failed to be associated, so as to obtain the primary fusion data.
With reference to the third implementation of the first aspect, in a fifth implementation of the first aspect, when the first sensor target level data and the second sensor target level data comprise one or more feature quantities; performing associative matching on the first sensor target level data and the second sensor target level data includes: performing time compensation on the first sensor target level data and the second sensor target level data; determining the same characteristic quantity in the first sensor target level data and the second sensor target level data after time compensation; establishing an association matrix between the first sensor target level data and the second sensor target level data according to the same feature quantity based on similar calculation; and determining a matching relation between the first sensor target level data and the second sensor target level data based on the incidence matrix, and forming an incidence data pair by the first sensor target level data and the second sensor target level data belonging to the same target.
With reference to the third implementation manner of the first aspect, in a sixth implementation manner of the first aspect, updating the association data pair that is successfully associated with the matching includes: and carrying out data combination on the associated data pairs based on the characteristics of the first sensor and the second sensor, and carrying out data updating according to a data combination substitution mode.
With reference to the fifth implementation manner of the first aspect, in a seventh implementation manner of the first aspect, the performing time compensation on the first sensor target level data and the second sensor target level data includes: acquiring a target timestamp difference value, and determining the first sensor target level data and the second sensor target level data corresponding to the timestamp difference value; acquiring position information and motion information contained in first sensor target level data or second sensor target level data corresponding to the smaller timestamp in the target timestamp difference value; predicting predicted position information after the time stamp difference value according to the position information, the motion information and the time stamp difference value; and updating the time stamp of the first sensor or the second sensor to the larger time stamp in the target time stamp difference value by taking the predicted position information as a current time measurement value.
With reference to the first aspect, in an eighth implementation manner of the first aspect, when the first sensor target level data and the second sensor target level data are target position data and the third sensor target level data are target image data, processing the primary fusion data and the third sensor target level data includes: calculating the time difference between the primary fusion data of the latest frame and the target level data of the third sensor of the latest frame; judging whether the time difference is larger than a preset threshold value or not; when the time difference is smaller than or equal to the preset threshold value, performing association matching on the primary fusion data and the third sensor target level data, and updating the data pair with successful association matching to obtain the secondary fusion data; and when the time difference is larger than the preset threshold value, selecting the data with the latest time from the primary fusion data and the target data of the third sensor, and taking the selected data with the latest time as the secondary fusion data.
With reference to the eighth implementation manner of the first aspect, in a ninth implementation manner of the first aspect, after performing association matching on the primary fusion data and the third sensor target level data, the method further includes: and filtering primary fusion data or third sensor target level data which are failed to be associated to obtain secondary fusion data.
With reference to the eighth implementation manner of the first aspect, in a tenth implementation manner of the first aspect, performing association matching on the primary fusion data and the third sensor target level data includes: performing time compensation on the primary fusion data and the third sensor target level data in a mode that the primary fusion data compensates the third sensor target level data; converting the primary fusion data subjected to time compensation into a coordinate system corresponding to the third sensor target level data, and establishing an incidence matrix between the primary fusion data and the third sensor target level according to coordinate system information corresponding to the third sensor target level data; and determining a matching relation between the primary fusion data and the third sensor target level data based on global optimal matching, and obtaining a matching data pair corresponding to the primary fusion data and the third sensor target level data of the same target.
With reference to the tenth implementation manner of the first aspect, in an eleventh implementation manner of the first aspect, updating the association data pair that is successfully associated with the matching includes: and correcting and eliminating the primary fusion data by adopting the acquired target level data of the third sensor according to the matched data pair, and updating data according to the corrected and eliminated primary fusion data.
With reference to the first aspect, in a twelfth implementation manner of the first aspect, after processing the primary fusion data and the third sensor target level data to obtain secondary fusion data, the method further includes: and carrying out multi-target tracking on the secondary fusion data, and determining a target running track containing the multi-target.
With reference to the first aspect, in a thirteenth implementation manner of the first aspect, the first sensor is a millimeter wave radar sensor; the second sensor is a laser radar sensor; the third sensor is a visual sensor.
According to a second aspect, an embodiment of the present invention provides a data fusion apparatus, including: the acquisition module is used for respectively acquiring the first sensor target level data, the second sensor target level data and the third sensor target level data; the first fusion module is used for processing the first sensor target level data and the second sensor target level data to obtain primary fusion data; wherein the first sensor target level data is the same data type as the second sensor target level data; and the second fusion module is used for processing the primary fusion data and the third sensor target level data to obtain secondary fusion data.
According to a third aspect, an embodiment of the present invention provides a computer apparatus, comprising: the data fusion method comprises the steps of storing computer instructions in a memory and a processor, wherein the memory and the processor are in communication connection, and the processor executes the computer instructions, so that the data fusion method according to the first aspect or any implementation mode of the first aspect is executed.
According to a fourth aspect, an embodiment of the present invention provides a computer readable storage medium storing computer instructions for causing a computer to perform the data fusion method according to the first aspect or any implementation manner of the first aspect.
The technical scheme of the invention has the following advantages:
1. according to the data fusion method provided by the invention, the target level data of the first sensor, the second sensor and the third sensor are respectively obtained, the first sensor target level data and the second sensor target level data are fused to obtain the primary fusion data, the primary fusion data and the third sensor target level data are fused to obtain the secondary fusion data, the characteristics of the first sensor, the second sensor and the third sensor are fully utilized in the process of fusing the target level data, so that each sensor can exert the advantages of each sensor, the primary fusion data can be corrected through the fusion of the third sensor target level data and the primary fusion data, the accuracy of the fusion data is ensured, the calculation time in the data fusion process can be reduced by adopting the fusion of the target level data based on the fusion data which are all the sensor target level data, the data fusion efficiency can be improved, and the instantaneity can be further improved.
2. According to the data fusion method provided by the invention, the time compensation is used for carrying out time synchronization on the unsynchronized sensor target level data, so that errors caused by the unsynchronized acquired sensor data are avoided, the accuracy of correlation matching of the sensor data is further improved, and the accuracy of data fusion of the sensors is further ensured.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are needed in the description of the embodiments or the prior art will be briefly described, and it is obvious that the drawings in the description below are some embodiments of the present invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a data fusion method in an embodiment of the invention;
FIG. 2 is a flow chart of a data fusion method in an embodiment of the invention;
FIG. 3 is a schematic block diagram of a data fusion device in an embodiment of the invention;
fig. 4 is a schematic structural diagram of a computer device according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made apparent and fully in view of the accompanying drawings, in which some, but not all embodiments of the invention are shown. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
In the description of the present invention, it should be noted that the directions or positional relationships indicated by the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc. are based on the directions or positional relationships shown in the drawings, are merely for convenience of describing the present invention and simplifying the description, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the present invention, it should be noted that, unless explicitly specified and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be either fixedly connected, detachably connected, or integrally connected, for example; can be mechanically or electrically connected; the two components can be directly connected or indirectly connected through an intermediate medium, or can be communicated inside the two components, or can be connected wirelessly or in a wired way. The specific meaning of the above terms in the present invention will be understood in specific cases by those of ordinary skill in the art.
In addition, the technical features of the different embodiments of the present invention described below may be combined with each other as long as they do not collide with each other.
Example 1
The embodiment provides a data fusion method applied to an intelligent automobile to realize the environment sensing of the intelligent automobile, as shown in fig. 1, the method comprises the following steps:
s11, acquiring first sensor target level data, second sensor target level data and third sensor target level data respectively.
Illustratively, the acquired target level data may be determined according to characteristics of the sensor, the first sensor target level data may include data information such as an object position, a speed, a type, and a first sensor identification, the second sensor target level data may include data information such as an object position, a speed, a type, a heading angle, and a second sensor identification, and the third sensor target level data may include data information such as an object pixel coordinate, a type, and a third sensor identification. Target level data of the first sensor, the second sensor and the third sensor can be acquired through the sensors correspondingly.
S12, processing the first sensor target level data and the second sensor target level data to obtain primary fusion data; wherein the first sensor target level data is the same data type as the second sensor target level data.
For example, the first sensor target level data may be the same data type as the second sensor target level data, e.g., the target level data of the second sensor of the first sensor may each contain data of object position, speed, kind, etc. And fusing the first sensor target level data with the second sensor target level data to obtain primary fused data.
As an alternative embodiment of the present application, the first sensor may be a millimeter wave radar sensor, the second sensor may be a lidar sensor, and the third sensor may be a vision sensor, such as a camera. The first sensor, the second sensor and the third sensor can be adjusted according to the need by a person skilled in the art, for example, the first sensor is a laser radar sensor, the second sensor is a millimeter wave radar sensor, and the third sensor is a vision sensor. The present application is not limited in the types of the first sensor, the second sensor, and the third sensor. The embodiments of the application are all described by taking a first sensor as a millimeter wave radar sensor, a second sensor as a laser radar sensor and a third sensor as a camera as an example.
Filtering and tracking unstable and high-noise target level data output by the millimeter wave radar sensor by a filtering method, adding a time identifier to the target level data of each frame of millimeter wave radar sensor, and packaging according to single frame data; clustering point cloud data output by a laser radar sensor, separating the point cloud data of an object from the environment, obtaining position information and three-dimensional contour information of the object by utilizing a boundary fitting mode, eliminating noise in sensing data output by the laser radar sensor by utilizing a filtering tracking method, obtaining stable object information, and finally adding time identification and data encapsulation for each frame of data according to a GPS differential sensor.
S13, processing the primary fusion data and the target level data of the third sensor to obtain secondary fusion data.
The method comprises the steps of obtaining primary fusion data by fusing target-level data of a first sensor and target-level data of a second sensor, identifying obstacles, lane lines, traffic signals and traffic identifications in images by using a deep learning algorithm according to front image information acquired by a third sensor, adding UTC time identifications as units, packaging data information of the same type in one period, and fusing the target-level data of the third sensor and the primary fusion data to achieve correction of the primary fusion data, so that secondary fusion data fused by the primary fusion data and the target-level data of the third sensor are obtained, and accuracy of the secondary fusion data is guaranteed.
According to the data fusion method provided by the embodiment, the target level data of the first sensor, the second sensor and the third sensor are respectively obtained, the first sensor target level data and the second sensor target level data are fused to obtain primary fusion data, the primary fusion data and the third sensor target level data are fused to obtain secondary fusion data, the characteristics of the first sensor, the second sensor and the third sensor are fully utilized in the process of fusing the target level data, so that each sensor can exert the advantages, the primary fusion data can be corrected through the fusion of the third sensor target level data and the primary fusion data, the accuracy of the fusion data is ensured, the calculation time in the data fusion process can be reduced by adopting the fusion of the target level data, the data fusion efficiency can be improved, and the instantaneity can be further improved.
As an optional embodiment of the present application, when acquiring the first sensor target level data, the second sensor target level data, and the third sensor target level data, respectively, further comprises: the acquired first sensor target level data, second sensor target level data, and third sensor target level data are time stamped.
For example, since there is a certain time difference between the perception of the object by each sensor, there is a certain time difference between the acquired first sensor target level data, second sensor target level data and third sensor target level data. And (3) time stamping the obtained original data of the first sensor target level data, the second sensor target level data and the third sensor target level data, otherwise, a certain time difference exists between the sensing results of the sensors. By time stamping, time synchronization between different sensors is achieved by time stamping each frame of data for each sensor using a uniform time standard.
It should be noted that, the time standard may be a system time standard or a UTC time standard, when the data of each sensor is received on one controller, the two methods of the above-mentioned unified time standard may be used, when each sensor is received on a different controller, an external UTC time standard may be used, and GPS time information is provided by a GPS differential sensor, so that UTC time identifier is capped for each sensor, which is not limited in this application. According to the embodiment of the application, the GPS differential sensor is used for stamping UTC time stamps on the sensors, and meanwhile, the positioning information of the intelligent automobile is obtained.
According to the data fusion method, time compensation is used for time synchronization of the unsynchronized sensor target level data, errors caused by the unsynchronized acquired sensor data are avoided, and the accuracy of correlation matching of the sensor data is improved, so that the accuracy of data fusion of the sensors is guaranteed.
As an optional embodiment of the present application, when the first sensor target level data and the second sensor target level data are target position data, and the first sensor is different from the second sensor in type, before fusing the first sensor target level data and the second sensor target level data, the method further includes: and performing coordinate conversion on the first sensor target level data and the second sensor target level data to obtain the first sensor target level data and the second sensor target level data after spatial synchronization.
For example, the acquired target level data of each sensor is stored according to the same data format for the latest frame of data, and a unified structure body can be constructed to store the latest frame of data of each sensor, wherein the structure body contains the sensor numbers of the sensors to which the data belong in addition to the data of the sensing object of each sensor. Based on a translation matrix and a rotation matrix of each sensor coordinate system relative to a vehicle body coordinate system, the received target level data of different sensors are transformed into the same vehicle body coordinate system through coordinates, and sensor data after spatial synchronization are obtained. And carrying out coordinate conversion on the translation matrix and the rotation matrix of the first sensor coordinate system and the second sensor coordinate system relative to the vehicle body coordinate system to obtain first sensor target level data and second sensor target level data after spatial synchronization.
As an alternative embodiment of the present application, as shown in fig. 2, the step S12 includes:
s121, calculating the time difference between the first sensor target level data of the latest frame and the second sensor target level data of the latest frame.
For example, the time difference between the two sets of target level data may be calculated by determining the latest frame of first sensor target level data and the latest frame of second sensor target level data from the time stamps stamped in the first sensor target level data and the second sensor target level data, respectively.
S122, judging whether the time difference is larger than a preset threshold value.
Illustratively, the preset threshold is a critical time difference value that determines a first sensor target level data and a second sensor target level data fusion mode. The person skilled in the art can determine the preset threshold according to actual needs, for example, the preset threshold is set to be 1s, which is not limited in the present application. If the time difference is less than or equal to the preset threshold, step S123 is executed, and if the time difference is greater than the preset threshold, step S124 is executed.
And S123, when the time difference is smaller than or equal to a threshold value, performing association matching on the first sensor target level data and the second sensor target level data, and updating the data pair with successful association matching to obtain primary fusion data.
Exemplary, when the time difference Δt is less than or equal to the preset threshold T th When the method is used, the time difference between the two groups of target level data of the first sensor target level data and the second sensor target level data can be considered to be within a controllable range, the object motion state in the time difference can be considered to be unchanged, the first sensor target level data and the second sensor target level data are associated and matched, the first sensor target level data and the second sensor target level data successfully associated and matched form a data pair, and the data pair is packaged to obtain primary fusion data.
And S124, when the time difference is larger than the threshold value, selecting the data with the latest time from the first sensor target level data and the second sensor target level data, and taking the selected data with the latest time as primary fusion data.
Exemplary, when the time difference Δt is greater than the preset threshold T th And when the time interval is too large, the old data set with the time stamp is not provided with a use value, the old data set with the time stamp is removed, the new data set with the time stamp is reserved, the dimension expansion operation is performed on the new data set with the time stamp, and the uniformity of the primary fusion data format is ensured. For example, when the latest time data is data of millimeter wave radar, in order to ensure the uniformity of primary fusion data format, the data of millimeter wave radar needs to be subjected to dimension expansion operation, so that the millimeter wave radar cannot obtain special data Filling the features with default values, and then carrying out data encapsulation to obtain primary fusion data.
As an alternative embodiment of the present application, when the first sensor target level data and the second sensor target level data contain one or more feature amounts, performing association matching on the first sensor target level data and the second sensor target level data includes:
first, the first sensor target level data and the second sensor target level data are time-compensated.
The principle of time compensation is, for example, to assume that the object does not change its state of motion within a short period of time that requires compensation. Based on the small time difference between the first sensor target level data and the second sensor target level data, it can be considered that the motion state of the object remains unchanged during the extreme time, so as to time compensate the first sensor target level data and the second sensor target level data, and the specific steps may include:
step 1, acquiring a target timestamp difference value, and determining first sensor target level data and second sensor target level data corresponding to the target timestamp difference value. The target timestamp difference may be equal to a time difference Δt between the last frame of first sensor target level data and the last frame of second sensor target level data, and the target timestamp difference is determined to correspond to two sets of target level data, namely the last frame of first sensor target level data and the last frame of second sensor target level data.
And step 2, acquiring position information and motion information contained in the first sensor target level data or the second sensor target level data corresponding to the smaller timestamp in the target timestamp difference value. Comparing the timestamps corresponding to the two groups of data, selecting the position information and the motion information contained in the target-level data corresponding to the smaller timestamp, and if the timestamp of the first sensor target-level data is smaller, acquiring the position information and the motion information contained in the first sensor target-level data; and if the timestamp of the second sensor target level data is smaller, acquiring the position information and the operation information contained in the second sensor target level data.
And 3, predicting the predicted position information after the target time stamp difference value passes according to the position information, the motion information and the target time stamp difference value. Since the time difference Δt of the time stamp is smaller than the preset threshold T th The object motion state within the time difference can be considered to be kept unchanged, and the object position information after the delta t time is predicted according to the acquired current position information and motion information of the object, so that the predicted position information is acquired.
And 4, updating the time stamp of the first sensor or the second sensor to the time stamp with the larger time stamp in the target time stamp difference value by taking the predicted position information as the current time measurement value. The predicted position information is used as a measured value of the current moment, and the time stamp information of the sensor is updated to be the one with larger time stamp in the two groups of data corresponding to the target time stamp difference value, so that the synchronization of space and time is realized. The time stamp of the target level data of each sensor after the time compensation is correspondingly updated, and the time compensation can be updated only for the position information and the speed information of the object, and optionally, the acceleration which cannot be changed can be predicted and compensated.
Next, the same feature quantity in the time-compensated first sensor target level data and the second sensor target level data is determined.
Illustratively, the same feature quantities, such as position, velocity, and acceleration, that can be detected by both the millimeter wave radar sensor target level data and the lidar sensor target level data are determined from a plurality of feature quantities contained in the target level data detected by the millimeter wave radar sensor and the lidar sensor.
Next, an association matrix between the first sensor target level data and the second sensor target level data is established from the same feature quantity based on the similarity calculation.
Illustratively, the elements of the ith row, jth column of the correlation matrix represent the similarity between the ith targets detected by the second sensor (lidar sensor) and the jth targets detected by the first sensor (millimeter-wave radar sensor). The similarity can be calculated from the weighted value of the position and velocity mahalanobis distance or euclidean distance between two objects. The correlation matrix can be established based on the determined same feature quantity as a consideration value for performing correlation matching of the first sensor target level data and the second sensor target level data.
And determining a matching relation between the first sensor target level data and the second sensor target level data based on the incidence matrix, and forming an incidence data pair by the first sensor target level data and the second sensor target level data belonging to the same target.
For example, by using a global optimal matching method, a corresponding matching relationship between the first sensor target level data and the second sensor target level data can be determined according to the constructed correlation matrix, wherein the matching relationship is used for indicating which feature measurements between the two sets of data belong to the same target. And carrying out matching search on target level data detected by the laser radar sensor and the millimeter wave radar sensor according to the incidence matrix, judging which characteristic quantities belong to the same target, and correlating the target level data of the laser radar sensor and the target level data of the millimeter wave radar sensor belonging to the same target to generate a correlation data pair.
As an optional implementation manner of the application, updating the association data pair with successful association matching comprises the following steps: the associated data pairs are data combined based on the characteristics of the first sensor and the second sensor, and the data is updated according to a data combination substitution mode.
Illustratively, based on the respective detection advantages of the first sensor and the second sensor when detecting object features within a certain range, the associated data pairs may be data combined and data updated according to alternative ways of data combination based on the advantages of the first sensor and the second sensor in detecting object features and detection ranges. For example, for the associated data pair with successful association matching, the speed information of the millimeter wave radar sensor can be used for correcting the speed information of the laser radar sensor, and data information which cannot be obtained by the laser radar is added to the data acquired by the laser radar sensor, including but not limited to the data information such as signal-to-noise ratio of the object, radar scattering cross section (radar cross section, RCS) and the like, and the current data is replaced by the combined data, namely, the data is updated by adopting a data combination replacing mode.
As an optional embodiment of the present application, after performing association matching on the first sensor target level data and the second sensor target level data, the method further includes: and filtering the first sensor target level data or the second sensor target level data which are failed to be associated to obtain primary fusion data.
For example, filtering is performed on first sensor target level data or second sensor target level data with failed correlation matching, for example, noise filtering may be performed on millimeter wave radar sensor target level data with failed correlation matching according to a range, a motion characteristic and a signal to noise ratio threshold of an object detected by the millimeter wave radar sensor, and filtering may be performed on laser radar sensor target level data without correlation matching according to information such as stability, three-dimensional size and the like of the object detected by the laser radar sensor, so as to obtain primary fusion data.
As an optional embodiment of the present application, when the first sensor target level data and the second sensor target level data are target position data and the third sensor target level data are target image data, the step S13 includes:
first, a time difference between the latest frame of primary fusion data and the latest frame of third sensor target level data is calculated. The calculation manner of the time difference is referred to step S121 in the above embodiment, and will not be described herein.
And secondly, judging whether the time difference is larger than a preset threshold value. The details are referred to the related description of step S122 in the above embodiment, and will not be repeated here.
And secondly, when the time difference is smaller than or equal to a preset threshold value, performing association matching on the primary fusion data and the target level data of the third sensor, and updating the data pair successfully subjected to association matching to obtain secondary fusion data. When the time difference is less than or equal to the preset threshold, the method for acquiring the secondary fusion data is the same as the method for acquiring the primary fusion data, and details are described in the related description of step S123 in the above embodiment, which is not repeated here.
And when the time difference is larger than a preset threshold value, selecting the data with the latest time from the primary fusion data and the target data of the third sensor, and taking the selected data with the latest time as the secondary fusion data. When the time difference is greater than the preset threshold, the method for acquiring the secondary fusion data is the same as the method for acquiring the primary fusion data, and details are described in the related description of step S124 in the above embodiment, which is not repeated here.
As an alternative embodiment, performing the associative matching on the primary fusion data and the third sensor target level data includes:
first, the primary fusion data and the third sensor target level data are time-compensated in such a way that the primary fusion data is compensated to the third sensor target level data.
For example, based on the fact that the third sensor is a visual sensor, the output target level data of the third sensor is target image data, and taking the third sensor as a camera sensor as an example, since there is a large error in time compensation by using pixel information of the target image output by the camera sensor, the time compensation can be performed by adopting a mode of compensating the primary fusion data to the target level data of the third sensor. The detailed principle of the time compensation is referred to the related description of the above embodiment, and will not be repeated here.
And secondly, converting the primary fusion data subjected to time compensation into a coordinate system corresponding to the third sensor target level data, and establishing an incidence matrix between the primary fusion data and the third sensor target level according to coordinate system information corresponding to the third sensor target level data.
The primary fusion data after time compensation is projected onto an image pixel coordinate system through conversion of an external reference matrix and an internal reference matrix of a visual sensor, and primary fusion data with pixel information is obtained. The primary fusion data is a vehicle body coordinate system, the conversion relation between the vehicle body coordinate system and the image pixel coordinate system is utilized to operate, namely, the primary fusion data is projected to the image pixel coordinate system, the position of an object under the vehicle body coordinate system on an image and the pixel size of an external rectangle are obtained, the rectangular frame information of the primary fusion data obstacle under the image pixel coordinate system is obtained, and the pixel information and the type information of the object are utilized to establish the correlation matrix. Details of the correlation matrix referring to the related description of the above embodiment, the similarity of the correlation matrix may be determined according to the coincidence ratio (IoU) between the primary fusion data and the object perceived by the vision sensor in the image pixel coordinate system.
And determining a matching relation between the primary fusion data and the target level data of the third sensor based on global optimal matching to obtain a matching data pair corresponding to the primary fusion data and the target level data of the third sensor of the same target.
Illustratively, the correlation matching of the two sets of data is completed based on the pixel information of the primary fusion data and the third sensor target level data, and corresponding matched data pairs are obtained. The principle of matching the primary fusion data with the target level data of the third sensor is described in the above embodiment, and will not be described herein.
As an optional implementation manner of the application, updating the association data pair with successful association matching comprises the following steps: and correcting and eliminating the primary fusion data by adopting the acquired target level data of the third sensor according to the matched data pair, and updating the data according to the corrected and eliminated primary fusion data.
Illustratively, the primary fusion data is augmented with third sensor data for a matching data pair for which the association matching was successful. And correcting the target type in the primary fusion data according to the target type in the target level data perceived by the third sensor according to the matching data, removing the target with lower confidence in the primary fusion data by utilizing the data information perceived by the third sensor, and updating the data by adopting the mode of correcting and removing the primary fusion data.
As an optional embodiment of the present application, after performing association matching on the primary fusion data and the third sensor target level data, the method further includes: and filtering primary fusion data or third sensor target level data which are failed to be associated to obtain secondary fusion data.
For example, for a matched data pair which is not successfully matched in an associated manner, the primary fusion data and the target level data of the third sensor are filtered according to the effective detection range and the object detection characteristic of the third sensor, and fusion of the primary fusion data and the target level data of the third sensor is completed, so that secondary fusion data is obtained.
As an optional embodiment of the present application, after processing the primary fusion data and the third sensor target level data to obtain the secondary fusion data, the method further includes: and carrying out multi-target tracking on the secondary fusion data, and determining a target running track containing multiple targets.
Illustratively, the multi-objective tracking is performed on the secondary fusion data based on the interactive multi-model Kalman filtering and the associated matching algorithm, the associated matching algorithm can be a Hungary matching algorithm, and the associated matching algorithm is not limited by the application. And (3) completing track updating matched with the secondary fusion data by using a state updating equation of the interactive multi-model Kalman filtering, completing multi-target tracking, determining a target running track containing multiple targets, and obtaining a final fusion result containing the target running track.
Example 2
The embodiment provides a data fusion device, which is applied to an intelligent automobile and is used for realizing the environment perception of the intelligent automobile, as shown in fig. 3, and comprises:
the acquisition module 21 is configured to acquire first sensor target level data, second sensor target level data, and third sensor target level data, respectively. The details are described in the above method embodiment in relation to step S11, and are not repeated here.
A first fusion module 22, configured to process the first sensor target level data and the second sensor target level data to obtain primary fusion data; wherein the first sensor target level data is the same data type as the second sensor target level data. The details of the method embodiment are described in the related description of step S12, and are not repeated here.
And the second fusion module 23 is configured to process the primary fusion data and the third sensor target level data to obtain secondary fusion data. The details of the method embodiment are described in the related description of step S13, and are not repeated here.
According to the data fusion device provided by the embodiment, the acquisition module is used for respectively acquiring the target level data of the first sensor, the second sensor and the third sensor, the first fusion module is used for fusing the target level data of the first sensor and the target level data of the second sensor to obtain primary fusion data, and the second fusion module is used for fusing the primary fusion data and the target level data of the third sensor to obtain secondary fusion data. The device fully utilizes the characteristics of the first sensor, the second sensor and the third sensor in the process of fusing target-level data, so that the sensors can play respective advantages, primary fusion data can be corrected by fusing the target-level data of the third sensor and the primary fusion data, the accuracy of the fusion data is ensured, the fused data are all sensor target-level data, the calculation time in the data fusion process can be reduced by adopting the target-level data for fusion, the data fusion efficiency is improved, and the instantaneity can be improved.
As an alternative embodiment of the present application, the apparatus further comprises:
and the time synchronization module is used for time stamping the acquired first sensor target level data, second sensor target level data and third sensor target level data. Details of the method embodiments are described in the related description, and are not repeated here.
As an optional embodiment of the present application, when the first sensor target level data and the second sensor target level data are target position data, and the first sensor is different from the second sensor in type, before fusing the first sensor target level data and the second sensor target level data, the method further includes:
and the coordinate conversion module is used for carrying out coordinate conversion on the first sensor target level data and the second sensor target level data to obtain the first sensor target level data and the second sensor target level data after spatial synchronization. Details of the method embodiments are described in the related description, and are not repeated here.
As an alternative embodiment of the present application, the first fusion module 22 includes:
and the first calculating sub-module is used for calculating the time difference between the latest frame of first sensor target level data and the latest frame of second sensor target level data. Details of the method embodiments are described in the related description, and are not repeated here.
And the first judging sub-module is used for judging whether the time difference is larger than a preset threshold value. Details of the method embodiments are described in the related description, and are not repeated here.
And the first fusion sub-module is used for carrying out association matching on the first sensor target level data and the second sensor target level data when the time difference is smaller than or equal to a preset threshold value, and updating the data pair successfully subjected to association matching to obtain primary fusion data. Details of the method embodiments are described in the related description, and are not repeated here.
And the second fusion sub-module is used for selecting the data with the latest time from the first sensor target level data and the second sensor target level data when the time difference is larger than a preset threshold value, and taking the selected data with the latest time as primary fusion data. Details of the method embodiments are described in the related description, and are not repeated here.
As an optional embodiment of the present application, after performing association matching on the first sensor target level data and the second sensor target level data, the method further includes:
and the first filtering sub-module is used for filtering the first sensor target level data or the second sensor target level data which are failed to be associated to obtain primary fusion data. Details of the method embodiments are described in the related description, and are not repeated here.
As an optional embodiment of the present application, when the first sensor target level data and the second sensor target level data include one or more feature amounts, the first fusion sub-module further includes:
and the first time compensation sub-module is used for performing time compensation on the first sensor target level data and the second sensor target level data. Details of the method embodiments are described in the related description, and are not repeated here.
And the determining submodule is used for determining the same characteristic quantity in the first sensor target level data and the second sensor target level data after time compensation. Details of the method embodiments are described in the related description, and are not repeated here.
The first establishing sub-module is used for establishing an incidence matrix between the first sensor target level data and the second sensor target level data according to the same characteristic quantity based on similar calculation. Details of the method embodiments are described in the related description, and are not repeated here.
The first association sub-module is used for determining a matching relation between the first sensor target level data and the second sensor target level data based on the association matrix, and forming association data pairs by the first sensor target level data and the second sensor target level data belonging to the same target. Details of the method embodiments are described in the related description, and are not repeated here.
As an optional implementation manner of the application, updating the association data pair with successful association matching comprises the following steps:
and the combination sub-module is used for carrying out data combination on the associated data pairs based on the characteristics of the first sensor and the second sensor and carrying out data updating according to a data combination substitution mode. Details of the method embodiments are described in the related description, and are not repeated here.
As an optional embodiment of the present application, the time compensation sub-module includes:
the first acquisition sub-module is used for acquiring the target timestamp difference value and determining first sensor target level data and second sensor target level data corresponding to the target timestamp difference value. Details of the method embodiments are described in the related description, and are not repeated here.
The second obtaining sub-module is used for obtaining the position information and the motion information contained in the first sensor target level data or the second sensor target level data corresponding to the smaller timestamp in the target timestamp difference value. Details of the method embodiments are described in the related description, and are not repeated here.
And the prediction sub-module is used for predicting the predicted position information after the target time stamp difference value passes according to the position information, the motion information and the target time stamp difference value. Details of the method embodiments are described in the related description, and are not repeated here.
And the updating sub-module is used for updating the time stamp of the first sensor or the second sensor into the time stamp with the larger time stamp in the target time stamp difference value by taking the predicted position information as the current time measurement value. Details of the method embodiments are described in the related description, and are not repeated here.
As an optional embodiment of the present application, when the first sensor target level data and the second sensor target level data are target position data and the third sensor target level data are target image data, the second fusion module includes:
and the second computing sub-module is used for computing the time difference between the latest frame of primary fusion data and the latest frame of third sensor target level data. Details of the method embodiments are described in the related description, and are not repeated here.
And the second judging sub-module is used for judging whether the time difference is larger than a preset threshold value. Details of the method embodiments are described in the related description, and are not repeated here.
And the third fusion sub-module is used for carrying out association matching on the primary fusion data and the target level data of the third sensor when the time difference is smaller than or equal to a preset threshold value, and updating the data pair successfully subjected to association matching to obtain the secondary fusion data. Details of the method embodiments are described in the related description, and are not repeated here.
And the fourth fusion sub-module is used for selecting the data with the latest time from the primary fusion data and the target data of the third sensor when the time difference is larger than a preset threshold value, and taking the selected data with the latest time as the secondary fusion data. Details of the method embodiments are described in the related description, and are not repeated here.
As an optional embodiment of the present application, after performing association matching on the primary fusion data and the third sensor target level data, the method further includes:
and the second filtering sub-module is used for filtering primary fusion data or third sensor target level data which are failed to be associated to obtain secondary fusion data. Details of the method embodiments are described in the related description, and are not repeated here.
As an alternative embodiment of the present application, the second fusion module 23 includes:
and the second time compensation sub-module is used for performing time compensation on the primary fusion data and the third sensor target level data in a mode that the primary fusion data compensates the third sensor target level data. Details of the method embodiments are described in the related description, and are not repeated here.
The second establishing sub-module is used for converting the primary fusion data after time compensation into a coordinate system corresponding to the third sensor target level data, and establishing an incidence matrix between the primary fusion data and the third sensor target level according to the coordinate system information corresponding to the third sensor target level data. Details of the method embodiments are described in the related description, and are not repeated here.
And the second association sub-module is used for determining a matching relation between the primary fusion data and the target level data of the third sensor based on global optimal matching to obtain a matching data pair corresponding to the primary fusion data and the target level data of the third sensor of the same target. Details of the method embodiments are described in the related description, and are not repeated here.
As an optional implementation manner of the application, updating the association data pair with successful association matching comprises the following steps:
and the correction sub-module is used for correcting and eliminating the primary fusion data by adopting the acquired third sensor target level data according to the matched data pair, and updating the data according to the corrected and eliminated primary fusion data. Details of the method embodiments are described in the related description, and are not repeated here.
As an alternative embodiment of the present application, after the second fusion module 23, it further includes:
and the tracking module is used for carrying out multi-target tracking on the secondary fusion data and determining a target running track containing multiple targets. Details of the method embodiments are described in the related description, and are not repeated here.
Example 3
The embodiment of the present application further provides a computer device, as shown in fig. 4, which includes a processor 31 and a memory 32, where the processor 31 and the memory 32 may be connected by a bus or other means, and in fig. 4, the connection is exemplified by the bus 30.
The processor 31 may be a central processing unit (Central Processing Unit, CPU). The processor 31 may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), graphics processors (Graphics Processing Unit, GPU), embedded Neural network processor (Neural-network Processing Unit, NPU) or other dedicated deep learning coprocessors, application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field-programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or a combination of the above.
The memory 32 is used as a non-transitory computer readable storage medium for storing non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules (e.g., the acquisition module 21, the first fusion module 22, and the second fusion module 23 shown in fig. 3) corresponding to the data fusion method in the embodiment of the invention. The processor 31 executes various functional applications of the processor and data processing, i.e., implements the data fusion method in the above-described method embodiments, by running non-transitory software programs, instructions, and modules stored in the memory 32.
The memory 32 may include a storage program area that may store an operating system, at least one application program required for functions, and a storage data area; the storage data area may store data created by the processor 31, etc. In addition, the memory 32 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, memory 32 may optionally include memory located remotely from processor 31, which may be connected to processor 31 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The one or more modules are stored in the memory 32 and when executed by the processor 31 perform the data fusion method of the embodiments shown in fig. 1-2.
The first sensor target level data and the second sensor target level data are fused to obtain primary fusion data, the primary fusion data and the third sensor target level data are fused to obtain secondary fusion data, the characteristics of the first sensor, the second sensor and the third sensor are fully utilized in the process of fusing the target level data, so that each sensor can exert the advantages, the primary fusion data can be corrected through the fusion of the third sensor target level data and the primary fusion data, the accuracy of the fusion data is ensured, the data fusion process can be reduced by adopting the target level data for fusion, the calculation time in the data fusion process can be improved, the data fusion efficiency can be improved, and the instantaneity can be improved.
The details of the computer device may be understood in reference to the corresponding relevant descriptions and effects of the embodiments shown in fig. 1 to fig. 1, which are not repeated herein.
The embodiment of the invention also provides a non-transitory computer storage medium, which stores computer executable instructions that can execute the data fusion method in any of the above method embodiments. Wherein the storage medium may be a magnetic Disk, an optical Disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a Flash Memory (Flash Memory), a Hard Disk (HDD), or a Solid State Drive (SSD); the storage medium may also comprise a combination of memories of the kind described above. It is apparent that the above examples are given by way of illustration only and are not limiting of the embodiments. Other variations or modifications of the above teachings will be apparent to those of ordinary skill in the art. It is not necessary here nor is it exhaustive of all embodiments. While still being apparent from variations or modifications that may be made by those skilled in the art are within the scope of the invention.

Claims (15)

1. A method of data fusion, comprising:
respectively acquiring first sensor target level data, second sensor target level data and third sensor target level data;
processing the first sensor target level data and the second sensor target level data to obtain primary fusion data; wherein the first sensor target level data is the same data type as the second sensor target level data;
processing the primary fusion data and the third sensor target level data to obtain secondary fusion data;
processing the first sensor target level data and the second sensor target level data to obtain primary fusion data, including:
calculating the time difference between the first sensor target level data of the latest frame and the second sensor target level data of the latest frame;
judging whether the time difference is larger than a preset threshold value or not;
when the time difference is smaller than or equal to the preset threshold value, performing association matching on the first sensor target level data and the second sensor target level data, and updating the data pairs successfully subjected to association matching to obtain the primary fusion data;
When the time difference is larger than the preset threshold value, selecting the data with the latest time from the first sensor target level data and the second sensor target level data, and taking the selected data with the latest time as the primary fusion data;
updating the association data pair successfully matched with the association comprises the following steps:
and carrying out data combination on the associated data pairs based on the characteristics of the first sensor and the second sensor, and carrying out data updating according to a data combination substitution mode.
2. The data fusion method of claim 1, wherein when acquiring the first sensor target level data, the second sensor target level data, and the third sensor target level data, respectively, further comprising:
and time stamping the acquired first sensor target level data, second sensor target level data and third sensor target level data.
3. The data fusion method of claim 1, wherein when the first sensor target level data and the second sensor target level data are target position data and the first sensor and the second sensor are of different types, prior to fusing the first sensor target level data and the second sensor target level data, further comprising:
And carrying out coordinate conversion on the first sensor target level data and the second sensor target level data to obtain the first sensor target level data and the second sensor target level data after spatial synchronization.
4. The data fusion method of claim 1, further comprising, after performing the associative matching on the first sensor target level data and the second sensor target level data:
and filtering the first sensor target level data or the second sensor target level data which are failed to be associated, so as to obtain the primary fusion data.
5. The data fusion method of claim 1, wherein when first sensor target level data and the second sensor target level data contain one or more feature quantities; performing associative matching on the first sensor target level data and the second sensor target level data includes:
performing time compensation on the first sensor target level data and the second sensor target level data;
determining the same characteristic quantity in the first sensor target level data and the second sensor target level data after time compensation;
establishing an association matrix between the first sensor target level data and the second sensor target level data according to the same feature quantity based on similar calculation;
And determining a matching relation between the first sensor target level data and the second sensor target level data based on the incidence matrix, and forming an incidence data pair by the first sensor target level data and the second sensor target level data belonging to the same target.
6. The method of data fusion of claim 5, wherein the time compensating the first sensor target level data and the second sensor target level data comprises:
acquiring a target timestamp difference value, and determining the first sensor target level data and the second sensor target level data corresponding to the target timestamp difference value;
acquiring position information and motion information contained in first sensor target level data or second sensor target level data corresponding to the smaller timestamp in the target timestamp difference value;
predicting predicted position information after passing through the target timestamp difference value according to the position information, the motion information and the target timestamp difference value;
and updating the time stamp of the first sensor or the second sensor to the larger time stamp in the target time stamp difference value by taking the predicted position information as a current time measurement value.
7. The data fusion method of claim 1, wherein processing the primary fusion data with the third sensor target level data when the first sensor target level data and the second sensor target level data are target location data and the third sensor target level data are target image data comprises:
calculating the time difference between the primary fusion data of the latest frame and the target level data of the third sensor of the latest frame;
judging whether the time difference is larger than a preset threshold value or not;
when the time difference is smaller than or equal to the preset threshold value, performing association matching on the primary fusion data and the third sensor target level data, and updating the data pair with successful association matching to obtain the secondary fusion data;
and when the time difference is larger than the preset threshold value, selecting the data with the latest time from the primary fusion data and the target data of the third sensor, and taking the selected data with the latest time as the secondary fusion data.
8. The data fusion method of claim 7, further comprising, after performing the associative matching on the primary fusion data and the third sensor target level data:
And filtering primary fusion data or third sensor target level data which are failed to be associated to obtain secondary fusion data.
9. The data fusion method of claim 7, wherein performing association matching on the primary fusion data and the third sensor target level data comprises:
performing time compensation on the primary fusion data and the third sensor target level data in a mode that the primary fusion data compensates the third sensor target level data;
converting the primary fusion data subjected to time compensation into a coordinate system corresponding to the third sensor target level data, and establishing an incidence matrix between the primary fusion data and the third sensor target level according to coordinate system information corresponding to the third sensor target level data;
and determining a matching relation between the primary fusion data and the third sensor target level data based on global optimal matching, and obtaining a matching data pair corresponding to the primary fusion data and the third sensor target level data of the same target.
10. The method of data fusion according to claim 9, wherein updating the pair of associated data for which the association match is successful comprises:
And correcting and eliminating the primary fusion data by adopting the acquired target level data of the third sensor according to the matched data pair, and updating data according to the corrected and eliminated primary fusion data.
11. The data fusion method of claim 1, further comprising, after processing the primary fusion data with the third sensor target level data to obtain secondary fusion data:
and carrying out multi-target tracking on the secondary fusion data, and determining a target running track containing the multi-target.
12. The data fusion method of claim 1, wherein:
the first sensor is a millimeter wave radar sensor;
the second sensor is a laser radar sensor;
the third sensor is a visual sensor.
13. A data fusion device, comprising:
the acquisition module is used for respectively acquiring the first sensor target level data, the second sensor target level data and the third sensor target level data;
the first fusion module is used for processing the first sensor target level data and the second sensor target level data to obtain primary fusion data; wherein the first sensor target level data is the same data type as the second sensor target level data;
The second fusion module is used for processing the primary fusion data and the third sensor target level data to obtain secondary fusion data;
processing the first sensor target level data and the second sensor target level data to obtain primary fusion data, including:
calculating the time difference between the first sensor target level data of the latest frame and the second sensor target level data of the latest frame;
judging whether the time difference is larger than a preset threshold value or not;
when the time difference is smaller than or equal to the preset threshold value, performing association matching on the first sensor target level data and the second sensor target level data, and updating the data pairs successfully subjected to association matching to obtain the primary fusion data;
when the time difference is larger than the preset threshold value, selecting the data with the latest time from the first sensor target level data and the second sensor target level data, and taking the selected data with the latest time as the primary fusion data;
updating the association data pair successfully matched with the association comprises the following steps:
and carrying out data combination on the associated data pairs based on the characteristics of the first sensor and the second sensor, and carrying out data updating according to a data combination substitution mode.
14. A computer device, comprising: a memory and a processor, the memory and the processor being communicatively coupled to each other, the memory having stored therein computer instructions, the processor executing the computer instructions to perform the data fusion method of any of claims 1-12.
15. A computer readable storage medium storing computer instructions for causing the computer to perform the data fusion method of any one of claims 1-12.
CN202010584334.3A 2020-06-23 2020-06-23 Data fusion method, device, system and computer equipment Active CN111753901B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010584334.3A CN111753901B (en) 2020-06-23 2020-06-23 Data fusion method, device, system and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010584334.3A CN111753901B (en) 2020-06-23 2020-06-23 Data fusion method, device, system and computer equipment

Publications (2)

Publication Number Publication Date
CN111753901A CN111753901A (en) 2020-10-09
CN111753901B true CN111753901B (en) 2023-08-15

Family

ID=72676671

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010584334.3A Active CN111753901B (en) 2020-06-23 2020-06-23 Data fusion method, device, system and computer equipment

Country Status (1)

Country Link
CN (1) CN111753901B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112465065B (en) * 2020-12-11 2022-10-14 中国第一汽车股份有限公司 Sensor data association method, device, equipment and storage medium
CN116166939A (en) * 2023-02-09 2023-05-26 浙江九州云信息科技有限公司 Data preprocessing method and system based on vehicle-road cooperation

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108037515A (en) * 2017-12-27 2018-05-15 清华大学苏州汽车研究院(吴江) A kind of laser radar and ultrasonic radar information fusion system and method
CN109615870A (en) * 2018-12-29 2019-04-12 南京慧尔视智能科技有限公司 A kind of traffic detection system based on millimetre-wave radar and video
CN109871385A (en) * 2019-02-28 2019-06-11 北京百度网讯科技有限公司 Method and apparatus for handling data
CN110414396A (en) * 2019-07-19 2019-11-05 中国人民解放军海军工程大学 A kind of unmanned boat perception blending algorithm based on deep learning
CN110866544A (en) * 2019-10-28 2020-03-06 杭州飞步科技有限公司 Sensor data fusion method and device and storage medium
CN111090095A (en) * 2019-12-24 2020-05-01 联创汽车电子有限公司 Information fusion environment perception system and perception method thereof
CN111222568A (en) * 2020-01-03 2020-06-02 北京汽车集团有限公司 Vehicle networking data fusion method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108037515A (en) * 2017-12-27 2018-05-15 清华大学苏州汽车研究院(吴江) A kind of laser radar and ultrasonic radar information fusion system and method
CN109615870A (en) * 2018-12-29 2019-04-12 南京慧尔视智能科技有限公司 A kind of traffic detection system based on millimetre-wave radar and video
CN109871385A (en) * 2019-02-28 2019-06-11 北京百度网讯科技有限公司 Method and apparatus for handling data
CN110414396A (en) * 2019-07-19 2019-11-05 中国人民解放军海军工程大学 A kind of unmanned boat perception blending algorithm based on deep learning
CN110866544A (en) * 2019-10-28 2020-03-06 杭州飞步科技有限公司 Sensor data fusion method and device and storage medium
CN111090095A (en) * 2019-12-24 2020-05-01 联创汽车电子有限公司 Information fusion environment perception system and perception method thereof
CN111222568A (en) * 2020-01-03 2020-06-02 北京汽车集团有限公司 Vehicle networking data fusion method and device

Also Published As

Publication number Publication date
CN111753901A (en) 2020-10-09

Similar Documents

Publication Publication Date Title
CN109920246B (en) Collaborative local path planning method based on V2X communication and binocular vision
WO2022022694A1 (en) Method and system for sensing automated driving environment
US20210124013A1 (en) Signal processing apparatus, signal processing method, and program
EP4283515A1 (en) Detection method, system, and device based on fusion of image and point cloud information, and storage medium
CN112558023B (en) Calibration method and device of sensor
CN111563450B (en) Data processing method, device, equipment and storage medium
Rawashdeh et al. Collaborative automated driving: A machine learning-based method to enhance the accuracy of shared information
AU2018286594A1 (en) Methods and systems for color point cloud generation
US11538241B2 (en) Position estimating device
CN112284416B (en) Automatic driving positioning information calibration device, method and storage medium
CN113537287A (en) Multi-sensor information fusion method and device, storage medium and automatic driving system
CN111209956A (en) Sensor data fusion method, and vehicle environment map generation method and system
CN111753901B (en) Data fusion method, device, system and computer equipment
US11908206B2 (en) Compensation for vertical road curvature in road geometry estimation
CN114898314B (en) Method, device, equipment and storage medium for detecting target of driving scene
US12085403B2 (en) Vehicle localisation
US20230094836A1 (en) Method for Detecting Moving Objects in the Surroundings of a Vehicle, and Motor Vehicle
CN114758504A (en) Online vehicle overspeed early warning method and system based on filtering correction
CN111612818A (en) Novel binocular vision multi-target tracking method and system
Sakic et al. Camera-LIDAR object detection and distance estimation with application in collision avoidance system
US20220404170A1 (en) Apparatus, method, and computer program for updating map
CN115471526A (en) Automatic driving target detection and tracking method based on multi-source heterogeneous information fusion
GB2604175A (en) A method for determining a mounting position error of an environment sensor device of an assistance system of a motor vehicle as well as an assistance syste
CN114758200A (en) Multi-sensing data fusion method, multi-source fusion perception system and computer equipment
EP4266261A1 (en) 3d road surface estimation for automated driving systems

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant