CN111753901A - Data fusion method, device and system and computer equipment - Google Patents

Data fusion method, device and system and computer equipment Download PDF

Info

Publication number
CN111753901A
CN111753901A CN202010584334.3A CN202010584334A CN111753901A CN 111753901 A CN111753901 A CN 111753901A CN 202010584334 A CN202010584334 A CN 202010584334A CN 111753901 A CN111753901 A CN 111753901A
Authority
CN
China
Prior art keywords
data
target level
level data
sensor
sensor target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010584334.3A
Other languages
Chinese (zh)
Other versions
CN111753901B (en
Inventor
张庆
李军
褚文博
温悦
赵盼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guoqi Beijing Intelligent Network Association Automotive Research Institute Co ltd
Original Assignee
Guoqi Beijing Intelligent Network Association Automotive Research Institute Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guoqi Beijing Intelligent Network Association Automotive Research Institute Co ltd filed Critical Guoqi Beijing Intelligent Network Association Automotive Research Institute Co ltd
Priority to CN202010584334.3A priority Critical patent/CN111753901B/en
Publication of CN111753901A publication Critical patent/CN111753901A/en
Application granted granted Critical
Publication of CN111753901B publication Critical patent/CN111753901B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a data fusion method, a device, a system and computer equipment, wherein the data fusion method comprises the following steps: respectively acquiring first sensor target level data, second sensor target level data and third sensor target level data; processing the first sensor target level data and the second sensor target level data to obtain primary fusion data, wherein the first sensor target level data and the second sensor target level data have the same data type; and processing the primary fusion data and the target level data of the third sensor to obtain secondary fusion data. By implementing the method, the computing time in the data fusion process is reduced, the data fusion efficiency is improved, and the real-time performance is further improved.

Description

Data fusion method, device and system and computer equipment
Technical Field
The invention relates to the technical field of intelligent automobiles, in particular to a data fusion method, a data fusion device, a data fusion system and computer equipment.
Background
With the continuous development of automobile intelligentization and networking technologies, intelligent automobiles enter the lives of people, and play a great role in relieving traffic pressure, guaranteeing the safety of passengers, improving traffic efficiency and the like, and accurate and comprehensive ambient environment sensing is the most critical technology in an intelligent automobile system and is also the premise of realizing intelligent driving of the automobiles, so that how to accurately and comprehensively sense the ambient environment is of great importance.
In order to accurately and comprehensively obtain information in the surrounding environment, most of the existing smart cars are equipped with various sensors (laser radar, camera, millimeter wave radar, ultrasonic radar, etc.) for sensing the environment. For an intelligent automobile with various sensors, how to accurately and effectively utilize data detected by the sensors becomes a difficult problem in an intelligent automobile sensing system. Most of the existing multi-sensor data fusion methods focus on data fusion of a plurality of sensors of the same type, or only perform data fusion on two sensors, and the fusion method does not embody the unique advantages of each type of sensor, but simply performs filtering processing on data detected by all the sensors, reduces the real-time performance of the system, cannot guarantee the accuracy of the multi-sensor data fusion, and cannot fully embody the respective advantages of each sensor.
Disclosure of Invention
Therefore, the technical problem to be solved by the present invention is to overcome the defects of insufficient utilization of the advantages of the sensor and low data fusion accuracy in the prior art, so as to provide a data fusion method, device, system and computer equipment.
According to a first aspect, an embodiment of the present invention provides a data fusion method, including: respectively acquiring first sensor target level data, second sensor target level data and third sensor target level data; processing the first sensor target level data and the second sensor target level data to obtain primary fusion data; wherein the first sensor target level data is of the same data type as the second sensor target level data; and processing the primary fusion data and the third sensor target level data to obtain secondary fusion data.
With reference to the first aspect, in a first implementation manner of the first aspect, when the first sensor target level data, the second sensor target level data, and the third sensor target level data are acquired, the method further includes: time stamping the acquired first sensor target level data, the acquired second sensor target level data and the acquired third sensor target level data.
With reference to the first aspect, in a second implementation manner of the first aspect, when the first sensor target level data and the second sensor target level data are target position data and the types of the first sensor and the second sensor are different, before fusing the first sensor target level data and the second sensor target level data, the method further includes: and performing coordinate conversion on the first sensor target level data and the second sensor target level data to obtain the first sensor target level data and the second sensor target level data after space synchronization.
With reference to the second implementation manner of the first aspect, in a third implementation manner of the first aspect, the processing the first sensor target level data and the second sensor target level data to obtain primary fusion data includes: calculating a time difference between a latest frame of the first sensor target level data and a latest frame of the second sensor target level data; judging whether the time difference is larger than a preset threshold value or not; when the time difference is smaller than or equal to the preset threshold, performing association matching on the first sensor target level data and the second sensor target level data, and updating the data pair with successful association matching to obtain the primary fusion data; and when the time difference is larger than the preset threshold value, selecting latest time data from the first sensor target level data and the second sensor target level data, and taking the selected latest time data as the primary fusion data.
With reference to the first aspect, in a fourth implementation manner of the first aspect, after performing correlation matching on the first sensor target level data and the second sensor target level data, the method further includes: and filtering the first sensor target level data or the second sensor target level data which are failed to be associated to obtain the primary fusion data.
With reference to the third embodiment of the first aspect, in a fifth embodiment of the first aspect, when the first sensor target level data and the second sensor target level data contain one or more feature quantities; the associative matching of the first sensor target level data and the second sensor target level data comprises: time compensating the first sensor target level data and the second sensor target level data; determining the same characteristic quantity in the time-compensated first sensor target level data and the second sensor target level data; establishing a correlation matrix between the first sensor target level data and the second sensor target level data according to the same characteristic quantity based on similarity calculation; and determining a matching relation between the first sensor target level data and the second sensor target level data based on the incidence matrix, and forming incidence data pairs by the first sensor target level data and the second sensor target level data which belong to the same target.
With reference to the third implementation manner of the first aspect, in a sixth implementation manner of the first aspect, the updating the association data pair with which the association matching is successful includes: and performing data combination on the associated data pairs based on the characteristics of the first sensor and the second sensor, and performing data updating according to a data combination substitution mode.
With reference to the fifth implementation manner of the first aspect, in a seventh implementation manner of the first aspect, the time compensating the first sensor target level data and the second sensor target level data includes: acquiring a target timestamp difference value, and determining the first sensor target level data and the second sensor target level data corresponding to the timestamp difference value; acquiring position information and motion information contained in first sensor target-level data or second sensor target-level data corresponding to the smaller timestamp in the target timestamp difference; predicting predicted position information after the timestamp difference value is passed according to the position information, the motion information and the timestamp difference value; and updating the timestamp of the first sensor or the second sensor to be the larger timestamp in the difference value of the target timestamps by taking the predicted position information as the measured value at the current moment.
With reference to the first aspect, in an eighth implementation manner of the first aspect, when the first sensor target level data and the second sensor target level data are target position data, and the third sensor target level data is target image data, the processing the primary fusion data and the third sensor target level data includes: calculating the time difference between the latest frame of the primary fusion data and the latest frame of the third sensor target level data; judging whether the time difference is larger than a preset threshold value or not; when the time difference is smaller than or equal to the preset threshold value, performing association matching on the primary fusion data and the third sensor target level data, and updating the data pair with successful association matching to obtain the secondary fusion data; and when the time difference is larger than the preset threshold value, selecting latest time data from the primary fusion data and the third sensor target level data, and taking the latest time data as the secondary fusion data.
With reference to the eighth implementation manner of the first aspect, in a ninth implementation manner of the first aspect, after performing correlation matching on the primary fusion data and the third sensor target-level data, the method further includes: and filtering the primary fusion data or the third sensor target level data which are failed to be associated to obtain the secondary fusion data.
With reference to the eighth implementation manner of the first aspect, in a tenth implementation manner of the first aspect, the performing associative matching on the primary fusion data and the third sensor target-level data includes: time compensating the primary fused data and the third sensor target level data in a manner that the primary fused data compensates to the third sensor target level data; converting the time-compensated primary fusion data into a coordinate system corresponding to the target level data of the third sensor, and establishing an incidence matrix between the primary fusion data and the target level of the third sensor according to coordinate system information corresponding to the target level data of the third sensor; and determining a matching relation between the primary fusion data and the third sensor target-level data based on global optimal matching to obtain a matching data pair corresponding to the primary fusion data and the third sensor target-level data of the same target.
With reference to the tenth implementation manner of the first aspect, in an eleventh implementation manner of the first aspect, the updating the association data pair with which the association matching is successful includes: and correcting and eliminating the primary fusion data by adopting the acquired target-level data of the third sensor according to the matching data pair, and updating data according to the corrected and eliminated primary fusion data.
With reference to the first aspect, in a twelfth implementation manner of the first aspect, after the processing the primary fusion data and the third sensor target-level data to obtain secondary fusion data, the method further includes: and carrying out multi-target tracking on the secondary fusion data, and determining a target driving track containing the multiple targets.
With reference to the first aspect, in a thirteenth implementation of the first aspect, the first sensor is a millimeter wave radar sensor; the second sensor is a laser radar sensor; the third sensor is a vision sensor.
According to a second aspect, an embodiment of the present invention provides a data fusion apparatus, including: the acquisition module is used for respectively acquiring first sensor target level data, second sensor target level data and third sensor target level data; the first fusion module is used for processing the first sensor target level data and the second sensor target level data to obtain primary fusion data; wherein the first sensor target level data is of the same data type as the second sensor target level data; and the second fusion module is used for processing the primary fusion data and the third sensor target level data to obtain secondary fusion data.
According to a third aspect, an embodiment of the present invention provides a computer apparatus, including: a memory and a processor, the memory and the processor being communicatively connected to each other, the memory storing computer instructions, and the processor executing the computer instructions to perform the data fusion method according to the first aspect or any embodiment of the first aspect.
According to a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, where computer instructions are stored, and the computer instructions are configured to cause the computer to execute the data fusion method according to the first aspect or any implementation manner of the first aspect.
The technical scheme of the invention has the following advantages:
1. the data fusion method provided by the invention obtains the target level data of the first sensor, the second sensor and the third sensor respectively, fuses the target level data of the first sensor and the target level data of the second sensor to obtain primary fusion data, fuses the primary fusion data and the target level data of the third sensor to obtain secondary fusion data, fully utilizes the characteristics of the first sensor, the second sensor and the third sensor in the process of fusing the target level data to ensure that each sensor can exert the respective advantages, can correct the primary fusion data by fusing the target level data of the third sensor and the primary fusion data to ensure the accuracy of the fusion data, and can reduce the calculation time in the data fusion process by fusing the target level data based on the fused data which are the target level data of the sensors, the data fusion efficiency is improved, and the real-time performance can be improved.
2. According to the data fusion method provided by the invention, the asynchronous target-level data of each sensor is subjected to time synchronization through time compensation, so that errors caused by the asynchronous acquired data of each sensor are avoided, the accuracy of correlation matching of the sensor data is further improved, and the accuracy of data fusion of each sensor is further ensured.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a flow chart of a data fusion method in an embodiment of the present invention;
FIG. 2 is a flow chart of a data fusion method according to an embodiment of the present invention;
FIG. 3 is a schematic block diagram of a data fusion apparatus according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a computer device in an embodiment of the present invention.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings, and it should be understood that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; the two elements may be directly connected or indirectly connected through an intermediate medium, or may be communicated with each other inside the two elements, or may be wirelessly connected or wired connected. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
In addition, the technical features involved in the different embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
Example 1
The embodiment provides a data fusion method, which is applied to an intelligent automobile to realize environment perception of the intelligent automobile, and as shown in fig. 1, the method includes the following steps:
s11, acquiring first sensor target level data, second sensor target level data and third sensor target level data, respectively.
For example, the obtained target-level data may be determined according to characteristics of the sensors, the first sensor target-level data may include data information such as object position, speed, category, and first sensor identification, the second sensor target-level data may include data information such as object position, speed, category, heading angle, and second sensor identification, and the third sensor target-level data may include data information such as object pixel coordinates, category, and third sensor identification. The target level data of the first sensor, the second sensor and the third sensor can be acquired correspondingly through the sensors.
S12, processing the first sensor target level data and the second sensor target level data to obtain primary fusion data; wherein the first sensor target level data is of the same data type as the second sensor target level data.
For example, the first sensor target level data may be of the same data type as the second sensor target level data, such as the target level data of the second sensor of the first sensor may each include object position, velocity, category, etc. data. And fusing the first sensor target level data and the second sensor target level data to obtain primary fusion data.
As an alternative embodiment of the present application, the first sensor may be a millimeter wave radar sensor, the second sensor may be a laser radar sensor, and the third sensor may be a vision sensor, such as a camera. Those skilled in the art can adjust the first sensor, the second sensor and the third sensor as needed, for example, the first sensor is a laser radar sensor, the second sensor is a millimeter wave radar sensor, and the third sensor is a vision sensor. The present application does not limit the types of the first sensor, the second sensor, and the third sensor. In the embodiment of the present application, a description is given by taking a first sensor as a millimeter wave radar sensor, a second sensor as a laser radar sensor, and a third sensor as a camera as an example.
Filtering and tracking unstable and high-noise target-level data output by the millimeter wave radar sensor by a filtering method, adding a time identifier to the target-level data of each frame of the millimeter wave radar sensor, and packaging the target-level data according to single-frame data; clustering point cloud data output by a laser radar sensor, separating the point cloud data of an object from the environment, obtaining position information and three-dimensional profile information of the object by using a boundary fitting mode, eliminating noise in perception data output by the laser radar sensor by using a filtering tracking method to obtain more stable object information, and finally adding time identification and data encapsulation for each frame of data according to a GPS differential sensor.
And S13, processing the primary fusion data and the third sensor target level data to obtain secondary fusion data.
Illustratively, after primary fusion data are obtained by fusing first sensor target level data and second sensor target level data, obstacles, lane lines, traffic lights and traffic identifications in images are identified by a deep learning algorithm according to front image information acquired by a third sensor, UTC time identifications are added for units, data information of the same type in a period is packaged, the third sensor target level data and the primary fusion data are fused to realize correction of the primary fusion data, secondary fusion data obtained by fusing the primary fusion data and the third sensor target level data are obtained, and accuracy of the secondary fusion data is guaranteed.
The data fusion method provided by this embodiment obtains target level data of the first sensor, the second sensor and the third sensor, fuses the target level data of the first sensor and the target level data of the second sensor to obtain primary fusion data, fuses the primary fusion data and the target level data of the third sensor to obtain secondary fusion data, fully utilizes respective characteristics of the first sensor, the second sensor and the third sensor during the process of target level data fusion, so that each sensor can exert respective advantages, and the primary fusion data can be corrected by fusing the target level data of the third sensor and the primary fusion data, so as to ensure the accuracy of the fusion data, and based on that the fused data are sensor target level data, the calculation time during the data fusion process can be reduced by fusing the target level data, the data fusion efficiency is improved, and the real-time performance can be improved.
As an optional embodiment of the present application, when the first sensor target level data, the second sensor target level data, and the third sensor target level data are acquired, the method further includes: and time stamping the acquired first sensor target level data, second sensor target level data and third sensor target level data.
For example, since the sensors may sense the object with a certain time difference, the acquired first sensor target level data, second sensor target level data, and third sensor target level data may also have a certain time difference. And adding timestamps to the acquired original data of the first sensor target level data, the second sensor target level data and the third sensor target level data, otherwise, a certain time difference exists among sensing results of the sensors. Time synchronization between different sensors is realized by adding time marks to each frame data of each sensor by adding time stamps and utilizing a uniform time standard.
It should be noted that, the time standard may be a system time standard or a UTC time standard, and when data reception of each sensor is performed on one controller, both the above two methods of unifying time standards may be used, and when each sensor is received on a different controller, an external UTC time standard may be used, GPS time information is provided through a GPS differential sensor, and a UTC time identifier is added to each sensor, which is not limited in this application. According to the embodiment of the application, the GPS differential sensor is selected to be used for adding UTC time stamps to the sensors, and meanwhile, the positioning information of the intelligent automobile is obtained.
According to the data fusion method provided by the embodiment, the asynchronous target-level data of each sensor is subjected to time synchronization through time compensation, so that errors caused by the asynchronous acquired data of each sensor are avoided, the accuracy of correlation matching of the sensor data is improved, and the accuracy of data fusion of each sensor is guaranteed.
As an optional embodiment of the present application, when the first sensor target level data and the second sensor target level data are target position data and the types of the first sensor and the second sensor are different, before fusing the first sensor target level data and the second sensor target level data, the method further includes: and performing coordinate conversion on the first sensor target level data and the second sensor target level data to obtain the first sensor target level data and the second sensor target level data after space synchronization.
Illustratively, the latest frame of data is stored according to the same data format for each piece of acquired target-level data of the sensor, and the latest frame of data of each sensor can be stored by constructing a uniform structure body, wherein the structure body also comprises the sensor number to which the data belongs in addition to the data of the object sensed by each sensor. And based on the translation matrix and the rotation matrix of each sensor coordinate system relative to the vehicle body coordinate system, converting the received target-level data of different sensors into the same vehicle body coordinate system through coordinates to obtain the sensor data after space synchronization. And carrying out coordinate conversion on the first sensor coordinate system and the second sensor coordinate system relative to the translation matrix and the rotation matrix of the vehicle body coordinate system to obtain first sensor target level data and second sensor target level data after space synchronization.
As an alternative embodiment of the present application, as shown in fig. 2, the step S12 includes:
and S121, calculating the time difference between the latest frame of first sensor target level data and the latest frame of second sensor target level data.
For example, a latest frame of first sensor target level data and a latest frame of second sensor target level data may be determined from timestamps added to the first sensor target level data and the second sensor target level data, respectively, and a time difference between the two sets of target level data may be calculated from a timestamp corresponding to the latest frame of first sensor target level data and a timestamp corresponding to the latest frame of second sensor target level data.
And S122, judging whether the time difference is larger than a preset threshold value or not.
Illustratively, the preset threshold is a critical time difference value that determines a fusion pattern of the first sensor target level data and the second sensor target level data. The preset threshold may be determined by a person skilled in the art according to actual needs, for example, the preset threshold is set to 1s, which is not limited in the present application. If the time difference is smaller than or equal to the predetermined threshold, step S123 is executed, and if the time difference is greater than the predetermined threshold, step S124 is executed.
And S123, when the time difference is smaller than or equal to the threshold value, performing correlation matching on the first sensor target level data and the second sensor target level data, and updating the data pair with successful correlation matching to obtain primary fusion data.
Illustratively, when the time difference Δ T is less than or equal to the preset threshold TthIn the process, the time difference between two groups of target level data of the first sensor target level data and the second sensor target level data can be considered to be in a controllable range, the motion state of an object in the time difference can be considered to be kept unchanged, the first sensor target level data and the second sensor target level data are subjected to correlation matching, the first sensor target level data and the second sensor target level data which are successfully subjected to correlation matching form a data pair, and the data pair is subjected to correlation matchingAnd packaging to obtain primary fusion data.
And S124, when the time difference is larger than the threshold value, selecting the latest time data from the first sensor target level data and the second sensor target level data, and taking the latest time data as primary fusion data.
Illustratively, when the time difference Δ T is greater than a preset threshold value TthAnd in the process, the old group of data of the timestamp is considered to have no use value due to too large time interval, the old data of the timestamp is removed, the new group of data of the timestamp is reserved, and the dimension expansion operation is performed on the new group of data of the timestamp, so that the uniformity of the primary fusion data format is ensured. For example, when the latest time data is data of a millimeter wave radar, in order to ensure the uniformity of the format of the primary fusion data, the millimeter wave radar data needs to be subjected to dimension expansion operation, features that cannot be obtained by the millimeter wave radar are filled with default values, and then data is packaged to obtain the primary fusion data.
As an optional embodiment of the present application, when the first sensor target level data and the second sensor target level data contain one or more feature quantities, the performing the associative matching of the first sensor target level data and the second sensor target level data includes:
first, the first sensor target level data and the second sensor target level data are time compensated.
For example, the principle of time compensation is to assume that the motion state of the object does not change within a short time period required for compensation. Based on the small time difference between the first sensor target level data and the second sensor target level data, it can be considered that the motion state of the object remains unchanged in the extreme time, so as to perform time compensation on the first sensor target level data and the second sensor target level data, and the specific steps can include:
step 1, obtaining a target timestamp difference value, and determining first sensor target level data and second sensor target level data corresponding to the target timestamp difference value. The target timestamp difference may be equal to a time difference Δ t between a latest frame of first sensor target level data and a latest frame of second sensor target level data, and the target timestamp difference is determined to correspond to two sets of target level data, i.e., the latest frame of first sensor target level data and the latest frame of second sensor target level data.
And 2, acquiring position information and motion information contained in the first sensor target level data or the second sensor target level data corresponding to the smaller timestamp in the target timestamp difference value. Comparing timestamps corresponding to the two groups of data, selecting position information and motion information contained in target level data corresponding to the smaller timestamp, and if the timestamp of the first sensor target level data is smaller, acquiring the position information and the motion information contained in the first sensor target level data; and if the timestamp of the second sensor target level data is smaller, acquiring the position information and the operation information contained in the second sensor target level data.
And 3, predicting the predicted position information after the target timestamp difference value is passed according to the position information, the motion information and the target timestamp difference value. The time difference delta T of the time stamp is less than the preset threshold value TthIt is considered that the motion state of the object in this time difference is kept unchanged, and the object position information after Δ t time is predicted according to the acquired current position information and motion information of the object, so as to acquire predicted position information.
And 4, updating the time stamp of the first sensor or the second sensor to be the larger time stamp in the difference value of the target time stamps by taking the predicted position information as the measured value at the current moment. And taking the predicted position information as a measured value of the current moment, and updating the timestamp information of the sensor to be the larger timestamp in the two groups of data corresponding to the target timestamp difference value, thereby realizing the synchronization of space and time. It should be noted that the time stamp of the target level data of each sensor is also updated correspondingly after the time compensation, and the time compensation may be updated only for the position information and the speed information of the object, and optionally, the acceleration which does not change may also be subjected to prediction compensation.
Next, the same characteristic amount in the time-compensated first sensor target level data and the second sensor target level data is determined.
Illustratively, the same characteristic quantities, such as position, speed, and acceleration, that can be detected by both the millimeter wave radar sensor target level data and the laser radar sensor target level data are determined from a plurality of characteristic quantities included in the target level data detected by the millimeter wave radar sensor and the laser radar sensor.
Second, a correlation matrix between the first sensor target-level data and the second sensor target-level data is established from the same feature quantity based on the similarity calculation.
Exemplarily, the elements in the ith row and the jth column of the correlation matrix represent the similarity between the ith target detected by the second sensor (lidar sensor) and the jth target detected by the first sensor (millimeter-wave radar sensor). The similarity can be calculated by weighting the position and speed between two objects by mahalanobis distance or euclidean distance. Thus, the correlation matrix can be established according to the determined same characteristic quantity as a consideration value for performing correlation matching on the first sensor target level data and the second sensor target level data.
And thirdly, determining a matching relation between the first sensor target level data and the second sensor target level data based on the incidence matrix, and forming incidence data pairs by the first sensor target level data and the second sensor target level data which belong to the same target.
Illustratively, a global optimal matching method is used to determine a corresponding matching relationship between the first sensor target-level data and the second sensor target-level data according to the constructed incidence matrix, wherein the matching relationship is used to indicate which feature measures between two sets of data belong to the same target. And matching and searching the target level data detected by the laser radar sensor and the millimeter wave radar sensor according to the incidence matrix, judging which characteristic quantities belong to the same target, and correlating the target level data of the laser radar sensor and the target level data of the millimeter wave radar sensor which belong to the same target to generate a correlation data pair.
As an optional embodiment of the present application, updating the association data pair with successful association matching includes: and performing data combination on the associated data pairs based on the characteristics of the first sensor and the second sensor, and performing data updating according to a data combination substitution mode.
For example, based on the respective detection advantages of the first sensor and the second sensor in detecting the object features within a certain range, the associated data pairs may be subjected to data combination based on the advantages of the first sensor and the second sensor in detecting the object features and the detection range, and the data update may be performed according to a data combination replacement mode. For example, for a correlation data pair successfully matched in correlation, the speed information of the millimeter wave radar sensor may be selected to correct the speed information of the laser radar sensor, and data information that cannot be obtained by the laser radar, including but not limited to the signal-to-noise ratio of the object and the Radar Cross Section (RCS), is added to the data obtained by the laser radar sensor, and the current data is replaced by the combined data, that is, the data is updated by a data combination replacement method.
As an optional embodiment of the present application, after performing correlation matching on the first sensor target level data and the second sensor target level data, the method further includes: and filtering the first sensor target level data or the second sensor target level data which are failed to be associated to obtain primary fusion data.
For example, the first sensor target level data or the second sensor target level data which are failed to be associated and matched are filtered, for example, the millimeter wave radar sensor target level data which are failed to be associated and matched may be subjected to noise filtering according to a range, a motion characteristic and a signal-to-noise ratio threshold value of an object detected by the millimeter wave radar sensor, and the laser radar sensor target level data which are not associated and matched may be subjected to filtering according to information such as stability and three-dimensional size of the object detected by the laser radar sensor, so as to obtain the primary fusion data.
As an optional embodiment of the present application, when the first sensor target level data and the second sensor target level data are target position data, and the third sensor target level data is target image data, the step S13 includes:
first, the time difference between the latest frame of primary fusion data and the latest frame of third sensor target level data is calculated. The calculation method of the time difference is referred to step S121 in the above embodiments, and is not described herein again.
Secondly, whether the time difference is larger than a preset threshold value is judged. For details, refer to the related description of step S122 in the above embodiment, and are not repeated herein.
And secondly, when the time difference is smaller than or equal to a preset threshold value, performing association matching on the primary fusion data and the target-level data of the third sensor, and updating the data pair with successful association matching to obtain secondary fusion data. When the time difference is less than or equal to the preset threshold, the method for obtaining the secondary fusion data is the same as the method for obtaining the primary fusion data, and for details, reference is made to the related description of step S123 in the above embodiment, and details are not repeated here.
And thirdly, when the time difference is larger than a preset threshold value, selecting the latest time data from the primary fusion data and the target level data of the third sensor, and taking the latest time data as secondary fusion data. When the time difference is greater than the preset threshold, the method for obtaining the secondary fusion data is the same as the method for obtaining the primary fusion data, and for details, reference is made to the related description of step S124 in the above embodiment, and details are not repeated here.
As an optional embodiment, the performing the associative matching on the primary fused data and the third sensor target level data includes:
first, the primary fused data and the third sensor target level data are time compensated in such a way that the primary fused data is compensated to the third sensor target level data.
For example, when the third sensor is a vision sensor and the target level data outputted from the vision sensor is target image data, and the third sensor is a camera sensor, for example, since there is a large error in time compensation using pixel information of the target image outputted from the camera sensor, the time compensation may be performed in a manner that the primary fusion data is compensated to the target level data of the third sensor. The detailed principle of the time compensation is described in the related description of the above embodiments, and is not described herein again.
And secondly, converting the time-compensated primary fusion data into a coordinate system corresponding to the target level data of a third sensor, and establishing an incidence matrix between the primary fusion data and the target level of the third sensor according to the coordinate system information corresponding to the target level data of the third sensor.
Illustratively, the primary fusion data after time compensation is projected under an image pixel coordinate system through the conversion of an external reference matrix and an internal reference matrix of the vision sensor, and the primary fusion data with pixel information is obtained. The primary fusion data is a vehicle body coordinate system in nature, and the conversion relation between the vehicle body coordinate system and the image pixel coordinate system is utilized to carry out operation, namely, the primary fusion data is projected to the image pixel coordinate system, the corresponding position of an object on an image and the pixel size of a circumscribed rectangle under the vehicle body coordinate system are obtained in nature, rectangular frame information of a primary fusion data obstacle under the image pixel coordinate system is obtained, and the pixel information and the type information of the object are utilized to establish an association matrix. The similarity of the correlation matrices may be determined according to the degree of coincidence (IoU) of the primary fused data with the visual sensor-perceived object in the image pixel coordinate system, as described in relation to the above embodiments.
And thirdly, determining the matching relation between the primary fusion data and the third sensor target level data based on the global optimal matching to obtain a matching data pair corresponding to the primary fusion data and the third sensor target level data of the same target.
Illustratively, the correlation matching of the two sets of data is completed based on the pixel information of the primary fusion data and the third sensor target level data, resulting in corresponding matched data pairs. The principle of matching the primary fusion data with the target level data of the third sensor is described in the above embodiments, and is not described herein again.
As an optional embodiment of the present application, updating the association data pair with successful association matching includes: and correcting and eliminating the primary fusion data by adopting the acquired target-level data of the third sensor according to the matching data pair, and updating the data according to the corrected and eliminated primary fusion data.
Illustratively, the primary fused data is augmented with third sensor data for matching data pairs for which the association match was successful. And correcting the target type in the primary fusion data according to the target type in the target-level data sensed by the third sensor through the matching data, removing a target with low reliability in the primary fusion data through the data information sensed by the third sensor, and updating the data by adopting a mode of correcting and removing the primary fusion data.
As an optional embodiment of the present application, after performing correlation matching on the primary fusion data and the third sensor target level data, the method further includes: and filtering the primary fusion data or the third sensor target level data which are failed to be associated to obtain secondary fusion data.
Illustratively, for the matching data pairs which are not successfully matched in an associated manner, the primary fusion data and the third sensor target-level data are filtered according to the effective detection range and the object detection characteristics of the third sensor, so as to complete the fusion of the primary fusion data and the third sensor target-level data, and obtain secondary fusion data.
As an optional embodiment of the present application, after the processing the primary fusion data and the third sensor target-level data to obtain the secondary fusion data, the method further includes: and performing multi-target tracking on the secondary fusion data, and determining a target driving track containing multiple targets.
Exemplarily, multi-target tracking is carried out on the secondary fusion data based on interactive multi-model Kalman filtering and an association matching algorithm, the association matching algorithm can be a Hungarian matching algorithm, and the association matching algorithm is not limited in the application. And completing track updating matched with the second-level fusion data by using a state updating equation of the interactive multi-model Kalman filtering, completing multi-target tracking, determining a target driving track containing multiple targets, and obtaining a final fusion result containing the target driving track.
Example 2
This embodiment provides a data fusion device, is applied to intelligent automobile, realizes intelligent automobile's environmental perception, as shown in fig. 3, includes:
the acquiring module 21 is configured to acquire first sensor target level data, second sensor target level data, and third sensor target level data, respectively. For details, refer to the description related to step S11 in the above method embodiment, and are not described herein again.
The first fusion module 22 is configured to process the first sensor target level data and the second sensor target level data to obtain primary fusion data; wherein the first sensor target level data is of the same data type as the second sensor target level data. For details, refer to the description related to step S12 in the above method embodiment, and are not described herein again.
And the second fusion module 23 is configured to process the primary fusion data and the third sensor target-level data to obtain secondary fusion data. For details, refer to the description related to step S13 in the above method embodiment, and are not described herein again.
In the data fusion device provided by this embodiment, the acquisition module is used to acquire target level data of the first sensor, the second sensor and the third sensor, the first fusion module is used to fuse the target level data of the first sensor with the target level data of the second sensor to obtain primary fusion data, and the second fusion module is used to fuse the primary fusion data with the target level data of the third sensor to obtain secondary fusion data. The device is carrying out the respective characteristic of target level data fusion's in-process make full use of first sensor, second sensor and third sensor for each sensor homoenergetic plays respective advantage, and fuse through third sensor target level data and elementary fusion data and can revise elementary fusion data, guarantee the accuracy of fusion data, be sensor target level data based on the data of fusing, adopt target level data to fuse and can reduce the calculation time of data fusion in-process, improve data fusion efficiency, and then can improve the real-time.
As an optional embodiment of the present application, the apparatus further comprises:
and the time synchronization module is used for adding timestamps to the acquired first sensor target level data, second sensor target level data and third sensor target level data. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
As an optional embodiment of the present application, when the first sensor target level data and the second sensor target level data are target position data and the types of the first sensor and the second sensor are different, before fusing the first sensor target level data and the second sensor target level data, the method further includes:
and the coordinate conversion module is used for performing coordinate conversion on the first sensor target level data and the second sensor target level data to obtain the first sensor target level data and the second sensor target level data after space synchronization. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
As an alternative embodiment of the present application, the first fusion module 22 includes:
and the first calculating submodule is used for calculating the time difference between the latest frame of first sensor target level data and the latest frame of second sensor target level data. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
And the first judgment submodule is used for judging whether the time difference is greater than a preset threshold value. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
And the first fusion submodule is used for performing association matching on the first sensor target level data and the second sensor target level data when the time difference is less than or equal to a preset threshold value, and updating the data pair with successful association matching to obtain primary fusion data. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
And the second fusion submodule is used for selecting the latest time data from the first sensor target level data and the second sensor target level data when the time difference is larger than a preset threshold value, and taking the selected latest time data as primary fusion data. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
As an optional embodiment of the present application, after performing correlation matching on the first sensor target level data and the second sensor target level data, the method further includes:
and the first filtering submodule is used for filtering the first sensor target level data or the second sensor target level data which are failed to be associated to obtain primary fusion data. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
As an optional embodiment of the present application, when the first sensor target level data and the second sensor target level data include one or more feature quantities, the first fusion sub-module further includes:
and the first time compensation submodule is used for performing time compensation on the first sensor target level data and the second sensor target level data. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
And the determining submodule is used for determining the same characteristic quantity in the time-compensated first sensor target level data and the time-compensated second sensor target level data. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
And the first establishing submodule is used for establishing a correlation matrix between the first sensor target level data and the second sensor target level data according to the same characteristic quantity based on the similarity calculation. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
And the first association submodule is used for determining the matching relation between the first sensor target level data and the second sensor target level data based on the association matrix and forming association data pairs by the first sensor target level data and the second sensor target level data which belong to the same target. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
As an optional embodiment of the present application, updating the association data pair with successful association matching includes:
and the combination submodule is used for carrying out data combination on the associated data pairs based on the characteristics of the first sensor and the second sensor and carrying out data updating according to a data combination substitution mode. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
As an optional implementation manner of the present application, the time compensation submodule includes:
and the first acquisition submodule is used for acquiring a target timestamp difference value and determining first sensor target level data and second sensor target level data corresponding to the target timestamp difference value. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
And the second acquisition submodule is used for acquiring the position information and the motion information contained in the first sensor target level data or the second sensor target level data corresponding to the smaller timestamp in the target timestamp difference value. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
And the prediction submodule is used for predicting the predicted position information after the target timestamp difference value is passed according to the position information, the motion information and the target timestamp difference value. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
And the updating submodule is used for updating the time stamp of the first sensor or the second sensor to be the larger time stamp in the difference value of the target time stamps by taking the predicted position information as the measured value at the current moment. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
As an optional embodiment of the present application, when the first sensor target level data and the second sensor target level data are target position data, and the third sensor target level data is target image data, the second fusion module includes:
and the second calculating submodule is used for calculating the time difference between the latest frame of primary fusion data and the latest frame of target level data of the third sensor. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
And the second judgment submodule is used for judging whether the time difference is larger than a preset threshold value or not. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
And the third fusion submodule is used for performing association matching on the primary fusion data and the third sensor target level data when the time difference is less than or equal to a preset threshold value, and updating the data pair which is successfully associated and matched to obtain secondary fusion data. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
And the fourth fusion submodule is used for selecting the latest time data from the primary fusion data and the third sensor target level data when the time difference is greater than the preset threshold value, and taking the selected latest time data as the secondary fusion data. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
As an optional embodiment of the present application, after performing correlation matching on the primary fusion data and the third sensor target level data, the method further includes:
and the second filtering submodule is used for filtering the primary fusion data with failed association or the target level data of the third sensor to obtain secondary fusion data. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
As an optional embodiment of the present application, the second fusion module 23 includes:
and the second time compensation submodule is used for performing time compensation on the primary fusion data and the third sensor target level data in a mode that the primary fusion data is compensated to the third sensor target level data. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
And the second establishing submodule is used for converting the time-compensated primary fusion data into a coordinate system corresponding to the target level data of the third sensor, and establishing an incidence matrix between the primary fusion data and the target level of the third sensor according to the coordinate system information corresponding to the target level data of the third sensor. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
And the second association submodule is used for determining the matching relation between the primary fusion data and the third sensor target-level data based on the global optimal matching to obtain a matching data pair corresponding to the primary fusion data and the third sensor target-level data of the same target. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
As an optional embodiment of the present application, updating the association data pair with successful association matching includes:
and the correction submodule is used for correcting and eliminating the primary fusion data by adopting the acquired target-level data of the third sensor according to the matching data pair, and updating the data according to the corrected and eliminated primary fusion data. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
As an optional embodiment of the present application, after the second fusion module 23, the method further includes:
and the tracking module is used for carrying out multi-target tracking on the secondary fusion data and determining a target driving track containing multiple targets. For details, reference is made to the description of the above method embodiments, which are not repeated herein.
Example 3
An embodiment of the present invention further provides a computer device, as shown in fig. 4, the device includes a processor 31 and a memory 32, where the processor 31 and the memory 32 may be connected by a bus or in another manner, and fig. 4 takes the connection by the bus 30 as an example.
The processor 31 may be a Central Processing Unit (CPU). The Processor 31 may also be other general-purpose processors, Digital Signal Processors (DSPs), Graphics Processing Units (GPUs), embedded Neural Network Processors (NPUs), or other dedicated deep learning coprocessors, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, or any combination thereof.
The memory 32, which is a non-transitory computer readable storage medium, may be used for storing non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules (e.g., the obtaining module 21, the first fusing module 22, and the second fusing module 23 shown in fig. 3) corresponding to the data fusing method in the embodiment of the present invention. The processor 31 executes various functional applications and data processing of the processor by executing non-transitory software programs, instructions and modules stored in the memory 32, that is, implements the data fusion method in the above method embodiments.
The memory 32 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created by the processor 31, and the like. Further, the storage 32 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 32 may optionally include memory located remotely from the processor 31, and these remote memories may be connected to the processor 31 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The one or more modules are stored in the memory 32 and, when executed by the processor 31, perform the data fusion method in the embodiment shown in fig. 1-2.
The method comprises the steps of respectively obtaining target level data of a first sensor, a second sensor and a third sensor, fusing the target level data of the first sensor and the target level data of the second sensor to obtain primary fusion data, fusing the primary fusion data and the target level data of the third sensor to obtain secondary fusion data, fully utilizing the characteristics of the first sensor, the second sensor and the third sensor in the process of fusing the target level data to enable the sensors to exert the advantages of the sensors, modifying the primary fusion data by fusing the target level data of the third sensor and the primary fusion data to ensure the accuracy of the fusion data, reducing the calculation time in the data fusion process and improving the data fusion efficiency by fusing the target level data based on the sensor target level data, and thus the real-time performance can be improved.
The details of the computer device can be understood by referring to the corresponding descriptions and effects in the embodiments shown in fig. 1 to fig. and are not described herein again.
An embodiment of the present invention further provides a non-transitory computer storage medium, where a computer executable instruction is stored in the computer storage medium, and the computer executable instruction may execute the data fusion method in any of the above method embodiments. The storage medium may be a magnetic Disk, an optical Disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a Flash Memory (Flash Memory), a Hard Disk (Hard Disk Drive, abbreviated as HDD), a Solid State Drive (SSD), or the like; the storage medium may also comprise a combination of memories of the kind described above. It should be understood that the above examples are only for clarity of illustration and are not intended to limit the embodiments. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. And obvious variations or modifications therefrom are within the scope of the invention.

Claims (17)

1. A method of data fusion, comprising:
respectively acquiring first sensor target level data, second sensor target level data and third sensor target level data;
processing the first sensor target level data and the second sensor target level data to obtain primary fusion data; wherein the first sensor target level data is of the same data type as the second sensor target level data;
and processing the primary fusion data and the third sensor target level data to obtain secondary fusion data.
2. The data fusion method of claim 1, further comprising, in acquiring the first sensor target level data, the second sensor target level data, and the third sensor target level data, respectively:
time stamping the acquired first sensor target level data, the acquired second sensor target level data and the acquired third sensor target level data.
3. The data fusion method of claim 1, wherein when the first sensor target level data and the second sensor target level data are target location data and the first sensor and the second sensor are of different types, prior to fusing the first sensor target level data and the second sensor target level data, further comprising:
and performing coordinate conversion on the first sensor target level data and the second sensor target level data to obtain the first sensor target level data and the second sensor target level data after space synchronization.
4. The data fusion method of claim 3, wherein: processing the first sensor target level data and the second sensor target level data to obtain primary fusion data, including:
calculating a time difference between a latest frame of the first sensor target level data and a latest frame of the second sensor target level data;
judging whether the time difference is larger than a preset threshold value or not;
when the time difference is smaller than or equal to the preset threshold, performing association matching on the first sensor target level data and the second sensor target level data, and updating the data pair with successful association matching to obtain the primary fusion data;
and when the time difference is larger than the preset threshold value, selecting latest time data from the first sensor target level data and the second sensor target level data, and taking the selected latest time data as the primary fusion data.
5. The data fusion method of claim 1, further comprising, after the associative matching of the first sensor target level data and the second sensor target level data:
and filtering the first sensor target level data or the second sensor target level data which are failed to be associated to obtain the primary fusion data.
6. The data fusion method of claim 4, wherein when the first sensor target level data and the second sensor target level data contain one or more feature quantities; the associative matching of the first sensor target level data and the second sensor target level data comprises:
time compensating the first sensor target level data and the second sensor target level data;
determining the same characteristic quantity in the time-compensated first sensor target level data and the second sensor target level data;
establishing a correlation matrix between the first sensor target level data and the second sensor target level data according to the same characteristic quantity based on similarity calculation;
and determining a matching relation between the first sensor target level data and the second sensor target level data based on the incidence matrix, and forming incidence data pairs by the first sensor target level data and the second sensor target level data which belong to the same target.
7. The data fusion method of claim 4, wherein updating the association data pair with successfully matched association comprises:
and performing data combination on the associated data pairs based on the characteristics of the first sensor and the second sensor, and performing data updating according to a data combination substitution mode.
8. The data fusion method of claim 6, wherein the time compensating the first sensor target level data and the second sensor target level data comprises:
acquiring a target timestamp difference value, and determining the first sensor target level data and the second sensor target level data corresponding to the target timestamp difference value;
acquiring position information and motion information contained in first sensor target-level data or second sensor target-level data corresponding to the smaller timestamp in the target timestamp difference;
predicting predicted position information after the target timestamp difference value is passed according to the position information, the motion information and the target timestamp difference value;
and updating the timestamp of the first sensor or the second sensor to be the larger timestamp in the difference value of the target timestamps by taking the predicted position information as the measured value at the current moment.
9. The data fusion method of claim 1, wherein processing the primary fusion data and the third sensor target level data when the first sensor target level data and the second sensor target level data are target location data and the third sensor target level data are target image data comprises:
calculating the time difference between the latest frame of the primary fusion data and the latest frame of the third sensor target level data;
judging whether the time difference is larger than a preset threshold value or not;
when the time difference is smaller than or equal to the preset threshold value, performing association matching on the primary fusion data and the third sensor target level data, and updating the data pair with successful association matching to obtain the secondary fusion data;
and when the time difference is larger than the preset threshold value, selecting latest time data from the primary fusion data and the third sensor target level data, and taking the latest time data as the secondary fusion data.
10. The data fusion method of claim 9, further comprising, after the associative matching of the primary fused data and the third sensor target level data:
and filtering the primary fusion data or the third sensor target level data which are failed to be associated to obtain the secondary fusion data.
11. The data fusion method of claim 9, wherein the associative matching of the primary fusion data and the third sensor target level data comprises:
time compensating the primary fused data and the third sensor target level data in a manner that the primary fused data compensates to the third sensor target level data;
converting the time-compensated primary fusion data into a coordinate system corresponding to the target level data of the third sensor, and establishing an incidence matrix between the primary fusion data and the target level of the third sensor according to coordinate system information corresponding to the target level data of the third sensor;
and determining a matching relation between the primary fusion data and the third sensor target-level data based on global optimal matching to obtain a matching data pair corresponding to the primary fusion data and the third sensor target-level data of the same target.
12. The data fusion method of claim 11, wherein updating the association data pair with successfully matched association comprises:
and correcting and eliminating the primary fusion data by adopting the acquired target-level data of the third sensor according to the matching data pair, and updating data according to the corrected and eliminated primary fusion data.
13. The data fusion method of claim 1, further comprising, after processing the primary fusion data with the third sensor target level data to obtain secondary fusion data:
and carrying out multi-target tracking on the secondary fusion data, and determining a target driving track containing the multiple targets.
14. The data fusion method of claim 1, wherein:
the first sensor is a millimeter wave radar sensor;
the second sensor is a laser radar sensor;
the third sensor is a vision sensor.
15. A data fusion apparatus, comprising:
the acquisition module is used for respectively acquiring first sensor target level data, second sensor target level data and third sensor target level data;
the first fusion module is used for processing the first sensor target level data and the second sensor target level data to obtain primary fusion data; wherein the first sensor target level data is of the same data type as the second sensor target level data;
and the second fusion module is used for processing the primary fusion data and the third sensor target level data to obtain secondary fusion data.
16. A computer device, comprising: a memory and a processor, the memory and the processor being communicatively connected to each other, the memory having stored therein computer instructions, the processor executing the computer instructions to perform the data fusion method of any one of claims 1-14.
17. A computer-readable storage medium having stored thereon computer instructions for causing a computer to perform the data fusion method of any one of claims 1-14.
CN202010584334.3A 2020-06-23 2020-06-23 Data fusion method, device, system and computer equipment Active CN111753901B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010584334.3A CN111753901B (en) 2020-06-23 2020-06-23 Data fusion method, device, system and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010584334.3A CN111753901B (en) 2020-06-23 2020-06-23 Data fusion method, device, system and computer equipment

Publications (2)

Publication Number Publication Date
CN111753901A true CN111753901A (en) 2020-10-09
CN111753901B CN111753901B (en) 2023-08-15

Family

ID=72676671

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010584334.3A Active CN111753901B (en) 2020-06-23 2020-06-23 Data fusion method, device, system and computer equipment

Country Status (1)

Country Link
CN (1) CN111753901B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112465065A (en) * 2020-12-11 2021-03-09 中国第一汽车股份有限公司 Sensor data association method, device, equipment and storage medium
CN116166939A (en) * 2023-02-09 2023-05-26 浙江九州云信息科技有限公司 Data preprocessing method and system based on vehicle-road cooperation

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108037515A (en) * 2017-12-27 2018-05-15 清华大学苏州汽车研究院(吴江) A kind of laser radar and ultrasonic radar information fusion system and method
CN109615870A (en) * 2018-12-29 2019-04-12 南京慧尔视智能科技有限公司 A kind of traffic detection system based on millimetre-wave radar and video
CN109871385A (en) * 2019-02-28 2019-06-11 北京百度网讯科技有限公司 Method and apparatus for handling data
CN110414396A (en) * 2019-07-19 2019-11-05 中国人民解放军海军工程大学 A kind of unmanned boat perception blending algorithm based on deep learning
CN110866544A (en) * 2019-10-28 2020-03-06 杭州飞步科技有限公司 Sensor data fusion method and device and storage medium
CN111090095A (en) * 2019-12-24 2020-05-01 联创汽车电子有限公司 Information fusion environment perception system and perception method thereof
CN111222568A (en) * 2020-01-03 2020-06-02 北京汽车集团有限公司 Vehicle networking data fusion method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108037515A (en) * 2017-12-27 2018-05-15 清华大学苏州汽车研究院(吴江) A kind of laser radar and ultrasonic radar information fusion system and method
CN109615870A (en) * 2018-12-29 2019-04-12 南京慧尔视智能科技有限公司 A kind of traffic detection system based on millimetre-wave radar and video
CN109871385A (en) * 2019-02-28 2019-06-11 北京百度网讯科技有限公司 Method and apparatus for handling data
CN110414396A (en) * 2019-07-19 2019-11-05 中国人民解放军海军工程大学 A kind of unmanned boat perception blending algorithm based on deep learning
CN110866544A (en) * 2019-10-28 2020-03-06 杭州飞步科技有限公司 Sensor data fusion method and device and storage medium
CN111090095A (en) * 2019-12-24 2020-05-01 联创汽车电子有限公司 Information fusion environment perception system and perception method thereof
CN111222568A (en) * 2020-01-03 2020-06-02 北京汽车集团有限公司 Vehicle networking data fusion method and device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112465065A (en) * 2020-12-11 2021-03-09 中国第一汽车股份有限公司 Sensor data association method, device, equipment and storage medium
CN116166939A (en) * 2023-02-09 2023-05-26 浙江九州云信息科技有限公司 Data preprocessing method and system based on vehicle-road cooperation

Also Published As

Publication number Publication date
CN111753901B (en) 2023-08-15

Similar Documents

Publication Publication Date Title
WO2022022694A1 (en) Method and system for sensing automated driving environment
CN109920246B (en) Collaborative local path planning method based on V2X communication and binocular vision
CN109870689B (en) Lane-level positioning method and system based on matching of millimeter wave radar and high-precision vector map
US20220371602A1 (en) Vehicle positioning method, apparatus, and controller, intelligent vehicle, and system
CN111563450B (en) Data processing method, device, equipment and storage medium
US11144770B2 (en) Method and device for positioning vehicle, device, and computer readable storage medium
WO2019208101A1 (en) Position estimating device
CN111080784B (en) Ground three-dimensional reconstruction method and device based on ground image texture
CN112562093B (en) Object detection method, electronic medium, and computer storage medium
CN115797454B (en) Multi-camera fusion sensing method and device under bird's eye view angle
CN114898314B (en) Method, device, equipment and storage medium for detecting target of driving scene
US11908206B2 (en) Compensation for vertical road curvature in road geometry estimation
CN111753901B (en) Data fusion method, device, system and computer equipment
CN114829971A (en) Laser radar calibration method and device and storage medium
CN114758504A (en) Online vehicle overspeed early warning method and system based on filtering correction
CN111612818A (en) Novel binocular vision multi-target tracking method and system
CN114694111A (en) Vehicle positioning
CN112241167A (en) Information processing method and device in automatic driving and storage medium
CN116990776A (en) Laser radar point cloud compensation method and device, electronic equipment and storage medium
CN114926799A (en) Lane line detection method, device, equipment and readable storage medium
CN116148820A (en) Laser radar calibration method, computer equipment, readable storage medium and motor vehicle
CN114755663A (en) External reference calibration method and device for vehicle sensor and computer readable storage medium
CN114758200A (en) Multi-sensing data fusion method, multi-source fusion perception system and computer equipment
CN113312403A (en) Map acquisition method and device, electronic equipment and storage medium
CN113390422B (en) Automobile positioning method and device and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant