CN111191734B - Sensor data fusion method, device, equipment and storage medium - Google Patents

Sensor data fusion method, device, equipment and storage medium Download PDF

Info

Publication number
CN111191734B
CN111191734B CN202010006425.9A CN202010006425A CN111191734B CN 111191734 B CN111191734 B CN 111191734B CN 202010006425 A CN202010006425 A CN 202010006425A CN 111191734 B CN111191734 B CN 111191734B
Authority
CN
China
Prior art keywords
sensor
type
time
moment
sensors
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010006425.9A
Other languages
Chinese (zh)
Other versions
CN111191734A (en
Inventor
孙靓
陈新
李彪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Automotive Group Co Ltd
Beijing Automotive Research Institute Co Ltd
Original Assignee
Beijing Automotive Group Co Ltd
Beijing Automotive Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Automotive Group Co Ltd, Beijing Automotive Research Institute Co Ltd filed Critical Beijing Automotive Group Co Ltd
Priority to CN202010006425.9A priority Critical patent/CN111191734B/en
Publication of CN111191734A publication Critical patent/CN111191734A/en
Application granted granted Critical
Publication of CN111191734B publication Critical patent/CN111191734B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The application discloses a sensor data fusion method, a device, equipment and a storage medium, wherein the sensor data fusion method comprises the following steps: mixing first time observation data of at least two types of sensors aiming at a target observation object to obtain first time state estimation corresponding to each type of sensor, wherein the first time is the last time of the second time; obtaining second time state estimation corresponding to each type of sensor according to the first time state estimation corresponding to each type of sensor and the second time observation data of each type of sensor; and combining the second moment state estimation corresponding to each type of sensor to obtain the second moment total state estimation corresponding to all the sensors. According to the application, the observation data of multiple types of sensors can be fused to obtain more accurate data, and then based on the fused data, the automatic driving control device can make more reasonable control decisions or more accurately track the target vehicle.

Description

Sensor data fusion method, device, equipment and storage medium
Technical Field
The application relates to the technical field of automatic driving, in particular to a sensor data fusion method, device and equipment and a storage medium.
Background
Currently, the automatic driving technology can automatically and safely operate the vehicle through a computer without driving control, thereby realizing automatic driving of the vehicle.
In general, in order to realize automatic driving of a vehicle, multiple types of sensors are required to acquire driving data of the vehicle in real time, so that a control decision for the vehicle is generated based on the driving data, in this process, characteristics of each type of sensor may be different, and sensed vehicle data also have differences, so how to fuse the sensed data of each type of sensor according to the characteristics of each type of sensor to make a more reasonable control decision is one of the problems to be solved in the current automatic driving technology.
Disclosure of Invention
The embodiment of the application aims to disclose a sensor data fusion method, device, equipment and storage medium, which are used for fusing the observation data of multiple types of sensors to obtain more accurate data, so that an automatic driving control device can make more reasonable control decisions or more accurately track a target vehicle based on the fused data.
The first aspect of the application discloses a sensor data fusion model method, which comprises the following steps:
mixing first time observation data of at least two types of sensors aiming at a target observation object to obtain first time state estimation corresponding to each type of sensor, wherein the first time is the last time of the second time;
Obtaining second time state estimation corresponding to each type of sensor according to the first time state estimation corresponding to each type of sensor and the second time observation data of each type of sensor;
and combining the second moment state estimation corresponding to each type of sensor to obtain the second moment total state estimation corresponding to all the sensors.
According to the application, the plurality of sensors are fused aiming at the observation data of the same target observation object, so that measurement errors caused by sensor characteristics and external factors can be counteracted, and further, the accurate data of the target observation object is obtained through complementation among the plurality of sensor characteristics, so that a control decision is made based on the accurate data of the target observation object.
In the first aspect of the present application, as an alternative, before merging the second time state estimates corresponding to each type of sensor, the method further includes:
acquiring system state control quantity at a first moment corresponding to all sensors;
and obtaining the system state control quantity at the second moment corresponding to all the sensors according to the system state control quantity at the first moment.
In this alternative manner, the system state control amount can reflect the association relationship between the other objects than the target observation object and the target observation object, so that the present alternative embodiment can facilitate the subsequent calculation of the influence degree of the other objects on the target observation object according to the second time system state control amount by using the first time system state control amount to obtain the second time system state control amount, so as to further improve the accuracy of the total state estimation at the second time. Meanwhile, the robustness of the sensor data fusion method can be improved by obtaining the system state control quantity at the second moment through the system state control quantity at the first moment.
In the first aspect of the present application, as an optional manner, the step of merging the second time state estimates corresponding to each type of sensor to obtain the total second time state estimate corresponding to all the sensors includes the following substeps:
and obtaining second moment total state estimation corresponding to all the sensors according to the second moment state estimation corresponding to each type of sensor and the second moment system state control quantity corresponding to all the sensors.
In this alternative, the second time system state control quantity can further improve the accuracy of the output second time total state estimation.
In the first aspect of the present application, as an optional manner, the step of mixing the first time observation data of at least two types of sensors for the target observation object and obtaining the first time state estimate corresponding to each type of sensor includes the sub-steps of:
calculating to obtain a first time system state control quantity corresponding to each type of sensor according to a transition probability matrix corresponding to each type of sensor and a system initial state control quantity in a Markov process;
and obtaining first time state estimation corresponding to each type of sensor according to the first time system state control quantity corresponding to each type of sensor and the corresponding first time observation data of each type of sensor.
In the optional implementation manner, the transition probability matrix corresponding to each type of sensor in the markov process can take the relation between each type of sensor as a technical influence factor, so that the accuracy of state estimation at the first moment is improved.
In the first aspect of the present application, as an optional manner, the step of merging the second time state estimates corresponding to each type of sensor to obtain the total second time state estimate corresponding to all the sensors includes the following substeps:
acquiring a preset weight value corresponding to each type of sensor;
and merging the second moment state estimation corresponding to each type of sensor according to the preset weight value corresponding to each type of sensor so as to obtain the second moment total state estimation corresponding to all the sensors.
According to the method, the influence weights of different types of sensors in specific scenes can be simulated by using the preset weight values, and further the total state estimation at the second moment generated based on the preset weight values can be suitable for different scenes, so that the robustness of the sensor data fusion method and the accuracy of the total state estimation at the second moment are further improved.
The second aspect of the application discloses another sensor data fusion method, which comprises the following steps:
Mixing observation data of at least two types of sensors at a first moment aiming at a target observation object to obtain first moment total state estimation corresponding to all the sensors, wherein the first moment is the last moment of the second moment;
and obtaining second moment total state estimation corresponding to all the sensors according to the first moment total state estimation corresponding to all the sensors.
According to the sensor data fusion method disclosed by the second aspect of the application, multiple types of sensor data can be fused based on the centralized data fusion structure, so that the sensors are concentrated in a certain server node and fused, and therefore, high-quality operation resources can be concentrated on a certain server node, and the equipment deployment cost is reduced.
A third aspect of the present application discloses a sensor data fusion device, the device comprising:
The first mixing module is used for mixing the observation data of at least two types of sensors at the first moment aiming at the target observation object and obtaining a first moment state estimation corresponding to each type of sensor, wherein the first moment is the last moment of the second moment;
the first calculation module is used for obtaining second moment state estimation corresponding to each type of sensor according to the first moment state estimation corresponding to each type of sensor and second moment observation data of each type of sensor;
and the merging module is used for merging the second moment state estimation corresponding to each type of sensor to obtain second moment total state estimation corresponding to all sensors.
According to the sensor data fusion device, by executing the sensor data fusion method, the plurality of sensors can be fused aiming at the observation data of the same target observation object, so that measurement errors caused by sensor characteristics and external factors can be counteracted, and further, accurate data of the target observation object can be obtained through complementation among the plurality of sensor characteristics, and a control decision is made based on the accurate data of the target observation object.
In a fourth aspect, the present application discloses a sensor data fusion device, which includes:
The second mixing module is used for mixing the observation data of at least two types of sensors at the first moment aiming at the target observation object and obtaining first moment total state estimation corresponding to all the sensors, wherein the first moment is the last moment of the second moment;
And the second calculation module is used for obtaining second moment total state estimation corresponding to all the sensors according to the first moment total state estimation corresponding to all the sensors.
According to the sensor data fusion device disclosed by the fourth aspect of the application, by executing the sensor data fusion method disclosed by the second aspect of the application, multiple types of sensor data can be fused based on a centralized data fusion structure, so that the sensors are concentrated in a certain server node and fused, and therefore, high-quality operation resources can be concentrated on the certain server node, and the equipment deployment cost is reduced.
A fifth aspect of the application discloses a sensor data fusion device comprising:
A processor; and
A memory configured to store machine readable instructions that, when executed by a processor, cause the processor to perform the first aspect of the application and the sensor data fusion method of the first aspect of the application.
A sixth aspect of the present application discloses a computer storage medium storing a computer program for executing the sensor data fusion method of the first aspect of the present application and the second aspect of the present application by a processor.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and should not be considered as limiting the scope, and other related drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic view of a sensing area formed by a plurality of sensors;
FIG. 2 is a diagram showing the result of fusion of radar data and camera data;
FIG. 3 is a schematic diagram of a sensor data fusion framework;
FIG. 4 is a schematic flow chart of a sensor data fusion model method disclosed in an embodiment of the application;
FIG. 5 is a schematic flow chart of substeps of step 101;
FIG. 6 is a schematic flow chart of substeps of step 105;
fig. 7 is a schematic flow chart of a sensor data fusion method disclosed in the second embodiment of the present application;
FIG. 8 is a schematic structural diagram of a sensor data fusion device according to a third embodiment of the present application;
FIG. 9 is a schematic structural diagram of a sensor data fusion device according to a fourth embodiment of the present application;
Fig. 10 is a schematic structural diagram of a sensor data fusion device according to a fifth embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings in the embodiments of the present application.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only to distinguish the description, and are not to be construed as indicating or implying relative importance.
In the existing autopilot field, in order to obtain real-time data of an autopilot vehicle, a plurality of sensors are generally used to capture the real-time data of the autopilot vehicle. Alternatively, the sensors may include one or more of lidar, millimeter wave radar, cameras, positioning sensors.
In some scenes, millimeter wave radars are installed on four corners of an autonomous vehicle for capturing the angle of the autonomous vehicle, meanwhile, millimeter wave radars and 4-wire lidars (IBE 04) can also be installed at the front guard plate positions on the autonomous vehicle, multi-wire lidars are installed at the left and right front side positions of the autonomous vehicle, cameras are installed at the front windshield positions, inertial navigation and V2X devices are placed in a trunk, and antennas of the inertial navigation and V2X are installed near a sunroof of the vehicle. In this way, the sensors form a sensing area as shown in fig. 1. Specifically, the sensor may be mounted in accordance with a sensor mounting table as shown in table 1.
The sensor installed in the mode can acquire more observation data, and then based on the observation data, the control decision system generates decision information, such as automatic tracking information. However, the observed data corresponding to each sensor is different from the true data of the target, and the degree of the difference is related to the own characteristics of the sensor. For example, radar observations are relatively accurate over a longitudinal distance of the target, but observations in the lateral direction are relatively erroneous; whereas the camera is the exact opposite.
Therefore, in order to generate high-precision observation data, the observation data of various sensors need to be fused, for example, a fusion result can be obtained according to a probability distribution function of radar observation data and a probability distribution function of camera observation data, as shown in fig. 2, the observation data falling into a fusion result area can be compatible with the characteristics of a radar and a camera, and further has high accuracy.
It should be noted that none of these sensors making up an autonomous vehicle can perform their intended function entirely (i.e., the observed data deviate from the realized data), but only disclose an estimate of the observed target, which gives an estimate and a deviation criterion, as will be appreciated by those of ordinary skill in the art. Thus, the present application estimates the observation data of the sensor with respect to the observation target. For example, the radar observes that the longitudinal distance of the autonomous vehicle corresponds to an estimate of "15m longitudinally, +/-0.25m", where "15m" is the estimated value and "+/-0.25m" is the standard of deviation.
The specific manner in which the sensor data is fused will be described in detail below in conjunction with the foregoing.
Example 1
Referring to fig. 4, fig. 4 is a flow chart of a sensor data fusion model method according to an embodiment of the application, the method includes the steps of:
101. Mixing first time observation data of at least two types of sensors aiming at a target observation object to obtain first time state estimation corresponding to each type of sensor, wherein the first time is the last time of the second time.
102. Obtaining second time state estimation corresponding to each type of sensor according to the first time state estimation corresponding to each type of sensor and the second time observation data of each type of sensor;
105. And combining the second moment state estimation corresponding to each type of sensor to obtain the second moment total state estimation corresponding to all the sensors.
For example, referring to FIG. 3, in FIG. 3, x (1) (k-1|k-1) represents the observation data of the radar at time k-1, x (2) (k-1|k-1) represents the observation data of the camera at time k-1, and mode i and mode j are obtained after mixing, wherein mode i represents the state estimation of the radar at time k-1, and mode j corresponds to the state estimation of the camera at time k-1. Further, according to the observation data of the radar and the camera at the k moment, the state estimation of the radar at the k moment and the state estimation of the camera at the k moment are obtained, and then the state estimation of the total state estimation of the k moment corresponding to all the sensors is obtained in a merging mode.
Specifically, the total state estimation at k time corresponding to all the sensors can be calculated according to the following formula:
Where x (k|k) represents a total state estimate at k times corresponding to all sensors, r represents r class sensors, j represents a certain class sensor, x j (k|k) represents a z state estimate at k times of a certain class sensor, μ j (k) represents a second time system state control amount corresponding to a j class sensor (for a description of the second time system state control amount, refer to the following).
Specifically, μ j (k) can be calculated by the following formula:
Wherein Z k represents a set of observation data of all sensors at time k, M j (k) is an intermediate physical quantity, a vector representing a pattern j at time k, and L j (k) is likelihood functions of r kalman filters.
Specifically, L j (k) can be calculated by the following formula:
Lj(k)=P[z(k)|Mj(k),x0j(k-1|k-1),P0j(k-1|k-1)];
Wherein z (k) represents observation data of a j-type sensor at a k moment, and x 0j (k-1|k-1) represents state estimation corresponding to the j-type sensor from an initial moment 0 to a k-1 moment; on the other hand, P 0j (k-1|k-1) represents the initial time 0 to k-1, and the state estimation covariance matrix of all the sensors.
More specifically, P 0j (k-1|k-1) can be calculated by the following formula:
Wherein mu i|j (k-1|k-1) represents the k-1 state control quantity corresponding to the j-type sensor, x (i) (k-1|k-1) represents the observed data of one sensor in the j-type sensor, and x 0j (k-1|k-1) represents the state estimation of the j-type sensor at the time of k-1.
More specifically, x 0j (k-1|k-1) is calculated by the following formula:
Wherein mu i|j (k-1|k-1) represents a system state control quantity at a first moment (k-1 moment) corresponding to a j-type sensor, and P ij is a transition probability matrix corresponding to each type of sensor in a Markov process; c j is a regularization constant, k represents k time, and μ i (k-1) represents a system initial state control amount at k-1 time.
In some alternative embodiments, the variance of the total state estimate at time k for all sensors can be calculated by the following formula:
According to the application, the plurality of sensors are fused aiming at the observation data of the same target observation object, so that measurement errors caused by sensor characteristics and external factors can be counteracted, and further, the accurate data of the target observation object is obtained through complementation among the plurality of sensor characteristics, so that a control decision is made based on the accurate data of the target observation object.
Referring to fig. 4, as shown in fig. 4, before combining the second time state estimates corresponding to each type of sensor in step 105, the method further includes the steps of:
103. acquiring system state control quantity at a first moment corresponding to all sensors;
104. and obtaining the system state control quantity at the second moment corresponding to all the sensors according to the system state control quantity at the first moment.
In this optional embodiment, the system state control amount can reflect the association relationship between the other objects than the target observation object and the target observation object, so that the optional embodiment can facilitate the subsequent calculation of the influence degree of the other objects on the target observation object according to the second time system state control amount by using the first time system state control amount to obtain the second time system state control amount, so as to further improve the accuracy of the total state estimation at the second time. Meanwhile, the robustness of the sensor data fusion method can be improved by obtaining the system state control quantity at the second moment through the system state control quantity at the first moment.
In some optional embodiments, step 105, the specific manner of combining the second time state estimates corresponding to each type of sensor to obtain the total second time state estimate corresponding to all the sensors is as follows:
and obtaining second moment total state estimation corresponding to all the sensors according to the second moment state estimation corresponding to each type of sensor and the second moment system state control quantity corresponding to all the sensors.
In this optional implementation manner, the second time system state control quantity can further improve the accuracy of the output second time total state estimation.
Referring to fig. 5, fig. 5 is a schematic flow chart of a sub-step of step 101. In some alternative embodiments, as shown in fig. 5, step 101, mixing first time observation data of at least two types of sensors for a target observation object and obtaining a first time state estimate corresponding to each type of sensor includes the following substeps:
1011. Calculating to obtain a first time system state control quantity corresponding to each type of sensor according to a transition probability matrix corresponding to each type of sensor and a system initial state control quantity in a Markov process;
1012. And obtaining first time state estimation corresponding to each type of sensor according to the first time system state control quantity corresponding to each type of sensor and the corresponding first time observation data of each type of sensor.
In this optional embodiment, the transition probability matrix corresponding to each type of sensor in the markov process can use the relationship between each type of sensor as an influencing factor, so as to improve the accuracy of the state estimation at the first moment.
Referring to fig. 6, fig. 6 is a flowchart illustrating another sub-step of step 105. In some alternative embodiments, as shown in fig. 6, step 105, combining the second time state estimates corresponding to each type of sensor to obtain a total second time state estimate corresponding to all sensors includes the following sub-steps:
1051. acquiring a preset weight value corresponding to each type of sensor;
1052. and merging the second moment state estimation corresponding to each type of sensor according to the preset weight value corresponding to each type of sensor so as to obtain the second moment total state estimation corresponding to all the sensors.
According to the method, the influence weights of different types of sensors in specific scenes can be simulated by using the preset weight values, and further the total state estimation at the second moment generated based on the preset weight values can be suitable for different scenes, so that the robustness of the sensor data fusion method and the accuracy of the total state estimation at the second moment are further improved.
It should be noted that, the sensor fusion and the calculation of the decision part can be performed by adopting a type XCZU EG or a type XCZU EV chip, wherein the XCZU EG or the type XCZU EV chip can enhance the real-time performance of the sensor data fusion and reduce the CPU occupation rate of the main control chip in the sensor data fusion processing process, thereby improving the real-time performance and the stability of the whole system and further improving the performance and the safety of the whole automatic driving system.
Example two
Referring to fig. 7, fig. 7 is a flow chart of a sensor data fusion method according to an embodiment of the application. As shown in fig. 7, the method includes the steps of:
201. mixing observation data of at least two types of sensors at a first moment aiming at a target observation object to obtain first moment total state estimation corresponding to all the sensors, wherein the first moment is the last moment of the second moment;
202. And obtaining second moment total state estimation corresponding to all the sensors according to the first moment total state estimation corresponding to all the sensors.
According to the sensor data fusion method disclosed by the second aspect of the application, multiple types of sensor data can be fused based on the centralized data fusion structure, so that the sensors are concentrated in a certain server node and fused, and therefore, high-quality operation resources can be concentrated on a certain server node, and the equipment deployment cost is reduced.
Example III
Referring to fig. 8, fig. 8 is a schematic structural diagram of a sensor data fusion device according to an embodiment of the application. As shown in fig. 8, the apparatus includes:
The first mixing module 301 is configured to mix at least two types of sensors at a first time point observation data for a target observation object and obtain a first time point state estimate corresponding to each type of sensor, where the first time point is a time point previous to the second time point;
A first calculation module 302, configured to obtain a second time state estimate corresponding to each type of sensor according to the first time state estimate corresponding to each type of sensor and the second time observation data of each type of sensor;
and the merging module 303 is configured to merge the second time state estimates corresponding to each type of sensor to obtain a total second time state estimate corresponding to all sensors.
For example, referring to FIG. 3, in FIG. 3, x (1) (k-1|k-1) represents the observation data of the radar at time k-1, x (2) (k-1|k-1) represents the observation data of the camera at time k-1, and mode i and mode j are obtained after mixing, wherein mode i represents the state estimation of the radar at time k-1, and mode j corresponds to the state estimation of the camera at time k-1. Further, according to the observation data of the radar and the camera at the k moment, the state estimation of the radar at the moment and the state estimation of the camera at the k moment are obtained, and then the state estimation of the total state estimation of the camera at the k moment corresponding to all the sensors is obtained in a merging mode.
Specifically, the total state estimation at k time corresponding to all the sensors can be calculated according to the following formula:
where x (k|k) represents a total state estimate at k time corresponding to all sensors, r represents r class sensors, j represents a sensor of a certain class, x j (k|k) represents a z state estimate at k time of a certain class sensor, and μ j (k) represents a second time system state control amount corresponding to a class j sensor (for description of the second time system state control amount, refer to the following).
Specifically, μ j (k) can be calculated by the following formula:
Wherein Z k represents a set of observation data of all sensors at time k, M j (k) is an intermediate physical quantity, a vector representing a pattern j at time k, and L j (k) is likelihood functions of r kalman filters.
Specifically, L j (k) can be calculated by the following company:
Lj(k)=P[z(k)|Mj(k),x0j(k-1|k-1),P0j(k-1|k-1)];
Wherein z (k) represents observation data of a j-type sensor at a k moment, and x 0j (k-1|k-1) represents state estimation corresponding to the j-type sensor from an initial moment 0 to a k-1 moment; on the other hand, P 0j (k-1|k-1) represents the initial time 0 to k-1, and the state estimation covariance matrix of all the sensors.
More specifically, P 0j (k-1|k-1) can be calculated by the following formula:
Wherein mu i|j (k-1|k-1) represents the k-state control quantity corresponding to the j-type sensor, x (i) (k-1|k-1) represents the observation data of one sensor in the j-type sensor, and x 0j (k-1|k-1) represents the state estimation of the j-type sensor at the moment of k-1.
More specifically, x 0j (k-1|k-1) is calculated by the following formula:
Mu i|j (k-1|k-1) represents a system state control quantity corresponding to a j-type sensor at a first moment, and P ij is a transition probability matrix corresponding to each type of sensor in a Markov process; c j is a regularization constant, k represents k time, and μ i (k-1) represents a system initial state control amount at k-1 time.
In some alternative embodiments, the variance of the total state estimate at time k for all sensors can be calculated by the following formula:
According to the sensor data fusion device, by executing the sensor data fusion method, the plurality of sensors can be fused aiming at the observation data of the same target observation object, so that measurement errors caused by sensor characteristics and external factors can be counteracted, and further, accurate data of the target observation object can be obtained through complementation among the plurality of sensor characteristics, and a control decision is made based on the accurate data of the target observation object.
In some alternative embodiments, the sensor data fusion device further comprises:
An obtaining module 304, configured to obtain all the first time system state control amounts corresponding to the sensors;
and the processing module 305 is configured to obtain second time system state control amounts corresponding to all the sensors according to the first time system state control amounts.
In this alternative manner, the system state control amount can reflect the association relationship between the other objects than the target observation object and the target observation object, so that the present alternative embodiment can facilitate the subsequent calculation of the influence degree of the other objects on the target observation object according to the second time system state control amount by using the first time system state control amount to obtain the second time system state control amount, so as to further improve the accuracy of the total state estimation at the second time. Meanwhile, the robustness of the sensor data fusion method can be improved by obtaining the system state control quantity at the second moment through the system state control quantity at the first moment.
In some optional embodiments, the merging module 303 performs merging of the second time state estimates corresponding to each type of sensor, to obtain the total second time state estimate corresponding to all sensors in the following specific manner:
and obtaining second moment total state estimation corresponding to all the sensors according to the second moment state estimation corresponding to each type of sensor and the second moment system state control quantity corresponding to all the sensors.
In this alternative, the second time system state control quantity can further improve the accuracy of the output second time total state estimation.
In some alternative embodiments, the first blending module 301 includes a first computing sub-module 3011 and a second computing sub-module, wherein:
The first calculation submodule 3011 is used for calculating and obtaining a first time system state control quantity corresponding to each type of sensor according to a transition probability matrix corresponding to each type of sensor and a system initial state control quantity in a Markov process;
The second calculation submodule 3012 is configured to obtain a first time state estimate corresponding to each type of sensor according to the first time system state control amount corresponding to each type of sensor and the corresponding first time observation data of each type of sensor.
In the optional implementation manner, the transition probability matrix corresponding to each type of sensor in the markov process can take the relation between each type of sensor as a technical influence factor, so that the accuracy of state estimation at the first moment is improved.
In some alternative embodiments, the merging module 303 includes an acquisition sub-module 3031, a third calculation sub-module 3032, wherein:
An acquisition submodule 3031, configured to acquire preset weight values corresponding to each type of sensor;
The third calculation sub-module 3032 merges the second time state estimation corresponding to each type of sensor according to the preset weight value corresponding to each type of sensor so as to obtain the second time total state estimation corresponding to all the sensors.
According to the method, the influence weights of different types of sensors in specific scenes can be simulated by using the preset weight values, and further the total state estimation at the second moment generated based on the preset weight values can be suitable for different scenes, so that the robustness of the sensor data fusion method and the accuracy of the total state estimation at the second moment are further improved.
Example IV
Referring to fig. 9, fig. 9 is a schematic structural diagram of a sensor data fusion device according to an embodiment of the application. As shown in fig. 9, the apparatus includes:
A second mixing module 401, configured to mix observation data of at least two types of sensors at a first moment for a target observation object and obtain a first moment total state estimate corresponding to all the sensors, where the first moment is a moment previous to the second moment;
the second calculation module 402 is configured to obtain a second time total state estimate corresponding to all the sensors according to the first time total state estimate corresponding to all the sensors.
According to the sensor data fusion device provided by the embodiment of the application, by executing the sensor data fusion method of the second aspect of the application, multiple types of sensor data can be fused based on the centralized data fusion structure, so that the sensors are concentrated in a certain server node and fused, and therefore, high-quality operation resources can be concentrated on the certain server node, and the equipment deployment cost is reduced.
Example five
Referring to fig. 10, fig. 10 is a schematic structural diagram of a sensor data fusion device according to an embodiment of the present application. As shown in fig. 10, the apparatus includes:
A processor 502; and
A memory 501 configured to store machine readable instructions that when executed by the processor 502 cause the processor 502 to perform the steps in the sensor data fusion method according to any one of embodiments one to two of the present application.
According to the sensor data fusion device, the sensor data fusion method is executed, so that a plurality of sensors can fuse the observation data of the same target observation object, measurement errors caused by sensor characteristics and external factors can be counteracted, accurate data of the target observation object are obtained through complementation among the sensor characteristics, and control decisions are made based on the accurate data of the target observation object.
Example six
An embodiment of the present application discloses a computer-readable storage medium storing a computer program that is executed by a processor to perform the steps in the sensor data fusion method according to any one of the first to second embodiments of the present application.
The computer readable storage medium of the embodiment of the application can fuse the observation data of a plurality of sensors aiming at the same target observation object by executing the sensor data fusion method, so that measurement errors caused by sensor characteristics and external factors can be counteracted, and further, the accurate data of the target observation object can be obtained through complementation among the sensor characteristics, so that a control decision is made based on the accurate data of the target observation object.
In the several embodiments disclosed herein, it should be understood that the disclosed apparatus and method may be implemented in other ways. The apparatus embodiments described above are merely illustrative, for example, of the flowcharts and block diagrams in the figures that illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form a single part, or each module may exist alone, or two or more modules may be integrated to form a single part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and variations will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application. It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.

Claims (5)

1. A method of sensor data fusion, the method comprising:
Mixing first time observation data of at least two types of sensors aiming at a target observation object to obtain first time state estimation corresponding to each type of sensor, wherein the first time is the last time of the second time;
obtaining second time state estimation corresponding to each type of sensor according to the first time state estimation corresponding to each type of sensor and second time observation data of each type of sensor;
combining the second moment state estimation corresponding to each type of sensor to obtain second moment total state estimation corresponding to all the sensors;
And, before merging the second time state estimates for each type of sensor, the method further comprises:
acquiring system state control quantities of all the sensors at a first moment corresponding to the sensors;
obtaining system state control quantities of all the sensors at second moments according to the first moment system state control quantities;
And merging the second time state estimates corresponding to each type of sensor to obtain second time total state estimates corresponding to all the sensors, wherein the method comprises the following steps:
Obtaining second moment total state estimation corresponding to all the sensors according to the second moment state estimation corresponding to each type of the sensor and the second moment system state control quantity corresponding to all the sensors;
and mixing first time observation data of at least two types of sensors aiming at a target observation object to obtain first time state estimation corresponding to each type of sensor, wherein the first time state estimation comprises the following steps:
Calculating to obtain a first moment system state control quantity corresponding to each type of sensor according to a transition probability matrix corresponding to each type of sensor and a system initial state control quantity in a Markov process;
and obtaining the first time state estimation corresponding to each type of sensor according to the first time system state control quantity corresponding to each type of sensor and the first time observation data corresponding to each type of sensor.
2. The method of claim 1, wherein combining the second time state estimates for each type of sensor to obtain a total second time state estimate for all of the sensors comprises:
acquiring a preset weight value corresponding to each type of sensor;
And merging the second moment state estimation corresponding to each type of sensor according to the preset weight value corresponding to each type of sensor so as to obtain second moment total state estimation corresponding to all the sensors.
3. A sensor data fusion device, the device comprising:
the first mixing module is used for mixing the observation data of at least two types of sensors at a first moment aiming at a target observation object and obtaining a first moment state estimation corresponding to each type of sensor, wherein the first moment is the last moment of the second moment;
The first calculation module is used for obtaining second time state estimation corresponding to each type of sensor according to the first time state estimation corresponding to each type of sensor and second time observation data of each type of sensor;
The merging module is used for merging the second moment state estimation corresponding to each type of sensor to obtain second moment total state estimation corresponding to all the sensors;
And, the apparatus is further for:
acquiring system state control quantities of all the sensors at a first moment corresponding to the sensors;
obtaining system state control quantities of all the sensors at second moments according to the first moment system state control quantities;
And merging the second time state estimates corresponding to each type of sensor to obtain second time total state estimates corresponding to all the sensors, wherein the method comprises the following steps:
Obtaining second moment total state estimation corresponding to all the sensors according to the second moment state estimation corresponding to each type of the sensor and the second moment system state control quantity corresponding to all the sensors;
and mixing first time observation data of at least two types of sensors aiming at a target observation object to obtain first time state estimation corresponding to each type of sensor, wherein the first time state estimation comprises the following steps:
Calculating to obtain a first moment system state control quantity corresponding to each type of sensor according to a transition probability matrix corresponding to each type of sensor and a system initial state control quantity in a Markov process;
and obtaining the first time state estimation corresponding to each type of sensor according to the first time system state control quantity corresponding to each type of sensor and the first time observation data corresponding to each type of sensor.
4.A sensor data fusion device, the device comprising:
A processor; and
A memory configured to store machine-readable instructions that, when executed by the processor, cause the processor to perform the sensor data fusion method of any of claims 1-2.
5. A computer storage medium, characterized in that the computer storage medium stores a computer program, which is executed by a processor by the sensor data fusion method according to any one of claims 1-2.
CN202010006425.9A 2020-01-03 2020-01-03 Sensor data fusion method, device, equipment and storage medium Active CN111191734B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010006425.9A CN111191734B (en) 2020-01-03 2020-01-03 Sensor data fusion method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010006425.9A CN111191734B (en) 2020-01-03 2020-01-03 Sensor data fusion method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111191734A CN111191734A (en) 2020-05-22
CN111191734B true CN111191734B (en) 2024-05-28

Family

ID=70709837

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010006425.9A Active CN111191734B (en) 2020-01-03 2020-01-03 Sensor data fusion method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111191734B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113933858A (en) * 2021-09-28 2022-01-14 中国科学院深圳先进技术研究院 Abnormal detection method and device of positioning sensor and terminal equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108573270A (en) * 2017-12-15 2018-09-25 蔚来汽车有限公司 Multisensor Target Information is set to merge method and device, computer equipment and the recording medium synchronous with multisensor sensing
CN108573271A (en) * 2017-12-15 2018-09-25 蔚来汽车有限公司 Optimization method and device, computer equipment and the recording medium of Multisensor Target Information fusion
CN109800819A (en) * 2019-01-28 2019-05-24 哈尔滨工业大学 Deviation compensation method and device when period arbitrary multisensor sky

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7460951B2 (en) * 2005-09-26 2008-12-02 Gm Global Technology Operations, Inc. System and method of target tracking using sensor fusion

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108573270A (en) * 2017-12-15 2018-09-25 蔚来汽车有限公司 Multisensor Target Information is set to merge method and device, computer equipment and the recording medium synchronous with multisensor sensing
CN108573271A (en) * 2017-12-15 2018-09-25 蔚来汽车有限公司 Optimization method and device, computer equipment and the recording medium of Multisensor Target Information fusion
CN109800819A (en) * 2019-01-28 2019-05-24 哈尔滨工业大学 Deviation compensation method and device when period arbitrary multisensor sky

Also Published As

Publication number Publication date
CN111191734A (en) 2020-05-22

Similar Documents

Publication Publication Date Title
CN111258313B (en) Multi-sensor fusion SLAM system and robot
EP3875985B1 (en) Method, apparatus, computing device and computer-readable storage medium for positioning
US11132810B2 (en) Three-dimensional measurement apparatus
US20160161260A1 (en) Method for processing feature measurements in vision-aided inertial navigation
CN112034431B (en) External parameter calibration method and device for radar and RTK
CN114323033A (en) Positioning method and device based on lane lines and feature points and automatic driving vehicle
CN112313536B (en) Object state acquisition method, movable platform and storage medium
CN111191734B (en) Sensor data fusion method, device, equipment and storage medium
CN117218350A (en) SLAM implementation method and system based on solid-state radar
CN115290071A (en) Relative positioning fusion method, device, equipment and storage medium
CN112835370B (en) Positioning method and device for vehicle, computer readable storage medium and processor
Yang et al. Simultaneous estimation of ego-motion and vehicle distance by using a monocular camera
CN116958452A (en) Three-dimensional reconstruction method and system
US20150073707A1 (en) Systems and methods for comparing range data with evidence grids
US20220091252A1 (en) Motion state determining method and apparatus
CN112284402B (en) Vehicle positioning method and device
CN114861725A (en) Post-processing method, device, equipment and medium for perception and tracking of target
CN111829552B (en) Error correction method and device for visual inertial system
CN112799079B (en) Data association method and device
Hassani et al. Analytical and empirical navigation safety evaluation of a tightly integrated LiDAR/IMU using return-light intensity
Richter et al. Advanced occupancy grid techniques for lidar based object detection and tracking
CN115128655B (en) Positioning method and device for automatic driving vehicle, electronic equipment and storage medium
CN117611762B (en) Multi-level map construction method, system and electronic equipment
US20230135965A1 (en) Virtual beams for identification of edge and planar points in lidar point cloud obtained with vehicle lidar system
CN117471483A (en) Multi-sensor fusion vehicle distance calculation method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant