CN110866544A - Sensor data fusion method and device and storage medium - Google Patents
Sensor data fusion method and device and storage medium Download PDFInfo
- Publication number
- CN110866544A CN110866544A CN201911032163.7A CN201911032163A CN110866544A CN 110866544 A CN110866544 A CN 110866544A CN 201911032163 A CN201911032163 A CN 201911032163A CN 110866544 A CN110866544 A CN 110866544A
- Authority
- CN
- China
- Prior art keywords
- obstacle
- information
- tracking data
- result
- tracking
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000007500 overflow downdraw method Methods 0.000 title description 5
- 230000004927 fusion Effects 0.000 claims abstract description 139
- 238000000034 method Methods 0.000 claims abstract description 50
- 238000001914 filtration Methods 0.000 claims description 11
- 230000000875 corresponding effect Effects 0.000 description 26
- 238000001514 detection method Methods 0.000 description 17
- 230000003068 static effect Effects 0.000 description 11
- 230000008569 process Effects 0.000 description 10
- 239000011159 matrix material Substances 0.000 description 7
- 230000004888 barrier function Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 238000005259 measurement Methods 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000002596 correlated effect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 239000013598 vector Substances 0.000 description 2
- 230000006870 function Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/251—Fusion techniques of input or preprocessed data
Landscapes
- Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Traffic Control Systems (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
The embodiment of the application provides a method, a device and a storage medium for fusing sensor data, which comprises the following steps: acquiring first tracking data acquired by a laser radar at a first moment and second tracking data acquired by a millimeter wave radar, fusing the first tracking data and the second tracking data to obtain a preliminary fusion result, acquiring third tracking data acquired by a camera at the first moment, wherein the third tracking data comprises information of at least one obstacle, and fusing the information of the obstacle of which the size meets a preset condition in the third tracking data with the preliminary fusion result to obtain a final fusion result. In the application, the tracking data collected by the laser radar, the millimeter wave radar and the camera are fused by adopting a specific fusion strategy, so that the accuracy and the stability of a fusion result are improved.
Description
Technical Field
The present application relates to the field of unmanned driving technologies, and in particular, to a method and an apparatus for fusing sensor data, and a storage medium.
Background
Generally, an unmanned vehicle can sense obstacles through a laser radar sensor, a millimeter wave sensor and a camera, wherein the laser radar sensor can accurately detect near-end obstacles, the millimeter wave radar sensor can detect far-end obstacles, and the camera is sensitive to the detection of small-sized obstacles such as pedestrians, bicycles, roadblocks and the like.
In the practical application process, in order to improve the accuracy of identifying the obstacle by the unmanned vehicle, a multi-sensor fusion technology is generally adopted to fuse the detection data of each sensor, for example, the detection data of each sensor is directly de-duplicated or combined, and the de-duplicated or combined detection data is used as a final fusion result.
However, the above-mentioned fusion method of sensor data does not fully consider the detection characteristics of each sensor, resulting in an error in the fusion result, which affects the path planning of the unmanned vehicle.
Disclosure of Invention
The application provides a method and a device for fusing sensor data and a storage medium, so as to realize the accuracy and the stability of the fused sensor data.
In a first aspect, an embodiment of the present application provides a fusion method of sensors, including:
acquiring first tracking data acquired by a laser radar and second tracking data acquired by a millimeter wave radar at a first moment, wherein the first tracking data comprises information of at least one obstacle, the second tracking data comprises information of at least one obstacle, and the information of each obstacle comprises identification, coordinates, speed and size of the obstacle;
fusing the first tracking data and the second tracking data to obtain a preliminary fusion result, wherein the preliminary fusion result comprises a plurality of preliminary clusters, information of obstacles separately collected by the laser radar and information of obstacles separately collected by the millimeter wave radar, and each preliminary cluster comprises information of one obstacle collected by the laser radar and information of at least one obstacle collected by the millimeter wave radar corresponding to the obstacle;
acquiring third tracking data acquired by a camera at the first moment, wherein the third tracking data comprises information of at least one obstacle;
and fusing the information of the obstacles with the size meeting the preset condition in the third tracking data with the preliminary fusion result to obtain a final fusion result.
In one possible implementation, the acquiring second tracking data collected by the millimeter wave radar at the first time includes:
acquiring a second moment closest to the first moment in moments corresponding to the tracking data acquired by the millimeter wave radar;
and acquiring the second tracking data according to the tracking data acquired by the millimeter wave radar at the second moment.
In one possible implementation, the acquiring third tracking data acquired by the camera at the first time includes:
acquiring a third moment closest to the first moment in moments corresponding to the tracking data acquired by the camera;
and acquiring the third tracking data according to the tracking data acquired by the camera at the third moment.
In one possible implementation, the method further comprises:
matching the final fusion result with a tracked result according to obstacle information to obtain an obstacle matching result, wherein the obstacle matching result comprises information of obstacles in the final fusion result and information of obstacles in the tracked result corresponding to the obstacles;
and filtering the obstacle matching result by adopting a Kalman filter to obtain a final tracking result of the matched obstacle.
In one possible implementation, the method further comprises:
and predicting the information of the unmatched obstacles in the tracked result by adopting the Kalman filter to obtain the final tracking result of the unmatched obstacles in the tracked result.
In one possible implementation, the method further comprises:
and newly adding the Kalman filter to store the information of newly added unmatched obstacles in the final fusion result to obtain a final tracking result of the newly added unmatched obstacles in the final fusion result.
In one possible implementation, the method further comprises:
and planning a path according to the final tracking result of the matched obstacle.
In a second aspect, an embodiment of the present application provides a device for fusing sensor data, including:
the system comprises an acquisition module, a storage module and a processing module, wherein the acquisition module is used for acquiring first tracking data acquired by a laser radar and second tracking data acquired by a millimeter wave radar at a first moment, the first tracking data comprises information of at least one obstacle, the second tracking data comprises information of at least one obstacle, and the information of each obstacle comprises identification, coordinates, speed and size of the obstacle;
the fusion module is used for fusing the first tracking data and the second tracking data to obtain a preliminary fusion result, wherein the preliminary fusion result comprises a plurality of preliminary clusters, information of obstacles separately collected by the laser radar and information of obstacles separately collected by the millimeter wave radar, and each preliminary cluster comprises the information of one obstacle collected by the laser radar and the information of at least one obstacle collected by the millimeter wave radar corresponding to the obstacle;
the acquisition module is further configured to acquire third tracking data acquired by the camera at the first time, where the third tracking data includes information of at least one obstacle;
the fusion module is further configured to fuse information of obstacles, of which the size meets a preset condition, in the third tracking data with the preliminary fusion result to obtain a final fusion result.
In a third aspect, an embodiment of the present application provides a device for fusing sensor data, including:
the memory stores computer-executable instructions;
the at least one processor executing the computer-executable instructions stored by the memory causes the at least one processor to perform the method of fusing sensor data as described in the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, in which computer-executable instructions are stored, and when a processor executes the computer-executable instructions, the method for fusing sensor data according to the first aspect is implemented.
The method, the device and the storage medium for fusing sensor data provided by the embodiment comprise the following steps: acquiring first tracking data acquired by a laser radar at a first moment and second tracking data acquired by a millimeter wave radar, wherein the first tracking data comprises information of at least one obstacle, the second tracking data comprises information of at least one obstacle, the information of each obstacle comprises identification, coordinates, speed and size of the obstacle, the first tracking data and the second tracking data are fused to obtain a preliminary fusion result, the preliminary fusion result comprises a plurality of preliminary clusters, the information of the obstacle acquired by the laser radar alone and the information of the obstacle acquired by the millimeter wave radar alone, each preliminary cluster comprises the information of one obstacle acquired by the laser radar and the information of at least one obstacle acquired by the millimeter wave radar corresponding to the obstacle, and third tracking data acquired by a camera at the first moment is acquired, and the third tracking data comprises the information of at least one obstacle, and fusing the information of the obstacles with the size meeting the preset condition in the third tracking data with the preliminary fusion result to obtain a final fusion result. In this application, through the detection characteristic who combines laser radar, millimeter wave radar and camera, adopt specific fusion strategy to fuse the tracking data that laser radar, millimeter wave radar and camera gathered, improved the degree of accuracy and the stability of fusing the result.
Drawings
Fig. 1 is a first flowchart of a method for fusing sensor data according to an embodiment of the present disclosure;
fig. 2 is a second flowchart of a method for fusing sensor data according to an embodiment of the present disclosure;
fig. 3 is a flowchart three of a method for fusing sensor data according to an embodiment of the present disclosure;
fig. 4 is a first schematic structural diagram of a sensor data fusion device provided in an embodiment of the present application;
fig. 5 is a schematic structural diagram of a sensor data fusion device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In an unmanned scene, a laser radar, a millimeter wave radar and a camera are generally used for sensing obstacles, wherein the laser radar can accurately detect nearby obstacles, but the detection range is limited; compared with a laser radar, the millimeter wave radar can detect a barrier at a distance, but the accuracy is limited, and the detection effect on the size of a static barrier is particularly poor; the camera is sensitive to the detection result of small obstacles such as pedestrians, bicycles, roadblocks and the like, but has a poor detection effect on large vehicles.
The embodiment of the application provides a sensor data fusion method aiming at the problems that detection characteristics of each sensor are not fully considered in the prior art, duplication removal or combination is directly carried out on each sensor data to obtain a final fusion result, the final fusion result is unstable, and subsequent path planning is influenced.
The technical solution of the present application will be described in detail by specific examples. It should be noted that the following specific embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
Fig. 1 is a first flowchart of a method for fusing sensor data according to an embodiment of the present disclosure, where an execution subject of the present embodiment may be a device for fusing sensor data, and the device may be integrated in an unmanned vehicle. As shown in fig. 1, a method for fusing sensor data provided in an embodiment of the present application may include the following steps:
s101, first tracking data collected by the laser radar and second tracking data collected by the millimeter wave radar at a first moment are obtained.
The unmanned vehicle is provided with a laser radar and a millimeter wave radar, the laser radar acquires tracking data in real time at a certain frequency, and the millimeter wave radar acquires the tracking data in real time at a certain frequency. Because the detection frequencies of the laser radar and the millimeter wave radar are usually different, the tracking data collected by the laser radar and the millimeter wave radar at the same moment cannot be obtained. When first tracking data collected by the laser radar at the first moment is obtained, second tracking data of the millimeter wave radar at the first moment is obtained by taking the first moment as a reference, the first tracking data comprises information of at least one obstacle, the second tracking data comprises information of at least one obstacle, and the information of each obstacle comprises identification, coordinates, speed and size of the obstacle. Wherein the size of the obstacle includes the length and width of the obstacle.
In a possible implementation, the acquiring second tracking data acquired by the millimeter wave radar at the first time specifically includes:
and S1011, acquiring a second moment closest to the first moment in the moment corresponding to the tracking data acquired by the millimeter wave radar.
And S1012, acquiring second tracking data according to the tracking data acquired by the millimeter wave radar at the second moment.
The millimeter wave radar acquires tracking data at a certain frequency, each frequency corresponds to a moment, and in the moment corresponding to the tracking data acquired by the millimeter wave radar, a second moment closest to the first moment is acquired, because: the information of the obstacle collected by the millimeter wave radar at the second time closest to the first time is most likely the same as the information of the obstacle collected by the laser radar at the first time.
Acquiring tracking data acquired by the millimeter wave radar at a second moment after acquiring the second moment closest to the first moment, and acquiring second tracking data according to the tracking data acquired by the millimeter wave radar at the second moment, wherein the identification, the speed and the size of an obstacle in the second tracking data are the same as those of the obstacle in the tracking data acquired by the millimeter wave radar at the second moment, and the coordinate of the obstacle in the second tracking data can be calculated according to the following formula:
wherein t is a first time, tr1Is the second time, (x)1,y1) Coordinates of the obstacle collected by the millimeter wave radar at the second time,coordinates of the obstacle acquired by the millimeter wave radar at the first time, (u)1,v1) The speed of the obstacle collected by the millimeter wave radar at the second moment.
It should be noted that, when the above formula is used for calculation, it may be determined that the first tracking data and the second tracking data correspond to the same obstacle according to the identifier of the obstacle.
And S102, fusing the first tracking data and the second tracking data to obtain a primary fusion result.
The preliminary fusion result comprises a plurality of preliminary clusters, information of obstacles acquired by the laser radar and information of obstacles acquired by the millimeter wave radar, wherein each preliminary cluster comprises information of an obstacle acquired by the laser radar and information of at least one obstacle acquired by the millimeter wave radar corresponding to the obstacle.
Wherein, for each primary cluster, the information of each obstacle collected by the laser radar in the primary cluster and the information of the obstacle collected by the millimeter wave radar are in a one-to-many, one-to-one relationship, specifically: if the difference between the coordinate, speed and size of the obstacle collected by the laser obstacle and the coordinate, speed and size of at least one obstacle collected by the millimeter wave radar meets the preset condition, the information of the obstacle collected by the laser radar is correlated with the information of at least one obstacle collected by the millimeter wave radar, if the information of a plurality of obstacles collected by the laser radar is correlated with the information of one obstacle collected by the millimeter wave radar, and if the condition of many-to-one is not allowed, calculating the Euclidean distance between the information of each obstacle acquired by the laser radar and the information of the obstacle acquired by the millimeter wave radar according to the coordinates, the speed and the size of the obstacle, associating the information of the obstacle acquired by the millimeter wave radar with the minimum Euclidean distance with the information of the obstacle acquired by the laser radar, and then obtaining the primary cluster according to the information of the associated obstacle.
The preset conditions may be that the difference between the abscissa and the ordinate is smaller than a first preset value, the difference between the ordinate is smaller than a second preset value, the difference between the speeds is smaller than a third preset value, the difference between the lengths is smaller than a fourth preset value, and the difference between the widths is smaller than a fifth preset value, and the specific values of the first preset value to the fifth preset value are selected according to actual situations, which is not particularly limited in this embodiment.
The information of the obstacles acquired by the laser radar is the obstacle information which is not related to the information of the obstacles acquired by the millimeter wave radar, namely the information of the obstacles acquired by the laser radar which does not form the initial cluster; the information of the obstacle acquired by the millimeter wave radar alone is the information of the obstacle which is not related to the information of the obstacle acquired by the laser radar, that is, the information of the obstacle acquired by the millimeter wave radar which does not form the initial cluster.
Then, combining the information of the obstacles separately collected by each primary cluster and the laser radar and the information of the obstacles separately collected by the millimeter wave radar to obtain a primary fusion result, wherein the primary fusion result specifically includes main information (main identifier, main coordinate, main speed, main size) and alternative information (alternative identifier, alternative coordinate, alternative speed, alternative size), and the combining process is as follows:
taking the information of at least one obstacle acquired by the laser radar in each primary cluster as main information of a primary fusion result, and taking the information of at least one obstacle acquired by the millimeter wave radar corresponding to each obstacle in each primary cluster as alternative information of the primary fusion result; taking the information of the obstacles separately collected by the laser radar as the main information of the primary fusion result; and taking the information of the obstacles separately collected by the millimeter wave radar as the main information of the preliminary fusion result.
It should be noted that, when the information of the obstacle separately collected by the millimeter wave radar is used as the main information, since the millimeter wave radar usually does not have accurate measurement result on the size of the static obstacle, and even may generate false detection, when the obstacle is a static obstacle, that is, when the speed of the obstacle in the information of the obstacle separately collected by the millimeter wave radar is 0, the information of the obstacle needs to be readjusted, and the specific manner is as follows:
acquiring a grid of a static obstacle acquired by a laser radar point cloud grid module; determining an obstacle frame of the static obstacle according to the size of the static obstacle acquired by the millimeter wave radar; and matching the grid with the obstacle detected in the grid with the obstacle frame, determining that the static obstacle exists if the coincidence area of the two is larger than the preset area, indicating that the millimeter wave radar is falsely detected if the coincidence area of the two is smaller than the preset area, and rejecting the information of the static obstacle if the static obstacle does not exist. For example: the static obstacle frame covers the positions of a first grid, a second grid and a third grid in the grids, an obstacle is arranged in the first grid and the third grid, and no obstacle is arranged in the second grid, so that the static obstacle is considered to cover the two grids, and the overlapped area of the two grids is the area of the two grids. It should be noted that the real obstacle must be covered by enough point cloud grids in the obstacle frame, and the preset area may be determined according to the actual situation, which is not limited in this embodiment.
The laser radar point cloud grid is a software module of a sensor data fusion device and used for obtaining a result after code processing is carried out on original data collected by the laser radar, the laser radar point cloud module returns the point cloud grid at the same frequency as the laser radar, namely, the point cloud grid at the first moment is returned, and the point cloud grid indicates whether each grid detects an obstacle or not.
And S103, acquiring third tracking data acquired by the camera at the first moment.
Still install the camera on the unmanned vehicle, the camera is with the real-time tracking data of gathering of certain frequency, because the detection frequency of lidar, millimeter wave radar and camera is usually different, then can't obtain the tracking data that lidar, millimeter wave radar and camera gathered at the same moment. In order to obtain tracking data collected by a laser radar, a millimeter wave radar and a camera at the same moment, when first tracking data collected by the laser radar at the first moment is obtained, third tracking data collected by the camera at the first moment is obtained with the first moment as the standard, the third tracking data comprises information of at least one obstacle, and the information of the obstacle comprises identification, coordinates, speed and size of the obstacle.
In one possible implementation, step S103 specifically includes:
and S1031, acquiring a third time closest to the first time from the time corresponding to the tracking data acquired by the camera.
And S1032, acquiring third tracking data according to the tracking data acquired by the camera at the third moment.
In the moments corresponding to the tracking data collected by the camera, a third moment closest to the first moment is acquired because: the closer in time, the most likely information to be collected for the same obstacle. Then, third tracking data is obtained according to the tracking data acquired by the camera at the third moment, wherein the identification, the speed and the size of the obstacle in the third tracking data are the same as those of the obstacle in the tracking data acquired by the camera at the third moment, and the coordinates of the obstacle in the third tracking data can be calculated according to the following formula:
wherein t is a first time, tr2Is the third time, (x)2,y2) Coordinates of the obstacle acquired by the camera at the third moment,coordinates of the obstacle acquired by the camera at the first moment, (u)2,v2) The speed of the obstacle captured by the camera at the third moment.
When the above formula is used for calculation, it may be determined that the first tracking data and the third tracking data correspond to the same obstacle according to the identifier of the obstacle.
And S104, fusing the information of the obstacles with the size meeting the preset condition in the third tracking data with the preliminary fusion result to obtain a final fusion result.
Since the camera only detects small obstacles such as pedestrians, bicycles, roadblocks and the like stably, information of the obstacles in the third tracking data, the size of which meets the preset condition, is fused with the preliminary fusion result, where the preset condition may be that the length is smaller than a sixth preset value, the width is smaller than a seventh preset value, and specific values corresponding to the sixth preset value and the seventh preset value are selected according to actual situations, which is not particularly limited in this embodiment.
The final fusion result comprises a plurality of clusters, information of the single obstacle in the preliminary fusion result and information of the obstacle collected by the camera, and each cluster comprises the information of the obstacle in the preliminary fusion result and the information of at least one obstacle collected by the camera corresponding to the obstacle.
Wherein, for each cluster, the information of the obstacle in the preliminary fusion result in the cluster and the information of the obstacle collected by the camera are in a one-to-many, one-to-one relationship, specifically: if the difference between the coordinates, the speed and the size of the obstacles in the preliminary fusion result and the coordinates, the speed and the size of at least one obstacle acquired by the camera meets the preset condition, associating the information of the obstacles in the preliminary fusion result with the information of the at least one obstacle acquired by the camera, if the information of a plurality of obstacles in the preliminary fusion result is associated with the information of one obstacle acquired by the camera, calculating the Euclidean distance between the information of each obstacle in the preliminary fusion result and the information of the obstacles acquired by the camera according to the coordinates, the speed and the size of the obstacles, associating the information of the obstacles acquired by the camera with the information of the obstacles in the preliminary fusion result, and then obtaining clusters according to the information of the associated obstacles. The preset condition may be the same as the preset condition in S102, and may be determined specifically according to an actual situation, which is not described herein again.
The information of the independent obstacles in the preliminary fusion result is the information of the unassociated obstacles, and the information of the obstacles acquired by the camera alone is the information of the unassociated obstacles, that is, the information of the obstacles which do not form clusters.
Then, information of each cluster, the obstacle individually collected by the camera and the obstacle individually collected by the preliminary fusion result are combined to obtain a final fusion result. The final fusion result specifically includes main information (main identifier, main coordinate, main speed, main size), main and standby information (main and standby identifiers, main and standby coordinates, main and standby speeds, main and standby sizes), and sub-candidate information (sub-candidate identifier, sub-candidate coordinate, sub-candidate speed, sub-candidate size). The merging process is as follows:
taking the main information in the preliminary fusion result in each cluster as the main information of the final fusion result; the alternative information of the preliminary fusion result in each cluster is used as the main and standby information of the final fusion result; taking the information of the obstacles collected by the camera corresponding to the primary fusion result in each cluster as secondary alternative information of the final fusion result; taking the single main information in the preliminary fusion result as the main information of the final fusion result; taking the single alternative information in the preliminary fusion result as the alternative information of the final fusion result; and taking the information of the obstacles acquired by the camera alone as the main information of the final fusion result.
The method for fusing sensor data provided by the embodiment comprises the following steps: acquiring first tracking data acquired by a laser radar at a first moment and second tracking data acquired by a millimeter wave radar, wherein the first tracking data comprises information of at least one obstacle, the second tracking data comprises information of at least one obstacle, the information of each obstacle comprises identification, coordinates, speed and size of the obstacle, the first tracking data and the second tracking data are fused to obtain a preliminary fusion result, the preliminary fusion result comprises a plurality of preliminary clusters, the information of the obstacle acquired by the laser radar alone and the information of the obstacle acquired by the millimeter wave radar alone, each preliminary cluster comprises the information of one obstacle acquired by the laser radar and the information of at least one obstacle acquired by the millimeter wave radar corresponding to the obstacle, and third tracking data acquired by a camera at the first moment is acquired, and the third tracking data comprises the information of at least one obstacle, and fusing the information of the obstacles with the size meeting the preset condition in the third tracking data with the preliminary fusion result to obtain a final fusion result. In this application, through the detection characteristic who combines laser radar, millimeter wave radar and camera, adopt specific fusion strategy to fuse the tracking data that laser radar, millimeter wave radar and camera gathered, improved the degree of accuracy and the stability of fusing the result.
On the basis of the above embodiment, after the final fusion result of the laser radar, the millimeter wave radar and the camera is obtained, the final fusion result and the tracked result may be matched, and filtering processing may be performed to obtain a more accurate tracking result. Fig. 2 is a second flowchart of a method for fusing sensor data according to an embodiment of the present application, and as shown in fig. 2, the method further includes the following steps:
s201, matching the final fusion result with the tracked result according to the obstacle information to obtain an obstacle matching result.
The obstacle matching result includes information of an obstacle in the final fusion result and information of an obstacle in the tracked result corresponding to the obstacle.
The final fusion result and the tracked result are combined to obtain a final tracking result, and the purpose of obtaining the final tracking result is as follows: and missing detection or false detection is avoided, and the accuracy and stability of the tracking result are further improved. For example: if the laser radar, the millimeter wave radar and the camera do not acquire tracking data at the current moment, and if the information that the vehicle does not exist is directly returned to the path planning module of the unmanned vehicle, the situation is unreasonable.
The tracked result includes information of a plurality of tracked obstacles, and specifically includes tracking information (tracking identifier, tracking coordinate, tracking speed, tracking size), primary information (primary identifier, primary coordinate, primary speed, primary size), primary and secondary selection information (primary and secondary selection identifiers, primary and secondary selection coordinates, primary and secondary selection speeds, primary and secondary selection sizes), and secondary selection information (secondary selection identifier, secondary selection coordinate, secondary selection speed, secondary selection size).
The tracking information is used for inputting the tracking information into the path planning module to carry out path planning at the current moment, and the main information, the main and standby selection information and the main and standby selection information are used as actual tracked results and are used for matching with a final fusion result at the next moment (first moment) to obtain a final tracking result of the obstacle at the first moment. When the unmanned vehicle is started, the final fusion result of the obstacle can be obtained through the steps S101-S104, then a Kalman filter is additionally arranged to track the obstacle, the main information, the main selection information and the secondary selection information of the obstacle are respectively assigned to be the main information, the main selection information and the secondary selection information of the final fusion result of the obstacle, and the tracking information is assigned to be the main information of the final fusion result of the obstacle.
The specific matching process is as follows:
the first step, prioritizing the original sensor matching, is that when there is the same trace of the same sensor in the final fused result and the tracked result, both are most likely to be the same obstacle. Wherein the information for each obstacle further includes an identification of a sensor tracking the obstacle. And performing association of the final fusion result and the tracked result according to the pairing priority, wherein the pairing priority is as follows: the main information is larger than the alternative information and larger than the secondary alternative information, and meanwhile, the main information sequentially comprises the following steps from high to low: the primary information is larger than the secondary information, for example: the main information and the main selection information between the obstacle in the final fusion result and the obstacle in the tracked result are the same, and then the two obstacles are preferably considered to be related by the main information, for example, the score of the dominance is 3; if the main information of the two obstacles is different and the main selection information and the secondary selection information are the same, the two obstacles are preferably considered to be related to the main selection information, for example, the score of the dominance is 2.
On the basis of the first step, the related obstacles are matched with a certain probability, and in order to determine the matched obstacles, whether the related obstacles are reasonable in speed or not needs to be further confirmed according to the second step, and if so, the related obstacles are considered to be matched.
And secondly, in order to prevent mismatching, judging whether the two obstacles can be matched or not according to the speed if the speed or direction change of the two obstacles obviously does not exceed a certain threshold value, sequentially traversing each association from high to low according to the dominance in the first step, and taking the associated obstacle as the obstacle to be matched if the speed of the obstacle in the final fusion result and the speed of the obstacle in the tracked result are reasonable, namely, the speed or direction change of the two obstacles does not exceed the certain threshold value.
And thirdly, considering the matching of the non-original sensors, comparing the information of the obstacles left in the final fusion result after the matching of the second step with the information of the obstacles left in the tracked result, and associating the obstacles with similar motion states (coordinates and speeds).
Specifically, the more similar the state is, the higher the weight is, and when the proportion of the speed relative to the position is λ, the weight u between a certain obstacle obj in the final fusion result and a certain obstacle track in the tracked result is defined as follows:
u=|w*obj-w*track|+λ|v*obj-v*track|
wherein,andcoordinate position vectors of obj and track respectively,andvelocity vectors for obj and track, respectively.
When u exceeds a certain threshold, the two are considered to be possibly associated. It should be noted that, firstly, to compare the positions, the position of the obstacle track and the position of the obstacle obj should be acquired at the same time; secondly, the two related parties cannot contain the association relationship in the first step, namely the sensor identifications are consistent, because the association means that two obstacles which are not tracked by the same sensor are forced to be matched by the original sensor. And after the connection, forming a weighted bipartite graph by the obstacle in the final fusion result and the obstacle in the tracked result, and then solving a maximum weight matching for the weighted bipartite graph by using a Kuhn-Munkres algorithm (KM algorithm for short), wherein the finally formed matching result is the result of the matching of the non-original sensor.
S202, filtering the obstacle matching result by adopting a Kalman filter to obtain a final tracking result of the matched obstacle.
The final tracking result specifically includes main information (main identifier, main coordinate, main speed, main size), main and standby selection information (main and standby identifiers, main and standby coordinates, main and standby speeds, main and standby sizes), secondary alternative information (secondary alternative identifier, secondary alternative coordinate, secondary alternative speed, secondary alternative size), and tracking information (tracking identifier, tracking coordinate, tracking speed, and tracking length and width).
Filtering information of an obstacle (hereinafter, referred to as a tracking obstacle) in tracked result data corresponding to the obstacle by using information of the obstacle (hereinafter, referred to as a fusion obstacle) in the final fusion result, wherein the specific filtering process is as follows:
state x of the Kalman filter is defined as the tracking coordinate (w) of the obstaclex1,wy1) Tracking velocity (v)x1,vy1) Acceleration (a)x1,ay1) Wherein the acceleration does not belong to the information of the tracked obstacle, and is stored in the Kalman filter and the length l of the obstacle1Width w of obstacle1And is recorded as:
x=(wx1,wy1,vx1,vy1,ax1,ay1,l1,w1)T
from this definition of the state, a state transition matrix F is obtained when the time difference is δδComprises the following steps:
let the state covariance matrix be P, and assume that a normal noise n with mean 0 and covariance Q is generated on P during state transition1Comprises the following steps:
n1~N(0,Q)
the measurement state z of the Kalman filter is the principal coordinate (w) of the fusion obstaclex2,wy2) Main speed (v)x2,vy2) Major length l2And a main width w2And is recorded as:
z=(wx2,wy2,vx2,vy2,l2,w2)T
the measurement state transition matrix H is
Assuming that the measured noise is a normal noise n with a mean of 0 and a covariance of R2Comprises the following steps:
n2~N(0,R)
if the fusion obstacle at the first time (referred to as time t) is matched with the tracking obstacle at time t- δ, that is, the obstacle matching result, a standard kalman filter is performed, and the state x 'of the fusion obstacle and the state covariance matrix P' are as follows:
x′=Fδx
P′=FδPFδ T+Q
the measured residue y and the measured residue covariance matrix S are y ═ z-Hx'
S=HP′HT+R
At this time, the optimal Kalman gain K is
K=P′HTS-1
So the state x of the obstacle in the final tracking result "1The covariance matrix P' of the state is as follows
x”1=x”+Ky
P”1=(1-KH)P'
In this way, the main information, the main selection information and the secondary selection information of the matched final tracking result of the obstacle are respectively assigned as the main information, the main selection information and the secondary selection information of the fused obstacle; the tracking identification is a main identification fused with the barrier, and the tracking coordinate, the tracking speed and the tracking length and width are all assigned to a state x "1The corresponding information in (1).
And S203, planning a path according to the final tracking result of the matched obstacles.
The unmanned vehicle is also provided with a path planning module, and the tracking information in the final tracking result of the matched obstacle is input into the path planning module to plan the path, wherein the input of the path planning module can be the tracking information of the obstacle, and the output can be the position coordinate of the unmanned vehicle, so that the unmanned vehicle can drive by avoiding the obstacle.
It should be noted that the primary information, the primary selection information, and the secondary selection information in the final tracking result of the matched obstacle may be used as the tracked result at the next time.
The method for fusing sensor data provided by the embodiment comprises the following steps: and matching the final fusion result and the tracked result according to the obstacle information to obtain an obstacle matching result, wherein the obstacle matching result comprises the information of the obstacle in the final fusion result and the information of the obstacle in the tracked result corresponding to the obstacle, filtering the obstacle matching result by adopting a Kalman filter to obtain the final tracking result of the matched obstacle, and planning a path according to the final tracking result of the matched obstacle. The final tracking result is obtained by adopting the scheme of the embodiment, and then the path planning is carried out according to the final tracking result, so that the accuracy of the path planning is improved.
On the basis of the foregoing embodiment of fig. 2, fig. 3 is a flowchart three of a method for fusing sensor data provided in the embodiment of the present application, and as shown in fig. 3, the method for fusing sensor data provided in the embodiment further includes the following steps:
s301, predicting information of unmatched obstacles in the tracked result by adopting a Kalman filter to obtain a final tracking result of the unmatched obstacles in the tracked result.
The unmatched obstacle in the tracked result is an obstacle which is not matched with the tracked obstacle in the final fusion result, that is, the tracked obstacle at the t-delta moment does not have the matching result of the fusion obstacle at the t moment. Indicating that the obstacle was missed at time t (first time), the state x of the unmatched obstacle in the final tracking result "2With state covariance matrix P'2The following were used:
x”2=Fδx
thus, the junction has been tracedAssigning the main information, the main selection information and the secondary selection information of the final tracking result of the unmatched obstacle in the result to be the main information, the main selection information and the secondary selection information of the tracked obstacle; the tracking identifier is a main identifier of the tracked obstacle; the tracking coordinate, the tracking speed and the tracking length and width are all assigned to a state x "2The corresponding information in (1).
In one possible implementation, if the same unmatched obstacles exist in the tracked result within the preset time length, the information of the unmatched obstacles is removed, and a new tracked result is obtained
And predicting to obtain a final tracking result in the manner of step S301 for the information of the unmatched obstacle in the tracked result, if the unmatched obstacle is unmatched all the time within the preset time length, indicating that the obstacle does not exist, and not continuing to predict, removing the information of the obstacle, and obtaining a new tracked result. That is, if the fusion obstacle is not matched in any of the multiple matches, it indicates that the obstacle is not present. The preset duration may be a duration of multiple matching, and may be selected according to an actual situation, which is not limited in this embodiment.
S302, the newly added Kalman filter stores the information of the newly added unmatched obstacles in the final fusion result to obtain a final tracking result of the newly added unmatched obstacles in the final fusion result.
The information of the unmatched obstacle in the final fusion result is the information of the obstacle which is not matched with the tracking result obtained in the tracking result, that is, the tracking obstacle at the time t does not have the matching result of the fusion obstacle at the time t-delta. And if the obstacle is not detected at the t-delta moment (the first moment), a Kalman filter is added to track the obstacle.
In this way, the main information, the main selection information and the secondary alternative information of the final tracking result of the newly added unmatched obstacle in the final fusion result are respectively assigned as the main information, the main selection information and the secondary alternative information of the fusion obstacle; tracking the identification as a primary identification of the fusion obstacle; and assigning the tracking coordinate, the tracking speed and the tracking length and width as main information of the fusion barrier.
And S303, planning a path according to the final tracking result of the matched obstacles, the final tracking result of the unmatched obstacles in the tracking result and the final tracking result of the newly added unmatched obstacles in the final fusion result.
And the tracking information in the final tracking result of the matched obstacle, the tracking information in the final tracking result of the unmatched obstacle in the tracked result and the final tracking result of the unmatched obstacle newly added in the final fusion result can be input into the path planning module for path planning.
The main information, the main selection information and the secondary selection information in the final tracking result of the matched obstacle, the main information, the main selection information and the secondary selection information in the final tracking result of the unmatched obstacle in the tracked result, and the main information, the main selection information and the secondary selection information in the final tracking result of the newly added unmatched obstacle in the final fusion result can be used as the tracked result at the next moment, and then the technical scheme is continuously executed.
The method for fusing sensor data provided by the embodiment comprises the following steps: filtering information of unmatched obstacles in the tracked result by adopting the Kalman filter to obtain a final tracking result of the unmatched obstacles in the tracked result; and newly adding the Kalman filter to filter the information of the unmatched obstacles in the final fusion result, predicting to obtain the final tracking result of the unmatched obstacles in the final fusion result, and improving the accuracy of path planning.
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
Fig. 4 is a schematic structural diagram of a sensor data fusion device provided in an embodiment of the present application, in this embodiment, the sensor data fusion device may be integrated in an unmanned vehicle, and as shown in fig. 4, the device may include:
an acquisition module 41 and a fusion module 42.
An obtaining module 41, configured to obtain first tracking data collected by a laser radar and second tracking data collected by a millimeter wave radar at a first time, where the first tracking data includes information of at least one obstacle, the second tracking data includes information of at least one obstacle, and the information of each obstacle includes an identifier, a coordinate, a speed, and a size of the obstacle;
a fusion module 42, configured to fuse the first tracking data and the second tracking data to obtain a preliminary fusion result, where the preliminary fusion result includes a plurality of preliminary clusters, information of obstacles separately acquired by the laser radar, and information of obstacles separately acquired by the millimeter wave radar, and each preliminary cluster includes information of an obstacle collected by the laser radar and information of at least one obstacle collected by the millimeter wave radar corresponding to the obstacle;
the acquiring module 41 is further configured to acquire third tracking data acquired by the camera at the first time, where the third tracking data includes information of at least one obstacle;
the fusion module 42 is further configured to fuse information of the obstacle in the third tracking data, the size of which meets a preset condition, with the preliminary fusion result, so as to obtain a final fusion result.
In a possible implementation, the obtaining module 41 is specifically configured to:
acquiring a second moment closest to the first moment in moments corresponding to the tracking data acquired by the millimeter wave radar;
and acquiring the second tracking data according to the tracking data acquired by the millimeter wave radar at the second moment.
In a possible implementation, the obtaining module 41 is specifically configured to:
acquiring a third moment closest to the first moment in moments corresponding to the tracking data acquired by the camera;
and acquiring the third tracking data according to the tracking data acquired by the camera at the third moment.
In one possible implementation, the method further comprises:
a matching module 43, configured to match the final fusion result and the tracked result according to obstacle information, and obtain an obstacle matching result, where the obstacle matching result includes information of an obstacle in the final fusion result and information of an obstacle in the tracked result corresponding to the obstacle;
and a filtering module 44, configured to filter the obstacle matching result by using a kalman filter, to obtain a final tracking result of the matched obstacle.
In one possible implementation, the filtering module 44 is further configured to:
and predicting the information of the unmatched obstacles in the tracked result by adopting the Kalman filter to obtain the final tracking result of the unmatched obstacles in the tracked result.
In one possible implementation, the filtering module 44 is further configured to:
and newly adding the Kalman filter to store the information of newly added unmatched obstacles in the final fusion result to obtain a final tracking result of the newly added unmatched obstacles in the final fusion result.
In one possible implementation, the method further comprises:
and the processing module 45 is configured to perform path planning according to the final tracking result of the matched obstacle.
The apparatus provided in the embodiment of the present application may be used to execute the method in the embodiments shown in fig. 1 to fig. 3, and the implementation principle and the technical effect are similar, which are not described herein again.
Fig. 5 is a schematic structural diagram of a sensor data fusion device according to an embodiment of the present application. As shown in fig. 5, the sensor data fusion device 50 may include:
a processor 51 and a memory 52. Wherein the processor 51 and the memory 52 may be connected by a bus 53.
A memory 52 for storing computer instructions;
a processor 51 for executing a computer stored by the memory 52; the instructions cause the processor 51 to perform the method as shown in fig. 1 to 3.
For a specific implementation process of the processor 51, reference may be made to the above method embodiments, which have similar implementation principles and technical effects, and details of this embodiment are not described herein again.
The embodiment of the present application provides a computer-readable storage medium, in which computer-executable instructions are stored, and when a processor executes the computer-executable instructions, the method according to the embodiment shown in fig. 1 to 3 is implemented.
It is to be understood that the various numerical references referred to in the embodiments of the present application are merely for descriptive convenience and are not intended to limit the scope of the embodiments of the present application.
It should be understood that, in the embodiment of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiment of the present application.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
Claims (10)
1. A method for fusing sensor data is applied to a device for fusing sensor data, and the method comprises the following steps:
acquiring first tracking data acquired by a laser radar and second tracking data acquired by a millimeter wave radar at a first moment, wherein the first tracking data comprises information of at least one obstacle, the second tracking data comprises information of at least one obstacle, and the information of each obstacle comprises identification, coordinates, speed and size of the obstacle;
fusing the first tracking data and the second tracking data to obtain a preliminary fusion result, wherein the preliminary fusion result comprises a plurality of preliminary clusters, information of obstacles separately collected by the laser radar and information of obstacles separately collected by the millimeter wave radar, and each preliminary cluster comprises information of one obstacle collected by the laser radar and information of at least one obstacle collected by the millimeter wave radar corresponding to the obstacle;
acquiring third tracking data acquired by a camera at the first moment, wherein the third tracking data comprises information of at least one obstacle;
and fusing the information of the obstacles with the size meeting the preset condition in the third tracking data with the preliminary fusion result to obtain a final fusion result.
2. The method of claim 1, wherein the obtaining second tracking data collected by the millimeter wave radar at the first time comprises:
acquiring a second moment closest to the first moment in moments corresponding to the tracking data acquired by the millimeter wave radar;
and acquiring the second tracking data according to the tracking data acquired by the millimeter wave radar at the second moment.
3. The method of claim 1, wherein said obtaining third tracking data acquired by the camera at the first time comprises:
acquiring a third moment closest to the first moment in moments corresponding to the tracking data acquired by the camera;
and acquiring the third tracking data according to the tracking data acquired by the camera at the third moment.
4. The method of claim 1, further comprising:
matching the final fusion result with a tracked result according to obstacle information to obtain an obstacle matching result, wherein the obstacle matching result comprises information of obstacles in the final fusion result and information of obstacles in the tracked result corresponding to the obstacles;
and filtering the obstacle matching result by adopting a Kalman filter to obtain a final tracking result of the matched obstacle.
5. The method of claim 4, further comprising:
and predicting the information of the unmatched obstacles in the tracked result by adopting the Kalman filter to obtain the final tracking result of the unmatched obstacles in the tracked result.
6. The method of claim 5, further comprising:
and newly adding the Kalman filter to store the information of newly added unmatched obstacles in the final fusion result to obtain a final tracking result of the newly added unmatched obstacles in the final fusion result.
7. The method of claim 4, further comprising:
and planning a path according to the final tracking result of the matched obstacle.
8. An apparatus for fusing sensor data, comprising:
the system comprises an acquisition module, a storage module and a processing module, wherein the acquisition module is used for acquiring first tracking data acquired by a laser radar and second tracking data acquired by a millimeter wave radar at a first moment, the first tracking data comprises information of at least one obstacle, the second tracking data comprises information of at least one obstacle, and the information of each obstacle comprises identification, coordinates, speed and size of the obstacle;
the fusion module is used for fusing the first tracking data and the second tracking data to obtain a preliminary fusion result, wherein the preliminary fusion result comprises a plurality of preliminary clusters, information of obstacles separately collected by the laser radar and information of obstacles separately collected by the millimeter wave radar, and each preliminary cluster comprises the information of one obstacle collected by the laser radar and the information of at least one obstacle collected by the millimeter wave radar corresponding to the obstacle;
the acquisition module is further configured to acquire third tracking data acquired by the camera at the first time, where the third tracking data includes information of at least one obstacle;
the fusion module is further configured to fuse information of obstacles, of which the size meets a preset condition, in the third tracking data with the preliminary fusion result to obtain a final fusion result.
9. An apparatus for fusing sensor data, comprising: at least one processor and memory;
the memory stores computer-executable instructions;
the at least one processor executing the computer-executable instructions stored by the memory causes the at least one processor to perform the method of fusing sensor data according to any one of claims 1 to 7.
10. A computer-readable storage medium having computer-executable instructions stored thereon which, when executed by a processor, implement the method of fusing sensor data according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911032163.7A CN110866544B (en) | 2019-10-28 | 2019-10-28 | Sensor data fusion method and device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911032163.7A CN110866544B (en) | 2019-10-28 | 2019-10-28 | Sensor data fusion method and device and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110866544A true CN110866544A (en) | 2020-03-06 |
CN110866544B CN110866544B (en) | 2022-04-15 |
Family
ID=69654659
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911032163.7A Active CN110866544B (en) | 2019-10-28 | 2019-10-28 | Sensor data fusion method and device and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110866544B (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111667512A (en) * | 2020-05-28 | 2020-09-15 | 浙江树人学院(浙江树人大学) | Multi-target vehicle track prediction method based on improved Kalman filtering |
CN111753901A (en) * | 2020-06-23 | 2020-10-09 | 国汽(北京)智能网联汽车研究院有限公司 | Data fusion method, device and system and computer equipment |
CN114415489A (en) * | 2021-12-02 | 2022-04-29 | 北京罗克维尔斯科技有限公司 | Vehicle-mounted sensor time synchronization method, device, equipment and medium |
WO2022237210A1 (en) * | 2021-05-12 | 2022-11-17 | 上海仙途智能科技有限公司 | Obstacle information generation |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109212532A (en) * | 2017-07-04 | 2019-01-15 | 百度在线网络技术(北京)有限公司 | Method and apparatus for detecting barrier |
CN109212521A (en) * | 2018-09-26 | 2019-01-15 | 同济大学 | A kind of method for tracking target merged based on forward sight camera with millimetre-wave radar |
CN109298415A (en) * | 2018-11-20 | 2019-02-01 | 中车株洲电力机车有限公司 | A kind of track and road barricade object detecting method |
CN109596078A (en) * | 2019-01-28 | 2019-04-09 | 吉林大学 | Multi-information fusion spectrum of road surface roughness real-time testing system and test method |
CN109752719A (en) * | 2019-01-27 | 2019-05-14 | 南昌航空大学 | A kind of intelligent automobile environment perception method based on multisensor |
CN109747643A (en) * | 2017-11-07 | 2019-05-14 | 郑州宇通客车股份有限公司 | A kind of information fusion method of intelligent vehicle sensory perceptual system |
-
2019
- 2019-10-28 CN CN201911032163.7A patent/CN110866544B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109212532A (en) * | 2017-07-04 | 2019-01-15 | 百度在线网络技术(北京)有限公司 | Method and apparatus for detecting barrier |
CN109747643A (en) * | 2017-11-07 | 2019-05-14 | 郑州宇通客车股份有限公司 | A kind of information fusion method of intelligent vehicle sensory perceptual system |
CN109212521A (en) * | 2018-09-26 | 2019-01-15 | 同济大学 | A kind of method for tracking target merged based on forward sight camera with millimetre-wave radar |
CN109298415A (en) * | 2018-11-20 | 2019-02-01 | 中车株洲电力机车有限公司 | A kind of track and road barricade object detecting method |
CN109752719A (en) * | 2019-01-27 | 2019-05-14 | 南昌航空大学 | A kind of intelligent automobile environment perception method based on multisensor |
CN109596078A (en) * | 2019-01-28 | 2019-04-09 | 吉林大学 | Multi-information fusion spectrum of road surface roughness real-time testing system and test method |
Non-Patent Citations (2)
Title |
---|
XIAO-PENG GUO,ET AL.: "Pedestrian Detection Based on Fusion of Millimeter Wave Radar and Vision", 《AIPR 2018》 * |
刘伟: "基于激光雷达和机器视觉的智能车前方障碍物检测研究", 《中国优秀博硕士学位论文全文数据库(硕士)工程科技Ⅱ辑》 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111667512A (en) * | 2020-05-28 | 2020-09-15 | 浙江树人学院(浙江树人大学) | Multi-target vehicle track prediction method based on improved Kalman filtering |
CN111667512B (en) * | 2020-05-28 | 2024-04-09 | 浙江树人学院(浙江树人大学) | Multi-target vehicle track prediction method based on improved Kalman filtering |
CN111753901A (en) * | 2020-06-23 | 2020-10-09 | 国汽(北京)智能网联汽车研究院有限公司 | Data fusion method, device and system and computer equipment |
CN111753901B (en) * | 2020-06-23 | 2023-08-15 | 国汽(北京)智能网联汽车研究院有限公司 | Data fusion method, device, system and computer equipment |
WO2022237210A1 (en) * | 2021-05-12 | 2022-11-17 | 上海仙途智能科技有限公司 | Obstacle information generation |
CN114415489A (en) * | 2021-12-02 | 2022-04-29 | 北京罗克维尔斯科技有限公司 | Vehicle-mounted sensor time synchronization method, device, equipment and medium |
CN114415489B (en) * | 2021-12-02 | 2023-09-22 | 北京罗克维尔斯科技有限公司 | Time synchronization method, device, equipment and medium for vehicle-mounted sensor |
Also Published As
Publication number | Publication date |
---|---|
CN110866544B (en) | 2022-04-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110866544B (en) | Sensor data fusion method and device and storage medium | |
CN109829351B (en) | Method and device for detecting lane information and computer readable storage medium | |
CN110587597B (en) | SLAM closed loop detection method and detection system based on laser radar | |
US11480967B2 (en) | Pass route planning method and apparatus, device and readable storage medium | |
EP3674161A1 (en) | A failure detection device for an external sensor and a failure detection method for an external sensor | |
US11738747B2 (en) | Server device and vehicle | |
KR102569900B1 (en) | Apparatus and method for performing omnidirectional sensor-fusion and vehicle including the same | |
CN111937036A (en) | Method, apparatus, and computer-readable storage medium having instructions for processing sensor data | |
US12077161B2 (en) | Method and processing unit for determining information with respect to an object in an environment of a vehicle | |
CN113093178A (en) | Obstacle target detection method and device, domain controller and vehicle | |
CN111753623B (en) | Method, device, equipment and storage medium for detecting moving object | |
CN113759906B (en) | Vehicle alignment method and device, computer equipment and storage medium | |
CN113269811A (en) | Data fusion method and device and electronic equipment | |
CN111796286A (en) | Brake grade evaluation method and device, vehicle and storage medium | |
CN111222441A (en) | Point cloud target detection and blind area target detection method and system based on vehicle-road cooperation | |
CN114460598A (en) | Target identification method, device, equipment and storage medium | |
CN111784730A (en) | Object tracking method and device, electronic equipment and storage medium | |
CN114219829A (en) | Vehicle tracking method, computer equipment and storage device | |
CN114894193A (en) | Path planning method and device for unmanned vehicle, electronic equipment and medium | |
CN115908498B (en) | Multi-target tracking method and device based on category optimal matching | |
JP2022537557A (en) | Method and apparatus for determining drivable area information | |
CN115014366A (en) | Target fusion method and device, vehicle and storage medium | |
CN115457506A (en) | Target detection method, device and storage medium | |
CN110969058B (en) | Fusion method and device for environment targets | |
KR102172849B1 (en) | Detecting system for approaching vehicle in video and method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |