CN110879598A - Information fusion method and device of multiple sensors for vehicle - Google Patents

Information fusion method and device of multiple sensors for vehicle Download PDF

Info

Publication number
CN110879598A
CN110879598A CN201911268821.2A CN201911268821A CN110879598A CN 110879598 A CN110879598 A CN 110879598A CN 201911268821 A CN201911268821 A CN 201911268821A CN 110879598 A CN110879598 A CN 110879598A
Authority
CN
China
Prior art keywords
estimated
obstacle
information
sensor
measurement result
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911268821.2A
Other languages
Chinese (zh)
Inventor
张巍
余贵珍
李华志
黄立明
冯冲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Tage Chi Technology Co Ltd
Beijing Tage Idriver Technology Co Ltd
Original Assignee
Beijing Tage Chi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Tage Chi Technology Co Ltd filed Critical Beijing Tage Chi Technology Co Ltd
Priority to CN201911268821.2A priority Critical patent/CN110879598A/en
Publication of CN110879598A publication Critical patent/CN110879598A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Abstract

The invention provides an information fusion method and device of multiple sensors for a vehicle, wherein the method comprises the following steps: at tNTime t andN+1acquiring the M +1 th sensor in the plurality of sensors at t when the plurality of sensors are updated between momentsN+1The M +1 th estimated measurement result and the M +1 th estimated track information of the obstacle at the moment are obtained, wherein the M +1 th estimated measurement result is obtained by the M +1 th actual measurement information of the obstacle at the update moment of the M +1 th sensor, and the M +1 th estimated track information is obtained by the M +1 th sensor at tN+1Obtaining the Mth estimated track information about the barrier at the moment, wherein N and M are positive integers; performing association matching on the M +1 th estimated measurement result and the M +1 th estimated track information; when the association matching is effective matching, performing optimal fusion estimation based on the M +1 th pre-estimated measurement result and the M +1 th pre-estimated track information to obtain the t-th pre-estimated track information of the plurality of sensorsN+1A first optimal fused estimate of a target state for an obstacle at a time.

Description

Information fusion method and device of multiple sensors for vehicle
Technical Field
The invention relates to the technical field of unmanned vehicles, in particular to a method and a device for information fusion of multiple sensors for a vehicle.
Background
Along with the rapid development of intelligent science and technology, the unmanned technology is gradually mature. In order to ensure the driving safety of the unmanned vehicle, a sensing system is composed of various sensors. The perception system provides reliable road driving information for decision planning control by perceiving the surrounding environment. Each sensor can obtain the obstacle information from the angle of the sensor.
However, much of the information received by these sensors is redundant, and different sensors have different reliability with respect to different obstacle information.
Disclosure of Invention
The invention provides a method and a device for fusing information of multiple sensors for a vehicle, aiming at the problems.
In a first aspect, an information fusion method for multiple sensors for a vehicle is provided, which includes: at tNTime t andN+1acquiring the M +1 th sensor in the plurality of sensors at t when the plurality of sensors are updated between momentsN+1The M +1 th estimated measurement result and the M +1 th estimated track information of the obstacle at the moment are obtained, wherein the M +1 th estimated measurement result is obtained by the M +1 th actual measurement information of the obstacle at the update moment of the M +1 th sensor, and the M +1 th estimated track information is obtained by the M +1 th sensor at tN+1Obtaining the Mth estimated track information about the barrier at the moment, wherein N and M are positive integers; performing association matching on the M +1 th estimated measurement result and the M +1 th estimated track information; when the association matching is effective matching, performing optimal fusion estimation based on the M +1 th pre-estimated measurement result and the M +1 th pre-estimated track information to obtain the t-th pre-estimated track information of the plurality of sensorsN+1A first optimal fused estimate of a target state for an obstacle at a time.
In some embodiments of the present invention, the method further comprises: at tNTime t andN+1when the 1 st sensor in the sensors is updated for the first time between moments, the 1 st sensor at t is acquiredN+11 st estimated measurement result and 1 st estimated track information about the obstacle at the moment, wherein the 1 st estimated measurement resultThe result is obtained by the 1 st actual measurement information of the 1 st sensor about the obstacle at the updating moment, and the 1 st estimated flight path information is obtained by tNObtaining a second optimal fusion estimation result of the target state of the obstacle at the moment; carrying out association matching on the 1 st estimated measurement result and the 1 st estimated track information; when the association matching is effective matching, carrying out optimal fusion estimation based on the 1 st pre-estimated measurement result and the 1 st pre-estimated track information to obtain the 2 nd sensor at tN+12 nd predicted track information of a target state of the obstacle at the time.
In some embodiments of the present invention, when the association matching is valid matching, the optimal fusion estimation is performed based on the M +1 th estimated measurement result and the M +1 th estimated track information to obtain the t-th estimated track information of the plurality of sensorsN+1The first optimal fusion estimation result of the target state of the obstacle at the time comprises: and when the association matching is effective matching, performing optimal fusion estimation by using a Kalman filtering algorithm based on the M +1 th estimated measurement result and the M +1 th estimated track information.
In some embodiments of the present invention, the optimal fusion estimation performed by using the kalman filter algorithm based on the M +1 th predicted measurement result and the M +1 th predicted track information is calculated by the following formula:
Figure BDA0002313610040000021
wherein the content of the first and second substances,
Figure BDA0002313610040000022
PM+2(tN+1)=(I-KM+1(tN+1)HM+1)PM+1(tN+1),
Figure BDA0002313610040000023
for the M +1 th sensor at t with respect to the obstacleN+1M +1 th estimated track information of time, KM+1(tN+1) For the M +1 th sensor at t with respect to the obstacleN+1Kalman filter gain calculation of time, HM+1Is the conversion matrix of the M +1 th sensor,
Figure BDA0002313610040000024
as a transpose of the conversion matrix, PM+1(tN+1) For the M +1 th sensor at t with respect to the obstacleN+1The covariance corresponding to the M +1 th estimated measurement result at the moment, R is the covariance of the measured noise, I is the unit matrix, and t is the M +1 th sensorNTime t andN+1at the last sensor to update between the times of day,
Figure BDA0002313610040000025
as an obstacle at tN+1And the first optimal fusion estimation result of the moment.
In some embodiments of the present invention, the foregoing is at tNTime t andN+1acquiring the M +1 th sensor in the plurality of sensors at t when the plurality of sensors are updated between momentsN+1The M +1 th estimated measurement result and the M +1 th estimated track information about the obstacle at the moment comprise: acquiring M +1 th actual measurement information of the M +1 th sensor about the obstacle during updating; performing spatial alignment on the M +1 th actual measurement information based on the M +1 th actual measurement information and the position information of the vehicle to obtain an M +1 th estimated measurement result of the obstacle after the spatial alignment; synchronizing M +1 th estimated measurement result about the obstacle after spatial alignment to tN+1At time, the M +1 th sensor at t is obtainedN+1The M +1 th estimate of the obstacle at the time.
In some embodiments of the present invention, after the performing the correlation matching on the M +1 th estimated measurement result and the M +1 th estimated track information, the method further includes: and calculating the optimal matching result of the M +1 th estimated measurement result and the M +1 th estimated flight path information by using a KM algorithm or a Hungarian algorithm.
In some embodiments of the present invention, the method further comprises: when the association matching is invalid matching, adding corresponding first obstacle information in the sensor list into an initial track information list, and confirming and adding the first obstacle information into a target track information list when the M +1 estimated measurement result and the M +1 estimated track information are continuously and successfully matched in a preset period; and adding second obstacle information in the target track information list into the track information list to be deleted, and deleting the second obstacle information when the M +1 th estimated measurement result and the M +1 th estimated track information cannot be successfully matched in a preset period.
In a second aspect, an apparatus for information fusion of multiple sensors for a vehicle is provided, comprising: an acquisition module for acquiring at tNTime t andN+1acquiring the M +1 th sensor of the plurality of sensors at t when there are multiple sensor updates between timesN+1The M +1 th estimated measurement result and the M +1 th estimated track information of the obstacle at the moment are obtained, wherein the M +1 th estimated measurement result is obtained by the M +1 th actual measurement information of the obstacle at the update moment of the M +1 th sensor, and the M +1 th estimated track information is obtained by the M +1 th sensor at tN+1Obtaining the Mth estimated track information about the barrier at the moment, wherein N and M are positive integers; the judging module is used for performing association matching on the M +1 th estimated measurement result and the M +1 th estimated track information; a fusion module for performing optimal fusion estimation based on the M +1 th estimated measurement result and the M +1 th estimated track information when the correlation matching is effective matching to obtain the t-th estimated track information of the multiple sensorsN+1A first optimal fused estimate of a target state for an obstacle at a time.
In some embodiments of the invention, the obtaining module is further configured to obtain the second value at tNTime t andN+1acquiring the 1 st sensor at t when the 1 st sensor in the plurality of sensors is updated for the first time between momentsN+11 st estimated measurement result and 1 st estimated track information of the obstacle at the moment, wherein the 1 st estimated measurement result is obtained by the 1 st actual measurement information of the 1 st sensor at the updating moment and the 1 st estimated track information is obtained by tNObtaining a second optimal fusion estimation result of the target state of the obstacle at the moment; the judging module is also used for carrying out association matching on the 1 st pre-estimated measuring result and the 1 st pre-estimated track information; the fusion module is also used for carrying out optimal fusion estimation based on the 1 st pre-estimated measurement result and the 1 st pre-estimated track information when the association matching is effective matching to obtain the result that the 2 nd sensor is at tN+1Of target state of the obstacle at the momentAnd 2, estimating the flight path information.
In some embodiments of the present invention, the fusion module is specifically configured to perform optimal fusion estimation by using a kalman filter algorithm based on the M +1 th predicted measurement result and the M +1 th predicted track information when the association matching is a valid matching.
In some embodiments of the present invention, the obtaining module is specifically configured to: acquiring M +1 th actual measurement information of the M +1 th sensor about the obstacle during updating; performing spatial alignment on the M +1 th actual measurement information based on the M +1 th actual measurement information and the position information of the vehicle to obtain an M +1 th estimated measurement result of the obstacle after the spatial alignment; synchronizing M +1 th estimated measurement result about the obstacle after spatial alignment to tN+1At time, the M +1 th sensor at t is obtainedN+1The M +1 th estimate of the obstacle at the time.
In some embodiments of the present invention, the apparatus further comprises: and the calculation module is used for calculating the optimal matching result of the M +1 th estimated measurement result and the M +1 th estimated track information by using a KM algorithm or a Hungarian algorithm.
In some embodiments of the present invention, the apparatus further comprises: a processing module to: when the association matching is invalid matching, adding corresponding first obstacle information in the sensor list into an initial track information list, and confirming and adding the first obstacle information into a target track information list when the M +1 estimated measurement result and the M +1 estimated track information are continuously and successfully matched in a preset period; and adding second obstacle information in the target track information list into the track information list to be deleted, and deleting the second obstacle information when the M +1 th estimated measurement result and the M +1 th estimated track information cannot be successfully matched in a preset period.
In a third aspect, a computer-readable storage medium is provided, wherein the storage medium stores a computer program for executing the information fusion method of the vehicular multi-sensor.
In a fourth aspect, there is also provided an electronic device, including: a processor; a memory for storing processor-executable instructions; and the processor is used for executing the information fusion method of the vehicle multi-sensor.
Based on the embodiment of the invention, a complete multi-sensor information fusion solution is provided. By matching the obstacle information acquired by the sensor with the track information of the obstacle, the obstacle information which is possibly mistakenly identified by a single sensor can be filtered, so that the screened effective obstacle information is fused to obtain more accurate obstacle information. Meanwhile, information correlation can be carried out between different sensors, so that the sensors in various forms can be better fused, and more accurate and reliable obstacle fusion results can be extracted from the fusion results. In addition, the specific information fusion method using the multiple sensors reduces the requirements of the system on communication bandwidth and hardware and the research and development difficulty, improves the calculation speed and improves the reliability of the system.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without inventive efforts, wherein:
fig. 1 is a schematic flow chart of an information fusion method of a vehicular multi-sensor according to an embodiment of the present invention.
Fig. 2 is a schematic flow chart of an information fusion method of a multi-sensor for a vehicle according to another embodiment of the present invention.
Fig. 3 is a schematic configuration diagram of an information fusion device for a vehicular multi-sensor according to an embodiment of the present invention.
FIG. 4 is a block diagram of a computer device of an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
At present, in an open-air mining area, the road condition of the mining area is poor, much dust is raised, and a single sensor is limited by respective characteristics in the automatic driving process, so that the accurate sensing of the surrounding environment is difficult to realize. For example: the laser radar is influenced by the raised dust, and the fault identification of the obstacle can occur; no street lamp is available at night in a mining area, and a camera cannot be used at night; the resolution of the millimeter wave radar for identifying the obstacle is low, and the accurate information of the obstacle cannot be obtained. Therefore, to achieve more accurate detection of surrounding obstacles in the mine environment, sensor fusion techniques must be incorporated. The embodiment of the invention can be used for unmanned multi-sensor information fusion scenes in the open-pit mine area.
Fig. 1 is a schematic flow chart of an information fusion method of a vehicular multi-sensor according to an embodiment of the present invention.
110 at tNTime t andN+1acquiring the M +1 th sensor in the plurality of sensors at t when the plurality of sensors are updated between momentsN+1The M +1 th estimated measurement result and the M +1 th estimated track information of the obstacle at the moment are obtained, wherein the M +1 th estimated measurement result is obtained by the M +1 th actual measurement information of the obstacle at the update moment of the M +1 th sensor, and the M +1 th estimated track information is obtained by the M +1 th sensor at tN+1Obtaining the Mth estimated track information about the barrier at the moment, wherein N and M are positive integers;
120, performing correlation matching on the M +1 th estimated measurement result and the M +1 th estimated track information;
130, when the correlation matching is effective matching, performing optimal fusion estimation based on the M +1 th estimated measurement result and the M +1 th estimated track information to obtain the t-th estimated track information of the plurality of sensorsN+1A first optimal fused estimate of a target state for an obstacle at a time.
Based on the embodiment of the invention, a complete multi-sensor information fusion solution is provided. By matching the obstacle information acquired by the sensor with the track information of the obstacle, the obstacle information which is possibly mistakenly identified by a single sensor can be filtered, so that the screened effective obstacle information is fused to obtain more accurate obstacle information. Meanwhile, information correlation can be carried out between different sensors, so that the sensors in various forms can be better fused, and more accurate and reliable obstacle fusion results can be extracted from the fusion results.
Specifically, tNTime t andN+1the time is the time when the information of multiple sensors are fused, tNThe moment is earlier than tN+1The time of day. The plurality of sensors may include cameras, lidar, millimeter wave radar, and other vehicle information based on V2V communication, among others, which may be distributed at different locations of the vehicle. Each sensor can process the acquired raw data of the obstacle to obtain obstacle information, and then send the respective result to the fusion center for information fusion, wherein the obstacle information can include the position, speed, length, width, height and/or the like of the obstacle. Compared with a centralized fusion technology, the fusion method has the advantages of low requirement on communication bandwidth, high calculation speed and good reliability. The obstacles may be pedestrians, vehicles, fixed obstacles, and the like, which may be one or more, for example, the number of obstacles may be i, i is a positive integer, and each sensor may detect the relevant information of each obstacle. The actual measurement information may be a sensing value of the sensor, and may include six sensing values of a position of the obstacle with respect to the host vehicle in the longitudinal direction, a longitudinal velocity, a longitudinal acceleration, a position of the obstacle with respect to the host vehicle in the lateral direction, a lateral velocity, and a lateral acceleration, or the sensor may calculate the six values from the sensing values. When the obstacle is a vehicle, the actual measurement information may be acquired by communication using V2V.
Fig. 2 is a schematic flow chart of an information fusion method of a multi-sensor for a vehicle according to another embodiment of the present invention.
210 at tNTime t andN+1acquiring the 1 st sensor when the 1 st sensor in the plurality of sensors is updated for the first time between momentsAt tN+11 st estimated measurement result and 1 st estimated track information of the obstacle at the moment, wherein the 1 st estimated measurement result is obtained by the 1 st actual measurement information of the 1 st sensor at the updating moment and the 1 st estimated track information is obtained by tNObtaining a second optimal fusion estimation result of the target state of the obstacle at the moment;
220, performing association matching on the 1 st estimated measurement result and the 1 st estimated track information;
230, when the association matching is effective matching, carrying out optimal fusion estimation based on the 1 st pre-estimated measurement result and the 1 st pre-estimated track information to obtain the 1 st sensor at tN+11 st estimated track information of a target state of the obstacle at the time.
Specifically, in 210, assume that at tNTime t andN+1at a certain time t between the momentsN+Δ1The data of the 1 st sensor is updated for the first time, and the 1 st actual measurement information can be obtained because the data acquired from the 1 st sensor is based on the 1 st sensor centroid coordinate system
Figure BDA0002313610040000081
And (4) spatially aligning the three-dimensional images into a vehicle centroid coordinate system, wherein i is a positive integer. Therefore, the position of the ith obstacle can be synchronized to the vehicle, and the more accurate position of the ith obstacle can be obtained. The spatial alignment may include rotation and translation of a coordinate system, for example, the 1 st actual measurement result of the ith obstacle after spatial alignment may be obtained by using the following formula (1)
Figure BDA0002313610040000082
When the 1 st actual measurement information is obtained by means of V2V communication, spatial alignment may be performed according to the 1 st actual measurement information of the obstacle and the motion information of the own vehicle, where the motion information may include longitude and latitude, speed, and the like.
Formula (1)
Figure BDA0002313610040000083
In the formula (2), the first and second groups,
Figure BDA0002313610040000084
is a rotation matrix of the 1 st sensor centroid coordinate system relative to the vehicle centroid coordinate system,
Figure BDA0002313610040000085
is a translation matrix of the 1 st sensor centroid coordinate system relative to the vehicle centroid coordinate system.
1 st actual measurement after spatial alignment due to acquired 1 st sensor
Figure BDA0002313610040000086
Is not tN+11 st actual measurement information of time, but tN+Δ11 st actual measurement information of the time. The 1 st actual measurement result after spatial alignment with respect to the ith obstacle may be
Figure BDA0002313610040000087
Synchronization to tN+1The time of day. First, the fusion time t can be calculated by the following formula (2)N+1Time difference Δ t from the 1 st actual measurement information update time of the 1 st sensorN+1|N+Δ1. Pairing the 1 st actual measurement result after spatial alignment based on the above time difference
Figure BDA0002313610040000088
Time alignment is performed, for example, at t can be obtained by the following formula (3)N+11 st estimated measurement result of time
Figure BDA0002313610040000089
Formula (2)
ΔtN+1|N+Δ1=tN+1-tN+Δ1
Formula (3)
Figure BDA0002313610040000091
In equation (3), B is the 1 st measurement prediction transition matrix of the 1 st sensor.
Can be compared with tNThe flight path information of the moment is predicted in one step to obtain the time tN+1And (4) estimating track information at moment. For example, 1 st estimated track information about jth obstacle
Figure BDA0002313610040000092
Can be calculated using the following formula (4).
Formula (4)
Figure BDA0002313610040000093
In formula (4), j is a positive integer, A is a system state transition matrix,
Figure BDA0002313610040000094
is tNA second optimal fused estimate of the target state for the jth obstacle at the time instant.
In 220, since the 1 st estimated measurement result is obtained based on the 1 st sensor itself, the 1 st estimated flight path information is based on tNThe 1 st estimated measurement result of the i-th obstacle of the 1 st sensor can be obtained by fusing the estimation results of the plurality of sensors at the time of fusion
Figure BDA0002313610040000095
And 1 st estimated track information of j barrier
Figure BDA0002313610040000096
And performing correlation matching and screening effective barrier information. In one embodiment of the present invention, the matching result c can be obtained by using the calculation of the following table 1 and the formula (5)ij(tN+1). It is understood that the 1 st estimated measurement result may include the coordinate position of the i-th obstacle in the vehicle coordinate system, and the width w of the i-th obstacle in the vehicle coordinate systemZi(tN+1) And the height h of the i-th obstacle in the vehicle coordinate systemZi(tN+1) Wherein, the i-th disorderThe coordinate position of the object may include an abscissa x of the i-th obstacle in the vehicle coordinate systemzi(tN+1) And the ordinate y of the i-th obstacle in the vehicle coordinate systemzi(tN+1). The 1 st estimated track information may include a coordinate position of a jth obstacle in a vehicle coordinate system, and a width w of the jth obstacle in the vehicle coordinate systemXj(tN+1) And the height h of the jth obstacle in the vehicle coordinate systemXj(tN+1) Wherein the coordinate position of the jth obstacle may include an abscissa x of the jth obstacle in the vehicle coordinate systemXj(tN+1) And the ordinate y of the jth obstacle in the vehicle coordinate systemXj(tN+1)。
TABLE 1
Association feature Weight of
Coordinate position 0.8
Width of 0.1
Height 0.1
Formula (5)
Figure BDA0002313610040000101
In equation (5), disij(tN+1) For the center distance between the obstacle corresponding to the 1 st estimated measurement result and the obstacle corresponding to the 1 st estimated track information,
Figure BDA0002313610040000102
based on the above matching result cij(tN+1) It is also possible to establish a correlation matching matrix C (t) of m × q as followsN+1) Wherein m is the total number of the obstacles detected by the 1 st sensor, q is the total number of tracks, m is greater than or equal to i, and q is greater than or equal to j.
Figure BDA0002313610040000103
In addition, the correlation matching matrix C (t) may be based on the above m × qN+1) And solving by using a KM algorithm or a Hungarian algorithm to obtain an optimal matching result of the ith barrier detected by the 1 st sensor and the jth barrier in the estimated track information.
At 230, a corresponding associated match threshold c may be set, for example, based on the accuracy, error, stability, etc. of the 1 st sensor and/or central processor with respect to the above match resultsthreshold. The matching result is less than the matching threshold cthresholdCan be considered as a valid match, the match result is greater than the match threshold cthresholdMay be considered an invalid match. Of course, the embodiments of the present invention are not limited to the above definition of valid matching and invalid matching.
When the association matching is effective matching, the 2 nd sensor at t can be obtained by utilizing a Kalman filtering algorithm based on the 1 st estimated measurement result and the 1 st estimated track informationN+12 nd predicted track information of a target state of the obstacle at the time. For example, the optimal fusion estimation can be performed using the following equation (6).
Formula (6)
Figure BDA0002313610040000111
In the formula (6), K (t)N+1)=Pj(tN+1|tN)HT(HPj(tN+1|tN)HT+R)-1
Figure BDA0002313610040000112
For sensor 2 with respect to jth obstacle at tN+12 nd estimated track information of time, K (t)N+1) Is tN+1The result of the kalman filter gain calculation at that time,
Figure BDA0002313610040000113
for the 1 st sensor with respect to the jth obstacle at tN+1The estimated measurement result of the moment, H is the transformation matrix of the 1 st sensor, HTIs the transpose of the conversion matrix of the 1 st sensor, Pj(tN+1|tN) Being based on P in respect of the j-th obstaclej(tN) 1 st estimated measurement, Pj(tN) For the 1 st sensor with respect to the jth obstacle at tNOptimal estimation of the covariance of the flight path information at the moment, R is the covariance of the measured noise, A is the system state transition matrix, ATIs the transpose of the system state transition matrix and Q is the covariance of the system process noise.
According to an embodiment of the invention, at tNTime t andN+1acquiring the M +1 th sensor in the plurality of sensors at t when the plurality of sensors are updated between momentsN+1The M +1 th estimated measurement result of the obstacle at the moment comprises: acquiring M +1 th actual measurement information of the M +1 th sensor about the obstacle during updating; performing spatial alignment on the M +1 th actual measurement information based on the M +1 th actual measurement information and the position information of the vehicle to obtain an M +1 th estimated measurement result of the obstacle after the spatial alignment; synchronizing M +1 th estimated measurement result about the obstacle after spatial alignment to tN+1At time, the M +1 th sensor at t is obtainedN+1The M +1 th estimate of the obstacle at the time.
In particular, the position information of the vehicle may include a coordinate system of a center of mass of the vehicle. Let us assume at tNTime t andN+1at a certain time t between the momentsN+Δ1Data of the M +1 th sensor is updated since the M +1 th sensorThe acquired data is based on the (M + 1) th sensor centroid coordinate system, and the (M + 1) th actual measurement information can be aligned to the vehicle centroid coordinate system through space, namely, the position of the obstacle is synchronized to the vehicle, and the more accurate position of the obstacle is obtained. The M +1 th actual measurement result after spatial alignment due to the acquired M +1 th sensor is not tN+1The M +1 th actual measurement information of the time, but tN+Δ1The M +1 th actual measurement information of the time. The M +1 th actual measurement result after spatial alignment with respect to the obstacle may be synchronized to tN+1Time, obtained at tN+1M +1 th estimated measurement result of time
Figure BDA0002313610040000121
The specific calculation method is similar to that of the sensor 1, and the above formulas (1) to (3) can be referred to, and are not described herein again.
According to an embodiment of the present invention, the M +1 th estimated track information may be calculated by using the following formula (7).
Formula (7)
Figure BDA0002313610040000122
In the formula (7), the first and second groups,
Figure BDA0002313610040000123
Figure BDA0002313610040000124
PM+2tN+1=I-KM+1tN+1HM+1PM+1tN+1,P2(tN+1)=(I-K(tN+1)HM+1)P(tN+1|tN),
Figure BDA0002313610040000125
for the M +1 th sensor at t with respect to the obstacleN+1M +1 th estimated track information of time, KM+1(tN+1) For the M +1 th sensor at t with respect to the obstacleN+1Kalman filter gain calculation of time, HM+1Is the conversion matrix of the M +1 th sensor,
Figure BDA0002313610040000126
as a transpose of the conversion matrix of the M +1 th sensor, P (t)N+1|tN) Based on P (t) for obstaclesN) 1 st estimated measurement, R is the covariance of the measurement noise, P2(tN+1) For the 2 nd sensor with respect to the obstacle at tN+1Covariance, P, corresponding to 2 nd estimated measurement of timeM+1(tN+1) For the M +1 th sensor at t with respect to the obstacleN+1Covariance, P, corresponding to the M +1 th estimated measurement at that momentM+2(tN+1) For the M +2 th sensor at t with respect to the obstacleN+1And (3) estimating the covariance corresponding to the measurement result at the M +2 th moment, wherein I is an identity matrix.
In the embodiment of the present invention, the correlation matching between the M +1 th predicted measurement result and the M +1 th predicted track information is similar to the correlation matching between the 1 st predicted measurement result and the 1 st predicted track information, and the correlation matching method thereof may be referred to, and will not be described herein again. It can be understood that the process of performing correlation matching between the M +1 th estimated measurement result and the M +1 th estimated track information complements the process of calculating the M +1 th estimated track information, in other words, the obstacle that is successfully matched can calculate the estimated track information about the obstacle of the next updated sensor.
According to the embodiment of the invention, when the obstacle information acquired by the sensor in the multi-sensor is updated, the obstacle information detected by the sensor is associated with the track information of the obstacle through the association matching. In other words, the embodiment of the invention uses sensor information of various different forms to maintain the track information of the detected obstacle, so that the related data of the obstacle acquired by multiple sensors is more accurate, and the information of the multiple sensors can be better fused. Meanwhile, through the correlation matching of the pre-estimated measurement result and the pre-estimated track information, the obstacle information which is possibly mistakenly identified by a single sensor can be filtered, and further the screened effective obstacle information is fused to obtain more accurate obstacle information.
According to the embodiment of the invention, after the correlation matching is performed on the M +1 th estimated measurement result and the M +1 th estimated track information, the method further comprises the following steps: and calculating the optimal matching result of the M +1 th estimated measurement result and the M +1 th estimated flight path information by using a KM algorithm or a Hungarian algorithm.
According to the embodiment of the invention, when the association matching is effective matching, the optimal fusion estimation is carried out based on the M +1 th estimated measurement result and the M +1 th estimated track information to obtain the t-th estimated track information of the plurality of sensorsN+1The first optimal fusion estimation result of the target state of the obstacle at the time comprises: and when the association matching is effective matching, performing optimal fusion estimation by using a Kalman filtering algorithm based on the M +1 th estimated measurement result and the M +1 th estimated target track information. In an embodiment of the present invention, the calculation can be performed by using the above formula (7). At the M +1 th sensor is tNTime t andN+1at the last sensor to update between the times of day,
Figure BDA0002313610040000131
i.e. the obstacle is at tN+1A first optimal fusion estimation result of the target state at the moment.
According to the embodiment of the invention, when the association matching is invalid matching, adding the corresponding first obstacle information in the sensor list into the initial track information list, and confirming and adding the first obstacle information into the target track information list when the M +1 estimated measurement result and the M +1 estimated track information are continuously and successfully matched in the preset period; and adding second obstacle information in the target track information list into the track information list to be deleted, and deleting the second obstacle information when the M +1 th estimated measurement result and the M +1 th estimated track information cannot be successfully matched in a preset period.
Specifically, when the above association match is an invalid match, the first obstacle information corresponding to the M +1 th estimated measurement result in the M +1 th sensor may be added to the initial track information list. If the M +1 th estimated measurement result of the first obstacle and the M +1 th estimated track information can be continuously and successfully matched in the following preset period, a new obstacle can be considered to appear for the vehicle, and the obstacle information can be confirmed and added into the target track information list. And second obstacle information corresponding to the M +1 estimated track information in the target track information list can be added into the track information list to be deleted, if the M +1 estimated measurement result of the obstacle and the M +1 estimated track information are not successfully matched in a later preset period, the obstacle information can be deleted from the target track information list, and accordingly, a certain obstacle in the target track information list is considered to be out of the detection range of the vehicle. Preferably, the preset period may be 10 periods. In addition, the case that the obstacle information detected by the 1 st sensor is not matched is similar to that of the M +1 st sensor, and the description is omitted here.
Fig. 3 is a schematic configuration diagram of an apparatus 300 for information fusion of multiple sensors for a vehicle according to an embodiment of the present invention.
The apparatus 300 for information fusion of multiple sensors for a vehicle includes an acquisition module 310, a determination module 320, and a fusion module 330.
An obtaining module 310 for obtaining at tNTime t andN+1acquiring the M +1 th sensor in the plurality of sensors at t when the plurality of sensors are updated between momentsN+1The M +1 th estimated measurement result and the M +1 th estimated track information of the obstacle at the moment are obtained, wherein the M +1 th estimated measurement result is obtained by the M +1 th actual measurement information of the obstacle at the update moment of the M +1 th sensor, and the M +1 th estimated track information is obtained by the M +1 th sensor at tN+1Obtaining the Mth estimated track information about the barrier at the moment, wherein N and M are positive integers; the judging module 320 is configured to perform association matching on the M +1 th estimated measurement result and the M +1 th estimated track information; a fusion module 330, configured to perform optimal fusion estimation based on the M +1 th pre-estimated measurement result and the M +1 th pre-estimated track information when the association matching is a valid matching, to obtain a t-th pre-estimated track information of the multiple sensorsN+1A first optimal fused estimate of a target state for an obstacle at a time.
Based on the embodiment of the invention, the device for information fusion of the multiple sensors for the vehicle filters the obstacle information which is possibly mistakenly identified by a single sensor by matching the obstacle information acquired by the sensors with the track information of the obstacle, so that the screened effective obstacle information is fused to obtain more accurate obstacle information. Meanwhile, information correlation can be carried out between different sensors, so that the sensors in various forms can be better fused, and more accurate and reliable obstacle fusion results can be extracted from the fusion results.
The obtaining module 310 is further configured to obtain the data at t according to an embodiment of the present inventionNTime t andN+1when the 1 st sensor in the sensors is updated for the first time between moments, the 1 st sensor at t is acquiredN+11 st estimated measurement result and 1 st estimated track information of the obstacle at the moment, wherein the 1 st estimated measurement result is obtained by the 1 st actual measurement information of the 1 st sensor at the updating moment and the 1 st estimated track information is obtained by tNObtaining a second optimal fusion estimation result of the target state of the obstacle at the moment; the determination module 320 is further configured to perform association matching on the 1 st estimated measurement result and the 1 st estimated track information; the fusion module 330 is further configured to perform optimal fusion estimation based on the 1 st pre-estimated measurement result and the 1 st pre-estimated track information to obtain the 2 nd sensor at t when the association matching is a valid matchingN+12 nd predicted track information of a target state of the obstacle at the time.
According to the embodiment of the present invention, the fusion module 330 is specifically configured to perform the optimal fusion estimation by using a kalman filter algorithm based on the M +1 th predicted measurement result and the M +1 th predicted track information when the association matching is the valid matching.
According to an embodiment of the present invention, the obtaining module 310 is specifically configured to: acquiring M +1 th actual measurement information of the M +1 th sensor about the obstacle during updating; performing spatial alignment on the M +1 th actual measurement information based on the M +1 th actual measurement information and the position information of the vehicle to obtain an M +1 th estimated measurement result of the obstacle after the spatial alignment; synchronizing M +1 th estimated measurement result about the obstacle after spatial alignment to tN+1At the moment, the M +1 th sensor is obtainedAt tN+1The M +1 th estimate of the obstacle at the time.
According to the embodiment of the present invention, the apparatus 300 for information fusion of multiple sensors for vehicles further includes a calculating module 340, configured to calculate a best matching result between the M +1 th estimated measurement result and the M +1 th estimated track information by using a KM algorithm or a hungarian algorithm.
According to an embodiment of the present invention, the apparatus 300 for information fusion of multiple sensors for a vehicle further comprises a processing module 350 for: when the association matching is invalid matching, adding corresponding first obstacle information in the sensor list into an initial track information list, and confirming and adding the first obstacle information into a target track information list when the M +1 estimated measurement result and the M +1 estimated track information are continuously and successfully matched in a preset period; and adding second obstacle information in the target track information list into the track information list to be deleted, and deleting the second obstacle information when the M +1 th estimated measurement result and the M +1 th estimated track information cannot be successfully matched in a preset period.
The operation and function of the above modules in the above information fusion device for multiple sensors for a vehicle can refer to the detailed description of the method embodiment part of fig. 1 and 2, and are not repeated herein in order to avoid repetition.
FIG. 4 is a block diagram of a computer device 400 according to an embodiment of the present invention.
Referring to fig. 4, apparatus 400 includes a processor 410 and memory resources represented by memory 420. The processor 410 further includes one or more processors, and the memory 420 is used to store instructions, such as application programs, that are executable by the processor 410. The application program stored in memory 420 may include one or more modules, each of which corresponds to a set of instructions. Further, the processor 410 is configured to execute instructions to perform the above-described information fusion method of the vehicular multi-sensor.
The apparatus 400 may also include a power component configured to perform power management of the apparatus 400. The apparatus 400 may also include a wired or wireless network interface configured to connect the apparatus 400 to a network. The apparatus 400 may further comprise an inputAn output (I/O) interface. The apparatus 400 may operate based on an operating system stored in the memory 420, such as a Windows ServerTM、Mac OS XTM、UnixTM、LinuxTM、FreeBSDTMAnd the like.
Embodiments of the present invention further provide a non-transitory computer-readable storage medium, where instructions in the storage medium, when executed by a processor of the apparatus 400, enable the apparatus 400 to perform a method for information fusion of multiple sensors for a vehicle, including: at tNTime t andN+1acquiring the M +1 th sensor in the plurality of sensors at t when the plurality of sensors are updated between momentsN+1The M +1 th estimated measurement result and the M +1 th estimated track information of the obstacle at the moment are obtained, wherein the M +1 th estimated measurement result is obtained by the M +1 th actual measurement information of the obstacle at the update moment of the M +1 th sensor, and the M +1 th estimated track information is obtained by the M +1 th sensor at tN+1Obtaining the Mth estimated track information about the barrier at the moment, wherein N and M are positive integers; performing association matching on the M +1 th estimated measurement result and the M +1 th estimated track information; when the association matching is effective matching, performing optimal fusion estimation based on the M +1 th pre-estimated measurement result and the M +1 th pre-estimated track information to obtain the t-th pre-estimated track information of the plurality of sensorsN+1A first optimal fused estimate of a target state for an obstacle at a time.
Those of ordinary skill in the art will appreciate that the steps for mining unmanned vehicle control of the examples described in connection with the embodiments disclosed herein can be implemented in electronic hardware, or a combination of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the method and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described embodiment of the control device for the mining unmanned vehicle is only illustrative, for example, the division of the units is only one logical function division, and other division manners may be available in actual implementation, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention or a part thereof, which essentially contributes to the prior art, can be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program check codes, such as a U disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (10)

1. A method for fusing information of multiple sensors for a vehicle, comprising:
at tNTime t andN+1when a plurality of sensors are updated between moments, acquiring the (M + 1) th sensor in the plurality of sensors at the tN+1The method comprises the steps of obtaining an M +1 th estimated measurement result and M +1 th estimated track information of an obstacle at a moment, wherein the M +1 th estimated measurement result is obtained by M +1 th actual measurement information of the obstacle at an update moment of an M +1 th sensor, and the M +1 th estimated track information is obtained by the M +1 th sensor at the tN+1Obtaining the Mth estimated flight path information about the obstacle at the moment, wherein N and M are positive integers;
performing association matching on the M +1 th estimated measurement result and the M +1 th estimated track information;
when the association matching is effective matching, performing optimal fusion estimation based on the M +1 th estimated measurement result and the M +1 th estimated track information to obtain the t-th estimated track information of the plurality of sensorsN+1A first optimal fused estimate of a target state for the obstacle at a time.
2. The method of claim 1, further comprising:
at said tNTime and said tN+1When the 1 st sensor in the plurality of sensors is updated for the first time between moments, the 1 st sensor is obtained at tN+1About obstacles at the momentThe 1 st estimated measurement result and the 1 st estimated track information, wherein the 1 st estimated measurement result is obtained from the 1 st actual measurement information of the 1 st sensor about the obstacle at the updating moment, and the 1 st estimated track information is obtained from the tNObtaining a second optimal fusion estimation result of the target state of the obstacle at the moment;
performing association matching on the 1 st pre-estimated measurement result and the 1 st pre-estimated track information;
when the association matching is effective matching, carrying out optimal fusion estimation based on the 1 st pre-estimated measurement result and the 1 st pre-estimated track information to obtain the 2 nd sensor at the tN+12 nd estimated track information of a target state of the obstacle at the time.
3. The method according to claim 1, wherein when the correlation match is a valid match, performing an optimal fusion estimation based on the M +1 th predicted measurement result and the M +1 th predicted track information to obtain the optimal fusion estimation of the plurality of sensors at the tN+1The first optimal fusion estimation result of the target state of the obstacle at the time comprises:
and when the association matching is effective matching, performing optimal fusion estimation by using a Kalman filtering algorithm based on the M +1 th estimated measurement result and the M +1 th estimated track information.
4. The method of claim 3, wherein the optimal fusion estimation using the Kalman filtering algorithm based on the M +1 th pre-estimated measurement result and the M +1 th pre-estimated track information is calculated by the following formula:
Figure FDA0002313610030000021
wherein the content of the first and second substances,
Figure FDA0002313610030000022
PM+2(tN+1)=(I-KM+1(tN+1)HM+1)PM+1(tN+1),
Figure FDA0002313610030000023
for the M +1 th sensor at the t with respect to the obstacleN+1The M +1 th estimated track information at the moment, KM+1(tN+1) For the M +1 th sensor at the t with respect to the obstacleN+1Kalman filter gain calculation of time, HM+1Is the conversion matrix of the M +1 th sensor,
Figure FDA0002313610030000025
is a transposed matrix of said conversion matrix, PM+1(tN+1) For the M +1 th sensor at the t with respect to the obstacleN+1The covariance corresponding to the M +1 th estimated measurement result at the moment, R is the covariance of the measured noise, I is the unit matrix, and t is the M +1 th sensorNTime and said tN+1At the last sensor to update between the times of day,
Figure FDA0002313610030000024
for the obstacle at the tN+1The first optimal fusion estimate of the time of day.
5. Method according to any of claims 1 to 4, characterized in that said at t is determined by a method of determining the maximum value of said threshold valueNTime t andN+1when a plurality of sensors are updated between moments, acquiring the (M + 1) th sensor in the plurality of sensors at the tN+1The M +1 th estimated measurement result and the M +1 th estimated track information about the obstacle at the moment comprise:
acquiring the M +1 th actual measurement information of the M +1 th sensor about the obstacle when updating;
performing spatial alignment on the M +1 th actual measurement information based on the M +1 th actual measurement information and the position information of the vehicle to obtain an M +1 th estimated measurement result of the obstacle after the spatial alignment;
synchronizing M +1 th estimated measurement result about the obstacle after spatial alignment to the tN+1At the moment, the M +1 th sensor at the tN+1An M +1 th prediction of the obstacle at a time.
6. The method according to any one of claims 1 to 4,
after the correlation matching is performed on the M +1 th estimated measurement result and the M +1 th estimated track information, the method further includes:
and calculating the optimal matching result of the M +1 th estimated measurement result and the M +1 th estimated flight path information by using a KM algorithm or a Hungarian algorithm.
7. The method of any of claims 1 to 4, further comprising:
when the association matching is invalid matching, adding corresponding first obstacle information in a sensor list into an initial track information list, and confirming and adding the first obstacle information into a target track information list when the M +1 th estimated measurement result and the M +1 th estimated track information are continuously and successfully matched in a preset period;
and adding second obstacle information in the target track information list into a track information list to be deleted, and deleting the second obstacle information when the M +1 th estimated measurement result and the M +1 th estimated track information cannot be successfully matched in a preset period.
8. An apparatus for information fusion of multiple sensors for a vehicle, comprising:
an acquisition module for acquiring at tNTime t andN+1acquiring an M +1 th sensor of the plurality of sensors at the time t when there are multiple sensor updates between timesN+1The M +1 th estimated measurement result and the M +1 th estimated track information of the barrier at the moment, wherein the M +1 th estimated measurement result is updated by the M +1 th sensorObtaining the M +1 th actual measurement information about the obstacle, wherein the M +1 th estimated flight path information is obtained by the M sensor at the tN+1Obtaining the Mth estimated flight path information about the obstacle at the moment, wherein N and M are positive integers;
the judging module is used for carrying out association matching on the M +1 th estimated measurement result and the M +1 th estimated track information;
a fusion module for performing optimal fusion estimation based on the M +1 th pre-estimated measurement result and the M +1 th pre-estimated track information when the correlation matching is effective matching, so as to obtain the t-th pre-estimated track information of the sensorsN+1A first optimal fused estimate of a target state for the obstacle at a time.
9. A computer-readable storage medium storing a computer program for executing the information fusion method of the vehicular multi-sensor according to any one of claims 1 to 7.
10. An electronic device, the electronic device comprising:
a processor;
a memory for storing the processor-executable instructions;
the processor is configured to execute the information fusion method of the vehicular multi-sensor according to any one of claims 1 to 7.
CN201911268821.2A 2019-12-11 2019-12-11 Information fusion method and device of multiple sensors for vehicle Pending CN110879598A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911268821.2A CN110879598A (en) 2019-12-11 2019-12-11 Information fusion method and device of multiple sensors for vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911268821.2A CN110879598A (en) 2019-12-11 2019-12-11 Information fusion method and device of multiple sensors for vehicle

Publications (1)

Publication Number Publication Date
CN110879598A true CN110879598A (en) 2020-03-13

Family

ID=69731063

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911268821.2A Pending CN110879598A (en) 2019-12-11 2019-12-11 Information fusion method and device of multiple sensors for vehicle

Country Status (1)

Country Link
CN (1) CN110879598A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111551973A (en) * 2020-04-16 2020-08-18 北京踏歌智行科技有限公司 Fault detection and correction method for unmanned inertial navigation system of strip mine
CN111551938A (en) * 2020-04-26 2020-08-18 北京踏歌智行科技有限公司 Unmanned technology perception fusion method based on mining area environment
CN112033429A (en) * 2020-09-14 2020-12-04 吉林大学 Target-level multi-sensor fusion method for intelligent automobile
CN112861971A (en) * 2021-02-07 2021-05-28 启迪云控(上海)汽车科技有限公司 Cross-point road side perception target tracking method and system
CN113281760A (en) * 2021-05-21 2021-08-20 阿波罗智能技术(北京)有限公司 Obstacle detection method, obstacle detection device, electronic apparatus, vehicle, and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101655561A (en) * 2009-09-14 2010-02-24 南京莱斯信息技术股份有限公司 Federated Kalman filtering-based method for fusing multilateration data and radar data
CN103065323A (en) * 2013-01-14 2013-04-24 北京理工大学 Subsection space aligning method based on homography transformational matrix
US20130246020A1 (en) * 2012-03-15 2013-09-19 GM Global Technology Operations LLC BAYESIAN NETWORK TO TRACK OBJECTS USING SCAN POINTS USING MULTIPLE LiDAR SENSORS
CN106101590A (en) * 2016-06-23 2016-11-09 上海无线电设备研究所 The detection of radar video complex data and processing system and detection and processing method
CN107798870A (en) * 2017-10-25 2018-03-13 清华大学 A kind of the flight path management method and system, vehicle of more vehicle target tracking
CN108573270A (en) * 2017-12-15 2018-09-25 蔚来汽车有限公司 Multisensor Target Information is set to merge method and device, computer equipment and the recording medium synchronous with multisensor sensing
CN108776333A (en) * 2018-06-19 2018-11-09 驭势(上海)汽车科技有限公司 A kind of secondary cascade fusion method of data, system, mobile unit and storage medium
CN108803622A (en) * 2018-07-27 2018-11-13 吉利汽车研究院(宁波)有限公司 A kind of method, apparatus for being handled target acquisition data
CN110533695A (en) * 2019-09-04 2019-12-03 深圳市唯特视科技有限公司 A kind of trajectory predictions device and method based on DS evidence theory

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101655561A (en) * 2009-09-14 2010-02-24 南京莱斯信息技术股份有限公司 Federated Kalman filtering-based method for fusing multilateration data and radar data
US20130246020A1 (en) * 2012-03-15 2013-09-19 GM Global Technology Operations LLC BAYESIAN NETWORK TO TRACK OBJECTS USING SCAN POINTS USING MULTIPLE LiDAR SENSORS
CN103065323A (en) * 2013-01-14 2013-04-24 北京理工大学 Subsection space aligning method based on homography transformational matrix
CN106101590A (en) * 2016-06-23 2016-11-09 上海无线电设备研究所 The detection of radar video complex data and processing system and detection and processing method
CN107798870A (en) * 2017-10-25 2018-03-13 清华大学 A kind of the flight path management method and system, vehicle of more vehicle target tracking
CN108573270A (en) * 2017-12-15 2018-09-25 蔚来汽车有限公司 Multisensor Target Information is set to merge method and device, computer equipment and the recording medium synchronous with multisensor sensing
CN108776333A (en) * 2018-06-19 2018-11-09 驭势(上海)汽车科技有限公司 A kind of secondary cascade fusion method of data, system, mobile unit and storage medium
CN108803622A (en) * 2018-07-27 2018-11-13 吉利汽车研究院(宁波)有限公司 A kind of method, apparatus for being handled target acquisition data
CN110533695A (en) * 2019-09-04 2019-12-03 深圳市唯特视科技有限公司 A kind of trajectory predictions device and method based on DS evidence theory

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ZHEHUA HU 等: ""Vision-GPS Fusion for High Precision Vehicle Localization in Urban Environment"", 《CICTP 2017》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111551973A (en) * 2020-04-16 2020-08-18 北京踏歌智行科技有限公司 Fault detection and correction method for unmanned inertial navigation system of strip mine
CN111551973B (en) * 2020-04-16 2022-04-05 北京踏歌智行科技有限公司 Fault detection and correction method for unmanned inertial navigation system of strip mine
CN111551938A (en) * 2020-04-26 2020-08-18 北京踏歌智行科技有限公司 Unmanned technology perception fusion method based on mining area environment
CN111551938B (en) * 2020-04-26 2022-08-30 北京踏歌智行科技有限公司 Unmanned technology perception fusion method based on mining area environment
CN112033429A (en) * 2020-09-14 2020-12-04 吉林大学 Target-level multi-sensor fusion method for intelligent automobile
CN112033429B (en) * 2020-09-14 2022-07-19 吉林大学 Target-level multi-sensor fusion method for intelligent automobile
CN112861971A (en) * 2021-02-07 2021-05-28 启迪云控(上海)汽车科技有限公司 Cross-point road side perception target tracking method and system
CN113281760A (en) * 2021-05-21 2021-08-20 阿波罗智能技术(北京)有限公司 Obstacle detection method, obstacle detection device, electronic apparatus, vehicle, and storage medium

Similar Documents

Publication Publication Date Title
CN110879598A (en) Information fusion method and device of multiple sensors for vehicle
CN106257242B (en) Unit and method for adjusting road boundaries
US10260892B2 (en) Data structure of environment map, environment map preparing system and method, and environment map updating system and method
AU2018282302B2 (en) Integrated sensor calibration in natural scenes
EP3283843B1 (en) Generating 3-dimensional maps of a scene using passive and active measurements
EP4318397A2 (en) Method of computer vision based localisation and navigation and system for performing the same
CN107406073B (en) Method and device for monitoring a target trajectory to be covered by a vehicle in terms of collision-free behavior
CN110264495B (en) Target tracking method and device
CN111932901B (en) Road vehicle tracking detection apparatus, method and storage medium
EP2713308A2 (en) Method and system for using fingerprints to track moving objects in video
KR20150058679A (en) Apparatus and method for localization of autonomous vehicle in a complex
CN112835030A (en) Data fusion method and device for obstacle target and intelligent automobile
KR20160048530A (en) Method and apparatus for generating pathe of autonomous vehicle
CN114248778B (en) Positioning method and positioning device of mobile equipment
JP2023525927A (en) Vehicle localization system and method
CN112381853A (en) Apparatus and method for person detection, tracking and identification using wireless signals and images
KR102106029B1 (en) Method and system for improving signage detection performance
CN109901194A (en) Onboard system, method, equipment and the storage medium of anticollision
CN110781730B (en) Intelligent driving sensing method and sensing device
CN112816974A (en) Method for integrating measurement data based on graphs
CN116429112A (en) Multi-robot co-location method and device, equipment and storage medium
CN103679746A (en) object tracking method based on multi-information fusion
JP2019158762A (en) Device, method, and program for detecting abnormalities
KR102252823B1 (en) Apparatus and method for tracking targets and releasing warheads
JP7020028B2 (en) State quantity integrated device and program

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200313