CN109343051A - A kind of multi-Sensor Information Fusion Approach driven for advanced auxiliary - Google Patents

A kind of multi-Sensor Information Fusion Approach driven for advanced auxiliary Download PDF

Info

Publication number
CN109343051A
CN109343051A CN201811363092.4A CN201811363092A CN109343051A CN 109343051 A CN109343051 A CN 109343051A CN 201811363092 A CN201811363092 A CN 201811363092A CN 109343051 A CN109343051 A CN 109343051A
Authority
CN
China
Prior art keywords
target entity
sensor
matrix
measurement
moment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811363092.4A
Other languages
Chinese (zh)
Inventor
周秀田
洪燕
马德仁
杨松铭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ZOTYE NEW ENERGY AUTOMOBILE Co Ltd
Original Assignee
ZOTYE NEW ENERGY AUTOMOBILE Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ZOTYE NEW ENERGY AUTOMOBILE Co Ltd filed Critical ZOTYE NEW ENERGY AUTOMOBILE Co Ltd
Priority to CN201811363092.4A priority Critical patent/CN109343051A/en
Publication of CN109343051A publication Critical patent/CN109343051A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes

Abstract

The invention discloses a kind of multi-Sensor Information Fusion Approachs for advanced DAS (Driver Assistant System), it is calculated and decision-making module and data fusion calculation module including data, budget computing module is target entity according to the moment state at that time of the status predication after last moment data fusion, multiple target entities that multiple sensors observe that multiple target entities and prediction calculate are associated judgement by budget computing module, judge that occurred target entity is real entities or false-alarm, false-alarm is just picked out;Multiple sensor selection switch are respectively arranged between each sensor and its signal processing module, if any one sensor occur exports measurement result, corresponding sensor selection switch is also closed therewith, carry out a data fusion process;A sensor is only selected in data fusion process, if multiple sensors have measurement result output simultaneously, sequentially carries out recycling multiple data fusion.False alarm rate and false dismissal phenomenon are reduced, measurement accuracy is improved.

Description

A kind of multi-Sensor Information Fusion Approach driven for advanced auxiliary
Technical field
The present invention relates to a kind of advanced DAS (Driver Assistant System)s (ADAS), drive system for advanced auxiliary more particularly, to one kind Multi-Sensor Information Fusion Approach in system (ADAS).
Background technique
Advanced DAS (Driver Assistant System) (ADAS) be by Multiple Source Sensor data fusion (information fusi on), Stable, comfortable, reliable dependable auxiliary is made for user and drives function, such as Lane Keeping System (Lane Keeping Assist, LKA), it is preceding touch early warning (Forward Collision Warning, FCW), pedestrian impact warning (Pedestrian Collision Warning, PCW), spacing monitoring report (Head Monitoring and Warning, HMW), emergency braking (AEB), adaptive cruise, automatic parking (APS) etc..These multisource data fusions, it is therefore intended that the redundancy of data information is number It is believed that the reliable analysis of breath provides foundation, to improve accuracy rate, false alarm rate and omission factor is reduced, realizes DAS (Driver Assistant System) The final goal of intelligent driving, safe driving is finally realized in self-test and self study.
Fusion is exactly the input content for obtaining different sensors and sensor, and uses combination Information together more accurately perceives the environment of surrounding.Using different sensors, each corresponding to different operating conditions Environment and perception target.Such as: camera of the work in limit of visible spectrum thick fog, rain, dazzling sunlight and illumination not It will appear identification failure in the case where foot.And radar lacks high-resolution possessed by current imaging sensor.Millimetre-wave radar To medium and long distance barrier (0.5 meter -150 meters), such as road vehicles, pedestrian, roadblock before main identification.Ultrasonic radar is main It identifies vehicle body short distance barrier (0.2 meter -5 meters), the curb during such as parking, static fore-aft vehicle, passing pedestrian Etc. information.The two synergistic effect, mutual tonifying for the deficiency portray vehicle body by data fusions such as measurement barrier angle, distance, speed Surrounding enviroment and reachable tree range.Relative to autonomous system, more preferable, safer decision can be made in this way.Data fusion The multi-sensor data resource for taking full advantage of different time and space obtains more sensings using computer technology in temporal sequence The observation data of device, are analyzed under certain criterion, are integrated, dominated and are used.It obtains and the consistency of measurand is explained With description, and then corresponding decision and estimation are realized, system is made to obtain information more more sufficient than its each component part.
And the basic framework of available data fusion at present is adopted as sensor parallel form, also measures all the sensors and ties It is related to carry out target after fruit Signal Pretreatment, related objective is subjected to data fusion (see Fig. 2);Each sensor refers to milli in Fig. 2 The sensors such as metre wave radar, imaging sensor and laser radar;Each sensor main function is to identify and measure surrounding objects reality Body, target entity include that vehicle periphery is static and moving vehicle, pedestrian etc..Each signal processing module 50 is realized to biography in Fig. 2 The processing such as sensor 20 is filtered target entity metrical information, coordinate conversion and time synchronization;Correlation data calculation in Fig. 2 It is that being same target is judged to all target entities of variant sensor measurement corresponding in Fig. 2 with decision-making module 10 Source, used algorithms most in use such as have nearest neighbor algorithm and JPDA scheduling algorithm, and make false-alarm at the judgement.
Data fusion computing module 40 is exactly by the measurement parameter of the target of the data correlation of different sensors measurement in Fig. 2 Carry out data fusion;Common data fusion calculating has: Bayesian statistical theory, nerual network technique and Kalman filtering Method.
Existing Fusion framework method is disadvantageous in that: first is that different sensors measurement result needs together Then step carries out relevant treatment and data fusion in synchronization, second is that only merged using the measurement result of current sensor, and Historical measurement is abandoned, measurement result precision is lower and false alarm rate is high, is easy to miss target entity namely false dismissal. False alarm rate and false dismissal be then for advanced DAS (Driver Assistant System) adas it is very unfavorable, so in data fusion process, it is necessary to drop Low false alarm rate and false dismissal.
Summary of the invention
The present invention is to solve the multi-Sensor Information Fusion Approach that is currently used in advanced DAS (Driver Assistant System) there is not It needs to synchronize in synchronization with sensor measurement, then carries out relevant treatment and data fusion;False alarm rate and false dismissal are existing The one kind provided as relatively more etc. statuses can reduce false alarm rate and false dismissal phenomenon, and that improves measurement accuracy is used for advanced auxiliary The multi-Sensor Information Fusion Approach of control loop.
The present invention is used to solve above-mentioned technical problem the specific technical proposal is: a kind of drive system for advanced auxiliary The multi-Sensor Information Fusion Approach of system, including data calculate with decision-making module, multiple sensors, multiple signal processing modules and Data fusion computing module, it is characterised in that: it further include budget computing module and multiple sensor selection switch, it is described pre- to calculate Calculating module is target entity according to the moment state at that time of the status predication after last moment data fusion, and budget computing module is by institute The multiple sensors stated observe that multiple target entities that multiple target entities and prediction calculate are associated judgement, and judgement is all Target entity occur is that real entities or false-alarm are just picked out if false-alarm;Multiple sensor selection switch, which respectively correspond, to be set It is placed between the corresponding each signal processing module of each sensor, if any one output measurement in multiple sensors As a result, then the sensor is selected, corresponding sensor selection switch is also closed therewith, carries out a data fusion mistake Journey;A sensor is only selected in data fusion process, if multiple sensors have measurement result output simultaneously, sequence is carried out Multiple data fusion so recycles, all the sensors measurement result is merged.False alarm rate and false dismissal phenomenon can be reduced, Improve measurement accuracy.The present invention drives for advanced auxiliary and automatic Pilot provides a kind of multi-Sensor Information Fusion Approach, the party Method can reduce radar false alarm rate and false dismissal phenomenon, improve accuracy of target measurement.
Preferably, first determining that coordinate system used in data fusion, coordinate system use car body coordinate before being merged System, installs all the sensors, and sensor is mounted on the multiple and different positions of car body, by coordinate system on the car body of this vehicle Coordinate origin be scheduled on vehicle forefront center, before vehicle to for X just, left side be Y just;Target is established in this coordinate system Entity state equation and observational equation.
State equation describes the motion state of target, provides basis for Fusion.
Preferably, the association judging result has including following situation: first is that the mesh that sensor one-shot measurement obtains It is related to prediction calculating target entity to mark entity, i.e. two kinds of target entities are same target entity sources;Second is that sensor measurement obtains To target entity prediction calculate in can not find relevant target entity;Third is that the target entity that prediction calculates is surveyed in sensor It cannot get related objective entity in amount;Second, third described kind is it may be the case that fresh target entity or false-alarm;Therefore calculate in advance Calculating second task of module is exactly to judge that occurred target entity is that real entities or false-alarm are just picked out if it is false-alarm Fall.
Association is one critical process of data fusion, be sensor measurement to target compared with original measurement target Compared with, if it is same target.
Preferably, the target entity state equation is discrete state equations, discrete state equations are used are as follows:
Subscript i be target entity number i=1,2 ...;
Subscript k is current time, and k-1 is k previous moment;
For the state variable matrix for describing target entity i;
It is k moment target entity i in the direction x coordinate;
It is k moment target entity i in the direction y coordinate;
It is k moment target entity i in the direction x speed;
It is k moment target entity i in the direction y speed;
It is k moment target entity i in x directional acceleration;
It is k moment target entity i in y directional acceleration;
T representing matrix transposition;
Fk-1For target entity motion state matrix;
This case state matrix is as follows:
Wherein: | Δ t is time interval, can be k and k-1 time interval;
It is the white Gaussian noise of Normal Distribution, meets:
Mean value
Covariance
QkNoise variance matrix
Δ t is time interval;
σvxFor the direction x velocity noise variance;
σvyFor the direction y velocity noise variance;
σaxFor x directional acceleration noise variance;
σayFor y directional acceleration noise variance;
This case thinks that acceleration is constant, and variable quantity is noise, variances sigmaax σay
δlnIt is Kronecker function
Subscript l, n represent l the and n moment.
The measurement parameter of the target entity of this case description refers to the measurement parameter that sensor identifies surrounding objects and obtains, Different sensors measurement parameter is different.
Such as: imaging sensor is expressed as target entity measurement parameter matrix
Wherein:
Subscript C is that imaging sensor measures target entity C=1,2 ...;
Subscript k is current time;
T representing matrix transposition;
It is imaging sensor measurement target entity c in the direction x coordinate position
It is imaging sensor measurement target entity c in the direction y coordinate position
The target entity measurement parameter matrix of millimetre-wave radar is expressed as
Subscript r is that imaging sensor measures target entity r=1,2 ...;
Subscript k is current time;
It is millimetre-wave radar measurement target entity in the direction x coordinate position
It is millimetre-wave radar measurement target entity in the direction y coordinate position
It is millimetre-wave radar measurement target entity in the direction x speed
Target entity measurement parameter is established in this case and target entity state variable relation equation is as follows:
Wherein:
Subscript j be sensor number j=1,2 ..., m;M is number of sensors;
Subscript i is target entity number i=1,2,3...;
Subscript k is current time;
It is k moment sensor j to the measurement parameter matrix of target entity i, is for imaging sensor
It is for millimetre-wave radar
For the state variable matrix to target entity i of formula as previously described (1.2);
It is the white Gaussian noise of sensor j, equal Normal Distribution;
Mean value:
Covariance: Cov (VI, Vn)=Rjδln (2.5)
RjNoise matrix is measured for sensor j, different sensors measure noise matrix difference,
δlnIt is Kronecker function
L, n represent l the and n moment;
Calculation matrix for sensor j at the k moment, corresponding different sensorsIt is different.
Calculation matrix is the matrix for establishing sensor measurement Yu state variable relationship.
Preferably, the prediction calculating of the budget computing module includes to existing target entity parameter prediction, target Substance parameter prediction includes state variable prediction and covariance prediction;Status predication equation is as follows:
Wherein:
Subscript i be target entity number i=1,2 ...;
Subscript k is current time, and k-1 is k previous moment;
It is that k moment state variable, state variable square are predicted according to k-1 moment state variable to target entity i Battle array is identical as formula (1.2) of above-mentioned technical proposal;
Fk-1For target entity motion state matrix, public affairs of the target entity motion state matrix defined formula with claim 4 Formula (1.3) is identical;
Target entity i is in k-1 moment state variable matrix, and state variable matrix is the same as above-mentioned technical proposal formula (1.2)
The predictive equation of covariance is described as follows:
Subscript i be target entity number i=1,2 ...;
Subscript k is current time;
Subscript k-1 is k previous moment;
T is matrix transposition;
The covariance matrix for being k-1 moment target entity i state variable parameter is a 6X6 rank matrix;
It is the k moment state variable estimated according to the covariance matrix of k-1 moment target entity i state variable Covariance matrix;
QkIt is identical with formula (1.7) of above-mentioned technical proposal with noise variance matrix for noise variance matrix.
Preferably, the number in multiple sensors be sensor 1, sensor 2 ... sensor m be respectively wrap One or more of millimetre-wave radar, imaging sensor, laser radar and ultrasonic radar are included, target entity is simultaneously for identification The measurement parameter of obtained target entity;The target entity measured value of sensor include: the position coordinates of target entity, speed, Target entity is one or more into the distance of sensor, type and recognition confidence information;The primary survey of each sensor Amount can identify the measured value of multiple target entities and multiple target entities.
Preferably, the signal processing of multiple signal processing modules includes to filter sensor measurement Wave, coordinate is converted and time synchronizing, each target entity calculation matrix obtained after signal processingIndicate that the k moment passes Measurement parameter of the sensor j to target entity i;
Wherein:
Subscript j be sensor number j=1,2 ..., m;M is number of sensors;
Subscript i is target entity number i=1,2,3...;
Subscript k is current time;
Indicate k moment sensor j to the measurement parameter matrix of target entity i.
Preferably, the data calculate and the calculating of the data of decision-making module and decision process are as follows: a sensor is worked as The preceding moment can identify multiple target entities, and provide measurement parameter namely measurement parameter matrix to each target entity, pass through There are multiple target entities at preceding multiple sensor measurement of multiple moment, prediction current time, and it is current that sensor is set up in data calculating Multiple target entities of moment measurement and multiple target entity incidence relations of prediction, judge to determine whether they have a target Source;Determine that the correlating method used is as follows:
Assuming that the target entity of a sensor measurement is Ci, the target entity of prediction is A;
1) the target entity C of sensor measurement is establishediWith the target entity A of predictionjIncidence matrix Dij
Subscript i is that the target entity of sensor measurement is numbered, i=1,2,3...m;M is the target entity of sensor measurement Quantity;
Subscript j is the target entity number of prediction, j=1,2,3...n;The target entity quantity of n prediction;
DijIt is each (C in incidence matrixi, Aj) associated measurement, it is CiWith AjThe measurement or similarity measurements of degree of closeness Amount;The referred to as degree of association, the expression of degree of association relationship are as follows:
Wherein DijCalculating is obtained according to the characteristic synthetic of multiple description target entities, these features include distance, speed The parameter that difference and class individual sensor can measure;Class individual sensor include large car, in-between car, compact car, people and Multiple sensors used in rickshaw;
The target entity correlation decision principle of this case is as follows:
If 1) Dij> DmaxIndicate (Ci, Aj) uncorrelated, DmaxFor degree of association threshold value;
2) target entity of a sensor measurement can only be associated with the target entity that one is predicted;
3) in table, for each prediction target entity Aj, in corresponding column, sensor corresponding to degree of association minimum is surveyed The target entity of amount is related;
Pass through mentioned above principle and degree of correlation DijValue can find out associated target entity (Ci, Aj);
Following result is finally obtained by above-mentioned association calculating:
1) target entity of sensor measurement has a prediction target entity to be corresponding to it, that is, belongs to same target source, then The target entity measurement parameter is the measured value for predicting that target entity is new;
2) target entity of sensor measurement, the target entity being not previously predicted is associated, then may be fresh target, can also It can be that false-alarm is determined by target entity decision;
3) there is no the target entity of sensor measurement associated in the target entity predicted, it may be possible to sensor is missed, It is also likely to be to be generated by noise, interference etc., and determined by target entity decision;
It is the confidence level by calculating its target entity that this case, which provides target entity decision-making technique, and decision is false-alarm or reality Body;
Objective degrees of confidence calculates as follows:
1) identify and measure all the sensors of target entity;
2) it identifies and measures the confidence level that the sensor of target entity provides;
3) apart from the time at current time when target entity is identified by sensor;
Wherein: subscript j be sensor number i=1,2 ...;
Subscript i be target entity number j=1,2 ..., m;
K is current time;
K-n be k before n at the time of n=0,1,2 ...;
Weight for sensor j at the k moment is determined according to sensor characteristics and time factor, general same sensing Device current time weight is big, and observation time is grown in the past, and weight is small;Different sensors determine weight according to characteristic;
It is sensor j in the confidence level of k moment target entity i, is provided by sensor, do not observe that target entity is 0;
Work as Bj> ByShi Zuowei physical object exists;
Work as Bj< BnShi Zuowei false-alarm eliminates target entity;
In By< Bj< BnWhen, it can not judge, retain target component, subsequent observation is to be confirmed;
Wherein: ByFor the threshold values for determining target entity;
BnFor the threshold values for determining target false-alarm;
Due to being the decision of multisensor multiple measurement results progress, false alarm rate is substantially reduced.
Preferably, the calculating of data fusion used by the data fusion computing module is as follows;
According to association calculating and decision, obtains three classes result and handle respectively;
1) target entity of sensor measurement has a prediction target entity to be corresponding to it, that is, belongs to same target source, then The target entity measurement parameter is the measured value for predicting that target entity is new;
Data fusion calculation method passes through following three equation:
The first step is gain calculating:
Second step is state variable optimal estimation of parameters
Third step is covariance estimation
Above-mentioned formula (6.1), formula (6.2), each symbolic interpretation of formula (6.3) are as follows:
Wherein: subscript k indicates current time, and k-1 is current previous moment
The associated target entity number i=1 of subscript i, 2 ...;
Subscript j be sensor number j=1,2 ..., m;M is number of sensors;
T representing matrix transposition;
Target entity i and prediction target entity data fusion gain matrix are measured for sensor j;
It is the k moment state estimated according to the covariance matrix of k-1 moment target entity i state variable parameter The covariance matrix of variable parameter, with the formula (4.1) in above-mentioned technical proposal;
Target entity calculation matrix for sensor j at the k moment, with the formula (2.3) in above-mentioned technical proposal;
It isTransposed matrix;
RjNoise matrix is measured for sensor j, with the formula (2.3) in above-mentioned technical proposal;
Optimal k moment target entity state variable parameter matrix is obtained for data fusion, in above-mentioned technical proposal (1.2);
It is to target entity i according to k-1 moment state variable parameter prediction k moment state variable parameter matrix, State variable parameter matrix is the same as the formula (1.2) in above-mentioned technical proposal;
It is sensor j to the measurement parameter matrix of target entity i;
It is the covariance matrix of K moment target entity i state variable parameter, with the formula in above-mentioned technical proposal (4.1);
It is the k moment state estimated according to the covariance matrix of k-1 moment target entity i state variable parameter The covariance matrix of variable parameter, with the formula (4.1) in above-mentioned technical proposal;
2) target entity of sensor measurement, the target entity being not previously predicted is associated, then may be fresh target, can also It can be false-alarm;First according to target entity handles, by repeatedly measuring whether ability decision is entity;
Target entity state variable parameter initial matrix and covariance initial matrix are needed, initial matrix determination is according to biography The target entity and sensor characteristics of sensor measurement determine;
3) there is no the target entity of sensor measurement associated in the target entity predicted, it may be possible to which sensor misses knowledge Not, referred to as false dismissal;It is also likely to be to remove sensor measurement range, it is also possible to be false-alarm, and true by target entity decision It is fixed;
By removing false-alarm after decision and being the target reality omitted for false dismissal not in the target entity of observation scope Body is calculated as follows;
It omits target entity state variable parameter matrix and covariance matrix is state variable parameter matrix and the association of prediction Variance matrix;After later observed result, if do not observed repeatedly, remove for false-alarm;If there is sensor observes the target Entity then carries out data fusion calculating.
Preferably, the sensor selection switch refers to the selection of respective sensor, sensor selection switch includes K1 Switch, K2 switch ... Km switch;If any one output measurement result in sensor, the sensor is selected, right with it The sensor selection switch closure answered, carries out a data fusion process;A sensor is only selected in a data fusion, If multiple sensors have measurement result simultaneously, sequence is repeatedly merged;Each fusion results are as entrance fusion next time Prediction calculates, and so recycles, and all the sensors measurement result can be carried out data fusion and obtain optimal estimation.
The beneficial effects of the present invention are: false alarm rate and false dismissal phenomenon can be reduced, measurement accuracy is improved.It can be by all biographies Sensor measurement result carries out fusion decision process not merely with all the sensors measurement result as decision-making foundation, will also go through The multiple result of history, which is used as, improves precision according to, reduces false alarm rate and false dismissed rate.
Detailed description of the invention:
The present invention is described in further detail with reference to the accompanying drawings and detailed description.
Fig. 1 is fusion method signal of the present invention for the multi-Sensor Information Fusion Approach of advanced DAS (Driver Assistant System) Figure.
Fig. 2 is that the fusion method in the prior art for the multi-Sensor Information Fusion Approach of advanced DAS (Driver Assistant System) is shown It is intended to.
Specific embodiment
In embodiment shown in FIG. 1, a kind of multi-Sensor Information Fusion Approach for advanced DAS (Driver Assistant System), including Data calculate and decision-making module 10, multiple sensors 20, multiple signal processing modules 50 and data fusion calculation module 40, also wrap Budget computing module 30 and multiple sensor selection switch 60 are included, the budget computing module is target entity according to last moment Multiple sensors are observed multiple targets to status predication after data fusion by moment state, budget computing module at that time Multiple target entities that entity and prediction calculate are associated judgement, judge that occurred target entity is real entities or void It is alert, if false-alarm, just pick out;Multiple sensor selection switch respectively correspond that be set to each sensor each of corresponding Between signal processing module, if any one output measurement result in multiple sensors, the sensor is selected, therewith phase Corresponding sensor selection switch is also closed therewith, carries out a data fusion process;The only selection one in data fusion process A sensor sequentially carries out multiple data fusion if multiple sensors have measurement result output simultaneously, so recycles, by institute There is sensor measurement to be merged.Multiple sensors 20 include sensor 1#, sensor 2#... sensor m#;Multiple letters Number processing module 50 includes signal processing module 1#, signal processing module 2#... signal processing module m#.
Before being merged, coordinate system used in data fusion is first determined, coordinate system uses bodywork reference frame, in this vehicle Car body on all the sensors are installed, and sensor is mounted on the multiple and different positions of car body, the coordinate of coordinate system is former Point is scheduled on vehicle forefront center, before vehicle to for X just, left side for Y just;Target entity state is established in this coordinate system Equation and observation equation.
Association judging result has including following situation: first is that target entity and prediction that sensor one-shot measurement obtains calculate Target entity is related, i.e. two kinds of target entities are same target entity sources;Second is that the target entity that sensor measurement obtains is pre- Surveying in calculating can not find relevant target entity;Third is that the target entity that prediction calculates cannot get related mesh in sensor measurement Mark entity;Second, third described kind is it may be the case that fresh target entity or false-alarm;Therefore second task of budget computing module Exactly judge that occurred target entity is that real entities or false-alarm are just picked out if it is false-alarm.
Target entity state equation is discrete state equations, and discrete state equations use are as follows:
Subscript i be target entity number i=1,2 ...;
Subscript k is current time, and k-1 is k previous moment;
For the state variable matrix for describing target entity i;
It is k moment target entity i in the direction x coordinate;
It is k moment target entity i in the direction y coordinate;
It is k moment target entity i in the direction x speed;
It is k moment target entity i in the direction y speed;
It is k moment target entity i in x directional acceleration;
It is k moment target entity i in y directional acceleration;
T representing matrix transposition;
Fk-1For target entity motion state matrix;This case state matrix is as follows:
Wherein: | Δ t is time interval, can be k and k-1 time interval;
It is the white Gaussian noise of Normal Distribution, meets:
Mean value
Covariance
QkNoise variance matrix
Δ t is time interval;
σvxFor the direction x velocity noise variance;
σvyFor the direction y velocity noise variance;
σaxFor x directional acceleration noise variance;
σayFor y directional acceleration noise variance;
This case thinks that acceleration is constant, and variable quantity is noise, variances sigmaax σay
δlnIt is Kronecker function
Subscript l, n represent l the and n moment.
The measurement parameter of the target entity of this case description refers to the measurement parameter that sensor identifies surrounding objects and obtains, Different sensors measurement parameter is different.
Such as: imaging sensor is expressed as target entity measurement parameter matrix
Wherein:
Subscript C is that imaging sensor measures target entity C=1,2 ...;
Subscript k is current time;
T representing matrix transposition;
It is imaging sensor measurement target entity c in the direction x coordinate position
It is imaging sensor measurement target entity c in the direction y coordinate position
The target entity measurement parameter matrix of millimetre-wave radar is expressed as
Subscript r is that imaging sensor measures target entity r=1,2 ...;
Subscript k is current time;
It is millimetre-wave radar measurement target entity in the direction x coordinate position
It is millimetre-wave radar measurement target entity in the direction y coordinate position
It is millimetre-wave radar measurement target entity in the direction x speed
Target entity measurement parameter is established in this case and target entity state variable relation equation is as follows:
Wherein:
Subscript j be sensor number j=1,2 ..., m;M is number of sensors;
Subscript i is target entity number i=1,2,3...;
Subscript k is current time;
It is k moment sensor j to the measurement parameter matrix of target entity i, is for imaging sensor
It is for millimetre-wave radar
For the state variable matrix to target entity i of formula as previously described (1.2);
It is the white Gaussian noise of sensor j, equal Normal Distribution;
Mean value:
Covariance: Cov (VI, Vn)=Rjδln (2.5)
RjNoise matrix is measured for sensor j, different sensors measure noise matrix difference,
δlnIt is Kronecker function
L, n represent l the and n moment;
Calculation matrix for sensor j at the k moment, corresponding different sensorsIt is different.
The prediction calculating of budget computing module includes to existing target entity parameter prediction, and target entity parameter prediction includes State variable prediction and covariance prediction;Status predication equation is as follows:
Wherein:
Subscript i be target entity number i=1,2 ...;
Subscript k is current time, and k-1 is k previous moment;
It is that k moment state variable is predicted according to k-1 moment state variable to target entity i,
State variable matrix is identical as formula (1.2) of above-mentioned technical proposal;
Fk-1For target entity motion state matrix, public affairs of the target entity motion state matrix defined formula with claim 4 Formula (1.3) is identical;
Target entity i is in k-1 moment state variable matrix, and state variable matrix is the same as above-mentioned technical proposal formula (1.2)
The predictive equation of covariance is described as follows:
Subscript i be target entity number i=1,2 ...;
Subscript k is current time;
Subscript k-1 is k previous moment;
T is matrix transposition;
The covariance matrix for being k-1 moment target entity i state variable parameter is a 6X6 rank matrix;
It is the k moment state variable estimated according to the covariance matrix of k-1 moment target entity i state variable Covariance matrix;
QkIt is identical with formula (1.7) of above-mentioned technical proposal with noise variance matrix for noise variance matrix.
Number in multiple sensors be sensor 1, sensor 2 ... sensor m be respectively include millimetre-wave radar, figure As one or more of sensor, laser radar and ultrasonic radar, target entity and obtained target entity for identification Measurement parameter;The target entity measured value of sensor includes: position coordinates, speed, the target entity to sensing of target entity It is one or more in the distance of device, type and recognition confidence information;The one-shot measurement of each sensor can identify multiple The measured value of target entity and multiple target entities.
The signal processing of multiple signal processing modules include for sensor measurement is filtered, coordinate conversion and when Between synchronization process, each target entity calculation matrix obtained after signal processingIndicate k moment sensor j to target entity i Measurement parameter;
Wherein:
Subscript j be sensor number j=1,2 ..., m;M is number of sensors;
Subscript i is target entity number i=1,2,3...;
Subscript k is current time;
Indicate k moment sensor j to the measurement parameter matrix of target entity i.
Data calculate and the calculating of the data of decision-making module and decision process are as follows: a sensor current time can identify more A target entity, and measurement parameter namely measurement parameter matrix are provided to each target entity, it is repeatedly passed by preceding multiple moment There are multiple target entities at sensor measurement, prediction current time, and data calculate the multiple mesh for setting up the measurement of sensor current time The multiple target entity incidence relations for marking entity and prediction, judge to determine whether they have a target source;Determine the pass used Linked method is as follows:
Assuming that the target entity of a sensor measurement is Ci, the target entity of prediction is A;
1) the target entity C of sensor measurement is establishediWith the target entity A of predictionjIncidence matrix Dij
Subscript i is that the target entity of sensor measurement is numbered, i=1,2,3...m;M is the target entity of sensor measurement Quantity;
Subscript j is the target entity number of prediction, j=1,2,3...n;The target entity quantity of n prediction;
DijIt is each (C in incidence matrixi, Aj) associated measurement, it is CiWith AjThe measurement or similarity measurements of degree of closeness Amount;The referred to as degree of association, the expression of degree of association relationship are as follows:
Wherein DijCalculating is obtained according to the characteristic synthetic of multiple description target entities, these features include distance, speed The parameter that difference and class individual sensor can measure;Class individual sensor include large car, in-between car, compact car, people and Multiple sensors used in rickshaw;
The target entity correlation decision principle of this case is as follows:
If 1) Dij> DmaxIndicate (Ci, Aj) uncorrelated, DmaxFor degree of association threshold value;
2) target entity of a sensor measurement can only be associated with the target entity that one is predicted;
3) in table, for each prediction target entity Aj, in corresponding column, sensor corresponding to degree of association minimum is surveyed The target entity of amount is related;
Pass through mentioned above principle and degree of correlation DijValue can find out associated target entity (Ci, Aj);
Following result is finally obtained by above-mentioned association calculating:
1) target entity of sensor measurement has a prediction target entity to be corresponding to it, that is, belongs to same target source, then The target entity measurement parameter is the measured value for predicting that target entity is new;
2) target entity of sensor measurement, the target entity being not previously predicted is associated, then may be fresh target, can also It can be that false-alarm is determined by target entity decision;
3) there is no the target entity of sensor measurement associated in the target entity predicted, it may be possible to sensor is missed, It is also likely to be to be generated by noise, interference etc., and determined by target entity decision;
It is the confidence level by calculating its target entity that this case, which provides target entity decision-making technique, and decision is false-alarm or reality Body;
Objective degrees of confidence calculates as follows:
1) identify and measure all the sensors of target entity;
2) it identifies and measures the confidence level that the sensor of target entity provides;
3) apart from the time at current time when target entity is identified by sensor;
Wherein: subscript j be sensor number i=1,2 ...;
Subscript i be target entity number j=1,2 ..., m;
K is current time;
K-n be k before n at the time of n=0,1,2 ...;
Weight for sensor j at the k moment is determined according to sensor characteristics and time factor, general same sensing Device current time weight is big, and observation time is grown in the past, and weight is small;Different sensors determine weight according to characteristic;
It is sensor j in the confidence level of k moment target entity i, is provided by sensor, does not observe that target buys body and is 0;
Work as Bj> ByShi Zuowei physical object exists;
Work as Bj< BnShi Zuowei false-alarm eliminates target entity;
In By< Bj< BnWhen, it can not judge, retain target component, subsequent observation is to be confirmed;
Wherein: ByFor the threshold values for determining target entity;
BnFor the threshold values for determining target false-alarm;
Due to being the decision of multisensor multiple measurement results progress, false alarm rate is substantially reduced.
Data fusion used by data fusion computing module calculates as follows;
According to association calculating and decision, obtains three classes result and handle respectively;
1) target entity of sensor measurement has a prediction target entity to be corresponding to it, that is, belongs to same target source, then The target entity measurement parameter is the measured value for predicting that target entity is new;
Data fusion calculation method passes through following three equation:
The first step is gain calculating:
Second step is state variable optimal estimation of parameters
Third step is covariance estimation
Above-mentioned formula (6.1), formula (6.2), each symbolic interpretation of formula (6.3) are as follows:
Wherein: subscript k indicates current time, and k-1 is current previous moment
The associated target entity number i=1 of subscript i, 2 ...;
Subscript j be sensor number j=1,2 ..., m;M is number of sensors;
T representing matrix transposition;
Target entity i and prediction target entity data fusion gain matrix are measured for sensor j;
It is the k moment state estimated according to the covariance matrix of k-1 moment target entity i state variable parameter The covariance matrix of variable parameter, with the formula (4.1) in above-mentioned technical proposal;
Target entity calculation matrix for sensor j at the k moment, with the formula (2.3) in above-mentioned technical proposal;
It isTransposed matrix;
RjNoise matrix is measured for sensor j, with the formula (2.3) in above-mentioned technical proposal;
Optimal k moment target entity state variable parameter matrix is obtained for data fusion, in above-mentioned technical proposal (1.2);
It is to target entity i according to k-1 moment state variable parameter prediction k moment state variable parameter matrix, State variable parameter matrix is the same as the formula (1.2) in above-mentioned technical proposal;
It is sensor j to the measurement parameter matrix of target entity i;
It is the covariance matrix of K moment target entity i state variable parameter, with the formula in above-mentioned technical proposal (4.1);
It is the k moment state estimated according to the covariance matrix of k-1 moment target entity i state variable parameter The covariance matrix of variable parameter, with the formula (4.1) in above-mentioned technical proposal;
2) target entity of sensor measurement, the target entity being not previously predicted is associated, then may be fresh target, can also It can be false-alarm;First according to target entity handles, by repeatedly measuring whether ability decision is entity;
Target entity state variable parameter initial matrix and covariance initial matrix are needed, initial matrix determination is according to biography The target entity and sensor characteristics of sensor measurement determine;
3) there is no the target entity of sensor measurement associated in the target entity predicted, it may be possible to which sensor misses knowledge Not, referred to as false dismissal;It is also likely to be to remove sensor measurement range, it is also possible to be false-alarm, and true by target entity decision It is fixed;
By removing false-alarm after decision and being the target reality omitted for false dismissal not in the target entity of observation scope Body is calculated as follows;
It omits target entity state variable parameter matrix and covariance matrix is state variable parameter matrix and the association of prediction Variance matrix;After later observed result, if do not observed repeatedly, remove for false-alarm;If there is sensor observes the target Entity then carries out data fusion calculating.
Sensor selection switch refers to the selection of respective sensor, sensor selection switch include K1 switch, K2 switch, ... Km switch;If any one output measurement result in sensor, the sensor is selected, corresponding sensor choosing It selects and closes the switch, carry out a data fusion process;A sensor is only selected in a data fusion, if multiple sensings Device has measurement result simultaneously, and sequence is repeatedly merged;Each fusion results are calculated as the prediction for entering fusion next time, so All the sensors measurement result can be carried out data fusion and obtain optimal estimation by circulation.
Measurement accuracy is improved, in decision process not merely with all the sensors measurement result as decision-making foundation, will also History multiple measurement results are as decision-making foundation.Method of the tracking in conjunction with Multi-sensor Fusion is not only substantially reduced into void It is alert, while reducing false dismissal.
The above content and structure describes the basic principles and main features and advantages of the present invention of the method for the present invention, current row The technical staff of industry should be recognized that.It is merely illustrated the principles of the invention described in examples detailed above and specification, is not departing from this Under the premise of spirit and range, various changes and improvements may be made to the invention, these changes and improvements belong to requirement and protect Within the scope of the invention of shield.The scope of the present invention is defined by the appended claims and its equivalents.

Claims (10)

1. a kind of multi-Sensor Information Fusion Approach for advanced DAS (Driver Assistant System), including data calculate with decision-making module, Multiple sensors, multiple signal processing modules and data fusion calculation module, it is characterised in that: further include budget computing module and Multiple sensor selection switch, the budget computing module are target entity according to the status predication after last moment data fusion Multiple sensors are observed multiple target entities and predict the multiple of calculating by moment state at that time, budget computing module Target entity is associated judgement, judges that occurred target entity is that real entities or false-alarm are just picked out if false-alarm Fall;Multiple sensor selection switch, which respectively correspond, to be set between the corresponding each signal processing module of each sensor, If any one output measurement result in multiple sensors, the sensor is selected, corresponding sensor selection Switch is also closed therewith, carries out a data fusion process;A sensor is only selected in data fusion process, if multiple Sensor has measurement result output simultaneously, sequentially carries out multiple data fusion, so recycle, by all the sensors measurement result into Row fusion.
2. the multi-Sensor Information Fusion Approach described in accordance with the claim 1 for advanced DAS (Driver Assistant System), feature exist In: before being merged, first determine coordinate system used in data fusion, coordinate system uses bodywork reference frame, in this vehicle All the sensors are installed on car body, and sensor is mounted on the multiple and different positions of car body, the coordinate origin of coordinate system is determined In vehicle forefront center, before vehicle to for X just, left side be Y just;Target entity state equation is established in this coordinate system And observational equation.
3. the multi-Sensor Information Fusion Approach described in accordance with the claim 1 for advanced DAS (Driver Assistant System), feature exist In: the association judging result has including following situation: first is that target entity and prediction that sensor one-shot measurement obtains are counted It is related to calculate target entity, i.e. two kinds of target entities are same target entity sources;Second is that the target entity that sensor measurement obtains exists Prediction can not find relevant target entity in calculating;Third is that the target entity that prediction calculates cannot get correlation in sensor measurement Target entity;Second, third described kind is it may be the case that fresh target entity or false-alarm;Therefore the judgement of budget computing module is all Target entity occur is that real entities or false-alarm are just picked out if it is false-alarm.
4. being used for the multi-Sensor Information Fusion Approach of advanced DAS (Driver Assistant System) according to claim 2, feature exists In: the target entity state equation is discrete state equations, and discrete state equations are adopted as
Subscript i be target entity number i=1,2 ...;
Subscript k is current time, and k-1 is k previous moment;
For the state variable matrix for describing target entity i;
It is k moment target entity i in the direction x coordinate;
It is k moment target entity i in the direction y coordinate;
It is k moment target entity i in the direction x speed;
It is k moment target entity i in the direction y speed;
It is k moment target entity i in x directional acceleration;
It is k moment target entity i in y directional acceleration;
T representing matrix transposition;
Fk-1For target entity motion state matrix;
It is as follows that state matrix is established in this case:
Wherein: | Δ t is time interval, can be k and k-1 time interval;
It is the white Gaussian noise of Normal Distribution, meets:
Mean value
Covariance
QkNoise variance matrix
Δ t is time interval;
σvxFor the direction x velocity noise variance;
σvyFor the direction y velocity noise variance;
σaxFor x directional acceleration noise variance;
σayFor y directional acceleration noise variance;
This case thinks that acceleration is constant, and variable quantity is noise, variances sigmaax σay
δlnIt is Kronecker function
Subscript l, n represent l the and n moment.
The measurement parameter of the target entity of this case description refers to the measurement parameter that sensor identifies surrounding objects and obtains, different Sensor measures parameters are different.
Such as: imaging sensor is expressed as target entity measurement parameter matrix
Wherein:
Subscript C is that imaging sensor measures target entity C=1,2 ...;
Subscript k is current time;
T representing matrix transposition;
It is imaging sensor measurement target entity c in the direction x coordinate position
It is imaging sensor measurement target entity c in the direction y coordinate position
The target entity measurement parameter matrix of millimetre-wave radar is expressed as
Subscript r is that imaging sensor measures target entity r=1,2 ...;
Subscript k is current time;
It is millimetre-wave radar measurement target entity in the direction x coordinate position
It is millimetre-wave radar measurement target entity in the direction y coordinate position
It is millimetre-wave radar measurement target entity in the direction x speed
Target entity measurement parameter is established in this case and target entity state variable relation equation is as follows:
Wherein:
Subscript j be sensor number j=1,2 ..., m;M is number of sensors;
Subscript i is target entity number i=1,2,3...;
Subscript k is current time;
It is k moment sensor j to the measurement parameter matrix of target entity i, is for imaging sensor
It is for millimetre-wave radar
For the state variable matrix to target entity i of formula as previously described (1.2);
It is the white Gaussian noise of sensor j, equal Normal Distribution;
Mean value:
Covariance: Cov (VI, Vn)=Rjδln (2.5)
RjNoise matrix is measured for sensor j, different sensors measure noise matrix difference,
δlnIt is Kronecker function
L, n represent l the and n moment;
Calculation matrix for sensor j at the k moment, corresponding different sensorsIt is different.
5. the multi-Sensor Information Fusion Approach described in accordance with the claim 1 for advanced DAS (Driver Assistant System), feature exist In: the prediction calculating of the budget computing module includes to existing target entity parameter prediction, target entity parameter prediction packet Include state variable prediction and covariance prediction;Status predication equation is as follows:
Wherein:
Subscript i be target entity number i=1,2 ...;
Subscript k is current time, and k-1 is k previous moment;
Be to target entity i according to k-1 moment state variable predict k moment state variable, state variable matrix with The formula (1.2) of claim 4 is identical;
Fk-1For target entity motion state matrix, formula of the target entity motion state matrix defined formula with claim 4 (1.3) identical;
Target entity i is in k-1 moment state variable matrix, and state variable matrix is the same as same claim 4 formula (1.2)
The predictive equation of covariance is described as follows:
Subscript i be target entity number i=1,2 ...;
Subscript k is current time;
Subscript k-1 is k previous moment;
T is matrix transposition;
The covariance matrix for being k-1 moment target entity i state variable parameter is a 6X6 rank matrix;
It is the association side for the k moment state variable estimated according to the covariance matrix of k-1 moment target entity i state variable Poor matrix;
QkIt is identical with formula (1.7) of claim 4 with noise variance matrix for noise variance matrix.
6. the multi-Sensor Information Fusion Approach described in accordance with the claim 1 for advanced DAS (Driver Assistant System), feature exist In: the number in multiple sensors be sensor 1, sensor 2 ... sensor m be respectively include millimetre-wave radar, figure As one or more of sensor, laser radar and ultrasonic radar, target entity and obtained target entity for identification Measurement parameter;The target entity measured value of sensor includes: position coordinates, speed, the target entity to sensing of target entity It is one or more in the distance of device, type and recognition confidence information;The one-shot measurement of each sensor can identify multiple The measured value of target entity and multiple target entities.
7. the multi-Sensor Information Fusion Approach described in accordance with the claim 1 for advanced DAS (Driver Assistant System), feature exist Include in: the signal processing of multiple signal processing modules for sensor measurement is filtered, coordinate conversion and Time synchronizing, each target entity calculation matrix obtained after signal processingIndicate k moment sensor j to target reality The measurement parameter of body i;
Wherein:
Subscript j be sensor number j=1,2 ..., m;M is number of sensors;
Subscript i is target entity number i=1,2,3...;
Subscript k is current time;
Indicate k moment sensor j to the measurement parameter matrix of target entity i.
8. the multi-Sensor Information Fusion Approach described in accordance with the claim 1 for advanced DAS (Driver Assistant System), feature exist In: the data calculate and the calculating of the data of decision-making module and decision process are as follows: a sensor current time can identify Multiple target entities, and measurement parameter namely measurement parameter matrix are provided to each target entity, it is multiple by preceding multiple moment There are multiple target entities at sensor measurement, prediction current time, and data, which calculate, sets up the multiple of sensor current time measurement Multiple target entity incidence relations of target entity and prediction judge to determine whether they have a target source;What determination used Correlating method is as follows: assuming that the target entity of a sensor measurement is Ci, the target entity of prediction is A;
1) the target entity C of sensor measurement is establishediWith the target entity A of predictionjIncidence matrix Dij
Subscript i is that the target entity of sensor measurement is numbered, i=1,2,3...m;M is the target entity quantity of sensor measurement;
Subscript j is the target entity number of prediction, j=1,2,3...n;The target entity quantity of n prediction;
DijIt is each (C in incidence matrixi, Aj) associated measurement, it is CiWith AjThe measurement or similarity measurement of degree of closeness; The referred to as degree of association, the expression of degree of association relationship are as follows:
Wherein DijCalculating be according to it is multiple description target entities characteristic synthetics obtain, these features include distance, speed difference, with And the parameter that class individual sensor can measure;Class individual sensor includes large car, in-between car, compact car, people and manpower Multiple sensors used in vehicle;
The target entity correlation decision principle of this case is as follows:
If 1) Dij> DmaxIndicate (Ci, Aj) uncorrelated, DmaxFor degree of association threshold value;
2) target entity of a sensor measurement can only be associated with the target entity that one is predicted;
3) in table, for each prediction target entity Aj, in corresponding column, sensor measurement corresponding to degree of association minimum Target entity is related;
Pass through mentioned above principle and degree of correlation DijValue can find out associated target entity (Ci, Aj);
Following result is finally obtained by above-mentioned association calculating:
1) target entity of sensor measurement has a prediction target entity to be corresponding to it, that is, belongs to same target source, then the mesh Mark entity measuring parameter is the measured value for predicting that target entity is new;
2) target entity of sensor measurement, the target entity being not previously predicted is associated, then may be fresh target, it is also possible to False-alarm is determined by target entity decision;
3) there is no the target entity of sensor measurement associated in the target entity predicted, it may be possible to which sensor is missed, can also Can be generated by noise, interference etc., and determined by target entity decision;
It is the confidence level by calculating its target entity that this case, which provides target entity decision-making technique, and decision is false-alarm or entity;
Objective degrees of confidence calculates as follows:
1) identify and measure all the sensors of target entity;
2) it identifies and measures the confidence level that the sensor of target entity provides;
3) apart from the time at current time when target entity is identified by sensor;
Wherein: subscript j be sensor number i=1,2 ...;
Subscript i be target entity number j=1,2 ..., m;
K is current time;
K-n be k before n at the time of n=0,1,2 ...;
Weight for sensor j at the k moment is determined according to sensor characteristics and time factor, and general same sensor is worked as Preceding moment weight is big, and observation time is grown in the past, and weight is small;Different sensors determine weight according to characteristic;
It is sensor j in the confidence level of k moment target entity i, is provided by sensor, does not observe that target entity is 0;
Work as Bj> ByShi Zuowei physical object exists;
Work as Bj< BnShi Zuowei false-alarm eliminates target entity;
In By< Bj< BnWhen, it can not judge, retain target component, subsequent observation is to be confirmed;
Wherein: ByFor the threshold values for determining target entity;
BnFor the threshold values for determining target false-alarm;
Due to being the decision of multisensor multiple measurement results progress, false alarm rate is substantially reduced.
9. the multi-Sensor Information Fusion Approach described in accordance with the claim 1 for advanced DAS (Driver Assistant System), feature exist In: data fusion used by the data fusion computing module calculates as follows;
According to association calculating and decision, obtains three classes result and handle respectively;
1) target entity of sensor measurement has a prediction target entity to be corresponding to it, that is, belongs to same target source, then the mesh Mark entity measuring parameter is the measured value for predicting that target entity is new;
Data fusion calculation method passes through following three equation:
The first step is gain calculating:
Second step is state variable optimal estimation of parameters
Third step is covariance estimation
Above-mentioned formula (6.1), formula (6.2), each symbolic interpretation of formula (6.3) are as follows:
Wherein: subscript k indicates current time, and k-1 is current previous moment
The associated target entity number i=1 of subscript i, 2 ...;
Subscript j be sensor number j=1,2 ..., m;M is number of sensors;
T representing matrix transposition;
Target entity i and prediction target entity data fusion gain matrix are measured for sensor j;
It is the k moment state variable estimated according to the covariance matrix of k-1 moment target entity i state variable parameter The covariance matrix of parameter, with the formula (4.1) in claim 5;
Target entity calculation matrix for sensor j at the k moment, with the formula (2.3) in claim 4;
It isTransposed matrix;
RjNoise matrix is measured for sensor j, with the formula (2.3) in claim 4;
Optimal k moment target entity state variable parameter matrix is obtained for data fusion, in claim 4 (1.2);
It is to target entity i according to k-1 moment state variable parameter prediction k moment state variable parameter matrix, state Variable parameter matrix is the same as the formula (1.2) in claim 4;
It is sensor j to the measurement parameter matrix of target entity i;
It is the covariance matrix of K moment target entity i state variable parameter, with the formula (4.1) in claim 5;
It is the k moment state variable estimated according to the covariance matrix of k-1 moment target entity i state variable parameter The covariance matrix of parameter, with the formula (4.1) in claim 5;
2) target entity of sensor measurement, the target entity being not previously predicted is associated, then may be fresh target, it is also possible to False-alarm;First according to target entity handles, by repeatedly measuring whether ability decision is entity;
Target entity state variable parameter initial matrix and covariance initial matrix are needed, initial matrix determination is according to sensor The target entity and sensor characteristics of measurement determine;
3) there is no the target entity of sensor measurement associated in the target entity predicted, it may be possible to sensor misses identification, Referred to as false dismissal;It is also likely to be to remove sensor measurement range, it is also possible to be false-alarm, and be determined by target entity decision;
By removing false-alarm after decision and not in the target entity of observation scope, for false dismissal be the target entity omitted into Row is following to be calculated;
It omits target entity state variable parameter matrix and covariance matrix is the state variable parameter matrix and covariance of prediction Matrix;After later observed result, if do not observed repeatedly, remove for false-alarm;If there is sensor observes the target entity Then carry out data fusion calculating.
10. the multi-Sensor Information Fusion Approach described in accordance with the claim 1 for advanced DAS (Driver Assistant System), feature exist Refer to the selection of respective sensor in: the sensor selection switch, sensor selection switch include K1 switch, K2 switch, ... Km switch;If any one output measurement result in sensor, the sensor is selected, corresponding sensor choosing It selects and closes the switch, carry out a data fusion process;A sensor is only selected in a data fusion, if multiple sensings Device has measurement result simultaneously, and sequence is repeatedly merged;Each fusion results are calculated as the prediction for entering fusion next time, so All the sensors measurement result can be carried out data fusion and obtain optimal estimation by circulation.
CN201811363092.4A 2018-11-15 2018-11-15 A kind of multi-Sensor Information Fusion Approach driven for advanced auxiliary Pending CN109343051A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811363092.4A CN109343051A (en) 2018-11-15 2018-11-15 A kind of multi-Sensor Information Fusion Approach driven for advanced auxiliary

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811363092.4A CN109343051A (en) 2018-11-15 2018-11-15 A kind of multi-Sensor Information Fusion Approach driven for advanced auxiliary

Publications (1)

Publication Number Publication Date
CN109343051A true CN109343051A (en) 2019-02-15

Family

ID=65315631

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811363092.4A Pending CN109343051A (en) 2018-11-15 2018-11-15 A kind of multi-Sensor Information Fusion Approach driven for advanced auxiliary

Country Status (1)

Country Link
CN (1) CN109343051A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110163270A (en) * 2019-05-10 2019-08-23 北京易控智驾科技有限公司 Multi-Sensor Information Fusion Approach and system
CN111391823A (en) * 2019-12-27 2020-07-10 湖北亿咖通科技有限公司 Multilayer map making method for automatic parking scene
CN112712549A (en) * 2020-12-31 2021-04-27 上海商汤临港智能科技有限公司 Data processing method, data processing device, electronic equipment and storage medium
CN112943450A (en) * 2019-12-10 2021-06-11 中国航发商用航空发动机有限责任公司 Engine monitoring device
CN113761705A (en) * 2021-07-19 2021-12-07 合肥工业大学 Multi-sensor fusion method and system based on multi-dimensional attribute correlation analysis
WO2023071992A1 (en) * 2021-10-26 2023-05-04 北京万集科技股份有限公司 Method and apparatus for multi-sensor signal fusion, electronic device and storage medium

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103731848A (en) * 2012-10-12 2014-04-16 北京源微达科技有限公司 Data fusion method for multi-sensor distributed system
CN103729357A (en) * 2012-10-12 2014-04-16 北京源微达科技有限公司 Distributed sensor system data association method
CN104502907A (en) * 2014-12-15 2015-04-08 西安电子工程研究所 Stable ground moving/static target tracking method for airborne radar
CN105372659A (en) * 2015-11-20 2016-03-02 上海无线电设备研究所 Road traffic monitoring multi-target detection tracking method and tracking system
CN105844217A (en) * 2016-03-11 2016-08-10 南京航空航天大学 Multi-target tracking method based on measure-driven target birth intensity PHD (MDTBI-PHD)
CN106093951A (en) * 2016-06-06 2016-11-09 清华大学 Object tracking methods based on array of ultrasonic sensors
CN106101590A (en) * 2016-06-23 2016-11-09 上海无线电设备研究所 The detection of radar video complex data and processing system and detection and processing method
CN106408940A (en) * 2016-11-02 2017-02-15 南京慧尔视智能科技有限公司 Microwave and video data fusion-based traffic detection method and device
US20180096256A1 (en) * 2015-02-26 2018-04-05 Stmicroelectronics International N.V. Method and apparatus for determining probabilistic context awareness of a mobile device user using a single sensor and/or multi-sensor data fusion
CN107942293A (en) * 2017-10-30 2018-04-20 中国民用航空总局第二研究所 The Target dots processing method and system of airport surface detection radar
CN107980138A (en) * 2016-12-28 2018-05-01 深圳前海达闼云端智能科技有限公司 A kind of false-alarm obstacle detection method and device
CN108573271A (en) * 2017-12-15 2018-09-25 蔚来汽车有限公司 Optimization method and device, computer equipment and the recording medium of Multisensor Target Information fusion
CN108733055A (en) * 2018-05-18 2018-11-02 郑州万达科技发展有限公司 A kind of method and AGV navigation positional devices of Fusion

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103729357A (en) * 2012-10-12 2014-04-16 北京源微达科技有限公司 Distributed sensor system data association method
CN103731848A (en) * 2012-10-12 2014-04-16 北京源微达科技有限公司 Data fusion method for multi-sensor distributed system
CN104502907A (en) * 2014-12-15 2015-04-08 西安电子工程研究所 Stable ground moving/static target tracking method for airborne radar
US20180096256A1 (en) * 2015-02-26 2018-04-05 Stmicroelectronics International N.V. Method and apparatus for determining probabilistic context awareness of a mobile device user using a single sensor and/or multi-sensor data fusion
CN105372659A (en) * 2015-11-20 2016-03-02 上海无线电设备研究所 Road traffic monitoring multi-target detection tracking method and tracking system
CN105844217A (en) * 2016-03-11 2016-08-10 南京航空航天大学 Multi-target tracking method based on measure-driven target birth intensity PHD (MDTBI-PHD)
CN106093951A (en) * 2016-06-06 2016-11-09 清华大学 Object tracking methods based on array of ultrasonic sensors
CN106101590A (en) * 2016-06-23 2016-11-09 上海无线电设备研究所 The detection of radar video complex data and processing system and detection and processing method
CN106408940A (en) * 2016-11-02 2017-02-15 南京慧尔视智能科技有限公司 Microwave and video data fusion-based traffic detection method and device
CN107980138A (en) * 2016-12-28 2018-05-01 深圳前海达闼云端智能科技有限公司 A kind of false-alarm obstacle detection method and device
CN107942293A (en) * 2017-10-30 2018-04-20 中国民用航空总局第二研究所 The Target dots processing method and system of airport surface detection radar
CN108573271A (en) * 2017-12-15 2018-09-25 蔚来汽车有限公司 Optimization method and device, computer equipment and the recording medium of Multisensor Target Information fusion
CN108733055A (en) * 2018-05-18 2018-11-02 郑州万达科技发展有限公司 A kind of method and AGV navigation positional devices of Fusion

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110163270A (en) * 2019-05-10 2019-08-23 北京易控智驾科技有限公司 Multi-Sensor Information Fusion Approach and system
CN112943450A (en) * 2019-12-10 2021-06-11 中国航发商用航空发动机有限责任公司 Engine monitoring device
CN111391823A (en) * 2019-12-27 2020-07-10 湖北亿咖通科技有限公司 Multilayer map making method for automatic parking scene
CN112712549A (en) * 2020-12-31 2021-04-27 上海商汤临港智能科技有限公司 Data processing method, data processing device, electronic equipment and storage medium
CN113761705A (en) * 2021-07-19 2021-12-07 合肥工业大学 Multi-sensor fusion method and system based on multi-dimensional attribute correlation analysis
WO2023071992A1 (en) * 2021-10-26 2023-05-04 北京万集科技股份有限公司 Method and apparatus for multi-sensor signal fusion, electronic device and storage medium

Similar Documents

Publication Publication Date Title
CN109343051A (en) A kind of multi-Sensor Information Fusion Approach driven for advanced auxiliary
CN102881022B (en) Concealed-target tracking method based on on-line learning
CN104573646B (en) Chinese herbaceous peony pedestrian detection method and system based on laser radar and binocular camera
CN112149550B (en) Automatic driving vehicle 3D target detection method based on multi-sensor fusion
CN110264495B (en) Target tracking method and device
CN106428000A (en) Vehicle speed control device and method
CN114299417A (en) Multi-target tracking method based on radar-vision fusion
CN104378582A (en) Intelligent video analysis system and method based on PTZ video camera cruising
CN103426179B (en) A kind of method for tracking target based on mean shift multiple features fusion and device
CN1940591A (en) System and method of target tracking using sensor fusion
CN103824070A (en) Rapid pedestrian detection method based on computer vision
CN108645375B (en) Rapid vehicle distance measurement optimization method for vehicle-mounted binocular system
US20230228593A1 (en) Method, device and system for perceiving multi-site roadbed network and terminal
CN102254394A (en) Antitheft monitoring method for poles and towers in power transmission line based on video difference analysis
CN104517125A (en) Real-time image tracking method and system for high-speed article
JP2012123642A (en) Image identifying device and program
Kühnl et al. Visual ego-vehicle lane assignment using spatial ray features
CN113792598B (en) Vehicle-mounted camera-based vehicle collision prediction system and method
CN115249066A (en) Quantile neural network
KR102494953B1 (en) On-device real-time traffic signal control system based on deep learning
KR20160081190A (en) Method and recording medium for pedestrian recognition using camera
Mannion Vulnerable road user detection: state-of-the-art and open challenges
CN111738323B (en) Hybrid enhanced intelligent track prediction method and device based on gray Markov model
CN110781730B (en) Intelligent driving sensing method and sensing device
BOURJA et al. Real time vehicle detection, tracking, and inter-vehicle distance estimation based on stereovision and deep learning using YOLOv3

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20190215