CN114139651A - Confidence coefficient acquisition method of multi-sensor fusion target tracking system, storage medium and electronic equipment - Google Patents

Confidence coefficient acquisition method of multi-sensor fusion target tracking system, storage medium and electronic equipment Download PDF

Info

Publication number
CN114139651A
CN114139651A CN202111520563.XA CN202111520563A CN114139651A CN 114139651 A CN114139651 A CN 114139651A CN 202111520563 A CN202111520563 A CN 202111520563A CN 114139651 A CN114139651 A CN 114139651A
Authority
CN
China
Prior art keywords
sensor
target
current sensor
confidence
coefficient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111520563.XA
Other languages
Chinese (zh)
Inventor
覃梓雨
张昭
刘杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongfeng Nissan Passenger Vehicle Co
Original Assignee
Dongfeng Nissan Passenger Vehicle Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongfeng Nissan Passenger Vehicle Co filed Critical Dongfeng Nissan Passenger Vehicle Co
Priority to CN202111520563.XA priority Critical patent/CN114139651A/en
Publication of CN114139651A publication Critical patent/CN114139651A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/18Complex mathematical operations for evaluating statistical data, e.g. average values, frequency distributions, probability functions, regression analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Evolutionary Biology (AREA)
  • Mathematical Physics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Operations Research (AREA)
  • Probability & Statistics with Applications (AREA)
  • Artificial Intelligence (AREA)
  • Algebra (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

When a single sensor of a sensor sensing system detects a target at the time T, a correction coefficient of the current sensor can be updated according to the sensor target confidence of the current sensor at the time T and the fusion target confidence of the time T-1, and when the sensor detects the same target next time, the correction coefficient of the current sensor is used for correcting the sensor target confidence of the sensor and then calculating the fusion target confidence, so that the accuracy of the sensor target confidence of each sensor is improved, and the accuracy of the output target of the fusion system is improved.

Description

Confidence coefficient acquisition method of multi-sensor fusion target tracking system, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of target tracking technologies, and in particular, to a method for obtaining confidence of a multi-sensor fusion target tracking system, a storage medium, and an electronic device.
Background
The obstacle detection of the vehicle-mounted sensor is to detect a target by a sensing system composed of a plurality of sensors mounted on a vehicle body, the target output by the sensing system is a target obtained by fusing detection results of the plurality of sensors, and the stability of the multi-sensor fused target tracking system is easily influenced by the sensing performance of the single sensor. The single sensor is easily affected by factors such as illumination, weather, abrasion, shading, external interference, target form and the like, temporary or permanent sensing performance reduction is generated, and the accuracy of a target result identified by the sensor with the reduced sensing performance is low.
In the current multi-sensor fusion target tracking system, the confidence of a single sensor is generally set as a factory confidence value, and in the target tracking process, the confidence of the single sensor is kept unchanged, so that the identification results of partial sensors with reduced perception performance cannot be distinguished, and the accuracy of the target result output by the multi-sensor fusion target tracking system is influenced.
Disclosure of Invention
The method for acquiring the confidence level of the multi-sensor fusion target tracking system, the storage medium and the electronic device are capable of dynamically evaluating the recognition performance of a single sensor, adjusting the confidence level of a single sensor target and improving the target tracking accuracy.
The technical scheme of the application provides a confidence coefficient acquisition method of a multi-sensor fusion target tracking system, which comprises the following steps
Responding to a target detected by a sensor sensing system at the moment T, and acquiring a correction coefficient of a current sensor and a sensor target confidence coefficient of the current sensor at the moment T;
based on a target tracking algorithm, obtaining a fusion target confidence coefficient at the T moment according to target information detected at the T moment, a sensor target confidence coefficient of a current sensor at the T moment and a correction coefficient of the current sensor;
and updating the correction coefficient of the current sensor according to the sensor target confidence coefficient of the current sensor at the time T and the fusion target confidence coefficient at the time T-1.
Further, when the time T is an initial time, then:
responding to a target detected by a sensor sensing system at the moment T, and acquiring a correction coefficient of a current sensor and a sensor target confidence coefficient of the current sensor at the moment T, wherein the correction coefficient of the current sensor is a preset initial value;
and based on a target tracking algorithm, obtaining a fusion target confidence coefficient at the T moment according to the target information detected at the T moment, the sensor target confidence coefficient of the current sensor at the T moment and the correction coefficient of the current sensor.
Further, updating the correction coefficient of the current sensor according to the sensor target confidence coefficient of the current sensor at the time T and the fusion target confidence coefficient of the current sensor at the time T-1 specifically comprises
If the product of the sensor target confidence coefficient of the current sensor at the moment T and the correction coefficient of the current sensor is smaller than the fusion target confidence coefficient at the moment T-1, the correction coefficient of the current sensor is adjusted up;
and if the product of the sensor target confidence coefficient of the current sensor at the moment T and the correction coefficient of the current sensor is greater than the fusion target confidence coefficient at the moment T-1, adjusting the correction coefficient of the current sensor downwards.
Further, updating the correction coefficient of the current sensor according to the sensor target confidence coefficient of the current sensor at the time T and the fusion target confidence coefficient of the current sensor at the time T-1 specifically comprises
Constructing an error function according to the product of the sensor target confidence coefficient of the current sensor at the time T and the correction coefficient of the current sensor and the fusion target confidence coefficient at the time T-1;
constructing an objective function relative to a correction coefficient of the current sensor according to the error function;
and updating the correction coefficient of the current sensor according to the objective function.
Further, the updating the correction coefficient of the current sensor according to the objective function specifically includes
Acquiring a correction coefficient theta (k) of the current sensor, wherein k is an integer greater than or equal to 0, and when k is equal to 0, theta (0) is a preset initial value;
determining a correction coefficient change gradient Δ θ from the objective functionθJfunc(θ),θJfunc (θ) is a partial derivative of the objective function Jfunc (θ) with respect to θ, Δ θ is 0 when k is 0;
and updating a correction coefficient theta (k +1) -theta (k) -mu-delta theta of the current sensor according to the correction coefficient change gradient, wherein mu is a proportionality coefficient.
Further, the obtaining of the fusion target confidence of the time T according to the target information detected at the time T, the sensor target confidence of the current sensor at the time T, and the correction coefficient of the current sensor specifically includes
Determining the false alarm rate and the matching probability of the sensor according to the product of the target confidence coefficient of the current sensor at the moment T and the correction coefficient of the current sensor;
determining a fusion target confidence coefficient change value according to the sensor false alarm rate and the matching probability;
and determining the confidence coefficient of the fusion target at the time T according to the confidence coefficient change value of the fusion target.
Further, the target tracking algorithm is a kalman filter tracking algorithm, or a particle filter tracking algorithm, or a multi-hypothesis tracking algorithm.
The technical scheme of the present application further provides a storage medium, where the storage medium stores computer instructions, and when a computer executes the computer instructions, the storage medium is used to execute the method for obtaining the confidence of the multi-sensor fusion target tracking system.
The technical scheme of the application also provides electronic equipment which comprises at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the multi-sensor fusion target tracking system confidence acquisition method as previously described.
When a single sensor of the sensor sensing system detects a target at the time T, the correction coefficient of the current sensor can be updated according to the sensor target confidence coefficient of the current sensor at the time T and the fusion target confidence coefficient at the time T-1, and when the sensor detects the same target next time, the correction coefficient of the current sensor is used for correcting the sensor target confidence coefficient of the sensor and then calculating the fusion target confidence coefficient, so that the accuracy of the sensor target confidence coefficient of each sensor is improved, and the precision of the output target of the fusion system is improved.
Drawings
The disclosure of the present application will become more readily understood by reference to the drawings. It should be understood that: these drawings are for illustrative purposes only and are not intended to limit the scope of the present application. In the figure:
FIG. 1 is a flow chart of a confidence level obtaining method for a multi-sensor fusion target tracking system in an embodiment of the present application;
FIG. 2 is a flow chart of a confidence level acquisition method for a multi-sensor fusion target tracking system in accordance with a preferred embodiment of the present application;
fig. 3 is a schematic diagram of a hardware structure of an electronic device in an embodiment of the present application.
Detailed Description
Embodiments of the present application are further described below with reference to the accompanying drawings.
It is easily understood that according to the technical solutions of the present application, those skilled in the art can substitute various structures and implementations without changing the spirit of the present application. Therefore, the following detailed description and the accompanying drawings are merely illustrative of the technical solutions of the present application, and should not be construed as limiting or restricting the technical solutions of the present application in their entirety.
The terms of orientation of up, down, left, right, front, back, top, bottom, and the like referred to or may be referred to in this specification are defined relative to the configuration shown in the drawings, and are relative terms, and thus may be changed correspondingly according to the position and the use state of the device. Therefore, these and other directional terms should not be construed as limiting terms. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Throughout the description of the present application, it is to be noted that, unless otherwise expressly specified or limited, the terms "mounted," "connected," and "coupled" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; the two components can be directly connected or indirectly connected through an intermediate medium, and the two components can be communicated with each other. The foregoing is to be understood as belonging to the specific meanings in the present application as appropriate to the person of ordinary skill in the art.
The confidence obtaining method of the multi-sensor fusion target tracking system in the embodiment of the application, as shown in fig. 1, includes
S101: responding to a target detected by a sensor sensing system at the moment T, and acquiring a correction coefficient of a current sensor and a sensor target confidence coefficient of the current sensor at the moment T;
s102: based on a target tracking algorithm, obtaining a fusion target confidence coefficient at the T moment according to target information detected at the T moment, a sensor target confidence coefficient of a current sensor at the T moment and a correction coefficient of the current sensor;
s103: and updating the correction coefficient of the current sensor according to the sensor target confidence coefficient of the current sensor at the time T and the fusion target confidence coefficient at the time T-1.
The sensor perception system is composed of a plurality of sensors installed on a vehicle body, the sensors comprise a laser radar, a millimeter wave radar, a camera and the like, when the target detection is carried out, the same target can be detected by different sensors at different moments, the multi-sensor fusion target tracking system can fuse the detection results of the same target detected by different sensors, and the target result is finally output.
As an example, for the same target, the laser radar detects first target information at a first moment, the camera detects second target information at a second moment, and the multi-sensor fusion target tracking system fuses the first target information and the second target information to obtain fusion target information; if the millimeter wave radar detects third target information at a third moment, the multi-sensor fusion target tracking system continues to fuse the third target information with the previous fusion target information, and updates the fusion target information; and continuously fusing the detected target information and updating the fused target information.
The sensor target confidence coefficient is the confidence coefficient of the target information detected by the current sensor, the fusion target confidence coefficient is the confidence coefficient of the fused target information, the fusion target confidence coefficient is influenced by the sensor confidence coefficient at each moment, the fusion target confidence coefficient is updated according to the sensor target confidence coefficient of the current sensor every time the target fusion is carried out, and the multi-sensor fusion target tracking system outputs a target result until the score of the fusion target confidence coefficient is larger than or equal to the preset score.
In the prior art, the sensor target confidence of each sensor is set to be a fixed value, for example, when the laser radar detects the first target information, the sensor target confidence of the laser radar is obtained to update the fusion target confidence, and even if the sensing performance of the laser radar is reduced, the fusion target confidence is updated by the higher sensor target confidence, so that the accuracy of the output target result is finally influenced.
The method and the device have the advantages that the sensor target confidence coefficient of each sensor is set to be a dynamically adjusted correction coefficient for correction, when one sensor detects target information at the time T, the correction coefficient of the current sensor is determined according to the fusion target confidence coefficient at the time T-1 and the sensor target confidence coefficient of the current sensor at the time T, and the correction coefficient of the current sensor is used for correcting the sensor target confidence coefficient of the sensor when the sensor detects the same target information again.
Because the fusion target information at the T-1 moment is fused with the target information of a plurality of sensors before the T-1 moment, the confidence coefficient of the fusion target at the T-1 moment is closer to the real confidence coefficient, the correction coefficient of the current sensor is determined according to the confidence coefficient of the fusion target at the T-1 moment, the sensor target confidence coefficient of the current sensor is corrected, and the accuracy of the sensor target confidence coefficient can be improved.
Specifically, when any one of the sensors of the sensor sensing system detects an object at time T in step S101, a correction coefficient of the current sensor (the correction coefficient is updated when the current sensor last detected the object) and a sensor object confidence of the current sensor at time T are acquired.
Step S102, the correction coefficient of the current sensor is multiplied by the sensor target confidence coefficient of the current sensor at the time T to correct the sensor target confidence coefficient of the current sensor, and then a target tracking algorithm is adopted to obtain the fusion target confidence coefficient at the time T according to the corrected sensor target confidence coefficient of the current sensor and the target information detected at the time T, so that the fusion target confidence coefficient is updated.
The target tracking algorithm may be a kalman filter tracking algorithm, a particle filter tracking algorithm, or a multi-hypothesis tracking algorithm.
Step S103 is configured to update the correction coefficient of the current sensor, compare the confidence of the fusion target at the time T-1 with the confidence of the sensor target of the current sensor at the time T, and correct the correction coefficient of the current sensor. Specifically, the method comprises the following steps:
if the product of the sensor target confidence coefficient of the current sensor at the moment T and the correction coefficient of the current sensor is smaller than the fusion target confidence coefficient at the moment T-1, the correction coefficient of the current sensor is adjusted up;
and if the product of the sensor target confidence coefficient of the current sensor at the moment T and the correction coefficient of the current sensor is greater than the fusion target confidence coefficient at the moment T-1, adjusting the correction coefficient of the current sensor downwards.
And taking the confidence coefficient of the fusion target at the moment T-1 as a reference, comparing the product of the confidence coefficient of the sensor target of the current sensor at the moment T and the correction coefficient of the current sensor with the confidence coefficient of the fusion target at the moment T-1, reducing the difference between the product of the confidence coefficient of the sensor target of the current sensor at the moment T and the correction coefficient of the current sensor at the moment T and the confidence coefficient of the fusion target at the moment T-1 by adjusting the magnitude of the correction coefficient of the current sensor, and adjusting the product of the confidence coefficient of the sensor target of the current sensor at the moment T and the correction coefficient of the current sensor towards the confidence coefficient of the fusion target at the moment T-1.
Step S103, dynamically correcting the correction coefficient of the current sensor by comparing the product of the sensor target confidence coefficient of the current sensor at the time T and the correction coefficient of the current sensor with the fusion target confidence coefficient at the time T-1, wherein the corrected correction coefficient of the current sensor is used for correcting the sensor target confidence coefficient when the current sensor detects the same target next time, so that the reliability of the sensor target confidence coefficient of the current sensor is improved.
Further, when the time T is an initial time, then:
responding to a target detected by a sensor sensing system at the moment T, and acquiring a correction coefficient of a current sensor and a sensor target confidence coefficient of the current sensor at the moment T, wherein the correction coefficient of the current sensor is a preset initial value;
and based on a target tracking algorithm, obtaining a fusion target confidence coefficient at the T moment according to the target information detected at the T moment, the sensor target confidence coefficient of the current sensor at the T moment and the correction coefficient of the current sensor.
And when the time T is the initial time of detecting the target, the correction coefficient of the current sensor is a preset initial value. And determining the confidence coefficient of the fusion target at the initial moment directly according to the target detected at the initial moment, the preset initial value of the correction coefficient of the current sensor and the confidence coefficient of the target of the sensor at the initial moment.
And because the initial time does not have the confidence coefficient of the fusion target at the previous time, the correction coefficient of the current sensor is not corrected at the time, and the correction coefficient of the current sensor is kept as the preset initial value of the correction coefficient of the current sensor.
And when the current sensor detects the same target next time, updating the correction coefficient of the current sensor.
It should be noted that each sensor in the sensor sensing system is provided with a corresponding correction coefficient, and when each sensor detects the same target for the first time, the adopted correction coefficients are all preset initial values of the sensor. Different preset initial values are set for different sensors.
As an example:
t0when the laser radar detects the first target information of the current target at the moment (initial moment), t is obtained0Sensor target confidence b of laser radar at time0And correction factor alpha of laser radar00Is a preset initial value);
based on the target tracking algorithm, according to the first target information and the correction coefficient alpha of the laser radar0Sensor target confidence b of laser radar0Determining t0Fusion target confidence a of time0
t1At the moment, the camera detects second target information of the current target, and then t is acquired1Sensor target confidence c for a camera at a time0And correction coefficient beta of camera00Is a preset initial value);
based on the target tracking algorithm, according to the second target information and the correction coefficient beta of the camera0Confidence of sensor target of camera c0Determining t1Fusion target confidence a of time1
According to t1Sensor target confidence c for a camera at a time0And t0Fusion target confidence a of time0Updating correction coefficient of camera to be beta1
t2At the moment, the camera detects the third target information of the current target, and then t is acquired2Sensor target confidence c for a camera at a time1And correction coefficient beta of camera1
Based on the target tracking algorithm, according to the third target information and the correction coefficient beta of the camera1Sensor target confidence of camerac1Determining t2Fusion target confidence a of time2
According to t2Sensor target confidence c for a camera at a time1And t1Fusion target confidence a of time1Updating correction coefficient of camera to be beta2
t3At the moment, if the laser radar detects the fourth target information of the current target, t is obtained3Sensor target confidence b of laser radar at time1And correction factor alpha of laser radar0
Based on the target tracking algorithm, according to the fourth target information and the correction coefficient alpha of the laser radar0Sensor target confidence b of laser radar1Determining t3Fusion target confidence a of time3
According to t3Sensor target confidence b of laser radar at time1And t2Fusion target confidence a of time2Updating the correction coefficient of the laser to alpha1……
In the embodiment of the application, each sensor is provided with a dynamic correction coefficient, after the sensor detects a target, the correction coefficient of the current sensor is corrected according to the confidence coefficient of the fusion target at the previous moment, when the sensor detects the same target next time, the correction coefficient is used for correcting the confidence coefficient of the sensor target at the current time, and the confidence coefficient of the fusion target is determined according to the confidence coefficient of the sensor target corrected by the correction coefficient, so that the interference of the detection result of the sensor with reduced performance on the finally output detection target can be eliminated, and the target identification and target tracking accuracy is improved.
In one embodiment, the updating the correction coefficient of the current sensor according to the sensor target confidence of the current sensor at the time T and the fusion target confidence of the current sensor at the time T-1 specifically includes
Constructing an error function according to the product of the sensor target confidence coefficient of the current sensor at the time T and the correction coefficient of the current sensor and the fusion target confidence coefficient at the time T-1;
constructing an objective function relative to a correction coefficient of the current sensor according to the error function;
and updating the correction coefficient of the current sensor according to the objective function.
According to the embodiment of the application, firstly, an error function of the correction coefficient of the current sensor to the fusion target confidence coefficient of the current sensor at the T-1 moment is constructed according to the product of the sensor target confidence coefficient of the current sensor at the T moment and the correction coefficient of the current sensor and the fusion target confidence coefficient of the current sensor at the T-1 moment, then, an object function of the correction coefficient of the current sensor to the fusion target confidence coefficient of the current sensor at the T-1 moment is constructed according to the error function, and the correction coefficient which enables the sensor target confidence coefficient of the current sensor and the fusion target confidence coefficient error of the current sensor at the T-1 moment to be minimum can be calculated on the basis of the object function.
In particular, said determining a correction factor from said objective function comprises
Acquiring a current correction coefficient theta (k), wherein k is an integer greater than or equal to 0, and when k is equal to 0, theta (0) is initialized;
determining a correction coefficient change gradient Δ θ from the objective functionθJfunc(θ),θJfunc (θ) is a partial derivative of the objective function Jfunc (θ) with respect to θ, Δ θ is 0 when k is 0;
and correcting the correction coefficient theta (k +1) ═ theta (k) -mu × delta theta according to the correction coefficient change gradient, wherein mu is a proportionality coefficient.
And by adopting the idea of gradient descent, the target function calculates the partial derivative of the correction coefficient theta to determine the change gradient delta theta of the correction coefficient, the change gradient delta theta of the correction coefficient is multiplied by a proportionality coefficient mu to serve as the change quantity of the correction coefficient, and the proportionality coefficient mu is set according to the specific requirements of the sensor sensing system.
According to the embodiment of the application, the target confidence of the sensor is corrected by adopting the idea of gradient descent, a more appropriate correction coefficient can be obtained according to the error of the target confidence of the sensor of the current sensor at the T-1 moment and the target confidence of the target of the sensor at the T moment, and the accuracy of the target confidence of the sensor of the current sensor is improved.
In one embodiment, the obtaining of the fusion target confidence of the time T according to the target information detected at the time T, the sensor target confidence of the current sensor at the time T, and the correction coefficient of the current sensor specifically includes
Determining the false alarm rate and the matching probability of the sensor according to the product of the target confidence coefficient of the current sensor at the moment T and the correction coefficient of the current sensor;
determining a fusion target confidence coefficient change value according to the sensor false alarm rate and the matching probability;
and determining the confidence coefficient of the fusion target at the time T according to the confidence coefficient change value of the fusion target.
In the multi-sensor fusion target tracking system, whether a target exists or not needs to be determined according to the score of the confidence coefficient of the fusion target, and each time the sensor sensing system detects the target, the change value of the confidence coefficient of the fusion target is increased on the basis of the confidence coefficient of the fusion target at the previous moment until the score of the confidence coefficient of the fusion target reaches a set value.
Specifically, a sensor false alarm rate P (DK | H0) and a matching probability P (DK | H1) can be determined from the product of the sensor target confidence of the current sensor at time T and the correction factor of the current sensor;
calculating the ratio of the matching probability and the false alarm rate of the sensor to obtain the likelihood ratio of the sensor
LR=P(DK|H1)/P(DK|H0);
Obtaining a fusion target confidence coefficient change value by logarithm of a sensor likelihood ratio LR
ΔLLR=ln(P(DK|H1)k/P(DK|H0)k);
Further obtaining the confidence of the fusion target at the time T
LLR(k)=LLR(k-1)+ΔLLR。
According to the embodiment of the application, the confidence coefficient of the fusion target at the moment T is determined according to the product of the confidence coefficient of the sensor target of the current sensor at the moment T and the correction coefficient of the current sensor, and when the confidence coefficient of the sensor target is higher and the change value of the confidence coefficient of the fusion target is higher, the speed of determining the target according to the confidence coefficient of the fusion target is higher. Therefore, the weight of the detection result of the sensor with better performance in the fusion target confidence coefficient is larger, and the weight of the detection result of the sensor with poorer performance in the fusion target confidence coefficient is smaller, so that the stability and reliability of the whole perception fusion system are realized.
FIG. 2 is a flow chart of a confidence level obtaining method for a multi-sensor fusion target tracking system in a preferred embodiment of the present application, which specifically includes
Step S201: responding to a target detected by a sensor sensing system at an initial moment, and acquiring a correction coefficient of a current sensor and a sensor target confidence coefficient of the current sensor at the initial moment, wherein the correction coefficient of the current sensor is a preset initial value;
step S202: determining a false alarm rate and a matching probability of the sensor according to the product of the sensor target confidence coefficient of the current sensor at the initial moment and the correction coefficient of the current sensor;
step S203: determining a fusion target confidence coefficient change value according to the sensor false alarm rate and the matching probability;
step S204: determining the confidence coefficient of the fusion target at the initial moment according to the confidence coefficient change value of the fusion target;
step S205: responding to a target detected by a sensor sensing system at the moment T, and acquiring a correction coefficient of a current sensor and a sensor target confidence coefficient of the current sensor at the moment T;
step S206: determining the false alarm rate and the matching probability of the sensor according to the product of the target confidence coefficient of the current sensor at the moment T and the correction coefficient of the current sensor;
step S207: determining a fusion target confidence coefficient change value according to the sensor false alarm rate and the matching probability;
step S208: determining the confidence coefficient of the fusion target at the time T according to the confidence coefficient change value of the fusion target;
step S209: constructing an error function according to the product of the sensor target confidence coefficient of the current sensor at the time T and the correction coefficient of the current sensor and the fusion target confidence coefficient at the time T-1;
step S210: constructing an objective function relative to a correction coefficient of the current sensor according to the error function;
step S211: acquiring a correction coefficient of a current sensor;
step S212: determining a change gradient of a correction coefficient according to the target function;
step S213: and updating the correction coefficient of the current sensor according to the change gradient of the correction coefficient, and returning to the step S205.
The technical scheme of the present application further provides a storage medium, where the storage medium stores a computer instruction, and when a computer executes the computer instruction, the storage medium is configured to execute the confidence obtaining method of the multi-sensor fusion target tracking system in any of the foregoing embodiments.
Fig. 3 shows an electronic device of the present application, comprising:
at least one processor 301; and the number of the first and second groups,
a memory 302 communicatively coupled to the at least one processor 301; wherein the content of the first and second substances,
the memory 302 stores instructions executable by the at least one processor 301 to enable the at least one processor 301 to perform all the steps of the multi-sensor fusion target tracking system confidence acquisition method in any of the above method embodiments.
The Electronic device is preferably an on-vehicle Electronic Control Unit (ECU), and further, a Microcontroller Unit (MCU) in the on-vehicle Electronic Control Unit.
In fig. 3, a processor 302 is taken as an example:
the electronic device may further include: an input device 303 and an output device 304.
The processor 301, the memory 302, the input device 303, and the output device 304 may be connected by a bus or other means, and are illustrated as being connected by a bus.
The memory 302 is a non-volatile computer-readable storage medium, and can be used for storing non-volatile software programs, non-volatile computer-executable programs, and modules, such as program instructions/modules corresponding to the confidence obtaining method of the multi-sensor fusion target tracking system in the embodiment of the present application, for example, the method flow shown in fig. 1 or 2. The processor 301 executes various functional applications and data processing by running nonvolatile software programs, instructions and modules stored in the memory 302, so as to implement the multi-sensor fusion target tracking system confidence obtaining method in the above embodiments.
The memory 302 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created from use of a multi-sensor fusion target tracking system confidence acquisition method, and the like. Further, the memory 302 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, the memory 302 optionally includes memory remotely located from the processor 301, and such remote memory may be connected over a network to a device that performs the multi-sensor fusion target tracking system confidence acquisition method. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 303 may receive input of a user click and generate signal inputs related to user settings and functional controls of the multi-sensor fusion target tracking system confidence acquisition method. The output means 304 may comprise a display device such as a display screen.
When the one or more modules are stored in the memory 302, when executed by the one or more processors 301, perform the multi-sensor fusion target tracking system confidence acquisition method of any of the method embodiments described above.
What has been described above is merely the principles and preferred embodiments of the present application. It should be noted that, for those skilled in the art, the embodiments obtained by appropriately combining the technical solutions respectively disclosed in the different embodiments are also included in the technical scope of the present invention, and several other modifications may be made on the basis of the principle of the present application and should be regarded as the protective scope of the present application.

Claims (9)

1. A confidence coefficient acquisition method for a multi-sensor fusion target tracking system is characterized by comprising the following steps
Responding to a target detected by a sensor sensing system at the moment T, and acquiring a correction coefficient of a current sensor and a sensor target confidence coefficient of the current sensor at the moment T;
based on a target tracking algorithm, obtaining a fusion target confidence coefficient at the T moment according to target information detected at the T moment, a sensor target confidence coefficient of a current sensor at the T moment and a correction coefficient of the current sensor;
and updating the correction coefficient of the current sensor according to the sensor target confidence coefficient of the current sensor at the time T and the fusion target confidence coefficient at the time T-1.
2. The confidence level obtaining method for a multi-sensor fusion target tracking system according to claim 1, wherein when the time T is an initial time, then:
responding to a target detected by a sensor sensing system at the moment T, and acquiring a correction coefficient of a current sensor and a sensor target confidence coefficient of the current sensor at the moment T, wherein the correction coefficient of the current sensor is a preset initial value;
and based on a target tracking algorithm, obtaining a fusion target confidence coefficient at the T moment according to the target information detected at the T moment, the sensor target confidence coefficient of the current sensor at the T moment and the correction coefficient of the current sensor.
3. The confidence level obtaining method of the multi-sensor fusion target tracking system according to claim 1, wherein the updating of the correction coefficient of the current sensor according to the sensor target confidence level of the current sensor at time T and the fusion target confidence level at time T-1 specifically includes
If the product of the sensor target confidence coefficient of the current sensor at the moment T and the correction coefficient of the current sensor is smaller than the fusion target confidence coefficient at the moment T-1, the correction coefficient of the current sensor is adjusted up;
and if the product of the sensor target confidence coefficient of the current sensor at the moment T and the correction coefficient of the current sensor is greater than the fusion target confidence coefficient at the moment T-1, adjusting the correction coefficient of the current sensor downwards.
4. The confidence level obtaining method of the multi-sensor fusion target tracking system according to claim 1, wherein the updating of the correction coefficient of the current sensor according to the sensor target confidence level of the current sensor at time T and the fusion target confidence level at time T-1 specifically includes
Constructing an error function according to the product of the sensor target confidence coefficient of the current sensor at the time T and the correction coefficient of the current sensor and the fusion target confidence coefficient at the time T-1;
constructing an objective function relative to a correction coefficient of the current sensor according to the error function;
and updating the correction coefficient of the current sensor according to the objective function.
5. The confidence level obtaining method of the multi-sensor fusion target tracking system according to claim 4, wherein the updating of the correction coefficient of the current sensor according to the objective function specifically includes
Acquiring a correction coefficient theta (k) of the current sensor, wherein k is an integer greater than or equal to 0, and when k is equal to 0, theta (0) is a preset initial value;
determining a correction coefficient change gradient Δ θ from the objective functionθJfunc(θ),θJfunc (θ) is a partial derivative of the objective function Jfunc (θ) with respect to θ, Δ θ is 0 when k is 0;
and updating a correction coefficient theta (k +1) -theta (k) -mu-delta theta of the current sensor according to the correction coefficient change gradient, wherein mu is a proportionality coefficient.
6. The confidence obtaining method of the multi-sensor fusion target tracking system according to claim 1, wherein the obtaining of the confidence of the fusion target at the time T according to the target information detected at the time T, the confidence of the target of the current sensor at the time T, and the correction coefficient of the current sensor specifically includes obtaining the confidence of the fusion target at the time T by using the information of the target detected at the time T, the confidence of the target of the current sensor at the time T, and the correction coefficient of the current sensor
Determining the false alarm rate and the matching probability of the sensor according to the product of the target confidence coefficient of the current sensor at the moment T and the correction coefficient of the current sensor;
determining a fusion target confidence coefficient change value according to the sensor false alarm rate and the matching probability;
and determining the confidence coefficient of the fusion target at the time T according to the confidence coefficient change value of the fusion target.
7. The confidence level obtaining method of the multi-sensor fusion target tracking system according to claim 1, wherein the target tracking algorithm is a kalman filter tracking algorithm, a particle filter tracking algorithm, or a multi-hypothesis tracking algorithm.
8. A storage medium storing computer instructions for performing the multi-sensor fusion target tracking system confidence acquisition method of any one of claims 1-7 when executed by a computer.
9. An electronic device comprising at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the multi-sensor fusion target tracking system confidence acquisition method of any of claims 1-7.
CN202111520563.XA 2021-12-13 2021-12-13 Confidence coefficient acquisition method of multi-sensor fusion target tracking system, storage medium and electronic equipment Pending CN114139651A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111520563.XA CN114139651A (en) 2021-12-13 2021-12-13 Confidence coefficient acquisition method of multi-sensor fusion target tracking system, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111520563.XA CN114139651A (en) 2021-12-13 2021-12-13 Confidence coefficient acquisition method of multi-sensor fusion target tracking system, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN114139651A true CN114139651A (en) 2022-03-04

Family

ID=80381994

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111520563.XA Pending CN114139651A (en) 2021-12-13 2021-12-13 Confidence coefficient acquisition method of multi-sensor fusion target tracking system, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN114139651A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113533962A (en) * 2021-07-29 2021-10-22 上海交通大学 Induction motor health diagnosis system based on decision fusion of multiple physical signal sensors

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113533962A (en) * 2021-07-29 2021-10-22 上海交通大学 Induction motor health diagnosis system based on decision fusion of multiple physical signal sensors
CN113533962B (en) * 2021-07-29 2022-08-12 上海交通大学 Induction motor health diagnosis system based on decision fusion of multiple physical signal sensors

Similar Documents

Publication Publication Date Title
KR20190042247A (en) Device and method to estimate position
JP5419784B2 (en) Prediction device, prediction system, computer program, and prediction method
US9794543B2 (en) Information processing apparatus, image capturing apparatus, control system applicable to moveable apparatus, information processing method, and storage medium of program of method
CN108225339A (en) For estimating the device and method of vehicle location
CN109684944B (en) Obstacle detection method, obstacle detection device, computer device, and storage medium
CN110865398A (en) Processing method and processing device for positioning data, terminal equipment and storage medium
CN114139651A (en) Confidence coefficient acquisition method of multi-sensor fusion target tracking system, storage medium and electronic equipment
CN111383246B (en) Scroll detection method, device and equipment
CN110637209B (en) Method, apparatus and computer readable storage medium having instructions for estimating a pose of a motor vehicle
CN116645396A (en) Track determination method, track determination device, computer-readable storage medium and electronic device
CN113916565B (en) Steering wheel zero deflection angle estimation method and device, vehicle and storage medium
Jurado et al. Towards an online sensor model validation and estimation framework
CN116168090B (en) Equipment parameter calibration method and device
CN110832274A (en) Ground slope calculation method, device, equipment and storage medium
WO2019007536A1 (en) A method for identifying a cause of blockage in a sequence of images, a computer program for performing said method, a computer-readable recording medium containing such computer program, a driving assistance system capable of executing said method
CN111639662A (en) Remote sensing image bidirectional matching method and device, electronic equipment and storage medium
Mobus et al. Multi-target multi-object radar tracking
CN113959433B (en) Combined navigation method and device
CN115761693A (en) Method for detecting vehicle location mark points and tracking and positioning vehicles based on panoramic image
KR102075831B1 (en) Method and Apparatus for Object Matching between V2V and Radar Sensor
CN112800864A (en) Target tracking method and device, electronic equipment and storage medium
CN114279458A (en) Lane uncertainty modeling and tracking in a vehicle
CN113325415A (en) Fusion method and system for vehicle radar data and camera data
CN107548033B (en) Positioning device and method and electronic equipment
KR101619672B1 (en) Method for Tracking Pedestrians based on Binarization and Apparatus thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination