CN117970318A - Target fusion method, electronic device and storage medium - Google Patents

Target fusion method, electronic device and storage medium Download PDF

Info

Publication number
CN117970318A
CN117970318A CN202410390054.7A CN202410390054A CN117970318A CN 117970318 A CN117970318 A CN 117970318A CN 202410390054 A CN202410390054 A CN 202410390054A CN 117970318 A CN117970318 A CN 117970318A
Authority
CN
China
Prior art keywords
target
radar
loss
camera
shooting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410390054.7A
Other languages
Chinese (zh)
Inventor
徐显杰
张扬
林进贵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suoto Hangzhou Automotive Intelligent Equipment Co Ltd
Tianjin Soterea Automotive Technology Co Ltd
Original Assignee
Suoto Hangzhou Automotive Intelligent Equipment Co Ltd
Tianjin Soterea Automotive Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suoto Hangzhou Automotive Intelligent Equipment Co Ltd, Tianjin Soterea Automotive Technology Co Ltd filed Critical Suoto Hangzhou Automotive Intelligent Equipment Co Ltd
Priority to CN202410390054.7A priority Critical patent/CN117970318A/en
Publication of CN117970318A publication Critical patent/CN117970318A/en
Pending legal-status Critical Current

Links

Landscapes

  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention provides a target fusion method, electronic equipment and a storage medium, and belongs to the field of intelligent driving. The method comprises the following steps: acquiring information of each imaging target detected by the imaging device and information of each radar target detected by the radar; according to the information of each shooting target and the information of each radar target, calculating the comprehensive loss between each shooting target and each radar target, and determining an initial loss matrix between the shooting equipment and the radar according to the comprehensive loss; determining the number of radar targets meeting preset conditions corresponding to each camera shooting target, and correcting the initial loss matrix according to the number of radar targets meeting the preset conditions corresponding to each camera shooting target to obtain a corrected loss matrix; and performing target fusion according to the corrected loss matrix. The invention can solve the problems that the target fusion fails or the target output is too late due to the difference between the imaging equipment and the radar speed measurement, thereby influencing the driving of the vehicle.

Description

Target fusion method, electronic device and storage medium
Technical Field
The invention relates to the technical field of intelligent driving, in particular to a target fusion method, electronic equipment and a storage medium.
Background
When a vehicle tracks a target, the target is often detected by two types of sensors, namely an imaging device and a radar, and the target detection results of the imaging device and the radar are fused. However, in the target tracking process, some cut-in targets often exist, and due to the incompleteness of the cut-in targets and the instability of the speed measurement of the camera equipment, the fact that the speed difference between the target speed detected by the camera equipment and the speed of the same target detected by the radar is too large often occurs, so that the fusion of the targets fails or the output of the targets is too late, and the driving of a vehicle is affected.
Disclosure of Invention
The embodiment of the invention provides a target fusion method, electronic equipment and a storage medium, which are used for solving the problem that the driving of a vehicle is influenced due to target fusion failure or target output too late caused by too large speed difference between the target speed detected by an imaging device and the same target detected by a radar.
In a first aspect, an embodiment of the present invention provides a target fusion method, including:
Acquiring information of each imaging target detected by the imaging device and information of each radar target detected by the radar;
According to the information of each shooting target and the information of each radar target, calculating the comprehensive loss between each shooting target and each radar target, and determining an initial loss matrix between the shooting equipment and the radar according to the comprehensive loss;
Determining the number of radar targets meeting preset conditions corresponding to each camera shooting target, and correcting the initial loss matrix according to the number of radar targets meeting the preset conditions corresponding to each camera shooting target to obtain a corrected loss matrix;
and performing target fusion according to the corrected loss matrix.
In one possible implementation, the information of the imaging target includes a speed and a distance of the imaging target; the information of the radar target comprises the speed and the distance of the radar target;
calculating a comprehensive loss between each imaging target and each radar target based on the information of each imaging target and the information of each radar target, and determining an initial loss matrix between the imaging apparatus and the radar based on the comprehensive loss, comprising:
For each camera target and each radar target, calculating a first difference value between the speed of the camera target and the speed of the radar target and a second difference value between the distance of the camera target and the distance of the radar target, calculating the comprehensive loss between the camera target and the radar target according to the first difference value and the second difference value, and determining the loss value corresponding to the camera target and the radar target in an initial loss matrix as the comprehensive loss between the camera target and the radar target when the comprehensive loss between the camera target and the radar target is smaller than or equal to a comprehensive loss threshold value.
In one possible implementation, the information of the imaging target includes a speed and a distance of the imaging target; the information of the radar target comprises the speed and the distance of the radar target;
The preset condition includes that the comprehensive loss between the camera target and the radar target is greater than a comprehensive loss threshold, the distance loss between the camera target and the radar target is less than or equal to a first distance threshold, and the speed loss between the camera target and the radar target is greater than a first speed threshold.
In one possible implementation manner, according to the number of radar targets meeting the preset condition corresponding to each imaging target, correcting the initial loss matrix to obtain a corrected loss matrix, including:
and for each shooting target, if the number of radar targets which correspond to the shooting targets and meet the preset condition is 1, correcting the loss value of the shooting target and the corresponding radar target which meet the preset condition in the initial loss matrix to be the comprehensive loss between the shooting target and the radar target, and obtaining a corrected loss matrix.
In one possible implementation, after obtaining the modified loss matrix, the target fusion method further includes:
Determining distance loss and speed loss between the first radar target and other radar targets, and determining a split radar target corresponding to the first camera shooting target according to the distance loss and the speed loss between the first radar target and other radar targets; the first shooting target is a shooting target of which the type is a preset large-sized vehicle; the first radar target is a radar target associated with the first camera target;
combining the split radar targets corresponding to the first camera shooting targets, and updating the corrected loss matrix to obtain an updated loss matrix;
correspondingly, according to the corrected loss matrix, performing target fusion, including:
and performing target fusion according to the updated loss matrix.
In one possible implementation, determining a split radar target corresponding to the first camera target based on a distance loss and a speed loss between the first radar target and each of the other radar targets includes:
Taking a radar target with a distance loss smaller than or equal to a second distance threshold value and a speed loss smaller than or equal to a second speed threshold value as a candidate split radar target of the first camera target;
Taking the first radar target as a candidate split radar target of the first shooting target;
And aiming at each candidate split radar target of the first shooting target, if the shooting target associated with the candidate split radar target is unique, determining the candidate split radar target as the split radar target corresponding to the first shooting target.
In one possible implementation, when the loss value of the camera target and the radar target in the corrected loss matrix is less than or equal to a preset loss threshold value, determining that the camera target and the radar target are associated; or alternatively, the first and second heat exchangers may be,
When the comprehensive loss between the camera shooting target and the radar target is smaller than the comprehensive loss threshold value, determining that the camera shooting target is associated with the radar target; or alternatively, the first and second heat exchangers may be,
When the distance loss between the camera object and the radar object is less than a first distance threshold and the speed loss between the camera object and the radar object is less than a first speed threshold, determining that the camera object and the radar object are associated.
In one possible implementation manner, merging the split radar targets corresponding to the first imaging target includes:
And taking the split radar target with the smallest distance as the combined radar target in the split radar targets corresponding to the first shooting target.
In a second aspect, an embodiment of the present invention provides a target fusion device, including:
an acquisition module for acquiring information of each imaging target detected by the imaging device and information of each radar target detected by the radar;
The initial loss matrix determining module is used for calculating the comprehensive loss between each camera shooting target and each radar target according to the information of each camera shooting target and the information of each radar target, and determining an initial loss matrix between the camera shooting equipment and the radar according to the comprehensive loss;
The loss matrix correction module is used for determining the number of radar targets which meet the preset conditions and correspond to each shooting target, and correcting the initial loss matrix according to the number of radar targets which meet the preset conditions and correspond to each shooting target to obtain a corrected loss matrix;
and the target fusion module is used for carrying out target fusion according to the corrected loss matrix.
In a third aspect, an embodiment of the present invention provides an electronic device, including a processor and a memory, where the memory is configured to store a computer program, and the processor is configured to invoke and run the computer program stored in the memory, to perform the target fusion method according to the first aspect or any possible implementation manner of the first aspect.
In a fourth aspect, embodiments of the present invention provide a computer readable storage medium storing a computer program which, when executed by a processor, implements the steps of the target fusion method as described above in the first aspect or any one of the possible implementations of the first aspect.
The embodiment of the invention provides a target fusion method, electronic equipment and a storage medium, wherein the method can calculate the comprehensive loss between each camera shooting target and each radar target according to the information of each camera shooting target and the information of each radar target, and determine an initial loss matrix between the camera shooting equipment and the radar according to the comprehensive loss; determining the number of radar targets meeting preset conditions corresponding to each camera shooting target, and correcting the initial loss matrix according to the number of radar targets meeting the preset conditions corresponding to each camera shooting target to obtain a corrected loss matrix; and performing target fusion according to the corrected loss matrix. According to the method, the initial loss matrix is corrected according to the number of radar targets which correspond to each shooting target and meet the preset condition, so that the problems that the targets are failed to be fused or the targets are output too late due to the difference between the shooting equipment and the radar speed measurement and further the driving of a vehicle is affected can be solved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of a target fusion method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a target fusion device according to an embodiment of the present invention;
fig. 3 is a schematic diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth such as the particular system architecture, techniques, etc., in order to provide a thorough understanding of the embodiments of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
For the purpose of making the objects, technical solutions and advantages of the present invention more apparent, the following description will be made by way of specific embodiments with reference to the accompanying drawings.
Referring to fig. 1, a flowchart of an implementation of a target fusion method according to an embodiment of the present invention is shown. The subject of execution of the target fusion method may be an electronic device. The electronic device may be an electronic device in a vehicle or the like.
Referring to fig. 1, the above target fusion method includes:
In S101, information of each imaging target detected by the imaging apparatus and information of each radar target detected by the radar are acquired.
The image pickup apparatus may be a forward-looking image pickup apparatus of a vehicle, the radar may be a forward-looking radar of the vehicle, and the image pickup apparatus and the radar may be used to detect an object in front of the vehicle for object tracking in front of the vehicle. Wherein the image pickup apparatus may be a camera, the radar may be a millimeter wave radar, or the like.
The image pickup device and the radar are located in the same direction of the vehicle. In some possible implementations, the camera device and radar may also both be located at the rear of the vehicle for detecting objects behind the vehicle; or the image pickup device and the radar may also be both located on the left side of the vehicle for detecting a target on the left side of the vehicle; or the image pickup device and the radar may also be both located on the right side of the vehicle for detecting a target on the right side of the vehicle; etc.
The present embodiment refers to each target detected by the image pickup apparatus as an image pickup target, and each target detected by the radar as a radar target. The image pickup apparatus and the radar may each detect a speed of each target, which may be a speed with respect to the own vehicle, and a distance of each target, which may be a distance with respect to the own vehicle, by the related art.
In some possible implementations, the imaging device and the radar may be replaced by other sensors that can perform target detection, and the method is not particularly limited herein.
In S102, a comprehensive loss between each imaging target and each radar target is calculated from the information of each imaging target and the information of each radar target, and an initial loss matrix between the imaging apparatus and the radar is determined from the comprehensive loss.
The comprehensive loss between the imaging target and the radar target can reflect the comprehensive difference between the information of the imaging target and the information of the radar target, and particularly can reflect the comprehensive difference of the speed and the distance between the imaging target and the radar target.
The present embodiment may first generate and initialize a loss matrix, where each element value in the loss matrix may be initialized to a first preset loss value. The first preset loss value may be a larger value, which is far larger than the integrated difference between the information detected by the imaging device and the radar on the same target under normal conditions, for example, may be 1000, 2000, etc. The camera shooting targets and radar targets in the loss matrix can be randomly ordered, can be ordered according to distance, can be ordered according to speed, can be ordered according to the sequence of detection, and the like.
After initializing the loss matrix, the initialized loss matrix can be updated according to the calculated comprehensive loss between each imaging target and each radar target, so as to obtain an initial loss matrix between the imaging equipment and the radar.
The corresponding comprehensive loss can be calculated for each imaging target and each radar target. For example, assuming that there are M imaging targets and N radar targets, m×n integrated losses can be calculated. The radar target list may be traversed with the image capturing apparatus as the main sensor, i.e., with the image capturing target as the main target; or may traverse the list of camera targets with radar as the primary sensor, i.e., radar targets as the primary targets.
In S103, the number of radar targets meeting the preset condition corresponding to each imaging target is determined, and the initial loss matrix is corrected according to the number of radar targets meeting the preset condition corresponding to each imaging target, so as to obtain a corrected loss matrix.
The radar target meeting the preset condition corresponding to the shooting target is a radar target which meets the requirement on the distance loss between the radar target and the shooting target and does not meet the requirement on the speed loss between the radar target and the shooting target.
In this embodiment, the number of radar targets corresponding to each imaging target that satisfies the preset condition is obtained, so that the number of radar targets corresponding to the imaging target that cannot be fused by the same target due to the fact that the difference between the speed of the target detected by the imaging device and the speed of the same target detected by the radar is too large can be obtained. And correcting the initial loss matrix according to the number of radar targets which correspond to each shooting target and meet the preset condition, so that a corrected loss matrix can be obtained.
In S104, target fusion is performed based on the corrected loss matrix.
In this embodiment, a related technology may be adopted, and target fusion is performed according to the corrected loss matrix, so as to fuse information of the same target. For example, nearest neighbor, global nearest neighbor, etc. methods may be used.
According to the embodiment, comprehensive losses between each shooting target and each radar target can be calculated according to the information of each shooting target and the information of each radar target, and an initial loss matrix between the shooting equipment and the radar is determined according to the comprehensive losses; determining the number of radar targets meeting preset conditions corresponding to each camera shooting target, and correcting the initial loss matrix according to the number of radar targets meeting the preset conditions corresponding to each camera shooting target to obtain a corrected loss matrix; and performing target fusion according to the corrected loss matrix. According to the method, the initial loss matrix is corrected according to the number of radar targets which correspond to each camera shooting target and meet the preset condition, so that the problems that the targets are failed to be fused or the targets are output too late due to the difference between the camera shooting equipment and the radar speed measurement can be solved, the driving of a vehicle is affected, and the timeliness of target identification can be improved.
The target fusion method provided by the embodiment can be applied to a perception module of an automatic driving system and used for target tracking, track tracking and the like in the automatic driving system. The track tracking process generally includes: time synchronization is carried out on sensing data of various sensors (cameras, millimeter wave radars and the like); synchronizing all sensor sensing data to the same space by using a bird's eye view map coordinate system, and carrying out data fusion; and filtering and predicting the current position of the target according to the historical motion trail of the target, and combining the prediction state and the historical trail so as to update the target.
The target fusion method provided in the embodiment can also be applied to target detection in non-automatic driving, so as to give a relevant target prompt to a user, and the like.
In some embodiments, the information of the camera object includes a speed and a distance of the camera object; the information of the radar target comprises the speed and the distance of the radar target;
the S102 may include:
For each camera target and each radar target, calculating a first difference value between the speed of the camera target and the speed of the radar target and a second difference value between the distance of the camera target and the distance of the radar target, calculating the comprehensive loss between the camera target and the radar target according to the first difference value and the second difference value, and determining the loss value corresponding to the camera target and the radar target in an initial loss matrix as the comprehensive loss between the camera target and the radar target when the comprehensive loss between the camera target and the radar target is smaller than or equal to a comprehensive loss threshold value.
In the present embodiment, the information of the imaging target includes the speed of the imaging target and the distance of the imaging target. The information of the radar target includes a speed of the radar target and a distance of the radar target. The speed of the shooting target and the speed of the radar target are relative to the speed of the vehicle, and the distance of the shooting target and the distance of the radar target are relative to the distance of the vehicle.
In this embodiment, for each imaging target and each radar target, a corresponding comprehensive loss is calculated, and whether the comprehensive loss is smaller than or equal to a comprehensive loss threshold is determined, if yes, the value of the corresponding position in the initial loss matrix is updated to be the comprehensive loss, and if not, the value of the corresponding position in the initial loss matrix is kept at an initial value (i.e., a first preset loss value).
Taking one of the imaging targets and one of the radar targets as an example, the comprehensive loss between the imaging targets and the radar targets can be represented by Euclidean distance, specifically, the sum of the square of the first difference value of the speed of the imaging targets and the speed of the radar targets and the square of the second difference value of the distance of the imaging targets and the distance of the radar targets can be calculated, and then the arithmetic square root of the sum is solved to obtain the comprehensive loss between the imaging targets and the radar targets; the integrated loss between the imaging target and the radar target may also be expressed as an average or a weighted average, and specifically, an average or a weighted average of an absolute value of a first difference between the speed of the imaging target and the speed of the radar target and an absolute value of a second difference between the distance of the imaging target and the distance of the radar target may be calculated as the integrated loss between the imaging target and the radar target.
If the comprehensive loss between the ith camera object and the jth radar object is smaller than or equal to the comprehensive loss threshold, updating the loss value (for example, the value can be the value of the ith row and the jth column in the initial loss matrix, the row represents the camera object, and the column represents the radar object) corresponding to the ith camera object and the jth radar object in the initial loss matrix to be the comprehensive loss between the camera object and the radar object; otherwise, the loss values corresponding to the ith shooting target and the jth radar target in the initial loss matrix are kept unchanged.
The comprehensive loss threshold value can be determined in a data statistical distribution mode according to the characteristics of the image pickup equipment and the radar. Wherein different distance segments may set different comprehensive loss thresholds, that is, the comprehensive loss threshold may be non-fixed, and the value of the comprehensive loss threshold may be greater as the distance of the target is greater according to the distance setting of the target. Illustratively, the determining of the comprehensive loss threshold may mainly consider radar ranging characteristics, where radar long-range ranging accuracy is low, ranging error is large, radar short-range ranging accuracy is high, and ranging error is small, so that the comprehensive loss threshold is associated with a distance of a radar target and may be in positive correlation, where the comprehensive loss threshold is large when the distance of the radar target is large, and where the comprehensive loss threshold is small when the distance of the radar target is small. The integrated loss threshold may be linearly distributed according to a distance segment to which a distance of the radar target belongs.
In some embodiments, the information of the camera object includes a speed and a distance of the camera object; the information of the radar target comprises the speed and the distance of the radar target;
The preset condition includes that the comprehensive loss between the camera target and the radar target is greater than a comprehensive loss threshold, the distance loss between the camera target and the radar target is less than or equal to a first distance threshold, and the speed loss between the camera target and the radar target is greater than a first speed threshold.
In some embodiments, in S103, the correcting the initial loss matrix according to the number of radar targets meeting the preset condition corresponding to each imaging target to obtain a corrected loss matrix includes:
and for each shooting target, if the number of radar targets which correspond to the shooting targets and meet the preset condition is 1, correcting the loss value of the shooting target and the corresponding radar target which meet the preset condition in the initial loss matrix to be the comprehensive loss between the shooting target and the radar target, and obtaining a corrected loss matrix.
Wherein the distance loss between the imaging target and the radar target is the absolute value of the difference between the distance between the imaging target and the radar target. The speed loss of the imaging target and the radar target is the absolute value of the difference between the speed of the imaging target and the speed of the radar target.
In the initial loss matrix, the values of the positions corresponding to the imaging target and the radar target whose integrated loss is less than or equal to the integrated loss threshold have been updated to the integrated loss, that is, the values of the positions corresponding to the imaging target and the radar target whose integrated difference is small are updated to the integrated loss.
Regarding the imaging targets and radar targets with the comprehensive loss greater than the comprehensive loss threshold, considering the situation that the speed of the targets detected by the imaging equipment is too far from the speed of the same targets detected by the radar due to the incompleteness of cut-in targets and the instability of the speed measurement of the imaging equipment, acquiring the number of radar targets meeting preset conditions corresponding to the imaging targets with the speed loss greater than the first speed threshold although the comprehensive loss is greater than the comprehensive loss threshold. If the number of radar targets meeting the preset condition corresponding to the imaging targets is 1, the condition that only one radar target with a distance loss smaller than or equal to the first distance threshold and a speed loss larger than the first speed threshold is provided around the imaging targets is considered, and the condition can be relaxed in consideration of the situation that the speed difference of the same target detected by the imaging equipment and the radar is too large, the imaging targets meeting the preset condition corresponding to the imaging targets and the loss values corresponding to the radar targets, with the number of 1 radar targets meeting the preset condition, are corrected to be comprehensive losses between the imaging targets and the radar targets, so that the situation that the target fusion fails or the target output is late due to the fact that the speed difference of the same target detected by the imaging equipment and the radar is too large can be improved; for the case of unsatisfied, the initial value remains unchanged.
For the same target, the speed detected by the camera equipment and the speed detected by the radar should be the same or similar, and the distance detected by the camera equipment and the distance monitored by the radar should be the same or similar, so that the first distance threshold and the first speed threshold are both smaller values and can be obtained according to actual requirements or related calibration. As mentioned above, the first distance threshold is related to the distance of the radar target due to the radar ranging characteristic, and is in positive correlation, that is, the greater the distance of the radar target is, the smaller the first distance threshold is, and the setting of the comprehensive loss threshold can be referred to specifically, and will not be repeated.
When the targets are fused, when the targets are large-sized vehicles such as trucks or vans, due to poor clustering of millimeter wave radars, reflection points after a plurality of clusters can appear, when the targets are fused, any radar split target can be fused with a camera shooting target, obvious jump can occur to measurement values at front and rear moments, so that track saw teeth appear on tracked target tracks, smoothness and robustness are poor, and the following stability of ACC (adaptive cruise control, adaptive cruise control system) is influenced. In order to solve the problem, the embodiment of the application also provides the following target fusion method.
In some embodiments, after the step S103, the target fusion method may further include:
Determining distance loss and speed loss between the first radar target and other radar targets, and determining a split radar target corresponding to the first camera shooting target according to the distance loss and the speed loss between the first radar target and other radar targets; the first shooting target is a shooting target of which the type is a preset large-sized vehicle; the first radar target is a radar target associated with the first camera target;
combining the split radar targets corresponding to the first camera shooting targets, and updating the corrected loss matrix to obtain an updated loss matrix;
Accordingly, the step S104 may include:
and performing target fusion according to the updated loss matrix.
In the present embodiment, the image pickup apparatus may recognize the type of the object using the related art, and the type of the object may be a car, a truck, a bus, a van, a motorcycle, a bicycle, an electric bicycle, or the like. The preset large-sized vehicle is a vehicle possibly with a split radar target, and can comprise a large-sized vehicle such as a truck, a passenger car or a truck.
In the present embodiment, an imaging target whose target type is a preset large vehicle is referred to as a first imaging target, and a radar target associated with the first imaging target is referred to as a first radar target.
For each first camera shooting target, obtaining the distance loss and the speed loss between the first radar target corresponding to the first camera shooting target and other radar targets (namely all radar targets except the first radar target in a radar target list), determining a split radar target corresponding to the first camera shooting target according to the distance loss and the speed loss between the first radar target corresponding to the first camera shooting target and other radar targets, merging the split radar targets corresponding to the first camera shooting target, and updating the corrected loss matrix.
For each first image capturing object, after the above process is executed, the obtained loss matrix is the updated loss matrix. Finally, according to the updated loss matrix, target fusion is carried out.
The distance loss between any two radar targets is the absolute value of the difference in the distances of the two radar targets. The speed loss between any two radar targets is the absolute value of the difference in the speeds of the two radar targets.
The split radar target corresponding to the first camera shooting target is a radar target which is detected by the radar and is the same as the first camera shooting target. The number of split radar targets corresponding to the first imaging target is greater than 1.
According to the method, the split radar target with the target type corresponding to the first shooting target of the preset large-sized vehicle is obtained, combined, the corrected loss matrix is updated, and finally target fusion is carried out according to the updated loss matrix, so that the shooting target and the real radar target can be fused, track saw teeth of a tracked target track are avoided, and smoothness and robustness of the target track can be improved.
In some embodiments, determining a split radar target corresponding to the first camera target based on a distance loss and a speed loss between the first radar target and each of the other radar targets includes:
Taking a radar target with a distance loss smaller than or equal to a second distance threshold value and a speed loss smaller than or equal to a second speed threshold value as a candidate split radar target of the first camera target;
Taking the first radar target as a candidate split radar target of the first shooting target;
And aiming at each candidate split radar target of the first shooting target, if the shooting target associated with the candidate split radar target is unique, determining the candidate split radar target as the split radar target corresponding to the first shooting target.
Wherein the second distance threshold may be greater than the first distance threshold. The second speed threshold may be equal to or similar to or less than the first speed threshold. Theoretically, two split radar targets should be at the same speed and different distances, and therefore, the second speed threshold should be as small as possible, e.g., may be 1, etc.; the second distance threshold is greater than the first distance threshold. The setting of the second distance threshold may be similar to the setting of the first distance threshold, and the second distance threshold is greater than the first distance threshold in a distance section to which the distance of the same radar target or the distance of the same radar target belongs, although the distance of the same radar target is positively correlated.
The method comprises the steps of relaxing a distance threshold, traversing a radar target list, and taking a radar target with a distance loss smaller than or equal to a second distance threshold and a speed loss smaller than or equal to a second speed threshold as a candidate split radar target of a first shooting target; meanwhile, the first radar target associated with the first imaging target is also taken as a candidate split radar target of the first imaging target. The IDs of the candidate split radar targets of the first imaging target are recorded for subsequent use.
Aiming at each candidate split radar target of the first camera target, traversing a camera target list, determining a camera target associated with the candidate split radar target, and determining the candidate split radar target as a split radar target corresponding to the first camera target if the camera target associated with the candidate split radar target is unique, namely only one camera target; otherwise, determining that the candidate split radar target is not the split radar target corresponding to the first camera target.
In some embodiments, determining that the camera target and the radar target are associated when a loss value of the camera target and the radar target in the modified loss matrix is less than or equal to a preset loss threshold; or alternatively, the first and second heat exchangers may be,
When the comprehensive loss between the camera shooting target and the radar target is smaller than the comprehensive loss threshold value, determining that the camera shooting target is associated with the radar target; or alternatively, the first and second heat exchangers may be,
When the distance loss between the camera object and the radar object is less than a first distance threshold and the speed loss between the camera object and the radar object is less than a first speed threshold, determining that the camera object and the radar object are associated.
In this embodiment, whether any one of the imaging targets and any one of the radar targets are associated may be determined by using any one of the above three schemes. Determining that the camera target is associated with the radar target when the integrated loss between the camera target and the radar target is less than an integrated loss threshold; or may determine that the camera target is associated with the radar target when a loss of distance between the camera target and the radar target is less than a first distance threshold and a loss of speed between the camera target and the radar target is less than a first speed threshold; or in the corrected loss matrix, if the loss value corresponding to the imaging target and the radar target is smaller than or equal to a preset loss threshold value, the association between the imaging target and the radar target can be determined.
The preset loss threshold value can be larger than or equal to the comprehensive loss threshold value and smaller than the first preset loss value, and the specific value of the preset loss threshold value can be set according to actual requirements.
In some embodiments, merging the split radar targets corresponding to the first camera target includes:
And taking the split radar target with the smallest distance as the combined radar target in the split radar targets corresponding to the first shooting target.
In this embodiment, the radar target with the smallest distance among the split radar targets corresponding to the first imaging target may be used as the radar target after the split radar targets corresponding to the first imaging target are combined. By way of example, assuming that the first imaging target corresponds to two split radar targets, one of which has a distance of 50m and the other of which has a distance of 48m, the split radar target having a distance of 48m is taken as the combined radar target.
Correspondingly, updating the corrected loss matrix, namely, in the split radar targets corresponding to the first shooting targets, reserving the column of the corrected loss matrix, which is used as the combined radar target, and deleting the column of the corrected loss matrix, which is not used as the combined radar target; or in the split radar targets corresponding to the first camera targets, keeping the numerical value of the column in the corrected loss matrix of the radar targets which are used as the combined radar targets unchanged, and updating the numerical value of the column in the corrected loss matrix of the radar targets which are not used as the combined radar targets to be a first preset loss value, namely an initial value; or in the split radar targets corresponding to the first shooting targets, the loss values of the radar targets which are combined and the first shooting targets in the corrected loss matrix are kept unchanged, and the loss values of the radar targets which are not combined and the first shooting targets in the corrected loss matrix are updated to be first preset loss values, namely initial values; etc.
According to the embodiment of the application, the split radar targets can be effectively combined by processing the split radar targets, so that the robustness of target fusion can be improved, and the smoothness of a tracking track can be improved.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present invention.
The following are device embodiments of the invention, for details not described in detail therein, reference may be made to the corresponding method embodiments described above.
Fig. 2 is a schematic structural diagram of a target fusion device according to an embodiment of the present invention, and for convenience of explanation, only a portion related to the embodiment of the present invention is shown, which is described in detail below:
As shown in fig. 2, the target fusion device 30 may include: the system comprises an acquisition module 31, an initial loss matrix determination module 32, a loss matrix correction module 33 and a target fusion module 34.
An acquisition module 31 for acquiring information of each imaging target detected by the imaging apparatus and information of each radar target detected by the radar;
An initial loss matrix determining module 32, configured to calculate a comprehensive loss between each imaging target and each radar target according to information of each imaging target and information of each radar target, and determine an initial loss matrix between the imaging apparatus and the radar according to the comprehensive loss;
The loss matrix correction module 33 is configured to determine the number of radar targets that meet the preset condition and correspond to each imaging target, and correct the initial loss matrix according to the number of radar targets that meet the preset condition and correspond to each imaging target, so as to obtain a corrected loss matrix;
And the target fusion module 34 is configured to perform target fusion according to the corrected loss matrix.
In one possible implementation, the information of the imaging target includes a speed and a distance of the imaging target; the information of the radar target comprises the speed and the distance of the radar target;
The initial loss matrix determination module 32 is specifically configured to:
For each camera target and each radar target, calculating a first difference value between the speed of the camera target and the speed of the radar target and a second difference value between the distance of the camera target and the distance of the radar target, calculating the comprehensive loss between the camera target and the radar target according to the first difference value and the second difference value, and determining the loss value corresponding to the camera target and the radar target in an initial loss matrix as the comprehensive loss between the camera target and the radar target when the comprehensive loss between the camera target and the radar target is smaller than or equal to a comprehensive loss threshold value.
In one possible implementation, the information of the imaging target includes a speed and a distance of the imaging target; the information of the radar target comprises the speed and the distance of the radar target;
The preset condition includes that the comprehensive loss between the camera target and the radar target is greater than a comprehensive loss threshold, the distance loss between the camera target and the radar target is less than or equal to a first distance threshold, and the speed loss between the camera target and the radar target is greater than a first speed threshold.
In one possible implementation manner, in the loss matrix correction module 33, according to the number of radar targets that meet the preset condition and correspond to each imaging target, the initial loss matrix is corrected, so as to obtain a corrected loss matrix, which includes:
and for each shooting target, if the number of radar targets which correspond to the shooting targets and meet the preset condition is 1, correcting the loss value of the shooting target and the corresponding radar target which meet the preset condition in the initial loss matrix to be the comprehensive loss between the shooting target and the radar target, and obtaining a corrected loss matrix.
In one possible implementation, the target fusion device 30 may further include: and splitting the target merging module.
The split target merging module is used for:
after the corrected loss matrix is obtained, determining the distance loss and the speed loss between the first radar target and other radar targets, and determining the split radar target corresponding to the first camera shooting target according to the distance loss and the speed loss between the first radar target and other radar targets; the first shooting target is a shooting target of which the type is a preset large-sized vehicle; the first radar target is a radar target associated with the first camera target;
combining the split radar targets corresponding to the first camera shooting targets, and updating the corrected loss matrix to obtain an updated loss matrix;
Accordingly, the target fusion module 34 is specifically configured to:
and performing target fusion according to the updated loss matrix.
In one possible implementation manner, in the split target merging module, determining, according to a distance loss and a speed loss between the first radar target and each of the other radar targets, a split radar target corresponding to the first camera target includes:
Taking a radar target with a distance loss smaller than or equal to a second distance threshold value and a speed loss smaller than or equal to a second speed threshold value as a candidate split radar target of the first camera target;
Taking the first radar target as a candidate split radar target of the first shooting target;
And aiming at each candidate split radar target of the first shooting target, if the shooting target associated with the candidate split radar target is unique, determining the candidate split radar target as the split radar target corresponding to the first shooting target.
In one possible implementation, when the loss value of the camera target and the radar target in the corrected loss matrix is less than or equal to a preset loss threshold value, determining that the camera target and the radar target are associated; or alternatively, the first and second heat exchangers may be,
When the comprehensive loss between the camera shooting target and the radar target is smaller than the comprehensive loss threshold value, determining that the camera shooting target is associated with the radar target; or alternatively, the first and second heat exchangers may be,
When the distance loss between the camera object and the radar object is less than a first distance threshold and the speed loss between the camera object and the radar object is less than a first speed threshold, determining that the camera object and the radar object are associated.
In one possible implementation manner, in the split target merging module, merging the split radar targets corresponding to the first imaging target includes:
And taking the split radar target with the smallest distance as the combined radar target in the split radar targets corresponding to the first shooting target.
Fig. 3 is a schematic diagram of an electronic device according to an embodiment of the present invention. As shown in fig. 3, the electronic apparatus 4 of this embodiment includes: a processor 40 and a memory 41. The memory 41 is used for storing a computer program 42, and the processor 40 is used for calling and running the computer program 42 stored in the memory 41 to execute the steps in the above-mentioned respective target fusion method embodiments, such as S101 to S104 shown in fig. 1. Or the processor 40 is configured to invoke and run the computer program 42 stored in the memory 41 to implement the functions of the modules/units in the above-described device embodiments, such as the functions of the modules/units 31 to 34 shown in fig. 2.
Illustratively, the computer program 42 may be partitioned into one or more modules/units that are stored in the memory 41 and executed by the processor 40 to complete the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing the specified functions, which instruction segments are used to describe the execution of the computer program 42 in the electronic device 4. For example, the computer program 42 may be split into the modules/units 31 to 34 shown in fig. 2.
The electronic device 4 may be a computing device such as a computer or a server, or may be a device such as an ECU (Electronic Control Unit ) on a vehicle. The electronic device 4 may include, but is not limited to, a processor 40, a memory 41. It will be appreciated by those skilled in the art that fig. 3 is merely an example of the electronic device 4 and is not meant to be limiting of the electronic device 4, and may include more or fewer components than shown, or may combine certain components, or different components, e.g., the electronic device may also include input-output devices, network access devices, buses, etc.
The Processor 40 may be a central processing unit (Central Processing Unit, CPU), other general purpose Processor, digital signal Processor (DIGITAL SIGNAL Processor, DSP), application SPECIFIC INTEGRATED Circuit (ASIC), field-Programmable gate array (Field-Programmable GATE ARRAY, FPGA) or other Programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 41 may be an internal storage unit of the electronic device 4, such as a hard disk or a memory of the electronic device 4. The memory 41 may also be an external storage device of the electronic device 4, such as a plug-in hard disk, a smart memory card (SMART MEDIA CARD, SMC), a Secure Digital (SD) card, a flash memory card (FLASH CARD) or the like, which are provided on the electronic device 4. Further, the memory 41 may also include both an internal storage unit and an external storage device of the electronic device 4. The memory 41 is used for storing the computer program and other programs and data required by the electronic device. The memory 41 may also be used for temporarily storing data that has been output or is to be output.
Corresponding to the electronic equipment, the embodiment of the application also provides a vehicle which comprises the electronic equipment.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, the specific names of the functional units and modules are only for distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/electronic device and method may be implemented in other manners. For example, the apparatus/electronic device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present invention may implement all or part of the flow of the method of the above embodiment, or may be implemented by instructing related hardware by a computer program, where the computer program may be stored on a computer readable storage medium, and the computer program may implement the steps of each of the above object fusion method embodiments when executed by a processor. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth.
The above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention, and are intended to be included in the scope of the present invention.

Claims (10)

1. A method of target fusion, comprising:
Acquiring information of each imaging target detected by the imaging device and information of each radar target detected by the radar;
According to the information of each shooting target and the information of each radar target, calculating the comprehensive loss between each shooting target and each radar target, and determining an initial loss matrix between shooting equipment and radar according to the comprehensive loss;
Determining the number of radar targets meeting preset conditions corresponding to each camera shooting target, and correcting the initial loss matrix according to the number of radar targets meeting the preset conditions corresponding to each camera shooting target to obtain a corrected loss matrix;
And performing target fusion according to the corrected loss matrix.
2. The target fusion method according to claim 1, wherein the information of the imaging target includes a speed and a distance of the imaging target; the information of the radar target comprises the speed and the distance of the radar target;
the method for calculating the comprehensive loss between each imaging target and each radar target according to the information of each imaging target and the information of each radar target, and determining an initial loss matrix between the imaging equipment and the radar according to the comprehensive loss comprises the following steps:
For each camera target and each radar target, calculating a first difference value between the speed of the camera target and the speed of the radar target and a second difference value between the distance of the camera target and the distance of the radar target, calculating the comprehensive loss between the camera target and the radar target according to the first difference value and the second difference value, and determining the loss value corresponding to the camera target and the radar target in an initial loss matrix as the comprehensive loss between the camera target and the radar target when the comprehensive loss between the camera target and the radar target is smaller than or equal to a comprehensive loss threshold.
3. The target fusion method according to claim 1, wherein the information of the imaging target includes a speed and a distance of the imaging target; the information of the radar target comprises the speed and the distance of the radar target;
The preset condition comprises that the comprehensive loss between the camera shooting target and the radar target is larger than a comprehensive loss threshold value, the distance loss between the camera shooting target and the radar target is smaller than or equal to a first distance threshold value, and the speed loss between the camera shooting target and the radar target is larger than a first speed threshold value.
4. The target fusion method according to claim 1, wherein the correcting the initial loss matrix according to the number of radar targets meeting the preset condition corresponding to each imaging target to obtain a corrected loss matrix includes:
and for each shooting target, if the number of radar targets which correspond to the shooting targets and meet the preset condition is 1, correcting the loss value of the shooting target and the corresponding radar target which meet the preset condition in the initial loss matrix to be the comprehensive loss between the shooting target and the radar target, and obtaining a corrected loss matrix.
5. The target fusion method according to any one of claims 1 to 4, wherein after the modified loss matrix is obtained, the target fusion method further comprises:
Determining distance loss and speed loss between a first radar target and other radar targets, and determining a split radar target corresponding to a first camera shooting target according to the distance loss and the speed loss between the first radar target and the other radar targets; the first shooting target is a shooting target of which the type is a preset large-sized vehicle; the first radar target is a radar target associated with the first camera target;
Combining the split radar targets corresponding to the first camera shooting targets, and updating the corrected loss matrix to obtain an updated loss matrix;
Correspondingly, the target fusion is carried out according to the corrected loss matrix, which comprises the following steps:
and performing target fusion according to the updated loss matrix.
6. The method according to claim 5, wherein determining the split radar target corresponding to the first camera target according to the distance loss and the speed loss between the first radar target and each of the other radar targets comprises:
Taking a radar target with a distance loss smaller than or equal to a second distance threshold value and a speed loss smaller than or equal to a second speed threshold value as a candidate split radar target of the first camera target;
taking the first radar target as a candidate split radar target of the first shooting target;
and aiming at each candidate split radar target of the first shooting target, if the shooting target associated with the candidate split radar target is unique, determining that the candidate split radar target is the split radar target corresponding to the first shooting target.
7. The target fusion method according to claim 5, wherein when a loss value of a camera target and a radar target in the modified loss matrix is less than or equal to a preset loss threshold value, determining that the camera target and the radar target are associated; or alternatively, the first and second heat exchangers may be,
When the comprehensive loss between the camera shooting target and the radar target is smaller than the comprehensive loss threshold value, determining that the camera shooting target is associated with the radar target; or alternatively, the first and second heat exchangers may be,
When the distance loss between the camera object and the radar object is less than a first distance threshold and the speed loss between the camera object and the radar object is less than a first speed threshold, determining that the camera object and the radar object are associated.
8. The method of target fusion according to claim 5, wherein merging the split radar targets corresponding to the first imaging target comprises:
and taking the split radar target with the smallest distance as the combined radar target in the split radar targets corresponding to the first shooting target.
9. An electronic device comprising a processor and a memory, the memory for storing a computer program, the processor for invoking and running the computer program stored in the memory to perform the target fusion method of any of claims 1-8.
10. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the object fusion method according to any one of claims 1 to 8.
CN202410390054.7A 2024-04-02 2024-04-02 Target fusion method, electronic device and storage medium Pending CN117970318A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410390054.7A CN117970318A (en) 2024-04-02 2024-04-02 Target fusion method, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410390054.7A CN117970318A (en) 2024-04-02 2024-04-02 Target fusion method, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN117970318A true CN117970318A (en) 2024-05-03

Family

ID=90866114

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410390054.7A Pending CN117970318A (en) 2024-04-02 2024-04-02 Target fusion method, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN117970318A (en)

Similar Documents

Publication Publication Date Title
US20220049972A1 (en) Surrounding information collection system and surrounding information acquisition apparatus
CN112712040B (en) Method, device, equipment and storage medium for calibrating lane line information based on radar
US10981568B2 (en) Redundant environment perception tracking for automated driving systems
CN115856872A (en) Vehicle motion track continuous tracking method
CN112313536B (en) Object state acquisition method, movable platform and storage medium
CN115523935A (en) Point cloud ground detection method and device, vehicle and storage medium
CN116863124B (en) Vehicle attitude determination method, controller and storage medium
CN114296095A (en) Method, device, vehicle and medium for extracting effective target of automatic driving vehicle
CN111989541B (en) Stereo camera device
CN117970318A (en) Target fusion method, electronic device and storage medium
US20230281872A1 (en) System for calibrating extrinsic parameters for a camera in an autonomous vehicle
CN115270930A (en) Target fusion method, device, vehicle and medium based on DS evidence theory
CN115575942A (en) Fusion method and fusion system for laser radar data and millimeter wave radar data
CN115561760A (en) Sensor system, and sensor data processing apparatus and method thereof
JP2019219180A (en) Object detection device for vehicle
JP7310674B2 (en) Map update device and map update program
CN114067224A (en) Unmanned aerial vehicle cluster target number detection method based on multi-sensor data fusion
CN111295566B (en) Object recognition device and object recognition method
CN116625384B (en) Data association method and device and electronic equipment
US20240159895A1 (en) Target detection device
US11577753B2 (en) Safety architecture for control of autonomous vehicle
CN115985113B (en) Traffic signal lamp control method and electronic equipment
JP7412254B2 (en) Object recognition device and object recognition method
CN113284186B (en) Inertial navigation attitude and vanishing point-based camera calibration method and system
CN116259015A (en) Intelligent 3D multi-target tracking method and system for vehicle-cloud coordination

Legal Events

Date Code Title Description
PB01 Publication