CN113269260A - Multi-sensor target fusion and tracking method and system for intelligent driving vehicle - Google Patents

Multi-sensor target fusion and tracking method and system for intelligent driving vehicle Download PDF

Info

Publication number
CN113269260A
CN113269260A CN202110602102.0A CN202110602102A CN113269260A CN 113269260 A CN113269260 A CN 113269260A CN 202110602102 A CN202110602102 A CN 202110602102A CN 113269260 A CN113269260 A CN 113269260A
Authority
CN
China
Prior art keywords
target
list
sensor
source
fusion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110602102.0A
Other languages
Chinese (zh)
Other versions
CN113269260B (en
Inventor
方阳丽
刘会凯
沈忱
黄值仪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lantu Automobile Technology Co Ltd
Original Assignee
Dongfeng Motor Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongfeng Motor Group Co Ltd filed Critical Dongfeng Motor Group Co Ltd
Priority to CN202110602102.0A priority Critical patent/CN113269260B/en
Publication of CN113269260A publication Critical patent/CN113269260A/en
Application granted granted Critical
Publication of CN113269260B publication Critical patent/CN113269260B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • G01S13/726Multiple target tracking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The application discloses a multi-sensor target fusion and tracking method and system for an intelligent driving vehicle, which relate to the technical field of multi-sensor data fusion, and the multi-sensor target fusion and tracking method for the intelligent driving vehicle comprises the following steps: each time a frame of sensor data is received, adding a time stamp to the sensor data and generating a first list, wherein the first list comprises each target and attribute information thereof in the sensor data; predicting and updating the existing target list according to the timestamp corresponding to the first list to obtain a prediction list; performing association matching on the prediction list and the first list, and generating a new target list; and outputting the target corresponding to the preset output strategy and the attribute information thereof from the new target list. The method and the system for fusing and tracking the multi-sensor target of the intelligent driving vehicle not only can solve the time synchronization problem in the multi-sensor fusion process, but also can retain all effective targets and obtain a credible target detection result.

Description

Multi-sensor target fusion and tracking method and system for intelligent driving vehicle
Technical Field
The application relates to the technical field of multi-sensor data fusion, in particular to a multi-sensor target fusion and tracking method and system for an intelligent driving vehicle.
Background
At present, based on the development of intelligent driving technology, the detection technology of a single sensor does not meet the perception requirement of a vehicle, and more types and numbers of sensors on the vehicle are used for the vehicle to sense the environment, such as millimeter wave radar, cameras, laser radar, ultrasonic radar and the like. Different kinds of sensor output have different target attributes, and have different advantages and disadvantages. For example, the camera can output the type and size of the target, but the illumination adaptability of the camera is poor; although the millimeter wave radar is slightly influenced by environmental factors and has high speed measurement precision, the millimeter wave radar has low resolution and poor detection performance on static targets; although the laser radar has high ranging precision and high resolution, the laser radar has general environmental adaptability and is expensive. Therefore, the limitation of a single sensor is improved and complemented through the multi-sensor fusion algorithm, the detection precision and the reliability of the sensing system can be effectively improved, and the multi-sensor target fusion algorithm has no unified standard.
In the related art, patent CN111783905A provides an object fusion method, device, storage medium, and electronic apparatus, where the object fusion method includes: acquiring a first target detected by a first sensor; matching the first target with the targets in the fusion target set, and updating the attribute of the first fusion target by using the first target if the first target is successfully matched with the first fusion target; and if the first target is not successfully matched with any target in the fusion target set, matching the first target with the targets in the heterogeneous target set, and if the first target is successfully matched with the first heterogeneous target, fusing the first target and the first heterogeneous target to generate a second fusion target and adding the second fusion target to the fusion target set.
However, when multi-sensor fusion is performed, at least three times of fusion are performed on each frame of data, which results in a complex fusion process and a long execution time, and even results in that the previously fused target set may not be the best match, and thus a reliable target detection result cannot be obtained.
Disclosure of Invention
Aiming at the defects in the prior art, the application aims to provide a multi-sensor target fusion and tracking method and system for an intelligent driving vehicle, so as to solve the problems that the fusion process is complex and the execution time is long, and even a credible target detection result cannot be obtained in the related technology.
The application provides a method for fusing and tracking a multi-sensor target of an intelligent driving vehicle, which comprises the following steps:
each time a frame of sensor data is received, adding a time stamp to the sensor data and generating a first list, wherein the first list comprises each target and attribute information thereof in the sensor data;
predicting and updating the existing target list according to the timestamp corresponding to the first list to obtain a prediction list;
the prediction list is associated and matched with the first list, and a new target list is generated;
and outputting the target corresponding to the preset output strategy and the attribute information thereof from the new target list.
In some embodiments, when a target in the prediction list does not exist in the first list, the target counts frame loss once, and the attribute information of the target includes the number of consecutive frame loss times of the target;
the generating of the new target list specifically includes:
filtering and updating the successfully matched target in the prediction list and the first list to obtain and retain the target and new attribute information thereof;
reserving targets which exist in the first list and do not exist in the prediction list and attribute information thereof;
adding one to the continuous frame loss times of targets which exist in the prediction list and do not exist in the first list, if the continuous frame loss times of the targets after adding one to the targets does not reach a first time threshold value, reserving the targets, and if not, discarding the targets;
a new list of targets is generated based on the retained targets and their attribute information.
In some embodiments, when a target in the prediction list exists in the first list, the target is matched once, and the attribute information of the target further includes the number of continuous matching times;
and taking the target with the continuous matching times not greater than the second time threshold or the continuous frame loss times greater than 0 as a potential target.
In some embodiments, the attribute information of the target further includes a target source, and the target source includes a single sensor source and a fusion source;
taking the target source stored in the prediction list as an original source, and taking the target source stored in the first list as a new source;
for targets that are present in the first list and not present in the prediction list, their target sources are single-sensor sources;
for targets that are present in the prediction list and not present in the first list, their target sources are original sources;
for the target successfully matched in the prediction list and the first list, when the original source of the target is a single sensor source, if the original source and the new source are the same sensor, the target is the single sensor source; if the original source and the new source are different sensors, the original source and the new source are fusion sources;
when the original source of the target is a fusion source, if the new sources are the same sensor within the preset time, the target source is changed into a single sensor source, otherwise, the fusion source is reserved.
In some embodiments, associating and matching the prediction list with the first list specifically includes:
sequentially carrying out Euclidean distance calculation on each target in the prediction list and each target in the first list to form an Euclidean distance matrix, and setting a distance threshold;
calculating the global optimal matching relation of the Euclidean distance matrix through a Hungarian algorithm, and deleting the matching relation which is greater than a distance threshold value in the global optimal matching relation to obtain a matching pair;
and taking the target of the matching pair as the target of successful matching.
In some embodiments, the attribute information of the object further includes an object ID;
when a certain target is discarded, releasing the target ID of the certain target; the target ID is the assignable ID the next time the first list is generated.
In some embodiments, before receiving a frame of sensor data, the method further comprises:
the coordinate systems of the plurality of sensors are all converted to the same coordinate system.
In some embodiments, after receiving a frame of sensor data, any frame of sensor data collected by a sensor is discarded before the new target list is generated.
In some embodiments, the first list generated with the first frame of sensor data received is used as the initial target list.
The second aspect of the present application provides a system for implementing the above method for fusing and tracking multiple sensor targets of an intelligent driving vehicle, comprising:
a clock processing module for receiving the sensor data and time stamping the frame of sensor data;
a first generating module, configured to generate a first list according to the sensor data after being timestamped, where the first list includes each target and attribute information thereof;
the prediction module is used for predicting and updating the existing target list according to the timestamp of the first list to obtain a prediction list;
a second generation module, configured to associate and match the prediction list with the first list, and generate a new target list;
and the output module is used for outputting the corresponding target and the attribute information thereof from the new target list based on a preset output strategy.
The beneficial effect that technical scheme that this application provided brought includes:
according to the method and the system for fusing and tracking the multi-sensor target of the intelligent driving vehicle, as each frame of sensor data is received, the sensor data is stamped with the time stamp, and a first list is generated based on the sensor data, wherein the first list comprises each target and the attribute information thereof in the sensor data; after a first list is generated, predicting and updating an existing target list based on a timestamp corresponding to the first list to obtain a prediction list, then performing association matching on the prediction list and the first list to generate a new target list, and finally outputting a corresponding target and attribute information thereof from the new target list based on a preset output strategy; therefore, the time synchronization problem in the multi-sensor fusion process can be solved, all effective targets can be reserved, and a credible target detection result can be obtained.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a first flowchart of a multi-sensor target fusion and tracking method in an embodiment of the present application;
FIG. 2 is a flowchart of step S3 in FIG. 1;
FIG. 3 is a matching diagram of step S3 in the embodiment of the present application;
FIG. 4 is an asynchronous schematic diagram of multiple sensors in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
The embodiment of the application provides a multi-sensor target fusion and tracking method and system for an intelligent driving vehicle, which can solve the problems that the fusion process is complex and the execution time is long in the related technology, and even a credible target detection result cannot be obtained.
As shown in fig. 1, the method for fusing and tracking the multi-sensor target of the intelligent driving vehicle in the embodiment of the present application includes the steps of:
s1, when receiving a frame of sensor data, adding a timestamp to the sensor data, and generating a first list, namely, when receiving the sensor data, triggering one-time fusion. The first list includes each object in the sensor data and attribute information thereof. In this embodiment, the first list is also accompanied by the timestamp.
And S2, performing prediction updating on the existing target list according to the timestamp corresponding to the first list to obtain a prediction list. In this embodiment, the existing target list includes a plurality of targets and attribute information of each target.
According to the timestamp attached to the existing target list and the timestamp attached to the first list, the time difference between the existing target list and the first list can be obtained, and then each target in the existing target list can be predicted and updated according to the time difference, so that two lists subjected to correlation matching subsequently are compared at the same time. The timestamp attached to the existing target list is the timestamp added when the sensor data was last received.
And S3, correlating and matching the prediction list with the first list, and generating a new target list, namely completing the fusion processing of the frame of sensor data.
And S4, outputting the target corresponding to the preset output strategy and the attribute information thereof from the new target list.
In the method of this embodiment, each time a frame of sensor data is received, the sensor data is time-stamped, and a first list is generated based on the sensor data, where the first list includes each target in the sensor data and its attribute information; after a first list is generated, predicting and updating an existing target list based on a timestamp corresponding to the first list to obtain a prediction list, then performing association matching on the prediction list and the first list to generate a new target list, and finally outputting a corresponding target and attribute information thereof from the new target list based on a preset output strategy; therefore, the time synchronization problem in the multi-sensor fusion process can be solved, all effective targets can be reserved, and a credible target detection result can be obtained.
In this embodiment, when a certain target in the prediction list does not exist in the first list, the target counts frame loss once, and accordingly, the attribute information of the target includes the number of consecutive frame loss times of the target.
Specifically, after the prediction list is associated and matched with the first list, the targets of the two lists are divided into three types, one is: predicting the target successfully matched with the first list in the list; secondly, the following steps: if the target in the first list is not matched with any target in the prediction list, the target in the first list is a newly appeared target and needs to keep the current state; thirdly, the method comprises the following steps: if there are objects in the prediction list that do not exist in the first list, i.e. if a certain object in the prediction list does not match any object in the first list, then the object in the prediction list needs to be further determined.
Further, in step S3, generating a new target list specifically includes the following steps:
firstly, filtering and updating the target successfully matched in the prediction list and the first list to obtain and retain the target and new attribute information thereof.
The prediction updating can adopt a Kalman filtering algorithm, and basic attributes of the target, such as transverse/longitudinal distance, speed and acceleration, are updated by constructing a vehicle dynamics equation.
Second, objects and their attribute information that are present in the first list and not present in the prediction list are retained.
Then, adding one to the continuous frame dropping times of the targets which exist in the prediction list and do not exist in the first list, then judging whether the continuous frame dropping times of the targets after adding one reaches a first time threshold value, and if the continuous frame dropping times of the targets do not reach the first time threshold value, reserving; and if the continuous frame dropping times of the target reach the first time threshold value, dropping the target.
And finally, generating a new target list based on the reserved targets and the attribute information thereof. The targets in the new target list are all currently valid targets.
In addition to the above embodiment, in this embodiment, when a certain object in the prediction list exists in the first list, the object is subjected to one-time matching, and accordingly, the attribute information of the object further includes the number of consecutive matching times.
Specifically, when a new target list is generated, the number Num-lost of consecutive matching times and the number Num-match of consecutive frame dropping times of the target need to be updated according to the association matching condition.
In the initial target list, the Num-match of the target is 0, the Num-lost is 0, and when the matching is successful once, the Num-match is added with 1; when the first list has no target successfully matched with the first list, adding 1 to Num-lost, and setting Num-match as an initial value of 0; when the target is matched again, adding 1 to the Num-match, and setting the Num-lost as an initial value of 0.
And after the Num-match of the target reaches the second secondary threshold, if the match is continued, keeping the second secondary threshold unchanged. When the Num-lost of the target reaches the first threshold, it is discarded.
In this embodiment, a target with the number of consecutive matches not greater than the second threshold or the number of consecutive frame losses greater than 0 is taken as a potential target. Wherein the potential target is a target which can be discarded subsequently.
In this embodiment, the first sub-threshold and the second sub-threshold may be both reasonably adjusted according to actual requirements. Optionally, the first time threshold value ranges from 3 to 5 times, and the second time threshold value ranges from 5 to 10 times.
Furthermore, the attribute information of the target also comprises a target source, namely a sensor type, and the target source comprises a single sensor source and a fusion source, so that the source of the target can be known visually, and the screening strategy can be selected reasonably in different decision plans. Targets collected by two or more sensors are used as fusion source targets.
The target sources stored in the prediction list are used as original sources, and the target sources stored in the first list are used as new sources, wherein the new sources are the sensor types corresponding to the sensor data.
For targets that are present in the first list and not present in the prediction list, their target sources are single-sensor sources.
For targets that are present in the prediction list and not present in the first list, their target sources are original sources.
For the target successfully matched in the prediction list and the first list, when the original source of the target is a single sensor source, if the original source and the new source are the same sensor, the target is still the single sensor source; if the original source and the new source are different sensors, the new source is updated to be the fusion source.
When the original source of the target is a fusion source, if the new sources are the same sensor within the preset time, the target source is changed into a single sensor source, otherwise, the fusion source is reserved, so that the situation that the target is the fusion source target after a period of time, and the target source cannot be updated because the target is detected by the same sensor and is successfully matched is avoided.
In this embodiment, the preset time may be set according to the periods of different sensors.
By adding the target source to the attribute information of the target to indicate whether the target is the target acquired by the single sensor or the target after fusion, selective output is facilitated when the target is output subsequently.
On the basis of the second embodiment, in this embodiment, the associating and matching the prediction list with the first list in the step S3 specifically includes the following steps:
firstly, sequentially carrying out Euclidean distance calculation on each target in the prediction list and each target in the first list to form an Euclidean distance matrix, and setting a distance threshold.
And then, calculating the global optimal matching relation of the Euclidean distance matrix through a Hungarian algorithm, and deleting the matching relation which is greater than the distance threshold value in the global optimal matching relation to obtain a matching pair.
And finally, taking the target of the matching pair as the target which is successfully matched.
Specifically, the above association matching process finds the pairing relationship between the targets in two different lists. Assuming that there are m targets in the existing target list, there are m targets in the prediction list a after the prediction update, and the number of targets in the first list B is n, the prediction list a may be used as a matrix row, and the first list B may be used as a matrix column.
Then, the Euclidean distance between each row of targets and each column of targets is circularly calculated according to parameters such as the transverse distance, the longitudinal distance and the speed between the targets, a Euclidean distance matrix D is formed, and a distance threshold Dm is set.
The two target lists are associated through the Hungarian algorithm, when the global optimal matching relation of the Euclidean distance matrix is calculated, the Hungarian algorithm enables a small number of target lists to complete full matching, at the moment, unsatisfied matching pairs need to be cancelled through the distance threshold Dm, namely, the matching relation larger than the distance threshold is deleted, the remaining matching pairs are the matching targets in the two lists, the matching targets can serve as the same target, the association result is marked, and the attribute information of the target can be updated conveniently in the follow-up process.
In other embodiments, the list a may be predicted as a matrix column and the first list B may be predicted as a matrix row.
As shown in fig. 2, in this embodiment, the step S3 specifically includes:
s31, associating and matching the prediction list with the first list;
s32, adding one to the continuous frame loss times of targets which exist in the prediction list and do not exist in the first list;
and S33, judging whether the number of times of continuous frame dropping after adding one reaches a first time threshold value, if so, turning to S34, and otherwise, turning to S35.
S34, the target is discarded, and the process goes to S36.
S35, reserving the target;
s36, filtering and updating the successfully matched target to obtain and retain the target and new attribute information thereof;
s37, reserving targets which exist in the first list and do not exist in the prediction list and attribute information of the targets;
and S38, generating a new target list based on the reserved targets and the attribute information thereof.
Preferably, the attribute information of the object further includes an object ID. The object ID of each object is different. When managing objects, all information of one object can be located only based on the object ID. When the target exists all the time, the target ID of the target also keeps unchanged, and the track data can be considered to be kept effective when the target ID is unchanged in the decision planning; when a new target ID appears, the new target carries out track management again, and the track prediction and tracking efficiency is ensured.
The target ID in the existing target list is needed to be used for the target after the association matching is successful, the target ID of the target existing in the prediction list and not existing in the first list is reserved, and the unique target ID needs to be reassigned according to the target ID of the existing target list for the new target existing in the first list and not existing in the prediction list.
When a certain target is discarded, releasing the target ID of the certain target; the target ID is the assignable ID the next time the first list is generated.
As shown in fig. 3, the targets in the prediction list a are valid targets, the targets in the first list B are current measurement targets, and the numbers are used as target IDs.
If matching of target 1, target 2 and target 3 of prediction list a with target 2, target 4 and target 3 of the first list B is successful, the IDs of target 1, target 2 and target 3 of prediction list a are used. Target 4 of prediction list a is not successfully matched, its ID is not changed.
If the objects 1 and 5 in the first list B are newly-appeared objects, the ID number of 5 is assigned to the object 1 in the first list B, and the ID number of 6 is assigned to the object 1 in the first list B, according to the assignment rule of the object IDs of the existing object lists.
At this time, for the new object list A generatedAdding 1 to the Num-match of the target 1, the target 2 and the target 3, and setting Num-lost as 0; adding 1 to Num-lost of the target 4, and setting Num-match to be 0; num-match of target 5 and target 6 is set to 0, and Num-lost is set to 0.
When a certain target is discarded and the target ID is released, and the first list is generated next time, if a new target appears, the target ID cannot be used, so that the phenomenon that the discarded target still exists in the later stage of outputting the target list is avoided.
Further, before receiving a frame of sensor data, the method further comprises the following steps:
the coordinate systems of the plurality of sensors are all converted to the same coordinate system.
In this embodiment, time and space synchronization of multiple sensors is a precondition for target fusion, where time synchronization is to require real time of targets output by different sensors to be consistent, so as to ensure that measurement targets in the target association and update processes are calculated at the same time. However, in the actual situation, after different sensors are powered on, synchronization of the first frame data is difficult to guarantee by a hard synchronization mode, even though power-on synchronization can be achieved by PPS (Pulse per Second) time signals synchronized by GPS, different types of sensor software consume different time and have different frame rates, as shown in fig. 4, taking a sensor a, a sensor B and a sensor N which are mounted on a vehicle as an example, the first frame data sent by each sensor is not synchronized in time and the frame rates are different, so that the target output at the same time cannot be guaranteed.
The spatial synchronization means that when each sensor is installed, the position and the angle of each sensor need to be ensured to meet basic installation requirements, the installation error of the sensor is compensated through software calibration, and a coordinate system of the sensor is converted into the same vehicle body coordinate system through coordinate conversion, so that the relevant vector parameters of different positions and different types of sensor output targets are relative to the same coordinate system, and the accuracy of fusion is ensured. The coordinate conversion of the present embodiment is a general coordinate conversion process, and will not be described here.
Based on the above embodiments, in this embodiment, after receiving a frame of sensor data, the sensor data collected by any sensor is discarded before the new target list is generated this time.
Specifically, when a frame of sensor data is in the fusion process, if a sensor acquires new data, the new data does not trigger fusion, and after the current frame of sensor data is fused, the fusion process is triggered according to the newly received frame of sensor data, so as to avoid program death caused by the fact that the next fusion is started again in the last fusion process.
In this embodiment, a first list generated by the received first frame of sensor data is used as an initial target list. That is, the target list is directly generated when the first frame of sensor data is received.
In this embodiment, since targets detected by different sensors include different attributes, but all need to be fused, and different code languages and data structures selected by algorithm logic are different, in the data analysis process, on one hand, sensor data needs to be processed into a physical value; on the other hand, the data structures stored in the analyzed multi-target data need to be fixed and unified, and each target adopts a unified and fixed data structure, so that subsequent fusion processing is facilitated. The target data structure comprises a target and attribute information thereof, wherein the attribute information comprises basic attributes and additional attributes, the basic attributes comprise a target ID, a target position, a speed, an acceleration and the like, and the additional attributes comprise a target source, continuous frame dropping times, continuous matching times and the like.
In addition, the sensor signal needs to be preprocessed to ensure that the sensor outputs stably tracked target level data. The signal preprocessing process is to preprocess the target data detected by each sensor respectively, and comprises signal anti-shake, signal filtering, signal monitoring and the like, so that the smoothness and the rationality of signals are ensured. Optionally, when the intelligent driving system is configured with a high-precision map and a high-precision positioning system, that is, when high-precision positioning information is input, some obstacles such as fences, well lids, traffic bars and the like which are mistakenly detected by the sensor can be removed according to the positioning information, and information of the vehicle and the obstacles in an absolute coordinate system can be provided in the correlation matching process, so that the error of the vehicle body coordinate system is reduced.
Optionally, the preset output policy is: policies set according to the functions desired to be implemented. In this embodiment, the presetting of the output policy includes:
A. selecting a stable target to output according to the continuous matching times and the continuous frame dropping times;
B. selecting vehicles, pedestrians or the like to output according to the target types;
C. and selecting the target of the lane or the side lane according to the transverse position, and selecting the required target according to the longitudinal distance for outputting.
In this embodiment, based on a preset output policy, a matching target may be screened from the new target list for output.
The embodiment mainly solves the problem of multi-sensor target fusion, and the fusion process mainly comprises sensor signal preprocessing, data analysis, correlation matching after target prediction updating and target output. Firstly, target data of sensors such as radar or vision and the like are obtained, the detected target data are preprocessed and analyzed, then correlation matching and filtering updating are carried out on all existing targets to obtain a fused target list, required targets are screened out according to attribute information and other conditions of the targets, and finally the screened targets are output to other required modules or systems.
The system for realizing the multi-sensor target fusion and tracking method of the intelligent driving vehicle comprises a clock processing module, a first generating module, a predicting module, a second generating module and an output module.
The clock processing module is used for receiving the sensor data and stamping a time stamp on the received sensor data.
Optionally, the clock processing module is further configured to start timing from power-on, so as to timestamp the sensor data and give the system time.
The first generating module is configured to generate a first list according to the time-stamped sensor data, where the first list includes each target and attribute information thereof.
The prediction module is used for predicting and updating the existing target list according to the timestamp of the first list to obtain a prediction list.
The second generation module is used for associating and matching the prediction list with the first list and generating a new target list.
Optionally, the second generating module is further configured to send a completion signal to the clock processing module after generating a new target list, and the clock processing module may receive new sensor data after receiving the completion signal.
The output module is used for outputting the corresponding target and the attribute information thereof from the new target list based on a preset output strategy. The target with different requirements can be output only by modifying the output strategy, and the whole portability is realized.
The system of the embodiment adopts a mode of fusing the measurement target of a new frame of sensor data with all target sets in the existing target list, so that the storage space can be saved, the execution efficiency can be improved, the expandability can be increased, and the fusion of two or more sensors can be expanded.
The multi-sensor target fusion and tracking system of the embodiment is suitable for the methods, can retain all effective targets in the target fusion process, performs target management on a single sensor target and a fusion target, ensures the integrity of the target, and can output targets with different requirements for different output strategies to obtain a credible target detection result.
The present invention is not limited to the above-described embodiments, and it will be apparent to those skilled in the art that various modifications and improvements can be made without departing from the principle of the present invention, and such modifications and improvements are also considered to be within the scope of the present invention.

Claims (10)

1. A multi-sensor target fusion and tracking method for an intelligent driving vehicle is characterized by comprising the following steps:
each time a frame of sensor data is received, adding a timestamp to the sensor data, and generating a first list, wherein the first list comprises each target and attribute information thereof in the sensor data;
predicting and updating the existing target list according to the timestamp corresponding to the first list to obtain a prediction list;
the prediction list is associated and matched with the first list, and a new target list is generated;
and outputting the target corresponding to the preset output strategy and the attribute information thereof from the new target list.
2. The method of claim 1, wherein when a target in the prediction list does not exist in the first list, the target counts frame loss once, and the attribute information of the target includes the number of consecutive frame loss times of the target;
the generating of the new target list specifically includes:
filtering and updating the successfully matched targets in the prediction list and the first list to obtain and retain the target and new attribute information thereof;
reserving targets which exist in the first list and do not exist in the prediction list and attribute information thereof;
adding one to the continuous frame loss times of targets which exist in the prediction list and do not exist in the first list, if the continuous frame loss times of the targets after adding one to the targets does not reach a first time threshold value, reserving the targets, and if not, discarding the targets;
a new list of targets is generated based on the retained targets and their attribute information.
3. The intelligent driving vehicle multi-sensor target fusion and tracking method of claim 2, characterized in that: when a certain target in the prediction list exists in the first list, counting one-time matching of the target, wherein the attribute information of the target further comprises the number of continuous matching times;
and taking the target with the continuous matching times not greater than the second time threshold or the continuous frame loss times greater than 0 as a potential target.
4. The intelligent driving vehicle multi-sensor target fusion and tracking method of claim 2, characterized in that: the attribute information of the target further comprises a target source, and the target source comprises a single sensor source and a fusion source;
taking the target source stored in the prediction list as an original source, and taking the target source stored in the first list as a new source;
for targets that are present in the first list and not present in the prediction list, their target sources are single-sensor sources;
for targets that are present in the prediction list and not present in the first list, their target sources are original sources;
for the target successfully matched in the prediction list and the first list, when the original source of the target is a single sensor source, if the original source and the new source are the same sensor, the target is the single sensor source; if the original source and the new source are different sensors, the original source and the new source are fusion sources;
when the original source of the target is a fusion source, if the new sources are the same sensor within the preset time, the target source is changed into a single sensor source, otherwise, the fusion source is reserved.
5. The method for fusing and tracking the multi-sensor target of the intelligently driven vehicle as claimed in claim 1, wherein the associating and matching the prediction list with the first list specifically comprises:
sequentially carrying out Euclidean distance calculation on each target in the prediction list and each target in the first list to form an Euclidean distance matrix, and setting a distance threshold;
calculating the global optimal matching relation of the Euclidean distance matrix through a Hungarian algorithm, and deleting the matching relation which is greater than a distance threshold value in the global optimal matching relation to obtain a matching pair;
and taking the target of the matching pair as the target of successful matching.
6. The intelligent driving vehicle multi-sensor target fusion and tracking method of claim 1, characterized in that: the attribute information of the target further includes a target ID;
when a certain target is discarded, releasing the target ID of the certain target; the target ID is the assignable ID the next time the first list is generated.
7. The method of claim 1, wherein prior to receiving a frame of sensor data, further comprising:
the coordinate systems of the plurality of sensors are all converted to the same coordinate system.
8. The intelligent driving vehicle multi-sensor target fusion and tracking method of claim 1, characterized in that: after receiving a frame of sensor data, the frame of sensor data collected by any sensor is discarded before the new target list is generated.
9. The intelligent driving vehicle multi-sensor target fusion and tracking method of claim 1, characterized in that: a first list is generated with the received first frame of sensor data as an initial list of targets.
10. A system for implementing the intelligent driving vehicle multi-sensor target fusion and tracking method of claim 1, characterized in that it comprises:
a clock processing module for receiving the sensor data and time stamping the frame of sensor data;
a first generating module, configured to generate a first list according to the sensor data after being timestamped, where the first list includes each target and attribute information thereof;
the prediction module is used for predicting and updating the existing target list according to the timestamp of the first list to obtain a prediction list;
the second generation module is used for carrying out association matching on the prediction list and the first list and generating a new target list;
and the output module is used for outputting the corresponding target and the attribute information thereof from the new target list based on a preset output strategy.
CN202110602102.0A 2021-05-31 2021-05-31 Multi-sensor target fusion and tracking method and system for intelligent driving vehicle Active CN113269260B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110602102.0A CN113269260B (en) 2021-05-31 2021-05-31 Multi-sensor target fusion and tracking method and system for intelligent driving vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110602102.0A CN113269260B (en) 2021-05-31 2021-05-31 Multi-sensor target fusion and tracking method and system for intelligent driving vehicle

Publications (2)

Publication Number Publication Date
CN113269260A true CN113269260A (en) 2021-08-17
CN113269260B CN113269260B (en) 2023-02-03

Family

ID=77233720

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110602102.0A Active CN113269260B (en) 2021-05-31 2021-05-31 Multi-sensor target fusion and tracking method and system for intelligent driving vehicle

Country Status (1)

Country Link
CN (1) CN113269260B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113721240A (en) * 2021-08-27 2021-11-30 中国第一汽车股份有限公司 Target association method and device, electronic equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130006576A1 (en) * 2010-03-15 2013-01-03 Bae Systems Plc Target tracking
CN109116349A (en) * 2018-07-26 2019-01-01 西南电子技术研究所(中国电子科技集团公司第十研究所) Multi-sensor cooperation tracks combined optimization decision-making technique
US20190204433A1 (en) * 2017-12-29 2019-07-04 Viettel Group Method of tracking target by using 2d radar with sensor
CN110942449A (en) * 2019-10-30 2020-03-31 华南理工大学 Vehicle detection method based on laser and vision fusion
CN110969178A (en) * 2018-09-30 2020-04-07 长城汽车股份有限公司 Data fusion system and method for automatic driving vehicle and automatic driving system
CN111652914A (en) * 2019-02-15 2020-09-11 初速度(苏州)科技有限公司 Multi-sensor target fusion and tracking method and system
CN111783905A (en) * 2020-09-07 2020-10-16 成都安智杰科技有限公司 Target fusion method and device, storage medium and electronic equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130006576A1 (en) * 2010-03-15 2013-01-03 Bae Systems Plc Target tracking
US20190204433A1 (en) * 2017-12-29 2019-07-04 Viettel Group Method of tracking target by using 2d radar with sensor
CN109116349A (en) * 2018-07-26 2019-01-01 西南电子技术研究所(中国电子科技集团公司第十研究所) Multi-sensor cooperation tracks combined optimization decision-making technique
CN110969178A (en) * 2018-09-30 2020-04-07 长城汽车股份有限公司 Data fusion system and method for automatic driving vehicle and automatic driving system
CN111652914A (en) * 2019-02-15 2020-09-11 初速度(苏州)科技有限公司 Multi-sensor target fusion and tracking method and system
CN110942449A (en) * 2019-10-30 2020-03-31 华南理工大学 Vehicle detection method based on laser and vision fusion
CN111783905A (en) * 2020-09-07 2020-10-16 成都安智杰科技有限公司 Target fusion method and device, storage medium and electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113721240A (en) * 2021-08-27 2021-11-30 中国第一汽车股份有限公司 Target association method and device, electronic equipment and storage medium
CN113721240B (en) * 2021-08-27 2024-03-15 中国第一汽车股份有限公司 Target association method, device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN113269260B (en) 2023-02-03

Similar Documents

Publication Publication Date Title
CN113379805A (en) Multi-information resource fusion processing method for traffic nodes
CN110781949B (en) Asynchronous serial multi-sensor-based flight path data fusion method and storage medium
CN109099920B (en) Sensor target accurate positioning method based on multi-sensor association
CN112462381B (en) Multi-laser radar fusion method based on vehicle-road cooperation
CN110245565A (en) Wireless vehicle tracking, device, computer readable storage medium and electronic equipment
KR102168288B1 (en) System and method for tracking multiple object using multi-LiDAR
US20220215673A1 (en) Device, system, and method for generating occupancy grid map
US10793145B2 (en) Object recognition device, object recognition method, and vehicle control system
US20230228593A1 (en) Method, device and system for perceiving multi-site roadbed network and terminal
JP6576511B1 (en) Object recognition apparatus and object recognition method
CN113269260B (en) Multi-sensor target fusion and tracking method and system for intelligent driving vehicle
CN114842445A (en) Target detection method, device, equipment and medium based on multi-path fusion
EP1533628B1 (en) A method for correlating and numbering target tracks from multiple sources
CN112598715A (en) Multi-sensor-based multi-target tracking method, system and computer readable medium
CN112578781B (en) Data processing method, device, chip system and medium
CN112241167A (en) Information processing method and device in automatic driving and storage medium
CN105321380A (en) Updating intensities in a PHD filter based on a sensor track ID
Rieken et al. Sensor scan timing compensation in environment models for automated road vehicles
JP2017075881A (en) Object recognition integration apparatus and object recognition integration method
CN116902005A (en) Fusion processing method, device and medium based on unmanned vehicle
CN113611112B (en) Target association method, device, equipment and storage medium
CN110949380A (en) Location prediction for dynamic objects
Schubert et al. The role of multisensor environmental perception for automated driving
CN110807942B (en) Intelligent driving automobile track updating method and system
CN115014366A (en) Target fusion method and device, vehicle and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20220120

Address after: 430000 No. n3010, 3rd floor, R & D building, building n, Artificial Intelligence Science Park, Wuhan Economic and Technological Development Zone, Hubei Province

Applicant after: Lantu Automobile Technology Co.,Ltd.

Address before: 430056 No. 1 Dongfeng Avenue, Wuhan economic and Technological Development Zone, Hubei, Wuhan

Applicant before: Dongfeng Motor GROUP Co.,Ltd.

GR01 Patent grant
GR01 Patent grant