CN115993595A - Target tracking filtering method and device - Google Patents

Target tracking filtering method and device Download PDF

Info

Publication number
CN115993595A
CN115993595A CN202211502040.7A CN202211502040A CN115993595A CN 115993595 A CN115993595 A CN 115993595A CN 202211502040 A CN202211502040 A CN 202211502040A CN 115993595 A CN115993595 A CN 115993595A
Authority
CN
China
Prior art keywords
data
motion state
state data
determining
correction coefficient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211502040.7A
Other languages
Chinese (zh)
Inventor
李坤乾
朱飞亚
吴童
顾翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingwei Hirain Tech Co Ltd
Original Assignee
Beijing Jingwei Hirain Tech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingwei Hirain Tech Co Ltd filed Critical Beijing Jingwei Hirain Tech Co Ltd
Priority to CN202211502040.7A priority Critical patent/CN115993595A/en
Publication of CN115993595A publication Critical patent/CN115993595A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Radar Systems Or Details Thereof (AREA)

Abstract

The application discloses a target tracking filtering method and device. The method comprises the following steps: acquiring first motion state data, second motion state data and third motion state data of a target object, determining relative position relations among the first motion state data, the second motion state data and the third motion state data by taking any one of the first motion state data, the second motion state data and the third motion state data as a reference, determining correction coefficients corresponding to the first motion state data, the second motion state data and the third motion state data respectively according to the relative position relations, and determining target motion state data of the target object at the next moment according to the correction coefficients, the first motion state data, the second motion state data and the third motion state data. Therefore, the track stability of the target object can be effectively improved in real time, a large amount of storage space is not required to be occupied, and the algorithm complexity is low.

Description

Target tracking filtering method and device
Technical Field
The application belongs to the technical field of target detection and tracking, and particularly relates to a target tracking filtering method and device.
Background
With the development and progress of technology, it has been possible to determine the movement state of a target object based on movement state data acquired by a sensor.
However, the data collected by the sensor inevitably has noise, so the motion state data of the object at the next moment is generally predicted by performing filtering processing on the motion state data collected by the sensor, and based on the motion state data after the filtering processing. However, in the filtering process, the condition of unreasonable filtering parameter setting may exist, in the prediction process, the problem of improper selection of a prediction model may also exist, even abnormal jump may exist in the motion state data acquired by the sensor, and these factors may all cause poor track stability of the target object.
In the related art, fluctuation in the motion state data after the filtering process is usually slowed down by performing secondary smoothing process on the motion state data after the filtering process, so that the track stability of the target object is improved to a certain extent, however, because a certain amount of data needs to be stored to perform the secondary smoothing process, a certain time delay exists, the track stability of the target object cannot be effectively improved in real time, and more storage space is occupied.
Disclosure of Invention
The embodiment of the application provides the target tracking filtering method and the device, which can effectively improve the track stability of a target object in real time, do not occupy a large amount of storage space, and have lower algorithm complexity.
In a first aspect, an embodiment of the present application provides a target tracking filtering method, including:
acquiring first motion state data, second motion state data and third motion state data of the target object, wherein the second motion state data is obtained by tracking and filtering the first motion state data, the third motion state data is obtained by predicting the motion state of the target object at the next moment based on the second motion state data,
determining a relative positional relationship of the first motion state data, the second motion state data, and the third motion state data based on any one of the first motion state data, the second motion state data, and the third motion state data,
determining correction coefficients corresponding to the first motion state data, the second motion state data and the third motion state data respectively according to the relative position relation,
And determining target motion state data of the target object at the next moment according to the correction coefficient, the first motion state data, the second motion state data and the third motion state data.
In a second aspect, an embodiment of the present application provides a target tracking filtering apparatus, including:
an acquisition module for acquiring first motion state data, second motion state data and third motion state data of the target object, wherein the second motion state data is obtained by tracking and filtering the first motion state data, the third motion state data is obtained by predicting the motion state of the target object at the next moment based on the second motion state data,
a first determination module for determining a relative positional relationship of the first motion state data, the second motion state data, and the third motion state data based on any one of the first motion state data, the second motion state data, and the third motion state data,
a second determining module for determining correction coefficients corresponding to the first motion state data, the second motion state data and the third motion state data respectively according to the relative position relation,
And the third determining module is used for determining target motion state data of the target object at the next moment according to the correction coefficient, the first motion state data, the second motion state data and the third motion state data.
In a third aspect, an embodiment of the present application provides an electronic device, including: a processor and a memory storing computer program instructions,
the processor, when executing the computer program instructions, implements the object tracking filtering method as shown in any of the embodiments of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer storage medium having stored thereon computer program instructions which, when executed by a processor, implement the target tracking filtering method shown in any of the embodiments of the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product, instructions in which, when executed by a processor of an electronic device, cause the electronic device to perform the object tracking filtering method shown in any one of the embodiments of the first aspect.
According to the target tracking filtering method and device, the first motion state data, the second motion state data and the third motion state data of the target object can be obtained, the relative position relation among the first motion state data, the second motion state data and the third motion state data is determined, correction coefficients corresponding to the first motion state data, the second motion state data and the third motion state data respectively are determined according to the relative position relation, the target motion state data of the target object at the next moment is determined according to the correction coefficients, the first motion state data, the second motion state data and the third motion state data, and then the target object is controlled based on the target motion state data. The second motion state data is obtained by tracking and filtering the first motion state data, and the third motion state data is obtained by predicting the motion state of the target object at the next moment based on the second motion state data. The relative position relation of the first motion state data, the second motion state data and the third motion state data can reflect real-time track deviation, so that the first motion state data, the second motion state data and the third motion state data are corrected according to the correction coefficient determined based on the relative position relation, the real-time track deviation can be corrected in a targeted mode, a certain amount of data are not required to be stored for secondary smoothing processing, the track stability of a target object can be effectively improved in real time, a large amount of storage space is not required to be occupied, and algorithm complexity is low.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments of the present application will be briefly described, and it is possible for a person skilled in the art to obtain other drawings according to these drawings without inventive effort.
Figure 1 is a flow chart of a method of target tracking filtering provided in one embodiment of the present application,
FIG. 2 is a graph showing the variation of correction factor with relative deviation according to one embodiment of the present application,
figure 3 is a graphical representation of one experimental result provided in one embodiment of the present application,
figure 4 is a schematic structural diagram of an object tracking filtering device according to an embodiment of the present application,
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
Features and exemplary embodiments of various aspects of the present application are described in detail below to make the objects, technical solutions and advantages of the present application more apparent, and to further describe the present application in conjunction with the accompanying drawings and the detailed embodiments. It should be understood that the specific embodiments described herein are intended to be illustrative of the application and are not intended to be limiting. It will be apparent to one skilled in the art that the present application may be practiced without some of these specific details. The following description of the embodiments is merely intended to provide a better understanding of the present application by showing examples of the present application.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
As background art, in the related fields of robot navigation, intelligent assisted driving, etc., sensors are indispensable data acquisition devices, for example, radars. In particular, in the intelligent driving sensor system, the millimeter wave radar has better speed measuring capability on a target and has better penetrating capability on interference factors in complex environments such as rain and fog, so that the millimeter wave radar becomes one of the sensor choices which are not replaced in the intelligent auxiliary driving scheme. Based on the basic principle of radar detection, and considering the interference of factors such as noise in the detection process, the detection data of the sensor such as the radar must have the influence of factors such as measurement noise, so that the measurement data has certain changes or fluctuation. In engineering use, the sensor such as radar can be used for detecting and tracking the target in the driving scene, and the accuracy of target detection and tracking has a great influence on track stability, so that the sensor is an important factor for influencing the performance of the intelligent auxiliary driving system.
In order to improve track stability, in a conventional target detection and tracking processing algorithm, a filtering algorithm is generally used to perform filtering processing on motion state data of a target object acquired by a sensor, and the motion state data of the target object at the next moment is predicted based on the motion state data after the filtering processing, however, in the filtering process, various unreasonable problems of filtering parameter setting or mismatching between a predicted motion model and an actually detected motion process of the target object often exist, even abnormal jump values (i.e. singular values) may exist in the acquired motion state data, and these factors may cause that a result output by the filtering algorithm has certain fluctuation or has some abnormal deviation and jump, and the specific performance is that the track stability of the target object is still poor.
In the related art, a common strategy for this phenomenon is to perform secondary smoothing on the motion state data after the filtering processing to slow down fluctuations in the filtering result, so as to improve the track stability of the target object, but this processing strategy needs to store a certain amount of motion state data after the filtering processing to perform the secondary smoothing processing, so that additional data storage is increased, and there is a certain time delay, so that the result of the secondary smoothing processing cannot be output in real time. Therefore, the method adopted in the related art cannot identify fluctuation or deviation of the filtering result in real time, and cannot effectively improve track stability of the target object in real time.
The embodiment of the application provides a target tracking filtering method and device, which can acquire first motion state data, second motion state data and third motion state data of a target object, determine the relative position relation of the first motion state data, the second motion state data and the third motion state data, determine correction coefficients corresponding to the first motion state data, the second motion state data and the third motion state data respectively according to the relative position relation, determine the target motion state data of the target object at the next moment according to the correction coefficients, the first motion state data, the second motion state data and the third motion state data, and then control the target object based on the target motion state data. The second motion state data is obtained by tracking and filtering the first motion state data, and the third motion state data is obtained by predicting the motion state of the target object at the next moment based on the second motion state data. The relative position relation of the first motion state data, the second motion state data and the third motion state data can reflect real-time track deviation, so that the first motion state data, the second motion state data and the third motion state data are corrected according to the correction coefficient determined based on the relative position relation, the real-time track deviation can be corrected in a targeted mode, a certain amount of data are not required to be stored for secondary smoothing processing, the track stability of a target object can be effectively improved in real time, a large amount of storage space is not required to be occupied, and algorithm complexity is low.
Fig. 1 shows a flowchart of a target tracking filtering method according to an embodiment of the present application.
As shown in fig. 1, the subject of the target tracking filtering method may be a target tracking filtering apparatus, and the target tracking filtering method may include the steps of:
s110, acquiring first motion state data, second motion state data and third motion state data of the target object,
s120, based on the first motion state data, the second motion state data and the third motion state data, determining the relative position relationship of the first motion state data, the second motion state data and the third motion state data by taking any one of the first motion state data, the second motion state data and the third motion state data as a reference,
s130, according to the relative position relation, determining the correction coefficients corresponding to the first motion state data, the second motion state data and the third motion state data respectively,
and S140, determining target motion state data of the target object at the next moment according to the correction coefficient, the first motion state data, the second motion state data and the third motion state data.
Therefore, the first motion state data, the second motion state data and the third motion state data of the target object can be obtained, the relative position relation among the first motion state data, the second motion state data and the third motion state data is determined, the correction coefficients corresponding to the first motion state data, the second motion state data and the third motion state data respectively are determined according to the relative position relation, the target motion state data of the target object at the next moment is determined according to the correction coefficients, the first motion state data, the second motion state data and the third motion state data, and then the target object is controlled based on the target motion state data. The second motion state data is obtained by tracking and filtering the first motion state data, and the third motion state data is obtained by predicting the motion state of the target object at the next moment based on the second motion state data. The relative position relation of the first motion state data, the second motion state data and the third motion state data can reflect real-time track deviation, so that the first motion state data, the second motion state data and the third motion state data are corrected according to the correction coefficient determined based on the relative position relation, the real-time track deviation can be corrected in a targeted mode, a certain amount of data are not required to be stored for secondary smoothing processing, the track stability of a target object can be effectively improved in real time, a large amount of storage space is not required to be occupied, and algorithm complexity is low.
Referring to S110, the first motion state data may include, but is not limited to, motion state data such as a position and a speed of the target object at the current time. The second motion state data may be obtained by performing tracking filtering processing on the first motion state data, and the third motion state data may be obtained by predicting a motion state of the target object at a next moment based on the second motion state data.
In some embodiments, in order to more accurately determine the first motion state data, the second motion state data, and the third motion state data, S110 may include:
first motion state data acquired by the sensor is acquired,
tracking and filtering the first motion state data to obtain second motion state data,
and inputting the second motion state data into a target motion state model, predicting the motion state of the target object at the next moment, and outputting to obtain third motion state data.
The first motion state data may be subjected to tracking filtering processing by a filtering algorithm such as mean filtering, kalman filtering, alpha-beta filtering, etc., to obtain the second motion state data. The motion state of the target object at the next moment can be predicted through a uniform velocity model or a uniform acceleration model.
Of course, other filtering algorithms and motion state models may be used, and are not limited in this regard.
In this way, the first movement state data, the second movement state data, and the third movement state data can be determined more accurately.
Referring to S120, the relative positional relationship of the first, second, and third motion state data on the coordinate axis may reflect the relative deviation state between the first, second, and third motion state data.
If the first motion state data is X 2 The second motion state data is X 3 The third motion state data is X 1 The relative positional relationships corresponding to the various deviation states between the first, second, and third motion state data may be as shown in table 1.
TABLE 1 relative positional relationship Table
Status of Relative positional relationship
1 X 1 X 2 X 3
2 X 1 X 3 X 2
3 X 2 X 1 X 3
4 X 3 X 1 X 2
5 X 2 X 3 X 1
6 X 3 X 2 X 1
In some deviation states, the second motion state data X 3 Clearly deviate from the first movement state data X 2 A larger distance, e.g., states 3 and 4 in table 1. For such deviation states, if the second motion state data X 3 The direct output as a result obviously causes the track to deviate from the current position seriously, resulting in poor track stability of the target, so that similar to the cases of states 3 and 4, special optimization processing is required, and for other deviation states in table 1, correction compensation can be performed based on specific relative deviation amounts.
In some embodiments, in order to more accurately determine the relative positional relationship of the first motion state data, the second motion state data, and the third motion state data, S120 may include:
determining first data X 1 And second data X 2 Is a first relative deviation delta of (1) 1 =X 1 -X 2 And determining the first data X 1 And third data X 3 A second relative deviation delta of (2) 2 =X 1 -X 3
Determining a product k=Δ of the first relative deviation and the second relative deviation 1 ×Δ 2
The relative positional relationship is determined from the product,
the first data may be any one of first motion state data, second motion state data, and third motion state data, the second data may be any one of the first motion state data, the second motion state data, and the third motion state data except the first data, and the third data may be any one of the first motion state data, the second motion state data, and the third motion state data except the first data and the second data.
Here, the relative positional relationship of the first motion state data, the second motion state data, and the third motion state data on the coordinate axis may be determined by determining the product between the relative deviation amount of the data and the other two data and the relative deviation amount with reference to any one of the first motion state data, the second motion state data, and the third motion state data.
For example, the third movement state data, i.e. the first data X, may be used as a reference 1 May be third motion state data. In addition, the second data X 2 Can be the first movement state data, the third data X 3 May be the second motion state data.
First, X can be determined 1 And X is 2 Is a first relative deviation delta of (1) 1 And determining X 1 And X is 3 A second relative deviation delta of (2) 2 Specifically:
Δ 1 =X 1 -X 2
Δ 2 =X 1 -X 3
second, a first relative deviation amount DeltaA can be determined 1 And a second relative deviation delta 2 Is, in particular, a product k of:
k=Δ 1 ×Δ 2
then, the relative positional relationship can be determined from the product k.
In this way, the relative positional relationship of the first movement state data, the second movement state data, and the third movement state data on the coordinate axis can be more accurately determined by the product of the first relative deviation amount and the second relative deviation amount.
In some embodiments, to more clearly determine the relative positional relationship of the first motion state data, the second motion state data, and the third motion state data, determining the relative positional relationship according to the product may include:
in the case where the product k >0, the relative positional relationship is determined such that the second data and the third data are located on the same side of the first data,
In the case of the product k <0, the relative positional relationship is determined such that the second data and the third data are located on different sides of the first data,
in the case where the product k=0, the relative positional relationship is determined such that at least one of the second data and the third data coincides with the first data.
Here, the relative positional relationship of the first data, the second data, and the third data on the coordinate axis can be classified into three types based on the first data: the second data and the third data are located on the same side of the first data, the second data and the third data are located on different sides of the first data, and at least one of the second data and the third data coincides with the first data.
Specifically, the first relative deviation amount is a relative deviation amount between the first data and the second data, and the second relative deviation amount is a relative deviation amount between the first data and the third data. If the product of the first relative deviation amount and the second relative deviation amount is greater than 0, it may be indicated that the first relative deviation amount is the same sign as the second relative deviation amount, and thus it may be determined that the second data and the third data are on the same side of the first data. If the product of the first relative deviation and the second relative deviation is less than 0, it may be indicated that the sign of the first relative deviation is opposite to the sign of the second relative deviation, and thus it may be determined that the second data and the third data are on different sides of the first data. If the product of the first relative deviation amount and the second relative deviation amount is equal to 0, it may be indicated that at least one of the first relative deviation amount and the second relative deviation amount is 0, and thus it may be determined that at least one of the second data and the third data coincides with the first data.
Illustratively, if k>0, the relative positional relationship is X 2 And X 3 Located at X 1 On the same side, belonging to one of states 1, 2, 5, 6 in Table 1, if k<0, the relative positional relationship is X 2 And X 3 Located at X 1 On the different side, belonging to one of states 3, 4 in table 1, if k=0, X 2 And X 3 At least one ofEach is with X 1 Coincidence (not shown in table 1).
Further, for k<0, may be combined with delta 1 And delta 2 Further distinguishing which of states 3 and 4 in table 1 specifically belongs to. If delta 1 >0、Δ 2 <0, then it is in state 3, if delta 1 <0、Δ 2 >0 belongs to state 4, and of course, in the target tracking filtering method provided in the embodiment of the present application, it is not necessary to determine whether the target tracking filtering method specifically belongs to state 3 or state 4.
In this way, the relative positional relationship of the first motion state data, the second motion state data, and the third motion state data on the coordinate axis can be more clearly determined by the sign of the product of the first relative deviation amount and the second relative deviation amount.
Referring to S130, after determining the relative positional relationship of the first motion state data, the second motion state data, and the third motion state data, correction coefficients corresponding to the first motion state data, the second motion state data, and the third motion state data, respectively, may be determined according to the relative positional relationship.
In some embodiments, in order to more effectively correct the motion state data, the step S130 may include:
in the case that the relative position relationship is that the second data and the third data are positioned on the same side of the first data, determining that the first correction coefficient corresponding to the first data is 1, determining the second correction coefficient corresponding to the second data according to the preset adjustment coefficient, the preset scaling coefficient and the first relative deviation amount, and determining the third correction coefficient corresponding to the third data according to the preset adjustment coefficient, the preset scaling coefficient and the second relative deviation amount,
under the condition that the relative position relationship is that the second data and the third data are positioned on different sides of the first data, determining that the first correction coefficient corresponding to the first data is 1, the third correction coefficient corresponding to the third data is 0, determining the second correction coefficient corresponding to the second data according to the preset adjusting coefficient, the preset scaling coefficient and the first relative deviation,
when the relative positional relationship is that at least one of the second data and the third data coincides with the first data, it is determined that the first correction coefficient corresponding to the first data and the third correction coefficient corresponding to the third data are both 1 and the second correction coefficient corresponding to the second data is 0.
Here, the first data is used as a reference, and therefore the first correction coefficient corresponding to the first data may be 1. Then, the second correction coefficient corresponding to the second data and the third correction coefficient corresponding to the third data may be determined in different manners based on different relative positional relationships.
If the relative position relationship is that the second data and the third data are located on the same side of the first data, the correction coefficients of the second data and the third data can be determined based on the preset adjustment coefficient, the preset scaling coefficient and the relative deviation amount.
The specific values of the preset adjusting coefficient and the preset scaling coefficient can be determined after debugging according to actual conditions.
If the relative positional relationship is that the second data and the third data are located on different sides of the first data, the third data span the first data and deviate from the second data seriously, no matter the third data belongs to the state 3 or the state 4 in table 1, and the third correction coefficient corresponding to the third data can be determined to be 0.
If the relative positional relationship is that at least one of the second data and the third data overlaps with the first data, the third data may be directly used as the target motion state data of the target object at the next moment without correction, so that the second correction coefficient corresponding to the second data may be determined to be 0, and the third correction coefficient corresponding to the third data may be determined to be 1.
In this way, the correction coefficient is determined in different ways based on different relative positional relationships, and the correction can be performed in a targeted manner based on the real-time deviation state.
In some embodiments, to more accurately determine the correction coefficient, the preset adjustment coefficient may be any value not less than 0.5 and not more than 2, and the preset scaling coefficient may be any value not less than 0.1 and not more than 0.5.
Here, the preset adjustment coefficient may be used to correct the case where the relative deviation amount is small, and the preset scaling coefficient may be used to correct the case where the relative deviation is large.
Therefore, the abnormal value of the generated correction coefficient can be avoided through the preset adjusting coefficient and the preset scaling coefficient, so that the correction coefficient can be determined more accurately.
In some embodiments, to improve track stability of the target object, determining the second correction coefficient corresponding to the second data according to the preset adjustment coefficient, the preset scaling coefficient, and the first relative deviation amount may include:
determining a second correction factor based on the following formula:
Figure BDA0003968131350000121
determining a third correction coefficient corresponding to the third data according to the preset adjustment coefficient, the preset scaling coefficient and the second relative deviation amount, including:
Determining a third correction factor based on the following formula:
Figure BDA0003968131350000122
wherein a may be a second correction coefficient, b may be a third correction coefficient, Δ 1 May be the second data, delta 2 The third data may be third data, ω may be a preset adjustment coefficient, α may be a preset scaling coefficient, and e may be a natural constant.
Here, a may be a second correction coefficient corresponding to the second data, |Δ 1 The i may be an absolute value of a first relative deviation of the second data from the first data, delta 1 The larger i indicates the more the second data deviates from the first data, and therefore the second correction coefficient a of the second data needs to be reduced. Based on the preset adjustment coefficient omega and the preset scaling coefficient alpha, a can follow the |delta 1 Increasing and decreasing with |delta 1 The decrease and increase of i.
b may be a third corresponding to the third dataCorrection coefficient, |delta 2 The i may be an absolute value of a second relative deviation amount of the third data from the first data, delta 2 The larger i indicates that the third data deviates more from the first data, and therefore the second correction coefficient b of the third data needs to be reduced. Based on the preset adjustment coefficient omega and the preset scaling coefficient alpha, b can follow the |delta 2 Increasing and decreasing with |delta 2 The decrease and increase of i.
Illustratively, if k >0, then X 1 The corresponding correction coefficient may be 1, X 2 The corresponding correction coefficient may be
Figure BDA0003968131350000123
X 3 The corresponding correction factor may be +.>
Figure BDA0003968131350000124
If k<0, then X 1 The corresponding correction coefficient may be 1, X 2 The corresponding correction coefficient may be
Figure BDA0003968131350000125
X 3 The corresponding correction factor may be 0.
If k=0, then X 1 The corresponding correction coefficient may be 1, X 2 The corresponding correction coefficient may be 0, X 3 The corresponding correction factor may be 1.
In this way, by negatively correlating the correction coefficient with the absolute value of the relative deviation amount, the data having a large deviation can be made to have a small specific gravity, and the data having a small deviation can be made to have a large specific gravity, thereby contributing to an improvement in track stability of the target object.
In S140, if the first data is the third motion state data, the second data is the first motion state data, and the third data is the second motion state data, the third motion state data may be corrected based on the first correction coefficient, the first motion state data may be corrected based on the second correction coefficient, the second motion state data may be corrected based on the third correction coefficient, and the target motion state data of the target at the next time may be determined.
In some embodiments, in order to more accurately determine the target motion state data, thereby improving the track stability of the target object, S140 may include:
Determining weights corresponding to the first data, the second data and the third data respectively according to the first correction coefficient, the second correction coefficient and the third correction coefficient,
and based on the weight, carrying out weighted calculation on the first data, the second data and the third data to obtain target motion state data.
In some embodiments, in order to determine weights corresponding to the first data, the second data, and the third data more accurately, determining weights corresponding to the first data, the second data, and the third data according to the first correction coefficient, the second correction coefficient, and the third correction coefficient may include:
the first weight corresponding to the first data is determined as follows:
Figure BDA0003968131350000131
determining a second weight corresponding to the second data as follows:
Figure BDA0003968131350000132
determining a third weight corresponding to the third data as follows:
Figure BDA0003968131350000133
where a may be a second correction coefficient and b may be a third correction coefficient.
In this way, the weights corresponding to the first data, the second data and the third data respectively can be determined more accurately through the above-mentioned process.
In some embodiments, to determine the target motion state data more accurately, the weighting calculation is performed on the first data, the second data, and the third data based on the weights, so as to obtain the target motion state data, which may include:
Determining target motion state data based on the following formula:
Figure BDA0003968131350000141
wherein X can be target motion state data, X 1 Can be the first data, X 2 Can be the second data, X 3 May be the third data.
For example, if the preset adjustment coefficient ω=0.75 and the preset scaling coefficient α=0.2, the variation curve of the correction coefficient with different relative deviation amounts may be as shown in fig. 2. X is X 1 、X 2 、X 3 And X may be as shown in fig. 4. As can be seen from the curve change in the graph, in the related art, the filtered data X is directly processed 3 The curve as the final motion state data fluctuates greatly, so the track stability is poor. The curve of the target motion state data X determined based on the target tracking filtering method provided by the embodiment of the application is obviously smooth, and the stability of the track is obviously and effectively improved.
Therefore, the weight of each motion state data is respectively determined based on the correction coefficient corresponding to each motion state data, and then weighted summation is carried out, so that the target motion state data of the target object at the next moment can be more accurately determined, and the track stability of the target object can be effectively improved.
The target tracking filtering method provided by the embodiment of the application can be used for rapidly identifying the deviation state of the target track and setting the targeted correction coefficient aiming at the identified deviation state. The implementation process of the whole algorithm has small calculated amount, small occupied data storage space, no complex calculation and strong engineering feasibility. Compared with a track stability optimization algorithm requiring complex calculation in the related art, the algorithm in the target tracking filtering method provided by the embodiment of the application is greatly improved in the aspects of instantaneity, algorithm complexity, data calculation amount and the like, and can be corrected and compensated in real time and rapidly in a targeted manner, so that the track stability in the target tracking process is effectively improved.
Based on the same inventive concept, the embodiment of the application also provides a target tracking filtering device. The following describes the target tracking filter apparatus provided in the embodiment of the present application in detail with reference to fig. 4.
Fig. 4 shows a schematic structural diagram of an object tracking filtering device according to an embodiment of the present application.
As shown in fig. 4, the object tracking filtering apparatus may include:
an obtaining module 401, configured to obtain first motion state data, second motion state data, and third motion state data of the target object, where the second motion state data is obtained by performing tracking filtering processing on the first motion state data, the third motion state data is obtained by predicting a motion state of the target object at a next moment based on the second motion state data,
a first determining module 402, configured to determine a relative positional relationship of the first motion state data, the second motion state data, and the third motion state data based on any one of the first motion state data, the second motion state data, and the third motion state data,
a second determining module 403, configured to determine correction coefficients corresponding to the first motion state data, the second motion state data and the third motion state data respectively according to the relative positional relationship,
The third determining module 404 is configured to determine target motion state data of the target object at a next moment according to the correction coefficient, the first motion state data, the second motion state data, and the third motion state data.
Therefore, the first motion state data, the second motion state data and the third motion state data of the target object can be obtained, the relative position relation among the first motion state data, the second motion state data and the third motion state data is determined, the correction coefficients corresponding to the first motion state data, the second motion state data and the third motion state data respectively are determined according to the relative position relation, and the target motion state data of the target object at the next moment is determined according to the correction coefficients, the first motion state data, the second motion state data and the third motion state data. The second motion state data is obtained by tracking and filtering the first motion state data, and the third motion state data is obtained by predicting the motion state of the target object at the next moment based on the second motion state data. The relative position relation of the first motion state data, the second motion state data and the third motion state data can reflect real-time track deviation, so that the first motion state data, the second motion state data and the third motion state data are corrected according to the correction coefficient determined based on the relative position relation, the real-time track deviation can be corrected in a targeted mode, a certain amount of data are not required to be stored for secondary smoothing processing, the track stability of a target object can be effectively improved in real time, a large amount of storage space is not required to be occupied, and algorithm complexity is low.
In some embodiments, to more accurately determine the relative positional relationship of the first, second, and third motion state data, the first determination module 402 may include:
a first determination submodule for determining first data X 1 And second data X 2 Is a first relative deviation delta of (1) 1 =X 1 -X 2 And determining the first data X 1 And third data X 3 A second relative deviation delta of (2) 2 =X 1 -X 3
A second determination sub-module for determining a product k=Δ of the first relative deviation amount and the second relative deviation amount 1 ×Δ 2
A third determining sub-module for determining a relative positional relationship based on the product,
the first data is any one of first motion state data, second motion state data and third motion state data, the second data is any one of the first motion state data, the second motion state data and the third motion state data except the first data, and the third data is the data except the first data and the second data in the first motion state data, the second motion state data and the third motion state data.
In some embodiments, to more clearly determine the relative positional relationship of the first motion state data, the second motion state data, and the third motion state data on the coordinate axes, the third determination submodule may include:
A first determining unit for determining, in the case of the product k >0, that the relative positional relationship is such that the second data and the third data are located on the same side of the first data,
a second determining unit for determining, in the case of the product k <0, that the relative positional relationship is such that the second data and the third data are located on different sides of the first data,
and a third determination unit configured to determine, in a case where the product k=0, that the relative positional relationship is such that at least one of the second data and the third data coincides with the first data.
In some embodiments, to more effectively correct the motion state data, the second determination module 403 may include:
a fourth determining sub-module for determining that the first correction coefficient corresponding to the first data is 1 when the relative positional relationship is that the second data and the third data are located on the same side of the first data, determining the second correction coefficient corresponding to the second data according to the preset adjustment coefficient, the preset scaling coefficient and the first relative deviation amount, and determining the third correction coefficient corresponding to the third data according to the preset adjustment coefficient, the preset scaling coefficient and the second relative deviation amount,
a fifth determining sub-module for determining that the first correction coefficient corresponding to the first data is 1 and the third correction coefficient corresponding to the third data is 0 when the relative positional relationship is that the second data and the third data are located at different sides of the first data, and determining the second correction coefficient corresponding to the second data according to the preset adjustment coefficient, the preset scaling coefficient and the first relative deviation amount,
And the sixth determining submodule is used for determining that the first correction coefficient corresponding to the first data and the third correction coefficient corresponding to the third data are both 1 and the second correction coefficient corresponding to the second data is 0 when the relative position relation is that at least one of the second data and the third data is coincident with the first data.
In some embodiments, to more accurately determine the target motion state data, thereby improving the track stability of the target, the third determination module 404 may include:
a seventh determining submodule for determining weights corresponding to the first data, the second data and the third data respectively according to the first correction coefficient, the second correction coefficient and the third correction coefficient,
and the calculation sub-module is used for carrying out weighted calculation on the first data, the second data and the third data based on the weights to obtain target motion state data.
In some embodiments, to improve stability of the target track, the fourth determination submodule and the fifth determination submodule may be specifically configured to:
determining a second correction factor based on the following formula:
Figure BDA0003968131350000173
the fourth determination submodule may in particular also be used for:
determining a third correction factor based on the following formula:
Figure BDA0003968131350000174
wherein a is a second correction coefficient, b is a third correction coefficient, Δ 1 For the second data, delta 2 For the third data, ω is a preset adjustment coefficient, α is a preset scaling coefficient, and e is a natural constant.
In some embodiments, to more accurately determine the correction coefficient, the preset adjustment coefficient may be any value not less than 0.5 and not more than 2, and the preset scaling coefficient may be any value not less than 0.1 and not more than 0.5.
In some embodiments, in order to more accurately determine weights respectively corresponding to the first data, the second data, and the third data, the seventh determining submodule may include:
the fourth determining unit is configured to determine that a first weight corresponding to the first data is:
Figure BDA0003968131350000171
a fifth determining unit, configured to determine a second weight corresponding to the second data as:
Figure BDA0003968131350000172
a sixth determining unit, configured to determine a third weight corresponding to the third data is:
Figure BDA0003968131350000181
wherein a is a second correction coefficient and b is a third correction coefficient.
In some embodiments, to more accurately determine the target motion state data, the computing sub-module may include:
a calculation unit for determining target motion state data based on the following formula:
Figure BDA0003968131350000182
wherein X is target motion state data.
Fig. 5 shows a schematic structural diagram of an electronic device according to an embodiment of the present application.
As shown in fig. 5, the electronic device 5 is capable of implementing a structure diagram of an exemplary hardware architecture of the electronic device according to the target tracking filtering method and the target tracking filtering apparatus in the embodiment of the present application. The electronic device may refer to an electronic device in an embodiment of the present application.
The electronic device 5 may comprise a processor 501 and a memory 502 storing computer program instructions.
In particular, the processor 501 may include a Central Processing Unit (CPU), or an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), or may be configured to implement one or more integrated circuits of embodiments of the present application.
Memory 502 may include mass storage for data or instructions. By way of example, and not limitation, memory 502 may comprise a Hard Disk Drive (HDD), floppy Disk Drive, flash memory, optical Disk, magneto-optical Disk, magnetic tape, or universal serial bus (Universal Serial Bus, USB) Drive, or a combination of two or more of the foregoing. Memory 502 may include removable or non-removable (or fixed) media, where appropriate. Memory 502 may be internal or external to the integrated gateway disaster recovery device, where appropriate. In a particular embodiment, the memory 502 is a non-volatile solid state memory. In particular embodiments, memory 502 may include Read Only Memory (ROM), random Access Memory (RAM), magnetic disk storage media devices, optical storage media devices, flash memory devices, electrical, optical, or other physical/tangible memory storage devices. Thus, in general, memory 502 includes one or more tangible (non-transitory) computer-readable storage media (e.g., memory devices) encoded with software comprising computer-executable instructions and when the software is executed (e.g., by one or more processors) it is operable to perform the operations described with reference to a method according to an aspect of the present application.
The processor 501 implements any one of the target tracking filtering methods of the above embodiments by reading and executing computer program instructions stored in the memory 502.
In one example, the electronic device may also include a communication interface 503 and a bus 504. As shown in fig. 5, the processor 501, the memory 502, and the communication interface 503 are connected to each other via the bus 504 and perform communication with each other.
The communication interface 503 is mainly used to implement communication between each module, apparatus, unit and/or device in the embodiments of the present application.
Bus 504 includes hardware, software, or both, that couple components of the electronic device to one another. By way of example, and not limitation, the buses may include an Accelerated Graphics Port (AGP) or other graphics bus, an Enhanced Industry Standard Architecture (EISA) bus, a Front Side Bus (FSB), a HyperTransport (HT) interconnect, an Industry Standard Architecture (ISA) bus, an infiniband interconnect, a Low Pin Count (LPC) bus, a memory bus, a micro channel architecture (MCa) bus, a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCI-X) bus, a Serial Advanced Technology Attachment (SATA) bus, a video electronics standards association local (VLB) bus, or other suitable bus, or a combination of two or more of the above. Bus 504 may include one or more buses, where appropriate. Although embodiments of the present application describe and illustrate a particular bus, the present application contemplates any suitable bus or interconnect.
The electronic device can execute the target tracking filtering method in the embodiment of the application, so that the target tracking filtering method and the device described in connection with fig. 1 to 4 are realized.
In addition, in combination with the target tracking filtering method in the above embodiment, the embodiment of the application may be implemented by providing a computer storage medium. The computer storage medium has stored thereon computer program instructions which, when executed by a processor, implement any of the target tracking filtering methods of the above embodiments.
It should be clear that the present application is not limited to the particular arrangements and processes described above and illustrated in the drawings. For the sake of brevity, a detailed description of known methods is omitted here. In the above embodiments, several specific steps are described and shown as examples. However, the method processes of the present application are not limited to the specific steps described and illustrated, and those skilled in the art can make various changes, modifications, and additions, or change the order between steps, after appreciating the spirit of the present application.
The functional blocks shown in the above-described structural block diagrams may be implemented in hardware, software, firmware, or a combination thereof. When implemented in hardware, it may be, for example, an electronic circuit, an Application Specific Integrated Circuit (ASIC), suitable firmware, a plug-in, a function card, or the like. When implemented in software, the elements of the present application are the programs or code segments used to perform the required tasks. The program or code segments may be stored in a machine readable medium or transmitted over transmission media or communication links by a data signal carried in a carrier wave. A "machine-readable medium" may include any medium that can store or transfer information. Examples of machine-readable media include electronic circuitry, semiconductor memory devices, ROM, flash memory, erasable ROM (EROM), floppy disks, CD-ROMs, optical disks, hard disks, fiber optic media, radio Frequency (RF) links, and the like. The code segments may be downloaded via computer networks such as the internet, intranets, etc.
It should also be noted that the exemplary embodiments mentioned in this application describe some methods or systems based on a series of steps or devices. However, the present application is not limited to the order of the above-described steps, that is, the steps may be performed in the order mentioned in the embodiments, may be different from the order in the embodiments, or several steps may be performed simultaneously.
Aspects of the present application are described above with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, enable the implementation of the functions/acts specified in the flowchart and/or block diagram block or blocks. Such a processor may be, but is not limited to being, a general purpose processor, a special purpose processor, an application specific processor, or a field programmable logic circuit. It will also be understood that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware which performs the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In the foregoing, only the specific embodiments of the present application are described, and it will be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the systems, modules and units described above may refer to the corresponding processes in the foregoing method embodiments, which are not repeated herein. It should be understood that the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the present application, which are intended to be included in the scope of the present application.

Claims (10)

1. A method of target tracking filtering, the method comprising:
acquiring first motion state data, second motion state data and third motion state data of a target object, wherein the second motion state data is obtained by tracking and filtering the first motion state data, the third motion state data is obtained by predicting the motion state of the target object at the next moment based on the second motion state data,
determining a relative positional relationship of the first motion state data, the second motion state data, and the third motion state data based on any one of the first motion state data, the second motion state data, and the third motion state data based on the first motion state data, the second motion state data, and the third motion state data,
Determining correction coefficients corresponding to the first motion state data, the second motion state data and the third motion state data respectively according to the relative position relation,
and determining target motion state data of the target object at the next moment according to the correction coefficient, the first motion state data, the second motion state data and the third motion state data.
2. The method according to claim 1, wherein the determining the relative positional relationship of the first movement state data, the second movement state data, and the third movement state data based on any one of the first movement state data, the second movement state data, and the third movement state data based on the first movement state data, the second movement state data, and the third movement state data includes:
determining first data X 1 And second data X 2 Is a first relative deviation delta of (1) 1 =X 1 -X 2 And determining the first data X 1 And third data X 3 A second relative deviation delta of (2) 2 =X 1 -X 3
Determining a product k=Δ of the first relative deviation and the second relative deviation 1 ×Δ 2
Determining the relative positional relationship from the product,
The first data is any one of the first motion state data, the second motion state data and the third motion state data, the second data is any one of the first motion state data, the second motion state data and the third motion state data except the first data, and the third data is any one of the first motion state data, the second motion state data and the third motion state data except the first data and the second data.
3. The method of claim 2, wherein said determining said relative positional relationship from said product comprises:
in the case where the product k >0, determining the relative positional relationship such that the second data and the third data are located on the same side of the first data,
in the case of the product k <0, determining the relative positional relationship such that the second data and the third data are located on different sides of the first data,
in the case where the product k=0, the relative positional relationship is determined such that at least one of the second data and the third data coincides with the first data.
4. A method according to claim 3, wherein determining correction coefficients respectively corresponding to the first motion state data, the second motion state data, and the third motion state data according to the relative positional relationship comprises:
when the relative positional relationship is that the second data and the third data are located on the same side of the first data, determining that a first correction coefficient corresponding to the first data is 1, determining a second correction coefficient corresponding to the second data according to a preset adjustment coefficient, a preset scaling coefficient and the first relative deviation amount, and determining a third correction coefficient corresponding to the third data according to the preset adjustment coefficient, the preset scaling coefficient and the second relative deviation amount,
when the relative position relationship is that the second data and the third data are positioned on different sides of the first data, determining that a first correction coefficient corresponding to the first data is 1, a third correction coefficient corresponding to the third data is 0, determining a second correction coefficient corresponding to the second data according to the preset adjusting coefficient, the preset scaling coefficient and the first relative deviation amount,
When the relative positional relationship is that at least one of the second data and the third data coincides with the first data, it is determined that the first correction coefficient corresponding to the first data and the third correction coefficient corresponding to the third data are both 1, and the second correction coefficient corresponding to the second data is 0.
5. The method of claim 4, wherein determining target motion state data for the target at a next time based on the correction factor, the first motion state data, the second motion state data, and the third motion state data comprises:
determining weights corresponding to the first data, the second data and the third data respectively according to the first correction coefficient, the second correction coefficient and the third correction coefficient,
and based on the weight, carrying out weighted calculation on the first data, the second data and the third data to obtain the target motion state data.
6. The method of claim 4, wherein determining a second correction factor corresponding to the second data based on a preset adjustment factor, a preset scaling factor, and the first relative deviation amount comprises:
Determining the second correction factor based on the following formula:
Figure FDA0003968131340000034
the determining a third correction coefficient corresponding to the third data according to the preset adjustment coefficient, the preset scaling coefficient and the second relative deviation amount includes:
determining the third correction factor based on the following formula:
Figure FDA0003968131340000035
wherein a is the second correction coefficient, b is the third correction coefficient, Δ 1 For the second data, delta 2 For the third data, ω is the preset adjustment coefficient, α is the preset scaling coefficient, and e is a natural constant.
7. The method of claim 6, wherein the preset adjustment factor is any value not less than 0.5 and not greater than 2, and the preset scaling factor is any value not less than 0.1 and not greater than 0.5.
8. The method of claim 5, wherein determining weights for the first data, the second data, and the third data, respectively, based on the first correction coefficient, the second correction coefficient, and the third correction coefficient, comprises:
determining that a first weight corresponding to the first data is:
Figure FDA0003968131340000031
determining that the second weight corresponding to the second data is:
Figure FDA0003968131340000032
and determining a third weight corresponding to the third data as follows:
Figure FDA0003968131340000033
Wherein a is the second correction coefficient and b is the third correction coefficient.
9. The method of claim 8, wherein the weighting the first data, the second data, and the third data based on the weights to obtain the target motion state data comprises:
determining the target motion state data based on the following formula:
Figure FDA0003968131340000041
wherein X is the target motion state data.
10. An object tracking filtering apparatus, the apparatus comprising:
an acquisition module, configured to acquire first motion state data, second motion state data, and third motion state data of a target object, where the second motion state data is obtained by performing tracking filtering processing on the first motion state data, the third motion state data is obtained by predicting a motion state of the target object at a next moment based on the second motion state data,
a first determination module for determining a relative positional relationship of the first motion state data, the second motion state data, and the third motion state data based on any one of the first motion state data, the second motion state data, and the third motion state data based on the first motion state data, the second motion state data, and the third motion state data,
A second determining module, configured to determine correction coefficients corresponding to the first motion state data, the second motion state data, and the third motion state data respectively according to the relative positional relationship,
and the third determining module is used for determining target motion state data of the target object at the next moment according to the correction coefficient, the first motion state data, the second motion state data and the third motion state data.
CN202211502040.7A 2022-11-28 2022-11-28 Target tracking filtering method and device Pending CN115993595A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211502040.7A CN115993595A (en) 2022-11-28 2022-11-28 Target tracking filtering method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211502040.7A CN115993595A (en) 2022-11-28 2022-11-28 Target tracking filtering method and device

Publications (1)

Publication Number Publication Date
CN115993595A true CN115993595A (en) 2023-04-21

Family

ID=85991379

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211502040.7A Pending CN115993595A (en) 2022-11-28 2022-11-28 Target tracking filtering method and device

Country Status (1)

Country Link
CN (1) CN115993595A (en)

Similar Documents

Publication Publication Date Title
CN112987560B (en) Filter control method, device, equipment and computer storage medium
CN117278643B (en) Vehicle-mounted cloud calibration data transmission system based on cloud edge cooperation
CN110174907A (en) A kind of human body target follower method based on adaptive Kalman filter
CN114494357B (en) Target tracking method, device, equipment, readable storage medium and program product thereof
CN114119744B (en) Method, device, equipment and storage medium for constructing point cloud map
CN112880659B (en) Fusion positioning method based on information probability
CN115993595A (en) Target tracking filtering method and device
CN115420290B (en) Non-cooperative target maneuver detection method, device, equipment and computer storage medium
CN117031512A (en) Target object tracking method and device and electronic equipment
CN116704549A (en) Position detection method, device, equipment and storage medium for three-dimensional space key points
CN112307890B (en) Object identification method and device, object identification equipment and storage medium
CN114638159A (en) Method, apparatus, device and computer storage medium for determining corrosion resistance
CN117609796A (en) Data association method and device
CN114061524A (en) Steel coil profile measuring method and device
CN113470070A (en) Driving scene target tracking method, device, equipment and storage medium
CN114660639A (en) Method for determining location confidence error model and method for determining location confidence error
CN112710306A (en) Self-positioning method for BDS and INS combined navigation for train
CN110987945A (en) Defect detection method, defect detection device and detection method of touch display panel
CN116456460B (en) Filtering pretreatment method for TDOA indoor positioning
CN110852397A (en) Adaptive signal fusion method and system based on relative fluctuation
CN118334061B (en) Image segmentation method, system, equipment and storage medium
CN117058506A (en) Lane line fusion method and device
CN112306245A (en) Gesture motion trajectory generation method, device, equipment and storage medium
CN115327526A (en) Method, device and equipment for determining motion state of target
CN117218624A (en) Obstacle information fusion method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination