CN115187637A - Real-time identification method and track estimation method and device for target motion parameters - Google Patents

Real-time identification method and track estimation method and device for target motion parameters Download PDF

Info

Publication number
CN115187637A
CN115187637A CN202211107066.1A CN202211107066A CN115187637A CN 115187637 A CN115187637 A CN 115187637A CN 202211107066 A CN202211107066 A CN 202211107066A CN 115187637 A CN115187637 A CN 115187637A
Authority
CN
China
Prior art keywords
current moment
moment
target
motion parameter
moving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211107066.1A
Other languages
Chinese (zh)
Other versions
CN115187637B (en
Inventor
石恒
刘潇翔
王淑一
于强
李洋
贾涛
宫经刚
李建平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Control Engineering
Original Assignee
Beijing Institute of Control Engineering
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Control Engineering filed Critical Beijing Institute of Control Engineering
Priority to CN202211107066.1A priority Critical patent/CN115187637B/en
Publication of CN115187637A publication Critical patent/CN115187637A/en
Application granted granted Critical
Publication of CN115187637B publication Critical patent/CN115187637B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing

Abstract

The invention relates to the field of image data processing, in particular to a real-time identification method of target motion parameters, a track estimation method and a device, wherein the method comprises the following steps: when the high-orbit image at the current moment is obtained, carrying out one-to-one corresponding pairing on the moving target in the high-orbit image at the current moment and the moving target in the high-orbit image at the last moment to obtain various pairing results; calculating the motion parameter estimation value of each pairing result at the current moment; selecting a final matching result from the multiple matching results according to the motion parameter estimation value at the current moment and the motion parameter output value at the historical moment; determining the motion parameter output value of each motion target at the current moment according to the final matching result; and estimating the position coordinates and reachable domains of each moving target at the next moment according to the output values of the motion parameters. According to the scheme, the identification accuracy can be improved, and the estimation of the future reachable range of the target can be realized.

Description

Real-time identification method and track estimation method and device for target motion parameters
Technical Field
The embodiment of the invention relates to the technical field of image data processing, in particular to a real-time identification method of target motion parameters, a track estimation method and a track estimation device.
Background
The moving target is the key point of research attention of the remote sensing technology, and the extraction of target motion parameters is a necessary means for researching the motion characteristics of the moving target and realizing the estimation of the track of the moving target. With the deepening of practical application, the remote sensing satellite is required to have the processing of autonomous identification, autonomous estimation and the like of target motion parameters.
At present, the identification of target motion parameters generally depends on currently observed image information, the remote sensing satellite has real-time attitude motion, the parameter identification is carried out only by the currently observed image information, and the identification accuracy is low. In addition, the existing target motion parameter identification algorithm cannot estimate the future reachable range of the target in real time, and the rapidity of target tracking planning is reduced.
Disclosure of Invention
Based on the above problems, embodiments of the present invention provide a method for identifying a motion parameter of a target in real time, a method for estimating a trajectory, and a device, which can improve identification accuracy and estimate a future reachable range of the target.
In a first aspect, an embodiment of the present invention provides a method for identifying motion parameters of a target in real time, where the motion parameters at least include a speed and an acceleration; the method comprises the following steps:
when the high-orbit image at the current moment is obtained, carrying out one-to-one corresponding pairing on the moving target in the high-orbit image at the current moment and the moving target in the high-orbit image at the last moment to obtain various pairing results;
calculating the motion parameter estimation value of each pairing result at the current moment;
selecting a final matching result from the multiple matching results according to the motion parameter estimation value at the current moment and the motion parameter output value at the historical moment;
and determining the motion parameter output value of each motion target at the current moment according to the final pairing result.
In a possible implementation manner, the calculating the motion parameter estimation value of each pairing result at the current time includes:
for a set of pairings in a pairing result
Figure 759655DEST_PATH_IMAGE001
The motion parameter estimate is calculated according to the following formula:
Figure 283653DEST_PATH_IMAGE002
wherein, the first and the second end of the pipe are connected with each other,
Figure 1073DEST_PATH_IMAGE003
Figure 333966DEST_PATH_IMAGE004
is the current time of the day
Figure 86021DEST_PATH_IMAGE005
An estimate of the velocity and an estimate of the acceleration of the individual moving object,
Figure 846166DEST_PATH_IMAGE006
is the current time
Figure 50883DEST_PATH_IMAGE007
The position coordinates of the individual moving objects,
Figure 187466DEST_PATH_IMAGE008
last moment of time
Figure 121924DEST_PATH_IMAGE009
The position coordinates of the individual moving objects,
Figure 50041DEST_PATH_IMAGE010
Figure 742054DEST_PATH_IMAGE011
is the last moment
Figure 885590DEST_PATH_IMAGE012
The speed output value and the acceleration output value of each moving object,
Figure 612238DEST_PATH_IMAGE013
Figure 307661DEST_PATH_IMAGE014
and estimating a filter coefficient for the speed and a filter coefficient for the acceleration, wherein T is the interval duration of the current time and the last time.
In a possible implementation manner, the selecting a final matching result from a plurality of matching results according to the motion parameter estimation value at the current time and the motion parameter output value at the historical time includes:
for each pairing result, performing: determining the mean square deviation of the acceleration of each moving object at the current moment on the whole-course track in the pairing result according to the acceleration estimated value at the current moment and the acceleration output value at the historical moment; taking the arithmetic mean of the mean square deviations of the accelerations as the matching index of the pairing result;
and taking the pairing result corresponding to the minimum matching index as the selected final pairing result.
In a possible implementation manner, after the calculating the motion parameter estimation value of each pairing result at the current time, the method further includes:
and taking the pairing result of which the speed estimation value is not more than the preset speed threshold value as an effective pairing result, and executing the selection of the final pairing result in the effective pairing result.
In a possible implementation manner, the determining, according to the final pairing result, a motion parameter output value of each motion target at the current time includes:
aiming at the moving target at the current moment in the final matching result, taking the motion parameter estimation value of the final matching result at the current moment as the motion parameter output value of the current moment;
and aiming at the moving target which is not positioned in the final matching result at the current moment, taking the initial moving parameter as the moving parameter output value at the current moment.
In a second aspect, an embodiment of the present invention further provides a target trajectory estimation method, including:
identifying the motion parameter output value of each moving target at the current moment in real time by using any one of the target motion parameter real-time identification methods;
and estimating the position coordinates and the reachable region of each moving target at the current moment at the next moment according to the motion parameter output value of each moving target at the current moment.
In a possible implementation manner, the estimating the position coordinates of each moving object at the current time at the next time includes:
adding the product of the speed output value of the moving target at the current moment and the interval duration to the position coordinate of the moving target at the current moment to be used as the position coordinate of the moving target at the next moment;
and/or the presence of a gas in the gas,
estimating the reachable domain of each moving target at the current moment at the next moment, wherein the method comprises the following steps:
calculating unit vectors corresponding to a plurality of directions in the circumferential direction respectively by taking the vector direction of the speed output value of the moving target at the current moment as an initial direction;
adding the product of the historical maximum speed of the moving target and the unit vector of each direction to the speed output value of the current moment to obtain the enveloping speed vector of each direction;
adding the product of the envelope velocity vector and the interval duration of each direction to the position coordinate of the moving target at the current moment to obtain an accessible position coordinate in each direction;
and connecting the reachable position coordinates in each direction to form a reachable domain of the moving target at the next moment.
In a third aspect, an embodiment of the present invention further provides a device for identifying motion parameters of a target in real time, where the motion parameters at least include speed and acceleration; the device comprises:
the matching unit is used for performing one-to-one corresponding matching on the moving target in the high rail image at the current moment and the moving target in the high rail image at the previous moment when the high rail image at the current moment is obtained, so as to obtain various matching results;
the computing unit is used for computing the motion parameter estimation value of each matching result at the current moment;
the selection unit is used for selecting a final matching result from the multiple matching results according to the motion parameter estimation value at the current moment and the motion parameter output value at the historical moment;
and the determining unit is used for determining the motion parameter output value of each motion target at the current moment according to the final pairing result.
In a fourth aspect, an embodiment of the present invention further provides a target trajectory estimation apparatus, including:
the identification unit is used for identifying the motion parameter output value of each motion target at the current moment in real time by using the target motion parameter real-time identification device;
and the estimating unit is used for estimating the position coordinates and the reachable domain of each moving target at the current moment at the next moment according to the motion parameter output value of each moving target at the current moment.
In a fifth aspect, an embodiment of the present invention further provides an electronic device, which includes a memory and a processor, where the memory stores a computer program, and the processor executes the computer program to implement the method according to any embodiment of the present specification.
In a sixth aspect, the present invention further provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed in a computer, the computer program causes the computer to execute the method described in any embodiment of the present specification.
The embodiment of the invention provides a real-time identification method and a track estimation method and device for target motion parameters, wherein after a high-orbit image at the current moment is obtained, because the high probability of each motion target at the current moment is obtained after each motion target at the last moment undergoes motion change, the identification of the motion parameters cannot be carried out only by relying on the high-orbit image at the current moment, but the motion target at the current moment and the motion target at the last moment need to be correspondingly paired one by one so as to indicate that the motion target at the current moment in the pairing is in a state after the motion target at the last moment in the pairing undergoes motion change; because the pairing result has a plurality of types, the optimal pairing result is required to be selected as the final pairing result, and when the final pairing result is selected, the final pairing result is more accurate according to the motion parameter estimation value at the current moment and the motion parameter output value at the historical moment, namely, the motion parameter output value of each motion target at the current moment determined by the accurate final pairing result is more accurate; in addition, the estimation of the future reachable range of the target is realized by utilizing the recognized motion parameter output value.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a flowchart of a real-time identification method for a target motion parameter according to an embodiment of the present invention;
FIG. 2 is a flowchart of a final pairing result selection method according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating a final pairing result according to an embodiment of the present invention;
FIG. 4 is a flowchart of a target trajectory estimation method according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a trajectory estimation result according to an embodiment of the present invention;
FIG. 6 is a diagram of a hardware architecture of an electronic device according to an embodiment of the present invention;
FIG. 7 is a block diagram of a real-time target motion parameter identification apparatus according to an embodiment of the present invention;
FIG. 8 is a diagram of another electronic device according to an embodiment of the present invention;
fig. 9 is a structural diagram of a target trajectory estimation device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer and more complete, the technical solutions in the embodiments of the present invention will be described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention, and based on the embodiments of the present invention, all other embodiments obtained by a person of ordinary skill in the art without creative efforts belong to the scope of the present invention.
Referring to fig. 1, an embodiment of the present invention provides a method for identifying motion parameters of a target in real time, where the motion parameters at least include velocity and acceleration; the method comprises the following steps:
when a high-orbit image at the current moment is obtained, carrying out one-to-one corresponding pairing on a moving target in the high-orbit image at the current moment and a moving target in the high-orbit image at the last moment to obtain various pairing results;
104, selecting a final matching result from the multiple matching results according to the motion parameter estimation value at the current moment and the motion parameter output value at the historical moment;
and 106, determining the motion parameter output value of each motion target at the current moment according to the final pairing result.
In the embodiment of the invention, after the high-orbit image at the current moment is acquired, because the high probability of each moving target at the current moment is obtained after each moving target at the last moment undergoes the motion change, the moving parameters cannot be identified only by depending on the high-orbit image at the current moment, but the moving target at the current moment and the moving target at the last moment need to be paired in a one-to-one correspondence manner, so that the moving target at the current moment in the pairing is in a state after the moving target at the last moment in the pairing undergoes the motion change; because there are many matching results, the optimal matching result needs to be selected as the final matching result, and when the final matching result is selected, the selected final matching result is more accurate not only according to the motion parameter estimated value at the current time, but also by combining the motion parameter output values at the historical times, that is, the motion parameter output values of each motion target at the current time determined by using the accurate final matching result are more accurate.
The manner in which the various steps shown in fig. 1 are performed is described below.
First, in step 100, whenever a high-orbit image at the current time is obtained, a moving target in the high-orbit image at the current time is paired with a moving target in the high-orbit image at the previous time in a one-to-one correspondence manner, so as to obtain multiple pairing results.
In the embodiment of the present invention, the acquisition frequency of the high-orbit image, that is, the interval duration T, may be set in advance. One high-track image is acquired every T intervals. At least one moving object may be included in each high-track image. The high-rail image comprises a plurality of moving targets and position coordinates of the moving targets in the imaging view field coordinate system, which can be realized by the existing scheme, and the embodiment is not repeated.
The high-orbit image in the embodiment of the present invention may be a remote sensing image obtained by a satellite, or may be an observation image in another mode.
Since the target is in motion, the number of moving targets included in the high-rail images at two adjacent time points may be the same or different.
Assuming that the number of moving objects in the high-orbit image at the current time is N, the number of moving objects in the high-orbit image at the previous time is M, and both N and M are integers greater than 0, then there are two cases when pairing is performed in this step:
case one, M is greater than or equal to N
Case two, M < N
When the situation is one, M moving objects from the previous moment are needed (the position coordinates are
Figure 955812DEST_PATH_IMAGE015
) Taking out N moving targets (the position coordinates are
Figure 434197DEST_PATH_IMAGE016
) Arranging the N moving targets in one-to-one correspondence with the N moving targets at the current moment to form various matching results
Figure 608827DEST_PATH_IMAGE017
The number of the pairing result is the number of all permutations of taking N elements out of M different elements.
In case two, N moving objects from the current time are needed (position coordinates are
Figure 84939DEST_PATH_IMAGE018
) Taking M moving targets (with position coordinates of
Figure 345019DEST_PATH_IMAGE019
) Arranging the M moving targets in one-to-one correspondence with the M moving targets at the previous moment to form various matching results
Figure 627095DEST_PATH_IMAGE020
The number of the pairing results is the number of all permutations of taking M elements out of N different elements.
Taking the above case as an example, assuming that M =3 and N =2, 2 moving objects need to be selected from 3 moving objects (for example, objects M1, M2, and M3) at the previous time, there are three selection manners, and for each selection manner, a one-to-one pairing is performed with 2 moving objects (for example, objects N1 and N2) at the current time, there are two pairing results, that is, the number of the pairing results is the number of all permutations of 2 elements taken out of 3 different elements, that is, 6 pairing results. The results of 6 pairings were as follows: the first selection mode is M1 and M2, and the pairing result is { M1-N1, M2-N2}, { M2-N1, M1-N2}; the second selection mode is M1 and M3, and the pairing result is { M1-N1, M3-N2}, { M3-N1, M1-N2}; the third selection mode is M2 and M3, and the pairing result is { M2-N1, M3-N2}, { M3-N1, M2-N2}.
Then, for step 102, the motion parameter estimation value of each pairing result at the current time is calculated.
In an embodiment of the present invention, in order to make the calculated motion parameter estimation value of each moving object at the current time more accurate, the position coordinate and the motion parameter output value at the previous time may be used for estimation, specifically, for a set of matching results in one of the matching results
Figure 528668DEST_PATH_IMAGE021
The motion parameter estimate may be calculated according to the following formula:
Figure 565894DEST_PATH_IMAGE022
wherein the content of the first and second substances,
Figure 923057DEST_PATH_IMAGE023
Figure 8825DEST_PATH_IMAGE024
is the current time
Figure 892467DEST_PATH_IMAGE025
An estimate of the velocity and an estimate of the acceleration of the individual moving object,
Figure 975961DEST_PATH_IMAGE026
is the current time
Figure 617158DEST_PATH_IMAGE027
The position coordinates of the individual moving objects,
Figure 568933DEST_PATH_IMAGE028
is the last moment
Figure 182448DEST_PATH_IMAGE029
Position coordinates of moving object,
Figure 499160DEST_PATH_IMAGE030
Figure 955549DEST_PATH_IMAGE031
Is the last moment
Figure 586382DEST_PATH_IMAGE032
The speed output value and the acceleration output value of each moving object,
Figure 582632DEST_PATH_IMAGE033
Figure 132562DEST_PATH_IMAGE034
and estimating a filter coefficient for the speed and a filter coefficient for the acceleration, wherein T is the interval duration of the current time and the last time.
In the above formula, the velocity estimation filter coefficient and the acceleration estimation filter coefficient may take 0.98.
That is, for each of the 6 pairing results obtained in the above example, each pairing result includes two pairs, and each pairing pair can calculate the velocity estimation value and the acceleration estimation value corresponding to each moving object (N1, N2) at the current time according to the above formula.
It should be noted that the position coordinates in the above formula refer to the position coordinates of the moving object in the track reference coordinate system, wherein the position coordinates in the track reference coordinate system can be determined by the following formula:
Figure 748351DEST_PATH_IMAGE035
wherein the content of the first and second substances,
Figure 182875DEST_PATH_IMAGE036
determining an attitude conversion matrix of the track reference coordinate system relative to the imaging view field coordinate system by the attitude angle and the orbit position of the satellite in the track reference coordinate system;
Figure 630037DEST_PATH_IMAGE037
is the position coordinate of the moving object in the imaging view field coordinate system.
In the embodiment of the present invention, each pair of the pairing result, for example, a pair of the previous moving object j and the current moving object i, is assumed that the current moving object i is in a state in which the previous moving object j has undergone a motion change within an interval time T, so that when a motion parameter estimation value of the moving object at the current time is calculated, an actual motion parameter output value of the moving object at the previous time is fully considered, and a determination basis is further provided for subsequently determining whether the pairing is appropriate.
Next, in step 104, a final matching result is selected from the plurality of matching results according to the motion parameter estimation value at the current time and the motion parameter output value at the historical time.
In step 102, having obtained the motion parameter estimation value of each motion object at the current time for each pairing result, it is necessary to select the best pairing result from a plurality of pairing results as the final pairing result based on known data, and in an embodiment of the present invention, please refer to fig. 2, which can be implemented at least in one of the following manners:
step 200, for each pairing result, executing: determining the mean square error of the acceleration of each moving object at the current moment on the whole-course track in the pairing result according to the estimated acceleration value at the current moment and the output acceleration value at the historical moment; taking the arithmetic mean of the mean square deviations of the accelerations as the matching index of the pairing result;
and step 202, taking the pairing result corresponding to the minimum matching index as the selected final pairing result.
In the embodiment of the present invention, the motion parameter output value at the historical time may be obtained by calling the steps 100 to 106 when each historical time is respectively used as the current time.
Taking the matching result { M1-N1, M2-N2} as an example, it is known that N1 and N2 respectively correspond to the acceleration estimated value at the current moment and that M1 and M2 respectively correspond to the acceleration output value at the previous moment, so that it is also possible to know the acceleration output values of M1/M2 and the moving target matched at the previous moment, that is, to know the full-distance trajectories respectively corresponding to the moving targets N1 and N2 at the current moment, and then, for the acceleration output values corresponding to the historical moments and the acceleration estimated values corresponding to the current moment on the full-distance trajectories, it is possible to obtain the acceleration mean square deviations of the moving targets N1 and N2 on the full-distance trajectories. Further, since the matching result includes two moving objects at the current time, the arithmetic mean of the mean square deviations of the accelerations of the two moving objects N1, N2 at the current time can be calculated.
The arithmetic mean value is used as the matching index of the pairing result, and as can be seen, the smaller the matching index is, the more stable the acceleration change is, so that the pairing result corresponding to the minimum matching index can be used as the final pairing result.
Please refer to fig. 3, which is a diagram illustrating a final pairing result. In fig. 3, there are three moving targets at the previous time and four moving targets at the current time, and after the final pairing result is determined, the target 4 at the current time cannot be paired with the three targets at the previous time, which indicates that the target 4 is a newly appeared target at the current time.
In the embodiment of the invention, the influence of the whole-course track of the moving target on the change from the last moment to the current moment is fully considered, so that the final pairing result selected by using the method is more accurate, and the identification result of the moving parameter is more accurate.
In an embodiment of the present invention, in order to further reduce the workload and reduce the influence of the invalid pairing result on the final pairing result selection, the method may further include, after step 102 and before step 104: and taking the pairing result of which the speed estimation value is not more than the preset speed threshold value as an effective pairing result, and executing the selection of the final pairing result in the effective pairing result.
Taking the pairing result { M1-N1, M2-N2} as an example, if the speed estimation value of N1 and/or the speed estimation value of N2 is greater than the preset speed threshold value, it indicates that the pairing result is a non-valid pairing result, and the pairing result is discarded.
And finally, aiming at the step 106, determining the motion parameter output value of each motion target at the current moment according to the final matching result.
In the embodiment of the present invention, it can be known in step 100 that there may be two cases in the number of moving targets at the previous time and the current time, and if the number is the first case, the moving targets at the current time are both located in the final pairing result; in case two, there is a moving object at the current time that is not located in the final pairing result, such as object 4 in fig. 3.
However, in either case one or case two, in the embodiment of the present invention, the motion parameter output value of each motion object at the current time may be determined as follows:
aiming at the moving target at the current moment in the final matching result, taking the motion parameter estimation value of the final matching result at the current moment as the motion parameter output value at the current moment;
and aiming at the moving target which is not positioned in the final matching result at the current moment, taking the initial moving parameter as the moving parameter output value at the current moment.
If the moving object not located at the current time in the final pairing result belongs to an object appearing newly at the current time, the current time may be used as the initial calculation time of the moving object, and the initial moving parameter may be: the speed output value and the acceleration output value at the current moment are both 0.
In the embodiment of the invention, after the high-orbit image at the current moment is acquired, because the high probability of each moving target at the current moment is obtained after each moving target at the last moment undergoes the motion change, the moving parameters cannot be identified only by depending on the high-orbit image at the current moment, but the moving target at the current moment and the moving target at the last moment need to be paired in a one-to-one correspondence manner, so that the moving target at the current moment in the pairing is in a state after the moving target at the last moment in the pairing undergoes the motion change; because there are many matching results, the optimal matching result needs to be selected as the final matching result, and when the final matching result is selected, the selected final matching result is more accurate not only according to the motion parameter estimated value at the current time, but also by combining the motion parameter output values at the historical times, that is, the motion parameter output values of each motion target at the current time determined by using the accurate final matching result are more accurate.
Referring to fig. 4, an embodiment of the present invention further provides a method for estimating a target track, including:
step 400, identifying the motion parameter output value of each moving target at the current moment in real time by using any one of the above-mentioned target motion parameter real-time identification methods.
Wherein the motion parameter output values include a velocity output value and an acceleration output value.
And 402, estimating the position coordinates and reachable domains of the moving targets at the current moment at the next moment according to the motion parameter output values of the moving targets at the current moment.
Please refer to fig. 5, which is a diagram illustrating the track estimation result at the next time. Since the target 4 is a newly appearing target at the present time, the position coordinates and the reachable domain cannot be estimated.
First, estimating the position coordinates of each moving object at the current time at the next time may include: and adding the product of the speed output value of the moving target at the current moment and the interval duration to the position coordinate of the moving target at the current moment to serve as the position coordinate of the moving target at the next moment.
Taking a certain moving object at the current moment as an example, the position coordinate of the moving object at the next moment is calculated by the following calculation formula:
Figure 226234DEST_PATH_IMAGE038
wherein the content of the first and second substances,
Figure 329320DEST_PATH_IMAGE039
is the position coordinate of the moving object at the next moment,
Figure 426589DEST_PATH_IMAGE040
is the position coordinate of the moving object at the current moment,
Figure 869202DEST_PATH_IMAGE041
and T is the speed output value of the moving target at the current moment, and is the interval duration.
Then, estimating the reachable domain of each moving object at the current moment at the next moment may include S1-S4:
s1, taking the vector direction of the speed output value of the moving target at the current moment as an initial direction, and calculating unit vectors corresponding to a plurality of directions in the circumferential direction.
In particular, it may be at fixed angular intervals
Figure 433039DEST_PATH_IMAGE042
The respective directions in the circumferential direction are determined. Wherein the content of the first and second substances,
Figure 351316DEST_PATH_IMAGE043
the value range is preferably (0, 5)]The degree, for example, takes a value of 1 degree.
Then the unit vectors corresponding to the plurality of directions in step S1 are
Figure 127642DEST_PATH_IMAGE044
Figure 283817DEST_PATH_IMAGE045
Indicating rounding up.
In the embodiment of the present invention, the calculation method of the unit vector may be calculated by the following formula:
Figure 320404DEST_PATH_IMAGE046
wherein the content of the first and second substances,
Figure 132503DEST_PATH_IMAGE047
to represent
Figure 837153DEST_PATH_IMAGE048
Die (2).
And S2, adding the product of the historical maximum speed of the moving target and the unit vector of each direction to the speed of the current time to obtain the envelope speed vector of each direction.
In the embodiment of the present invention, the motion parameter may further include a historical maximum rate in addition to the velocity and the acceleration, and if a modulus of the velocity output value at the current time is greater than the historical maximum rate of the moving object, the historical maximum rate is updated to the modulus of the velocity output value at the current time.
Specifically, the envelope velocity vector of each direction in step S2
Figure 988780DEST_PATH_IMAGE049
Can be calculated by the following formula:
Figure 159999DEST_PATH_IMAGE050
wherein the content of the first and second substances,
Figure 787289DEST_PATH_IMAGE051
is the historical maximum rate of the moving object.
And S3, adding the product of the envelope velocity vector and the interval duration of each direction to the position coordinate of the moving target at the current moment to obtain the reachable position coordinate in each direction.
Specifically, the reachable position coordinates in each direction in step S3
Figure 639838DEST_PATH_IMAGE052
Can be calculated by the following formula:
Figure 708289DEST_PATH_IMAGE053
and S4, connecting the reachable position coordinates in each direction to form a reachable domain of the moving target at the next moment.
In the embodiment of the invention, the track association of the moving targets at adjacent moments is realized by continuously processing the high-track images of the moving targets, so that the position coordinates and the reachable range of each moving target at the next moment are estimated, the input information containing the real-time moving parameters of the moving targets is provided for target tracking planning, and the estimation accuracy of the track of the moving targets and the tracking efficiency of the moving targets can be improved. In addition, the real-time orbit and attitude motion information of the satellite is considered in the coordinate conversion of the moving target, and the calculation reference of the motion parameters of a plurality of moving targets is unified; when the moving target is matched, the current moving trend and the historical moving characteristic information of the target are comprehensively considered, and the accuracy of target matching is improved; in addition, a method for estimating the reachable domain of the moving target is also provided, and better support is provided for the tracking planning of the moving target.
As shown in fig. 6 and 7, an embodiment of the invention provides a device for identifying a motion parameter of a target in real time. The device embodiments may be implemented by software, or by hardware, or by a combination of hardware and software. In terms of hardware, as shown in fig. 6, for a hardware architecture diagram of an electronic device where a target motion parameter real-time identification apparatus is provided in an embodiment of the present invention, in addition to the processor, the memory, the network interface, and the nonvolatile memory shown in fig. 6, the electronic device where the apparatus is located in the embodiment may also include other hardware, such as a forwarding chip responsible for processing a message, and the like. As shown in fig. 7, a logical device is formed by reading a corresponding computer program in a non-volatile memory into a memory by a CPU of an electronic device in which the device is located and running the computer program. In the device for identifying the motion parameters of the target in real time provided by the embodiment, the motion parameters at least comprise speed and acceleration; the method comprises the following steps:
a pairing unit 701, configured to pair a moving target in the high-track image at the current time with a moving target in the high-track image at the previous time in a one-to-one correspondence manner to obtain multiple pairing results each time the high-track image at the current time is obtained;
a calculating unit 702, configured to calculate an estimated value of the motion parameter of each pairing result at the current time;
a selecting unit 703, configured to select a final matching result from the multiple matching results according to the motion parameter estimation value at the current time and the motion parameter output value at the historical time;
a determining unit 704, configured to determine a motion parameter output value of each motion target at the current time according to the final pairing result.
In an embodiment of the present invention, the computing unit is specifically configured to: for a set of pairings in a pairing result
Figure 519250DEST_PATH_IMAGE054
The motion parameter estimation value is calculated according to the following formula:
Figure 40361DEST_PATH_IMAGE055
wherein the content of the first and second substances,
Figure 490409DEST_PATH_IMAGE056
Figure 147787DEST_PATH_IMAGE057
is the current time
Figure 598491DEST_PATH_IMAGE058
An estimate of the velocity and an estimate of the acceleration of the individual moving object,
Figure 341319DEST_PATH_IMAGE059
is the current time
Figure 535671DEST_PATH_IMAGE060
The position coordinates of the individual moving objects,
Figure 47555DEST_PATH_IMAGE061
is the last moment
Figure 994127DEST_PATH_IMAGE062
The position coordinates of the individual moving objects,
Figure 896355DEST_PATH_IMAGE063
Figure 425556DEST_PATH_IMAGE064
is the last moment
Figure 588684DEST_PATH_IMAGE065
The speed output value and the acceleration output value of each moving object,
Figure 850032DEST_PATH_IMAGE066
Figure 833032DEST_PATH_IMAGE067
and estimating a filter coefficient for the speed and a filter coefficient for the acceleration, wherein T is the interval duration of the current time and the last time.
In an embodiment of the present invention, the selecting unit is specifically configured to:
for each pairing result, performing: determining the mean square deviation of the acceleration of each moving object at the current moment on the whole-course track in the pairing result according to the acceleration estimated value at the current moment and the acceleration output value at the historical moment; taking the arithmetic mean of the mean square deviations of the accelerations as the matching index of the pairing result;
and taking the pairing result corresponding to the minimum matching index as the selected final high pairing result.
In an embodiment of the invention, the selection unit is further configured to: and taking the pairing result of which the speed estimation value is not more than the preset speed threshold value as an effective pairing result, and executing the selection of the final pairing result in the effective pairing result.
In an embodiment of the present invention, the determining unit is specifically configured to: aiming at the moving target at the current moment in the final matching result, taking the motion parameter estimation value of the final matching result at the current moment as the motion parameter output value at the current moment; and aiming at the moving target which is not positioned in the final matching result at the current moment, taking the initial moving parameter as the moving parameter output value at the current moment.
As shown in fig. 8 and 9, an embodiment of the present invention provides a target trajectory estimation device. The device embodiments may be implemented by software, or by hardware, or by a combination of hardware and software. In terms of hardware, as shown in fig. 8, for a hardware architecture diagram of an electronic device where a target trajectory estimation device provided in the embodiment of the present invention is located, in addition to the processor, the memory, the network interface, and the nonvolatile memory shown in fig. 8, the electronic device where the device is located in the embodiment may also include other hardware, such as a forwarding chip responsible for processing a message. Taking a software implementation as an example, as shown in fig. 9, as a logical device, a CPU of an electronic device in which the device is located reads a corresponding computer program in a non-volatile memory into a memory and runs the computer program. The target trajectory estimation device provided by the embodiment includes:
an identifying unit 901, configured to identify a motion parameter output value of each moving object at the current time in real time by using the object motion parameter real-time identifying device;
and the estimating unit 902 is configured to estimate, according to the motion parameter output value of each moving object at the current time, a position coordinate and a reachable domain of each moving object at the next time at the current time.
In an embodiment of the present invention, when the estimating unit estimates the position coordinate of each moving target at the next time at the current time, the estimating unit specifically includes: adding the product of the speed output value of the moving target at the current moment and the interval duration to the position coordinate of the moving target at the current moment to be used as the position coordinate of the moving target at the next moment;
in an embodiment of the present invention, when the estimating unit estimates the reachable domain of each moving object at the current time at the next time, the estimating unit specifically includes: calculating unit vectors corresponding to a plurality of directions in the circumferential direction respectively by taking the vector direction of the speed output value of the moving target at the current moment as an initial direction; adding the product of the historical maximum speed of the moving target and the unit vector of each direction to the speed output value of the current moment to obtain the envelope speed vector of each direction; adding the product of the envelope velocity vector and the interval duration of each direction to the position coordinate of the moving target at the current moment to obtain an accessible position coordinate in each direction; and connecting the reachable position coordinates in each direction to form a reachable domain of the moving target at the next moment.
It is understood that the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on a target motion parameter real-time identification device/target trajectory estimation device. In other embodiments of the present invention, a real-time object motion parameter identification/object trajectory estimation device may include more or fewer components than those shown, or some components may be combined, some components may be separated, or different arrangements of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
For the information interaction, execution process and other contents between the modules in the above-mentioned apparatus, because the same concept is based on as the method embodiment of the present invention, specific contents can refer to the description in the method embodiment of the present invention, and are not described herein again.
The embodiment of the invention also provides electronic equipment which comprises a memory and a processor, wherein the memory stores a computer program, and the processor executes the computer program to realize the method in any embodiment of the invention.
An embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, causes the processor to execute the method described in any embodiment of the present invention.
Specifically, a system or an apparatus equipped with a storage medium on which software program codes that realize the functions of any of the above-described embodiments are stored may be provided, and a computer (or a CPU or MPU) of the system or the apparatus is caused to read out and execute the program codes stored in the storage medium.
In this case, the program code itself read from the storage medium can realize the functions of any of the above-described embodiments, and thus the program code and the storage medium storing the program code constitute a part of the present invention.
Examples of the storage medium for supplying the program code include a floppy disk, a hard disk, a magneto-optical disk, an optical disk (e.g., CD-ROM, CD-R, CD-RW, DVD-ROM, DVD-RAM, DVD-RW, DVD + RW), a magnetic tape, a nonvolatile memory card, and a ROM. Alternatively, the program code may be downloaded from a server computer via a communications network.
Further, it should be clear that the functions of any one of the above-described embodiments may be implemented not only by executing the program code read out by the computer, but also by causing an operating system or the like operating on the computer to perform a part or all of the actual operations based on instructions of the program code.
Further, it is to be understood that the program code read out from the storage medium is written to a memory provided in an expansion board inserted into the computer or to a memory provided in an expansion module connected to the computer, and then causes a CPU or the like mounted on the expansion board or the expansion module to perform part or all of the actual operations based on instructions of the program code, thereby realizing the functions of any of the above-described embodiments.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising a …" does not exclude the presence of additional identical elements in a process, method, article, or apparatus that comprises the element.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A real-time identification method for target motion parameters is characterized in that the motion parameters at least comprise speed and acceleration; the method comprises the following steps:
when the high-orbit image at the current moment is obtained, carrying out one-to-one corresponding pairing on the moving target in the high-orbit image at the current moment and the moving target in the high-orbit image at the last moment to obtain various pairing results;
calculating the motion parameter estimation value of each pairing result at the current moment;
selecting a final pairing result from the multiple pairing results according to the motion parameter estimation value at the current moment and the motion parameter output value at the historical moment;
and determining the motion parameter output value of each motion target at the current moment according to the final pairing result.
2. The method of claim 1, wherein calculating the motion parameter estimate for each paired result at the current time comprises:
for a set of pairings in a pairing result
Figure 563626DEST_PATH_IMAGE001
The motion parameter estimation value is calculated according to the following formula:
Figure 277504DEST_PATH_IMAGE002
wherein, the first and the second end of the pipe are connected with each other,
Figure 588399DEST_PATH_IMAGE003
Figure 514767DEST_PATH_IMAGE004
is the current time
Figure 624412DEST_PATH_IMAGE005
An estimate of the velocity and an estimate of the acceleration of the individual moving object,
Figure 978033DEST_PATH_IMAGE006
is the current time
Figure 776225DEST_PATH_IMAGE007
The position coordinates of the individual moving objects,
Figure 506283DEST_PATH_IMAGE008
is the last moment
Figure 706320DEST_PATH_IMAGE009
The position coordinates of the individual moving objects,
Figure 496422DEST_PATH_IMAGE010
Figure 781910DEST_PATH_IMAGE011
is the last moment
Figure 551545DEST_PATH_IMAGE012
An exerciseThe velocity output value and the acceleration output value of the target,
Figure 606089DEST_PATH_IMAGE013
Figure 567091DEST_PATH_IMAGE014
and estimating a filter coefficient for the speed and a filter coefficient for the acceleration, wherein T is the interval duration of the current time and the last time.
3. The method of claim 1, wherein selecting a final matching result from a plurality of matching results according to the estimated value of the motion parameter at the current time and the output value of the motion parameter at the historical time comprises:
for each pairing result, performing: determining the mean square deviation of the acceleration of each moving object at the current moment on the whole-course track in the pairing result according to the acceleration estimated value at the current moment and the acceleration output value at the historical moment; and taking the arithmetic mean of the mean square deviations of the accelerations as the matching index of the pairing result;
and taking the pairing result corresponding to the minimum matching index as the selected final pairing result.
4. The method as claimed in claim 1 or 3, wherein after said calculating the motion parameter estimation value of each pairing result at the current time, further comprising:
and taking the pairing result of which the speed estimation value is not more than the preset speed threshold value as an effective pairing result, and executing the selection of the final pairing result in the effective pairing result.
5. The method of claim 1,
the determining the motion parameter output value of each motion target at the current moment according to the final pairing result comprises:
aiming at the moving target at the current moment in the final matching result, taking the motion parameter estimation value of the final matching result at the current moment as the motion parameter output value at the current moment;
and aiming at the moving target which is not positioned in the final matching result at the current moment, taking the initial moving parameter as the moving parameter output value at the current moment.
6. A target track prediction method is characterized by comprising the following steps:
identifying the motion parameter output value of each motion target at the current moment in real time by using the real-time target motion parameter identification method of any one of claims 1 to 5;
and estimating the position coordinates and reachable domains of the moving targets at the current moment at the next moment according to the motion parameter output values of the moving targets at the current moment.
7. The method of claim 6,
the estimating of the position coordinates of each moving object at the current moment at the next moment includes:
adding the product of the speed output value of the moving target at the current moment and the interval duration to the position coordinate of the moving target at the current moment to serve as the position coordinate of the moving target at the next moment;
and/or the presence of a gas in the gas,
estimating the reachable domain of each moving target at the current moment at the next moment, comprising:
calculating unit vectors respectively corresponding to a plurality of directions in the circumferential direction by taking the vector direction of the speed output value of the moving target at the current moment as an initial direction;
adding the product of the historical maximum speed of the moving target and the unit vector of each direction to the speed output value of the current moment to obtain the envelope speed vector of each direction;
adding the product of the envelope velocity vector and the interval duration of each direction to the position coordinate of the moving target at the current moment to obtain an accessible position coordinate in each direction;
and connecting the reachable position coordinates in each direction to form a reachable domain of the moving target at the next moment.
8. A real-time identification device for motion parameters of a target is characterized in that the motion parameters at least comprise speed and acceleration; the device comprises:
the matching unit is used for performing one-to-one corresponding matching on the moving target in the high rail image at the current moment and the moving target in the high rail image at the previous moment when the high rail image at the current moment is obtained, so as to obtain various matching results;
the computing unit is used for computing the motion parameter estimation value of each matching result at the current moment;
the selection unit is used for selecting a final matching result from the multiple matching results according to the motion parameter estimation value at the current moment and the motion parameter output value at the historical moment;
and the determining unit is used for determining the motion parameter output value of each motion target at the current moment according to the final pairing result.
9. An object trajectory estimation device, comprising:
an identification unit, for identifying the motion parameter output value of each moving object at the current moment in real time by using the object motion parameter real-time identification device of claim 8;
and the estimating unit is used for estimating the position coordinates and the reachable domain of each moving target at the current moment at the next moment according to the motion parameter output value of each moving target at the current moment.
10. An electronic device comprising a memory having stored therein a computer program and a processor that, when executing the computer program, implements the method of any of claims 1-7.
CN202211107066.1A 2022-09-13 2022-09-13 Real-time identification method and track estimation method and device for target motion parameters Active CN115187637B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211107066.1A CN115187637B (en) 2022-09-13 2022-09-13 Real-time identification method and track estimation method and device for target motion parameters

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211107066.1A CN115187637B (en) 2022-09-13 2022-09-13 Real-time identification method and track estimation method and device for target motion parameters

Publications (2)

Publication Number Publication Date
CN115187637A true CN115187637A (en) 2022-10-14
CN115187637B CN115187637B (en) 2022-11-22

Family

ID=83524861

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211107066.1A Active CN115187637B (en) 2022-09-13 2022-09-13 Real-time identification method and track estimation method and device for target motion parameters

Country Status (1)

Country Link
CN (1) CN115187637B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102663743A (en) * 2012-03-23 2012-09-12 西安电子科技大学 Multi-camera cooperative character tracking method in complex scene
CN105913459A (en) * 2016-05-10 2016-08-31 中国科学院自动化研究所 Moving object detection method based on high resolution continuous shooting images
CN109272530A (en) * 2018-08-08 2019-01-25 北京航空航天大学 Method for tracking target and device towards space base monitoring scene
CN109712173A (en) * 2018-12-05 2019-05-03 北京空间机电研究所 A kind of picture position method for estimating based on Kalman filter
WO2019218861A1 (en) * 2018-05-14 2019-11-21 华为技术有限公司 Method for estimating driving road and driving road estimation system
CN111361570A (en) * 2020-03-09 2020-07-03 福建汉特云智能科技有限公司 Multi-target tracking reverse verification method and storage medium
US20200265592A1 (en) * 2019-02-18 2020-08-20 Raytheon Company Three-frame difference target acquisition and tracking using overlapping target images
CN113920156A (en) * 2021-08-30 2022-01-11 暨南大学 Acceleration estimation method, system, device and medium based on video target tracking
CN114091561A (en) * 2020-08-05 2022-02-25 北京万集科技股份有限公司 Target tracking method, device, server and readable storage medium
CN114529602A (en) * 2022-04-24 2022-05-24 北京开运联合信息技术集团股份有限公司 Space multi-target situation monitoring method and device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102663743A (en) * 2012-03-23 2012-09-12 西安电子科技大学 Multi-camera cooperative character tracking method in complex scene
CN105913459A (en) * 2016-05-10 2016-08-31 中国科学院自动化研究所 Moving object detection method based on high resolution continuous shooting images
WO2019218861A1 (en) * 2018-05-14 2019-11-21 华为技术有限公司 Method for estimating driving road and driving road estimation system
CN109272530A (en) * 2018-08-08 2019-01-25 北京航空航天大学 Method for tracking target and device towards space base monitoring scene
CN109712173A (en) * 2018-12-05 2019-05-03 北京空间机电研究所 A kind of picture position method for estimating based on Kalman filter
US20200265592A1 (en) * 2019-02-18 2020-08-20 Raytheon Company Three-frame difference target acquisition and tracking using overlapping target images
CN111361570A (en) * 2020-03-09 2020-07-03 福建汉特云智能科技有限公司 Multi-target tracking reverse verification method and storage medium
CN114091561A (en) * 2020-08-05 2022-02-25 北京万集科技股份有限公司 Target tracking method, device, server and readable storage medium
CN113920156A (en) * 2021-08-30 2022-01-11 暨南大学 Acceleration estimation method, system, device and medium based on video target tracking
CN114529602A (en) * 2022-04-24 2022-05-24 北京开运联合信息技术集团股份有限公司 Space multi-target situation monitoring method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
付小宁等: "基于光电成像的单站被动测距", 《光电工程》 *

Also Published As

Publication number Publication date
CN115187637B (en) 2022-11-22

Similar Documents

Publication Publication Date Title
CN107728615B (en) self-adaptive region division method and system
CN110866496A (en) Robot positioning and mapping method and device based on depth image
CN111209978B (en) Three-dimensional visual repositioning method and device, computing equipment and storage medium
CN115265523A (en) Robot simultaneous positioning and mapping method, device and readable medium
CN108388649B (en) Method, system, device and storage medium for processing audio and video
CN110866497A (en) Robot positioning and image building method and device based on dotted line feature fusion
CN115423846A (en) Multi-target track tracking method and device
WO2012004387A2 (en) Generalized notion of similarities between uncertain time series
CN115326051A (en) Positioning method and device based on dynamic scene, robot and medium
CN116958267B (en) Pose processing method and device, electronic equipment and storage medium
CN115187637B (en) Real-time identification method and track estimation method and device for target motion parameters
KR20180112374A (en) Feature point-based real-time camera pose estimation method and apparatus therefor
CN109489660A (en) Robot localization method and apparatus
CN109816726B (en) Visual odometer map updating method and system based on depth filter
CN112965076A (en) Multi-radar positioning system and method for robot
CN116958809A (en) Remote sensing small sample target detection method for feature library migration
CN114674328B (en) Map generation method, map generation device, electronic device, storage medium, and vehicle
WO2020084279A1 (en) Data communication
CN113763468B (en) Positioning method, device, system and storage medium
Berkvens et al. Asynchronous, electromagnetic sensor fusion in RatSLAM
US9514256B1 (en) Method and system for modelling turbulent flows in an advection-diffusion process
CN113721240A (en) Target association method and device, electronic equipment and storage medium
CN113033397A (en) Target tracking method, device, equipment, medium and program product
CN113791425A (en) Radar P display interface generation method and device, computer equipment and storage medium
JP3685023B2 (en) Target tracking device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant