CA3158929A1 - Robot tracking method, device, equipment, and computer-readable storage medium - Google Patents

Robot tracking method, device, equipment, and computer-readable storage medium

Info

Publication number
CA3158929A1
CA3158929A1 CA3158929A CA3158929A CA3158929A1 CA 3158929 A1 CA3158929 A1 CA 3158929A1 CA 3158929 A CA3158929 A CA 3158929A CA 3158929 A CA3158929 A CA 3158929A CA 3158929 A1 CA3158929 A1 CA 3158929A1
Authority
CA
Canada
Prior art keywords
timing
robot
model
state
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CA3158929A
Other languages
French (fr)
Inventor
Xinjiang ZHENG
Minghao LI
Guoxu FAN
Jingquan ZHAO
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
10353744 Canada Ltd
Original Assignee
10353744 Canada Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 10353744 Canada Ltd filed Critical 10353744 Canada Ltd
Publication of CA3158929A1 publication Critical patent/CA3158929A1/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A robot tracking method, device and equipment and a computer readable storage medium, belonging to the field of intelligent robot control. The method comprises: at each moment of tracking, obtaining observation data obtained by observing a robot by at least two ultrasonic arrays (101); and estimating the motion state of the robot of each moment by using a preset augmented IMM-EKF algorithm, specifically including: respectively obtaining, by means of a number m of augmented EKF filters matched with m motion models corresponding to m motion states at k moment, the corresponding state estimation of the robot under each of the motion models at the k moment to obtain m state estimations, and performing weighting calculation on the m state estimations to obtain a state estimation result of the robot at the k moment, wherein each moment is represented by k moment and k and m are integers greater than zero (102). The intelligent robot can be stably and effectively tracked when the motion state of the robot is unknown and changeable, the phenomenon of mistracking or tracking loss is reduced, and the method is suitable for application scenarios with multiple ultrasonic arrays.

Description

ROBOT TRACKING METHOD, DEVICE, EQUIPMENT, AND COMPUTER-READABLE STORAGE MEDIUM
BACKGROUND OF THE INVENTION
Technical Field [0001] The present invention relates to the field of intelligent robots manipulation and control, and more particularly to a robot tracking method, and corresponding device, equipment, and computer-readable storage medium.
Description of Related Art
[0002] At present, intelligent robots have been applied in such various fields as ocean exploration, security and medical care, and bring about great convenience to the development of science and everyday life of people, it is therefore necessary to track robots in real time.
However, when an intelligent robot operates underwater or indoors, it is impossible to employ satellite positioning. The method of visual navigation is advantageous in obtaining complete information and surveying wide ranges, so it plays an important role in robot navigation, but disadvantages thereof rest in longer time in processing visual images and inferior real timeliness. Accordingly, scholars in the field have conducted researches on positioning moving robots based on the radio frequency identification (RFID) technology, as should be pointed out, however, the precision of such wireless positioning technology is by the order of meter, and cannot meet the requirement of high-precision navigation and positioning of indoor robots to use the intelligent robots to transmit ultrasonic waves. By receiving sound signals emitted by a robot through multiple ultrasonic arrays, acquiring the observation location of the robot after processing, and finally obtaining location estimation of the robot through filtration by a tracking algorithm, such a method achieves higher tracking precision while satisfies the demand Date Recue/Date Received 2022-04-25 on real timeliness.
[0003] Common tracking algorithms mainly include extended Kalman filter, unscented Kalman filter, and particle filter etc., and these algorithms exhibit excellent tracking effects in the case the target motion model is known and the motion state remains essentially unchanged. However, in the actual target tracking process, the motion model is usually unknown, the motion state of the robot is also frequently subjected to changes, and the tracking effects of the aforementioned algorithms will be reduced, or even dissipated. In comparison with a single ultrasonic reception array, the multi-arrays tracking system can obtain more motion state information of the target, and tracking precision is enhanced by the use of a corresponding fusion algorithm.
SUMMARY OF THE INVENTION
[0004] In order to solve problems pending in the state of the art, embodiments of the present invention provide a robot tracking method and a robot tracking device, whereby tracking precision in tracking a robot indoors is enhanced, the tracking error is small, and the computational amount is relatively low, so as to realize stable and effective tracking of an intelligent robot also when its state is unknown and variable, and to reduce the phenomenon of erroneous tracking or failed tracking. The technical solutions are as follows.
[0005] According to one aspect, there is provided a robot tracking method that comprises:
[0006] obtaining observation data of at least two ultrasonic arrays on a robot at each timing of tracking; and
[0007] employing a preset dimension-extended IMM-EKF algorithm to estimate a motion state of the robot at each timing, obtaining state estimation to which the robot corresponds under each motion model at timing k through m number of dimension-extended EKF

filters that match m number of motion models corresponding to m number of motion Date Recue/Date Received 2022-04-25 states at timing k, obtaining m number of states, and performing weighted calculation on the m number of states to obtain a state estimation result of the robot at timing k, wherein each timing is expressed by timing k, and k, m are each an integer greater than 0.
[0008] Further, the step of obtaining observation data of at least two ultrasonic arrays on a robot at each timing of tracking includes:
[0009] obtaining observation data 4, z,2, .................................
zici of at least two ultrasonic arrays on the robot at timing k, wherein k, n are each an integer greater than 0, and 4, Z12, Zk are all vectors of robot angles and distance data measured by the at least two ultrasonic arrays.
[0010] Further, the step, employing a preset dimension-extended IMM-EKF
algorithm to estimate a motion state of the robot at each timing, obtaining state estimation to which the robot corresponds under each motion model at timing k through m number of dimension-extended EKE filters that match m number of motion models corresponding to m number of motion states at timing k, obtaining m number of states, and performing weighted calculation on the m number of states to obtain a state estimation result of the robot at timing k, includes:
[0011] a robot tracking system creating step: creating the robot tracking system that includes a motion equation and an observation equation of the robot, expressed as:
[0012] the motion equation: 4+1 = Fx + wk;
[0013] the observation equation: 41 = Nk'xk + vikl;
[0014] C11 = P(Mk = iMk-1 = Mi);
[0015] where i, j=1, 2... m represents the number of models, n=1,2 ........ n represents the number of ultrasonic arrays, m, n are each an integer greater than or equal to 1, k E
N represents timing, C11 represents a probability of a target transferring from model i at timing k-1 to model j at timing k, F represents an ith model state transfer matrix at timing k, 4,4 represents a target state under an ith motion model at timing k, 41 represents an observation matrix of an nth array at timing k, zk' represents target state observation received by the nth array at timing k, represents process noise of model i, Date Recue/Date Received 2022-04-25 represents observation noise of the nth array, the two noises are both supposed to be white Gaussian noise with a mean value of zero and a covariance of 0, and Rikl respectively;
[0016] a model input interacting step: letting xki be state estimation of dimension-extended ETU filter i at timing k-1, Pki_1ik_1 be corresponding covariance matrix estimation, and /4_1 be a probability of model i at timing k-1, after interactive calculation, input calculation formulae of dimension-extended ETU filter j at timing k being as follows:
of ,,, .
[0017] xk_i1-1 = En=1 Xki
[0018] of _L r,i of ir,i of õili Pk-tik-1 = E'in=1tPki-1lk-1 ' L'Ic-111c-1 Xk-111c-1-1LA'k-lik-1 Xk-111c-
[0019] where kik _ = Cij4-1c1 = Elir C
c;
[0020] a sub-model filtering step: calculating and obtaining corresponding inputs xk i 1ik_1(i = 1,2... m) at various dimension-extended ETU filters, and employing the obtained measurement zilc, Z .............................................
zto perform corresponding state estimation updates under various models;
[0021] a model probability updating step: calculating model probabilities on various models i=1, 2, ...m, the calculation formula being as follows:
[0022] 4 = Aikci;
[0023] where ci = _ C111, c = Aik Ci;
[0024] an estimation fusion outputting step: calculating state estimation and covariance matrix estimation of the target at the current timing according to update probabilities, state estimations and covariance matrix estimations of the various models, the calculation formulae being as follows:
i.
xkik = rin=1 xki ik
[0025] 13k1k = rin=itPkilk kkik ¨ Xklk] [Xkik ¨ Xki ik]T),Uik;
[0026] where xkik represents target state estimation at timing k, and Pk'', represents target Date Recue/Date Received 2022-04-25 state covariance matrix estimation at timing k.
[0027] Further, the sub-model filtering step includes:
[0028] a state predicting sub-step: with respect to the various models i=1, 2... m, calculating corresponding prediction states and prediction covariance matrixes respectively, the calculation formulae being as follows:
[0029] xki =lik-1;
[0030] 13ki1k-1 = +
[0031] a data fusing sub-step: employing a dimension-extended algorithm to perform data fusion, formulae of various corresponding variables being as follows:
[0032] 4 = [(4)T, (z,2c)T1T;
[0033] A k = KADT , (A2k)T1T ;
[0034] R k = diag[Rilc,Ri2c];
[0035] corresponding to the models i=1, 2... m, calculation formulae of their respective measurement prediction residuals and measurement covariances being as follows:
[0036] 14 = 4 ¨ Akxki Ik_,;
[0037] Sic = Ak_Pkilk_i(Ak)T + R k;
[0038] a likelihood function corresponding to model i being simultaneously calculated, the likelihood function being as follows in the supposition that a condition of Gaussian distribution is abided by:
[0039] i1 = __ 1 __ exp[ ¨ -1 (v)T (S)-1-14];
õ1127Tsk 2
[0040] a filter updating sub-step: corresponding to the models i=1, 2... m, respectively calculating their respective filter gains, state estimation updates, and error covariance matrixes, the calculation formulae being as follows:
[0041] Kiic = Pkilk_i(Ak)T(Sk)l;
[0042] xi kik = xki pc-1 + Klcvic;
Date Recue/Date Received 2022-04-25
[0043] Pkiik = Pi Ki A PI
k k kik-1-
[0044] According to another aspect, there is provided a robot tracking device that comprises:
[0045] a data obtaining module, for obtaining observation data of at least two ultrasonic arrays on a robot at each timing of tracking; and
[0046] a calculating module, for employing a preset dimension-extended IMM-EKF
algorithm to estimate a motion state of the robot at each timing, obtaining state estimation to which the robot corresponds under each motion model at timing k through m number of dimension-extended EKE filters that match m number of motion models corresponding to m number of motion states at timing k, obtaining m number of states, and performing weighted calculation on them number of states to obtain a state estimation result at timing k, wherein each timing is expressed by timing k, and k, m are each an integer greater than 0.
[0047] Further, the data obtaining module is employed for:
[0048] obtaining observation data 4, 4 ....................................
zici of at least two ultrasonic arrays on the robot at timing k, wherein k, n are each an integer greater than 0, and 4, 4 ....
zici are all vectors of robot angles and distance data measured by the at least two ultrasonic arrays.
[0049] Further, the calculating module includes a robot tracking system creating module for:
[0050] creating the robot tracking system that includes a motion equation and an observation equation of the robot, expressed as:
[0051] the motion equation: 4+1 = Fx + wk;
[0052] the observation equation: 41 = Nk'xk + vikl;
[0053] C11 = P(Mk = Mi IMk_l = Mi);
[0054] where i, j=1, 2... m represents the number of models, n=1,2 ........ n represents the number of ultrasonic arrays, m, n are each an integer greater than or equal to 1, k E
N represents timing, Cii represents a probability of a target transferring from model i at timing k-1 to Date Recue/Date Received 2022-04-25 model j at timing k, F represents an ith model state transfer matrix at timing k, xx represents a target state under an till motion model at timing k, 41 represents an observation matrix of an nth array at timing k, 4,1 represents target state observation received by the nth array at timing k, represents process noise of model i, v17,' represents observation noise of the nth array, the two noises are both supposed to be white Gaussian noise with a mean value of zero and a covariance of (211, and Rikl respectively;
[0055] a model input interacting module, for letting xki be state estimation of dimension-extended ETU filter i at timing k-1, Pk_1ik_1 be corresponding covariance matrix estimation, and ;4_1 be a probability of model i at timing k-1, after interactive calculation, input calculation formulae of dimension-extended ETU filter j at timing k being as follows:
[0056] .x.k i -11k-1 = rin-1 Xki ¨111c-1 ttik[]t;
[0057] = E'in=1tPki-1ik-1 [xki -11k-1 ¨ -11k-1 ¨
[0058] where =CUkl c1 = rir i C 14_1;
c
[0059] a sub-model filtering module, for calculating and obtaining corresponding inputs ,oi Doi k-lik-i(i = 1,2... m) at various dimension-extended ETU filters, and employing the obtained measurement 4,, zi2, ..............................
zici to perform corresponding state estimation updates under various models;
[0060] a model probability updating module: for calculating model probabilities on various models i=1, 2, ...m, the calculation formula being as follows:
[0061] 4 =
[0062] where ci = _ C111, c = ci; and
[0063] an estimation fusion outputting module, for calculating state estimation and covariance matrix estimation of the target at the current timing according to update probabilities, state estimations and covariance matrix estimations of the various models, the calculation formulae being as follows:

Date Recue/Date Received 2022-04-25
[0064] xkik = rin=lxki ik
[0065] P kik = rin=r(Plcik + [xkik ¨ xiicik][xklic ¨ xklic]} itik;
[0066] where xkik represents target state estimation at timing k, and Pkik represents target state covariance matrix estimation at timing k.
[0067] Further, the sub-model filtering module includes:
[0068] a state predicting sub-module for, with respect to the various models i=1, 2... m, calculating corresponding prediction states and prediction covariance matrixes respectively, the calculation formulae being as follows:
[0069] xk = iik-t;
[0070] Pki k -1 =113 (pk-1)i V' _L nt k1 -11k-tv.
[0071] a data fusing sub-module, for employing a dimension-extended algorithm to perform data fusion, formulae of various corresponding variables being as follows:
[0072] Zk = [(ZI)T ,(Z)T1T ;
[0073] Ak = [(Alk)T , (A2k)T1T ;
[0074] Rk = diag[Rk,Ri2c];
[0075] corresponding to the models i=1, 2... m, calculation formulae of their respective measurement prediction residuals and measurement covariances being as follows:
[0076] vk = 4 -Akxklik_1;
[0077] Sk = Ak_Pki ik_i(Ak)T + Rk;
[0078] a likelihood function corresponding to model i being simultaneously calculated, the likelihood function being as follows in the supposition that a condition of Gaussian distribution is abided by:
[0079] 4 = 1 exp[ -1 (v)T (S)' viic]; and ,1127r41 2
[0080] a filter updating sub-module, for, corresponding to the models i=1, 2... m, respectively calculating their respective filter gains, state estimation updates, and error covariance Date Recue/Date Received 2022-04-25 matrixes, the calculation formulae being as follows:
[0081] K = Pkilk_i(Ak)T(Sk)_l;
[0082] xki ik = xi kik-i "kvic,
[0083] Pkiik = Pi Ki A PI
k k kik-1-
[0084] According to still another aspect, there is provided a robot tracking equipment that comprises:
[0085] a processor; and
[0086] a memory, for storing an executable instruction of the processor;
wherein
[0087] the processor is configured to execute via the executable instruction steps of the robot tracking method according to any of the aforementioned solutions.
[0088] According to yet another aspect, there is provided a computer-readable storage medium, and the computer-readable storage medium stores a computer program, the steps of the robot tracking method according to any of the aforementioned solutions are realized when the computer program is executed by a processor.
[0089] The technical solutions provided by the embodiments of the present invention bring about the following advantageous effects.
[0090] 1. By arranging multiple ultrasonic arrays, observation data is obtained at each timing of tracking the robot, each step of the iterative process is performed with measurement and dimension extension through a preset dimension-extended IMM-EKF algorithm on the basis of the IMM-EKF algorithm, more target motion state information is obtained, and such a solution is applicable to multiple ultrasonic arrays.
[0091] 2. The primary observation data is fully utilized, the fusion effect is optimized, tracking precision in tracking a robot indoors is enhanced, the tracking error is small, and the Date Recue/Date Received 2022-04-25 computational amount is relatively low, so as to realize stable and effective tracking of an intelligent robot also when its state is unknown and variable, and to reduce the phenomenon of erroneous tracking or failed tracking.
[0092] 3. The dimension-extended IMM-EKF algorithm is employed to track the robot, whereby it is made possible to effectively weaken the influence of reverberation and noise to the tracking precision, the tracking error is rendered apparently lower than that of the traditional IMM-EKF algorithm, and excellent robustness also exhibits in the tracking scenario where observation data is missing.
BRIEF DESCRIPTION OF THE DRAWINGS
[0093] To more clearly describe the technical solutions in the embodiments of the present invention, drawings required to illustrate the embodiments are briefly introduced below.
Apparently, the drawings introduced below are merely directed to some embodiments of the present invention, while persons ordinarily skilled in the art may further acquire other drawings on the basis of these drawings without spending creative effort in the process.
[0094] Fig. 1 is a flowchart illustrating the robot tracking method provided by an embodiment of the present invention;
[0095] Fig. 2 is a flowchart illustrating sub-steps of step 102 in Fig. 1;
[0096] Fig. 3 is a flowchart illustrating sub-steps of step 1023 in Fig. 2;
[0097] Fig. 4 is a flow block diagram illustrating the robot tracking method provided by an embodiment of the present invention;
[0098] Fig. 5 is a view schematically illustrating the process of calculating state calculation result Date Recue/Date Received 2022-04-25 in the robot tracking method provided by an embodiment of the present invention;
[0099] Fig. 6 is a view schematically illustrating the structure of the robot tracking device provided by an embodiment of the present invention;
[0100] Fig. 7 is a view schematically illustrating the formation of the robot tracking equipment provided by an embodiment of the present invention;
[0101] Fig. 8 is a view illustrating the effect comparison of tracking tracks between the robot tracking solution provided by the embodiments of the present invention and a currently available solution; and
[0102] Fig. 9 is a view illustrating the effect comparison of tracking errors between the robot tracking solution provided by the embodiments of the present invention and a currently available solution.
DETAILED DESCRIPTION OF THE INVENTION
[0103] To make more lucid and clear the objectives, technical solutions and advantages of the present invention, the technical solutions in the embodiments of the present invention will be clearly and comprehensively described below with reference to the accompanying drawings in the embodiments of the present invention. Apparently, the embodiments as described are merely partial, rather than the entire, embodiments of the present invention.
Any other embodiments makeable by persons ordinarily skilled in the art on the basis of the embodiments in the present invention without creative effort shall all fall within the protection scope of the present invention. The wordings of "plural", "a plurality of' and "multiple" as used in the description of the present invention mean "two or more", unless definitely and specifically defined otherwise.

Date Recue/Date Received 2022-04-25
[0104] In the robot tracking method, and corresponding device, equipment and computer-readable storage medium provided by the embodiments of the present invention, by arranging multiple ultrasonic arrays, observation data is obtained at each timing of tracking the robot, each step of the iterative process is performed with measurement and dimension extension through a preset dimension-extended IMM-EKF algorithm on the basis of the IMM-EKF algorithm, more target motion state information is obtained, such a solution is applicable to multiple ultrasonic arrays, the primary observation data is fully utilized, the fusion effect is optimized, tracking precision in tracking a robot indoors is enhanced, the tracking error is small, and the computational amount is relatively low, so as to realize stable and effective tracking of an intelligent robot also when its state is unknown and variable, and to reduce the phenomenon of erroneous tracking or failed tracking. Therefore, the robot tracking method is applicable to the application field of intelligent robots manipulation and control, and particularly applicable to the application scenario with multiple ultrasonic arrays.
[0105] The robot tracking method, and corresponding device, equipment and computer-readable storage medium provided by the embodiments of the present invention are described in greater detail below in conjunction with specific embodiments and accompanying drawings.
[0106] Fig. 1 is a flowchart illustrating the robot tracking method provided by an embodiment of the present invention. Fig. 2 is a flowchart illustrating sub-steps of step 102 in Fig. 1.
Fig. 3 is a flowchart illustrating sub-steps of step 1023 in Fig. 2.
[0107] As shown in Fig. 1, the robot tracking method provided by an embodiment of the present invention comprises the following steps.
[0108] 101 - obtaining observation data of at least two ultrasonic arrays on a robot at each timing of tracking.

Date Recue/Date Received 2022-04-25
[0109] The step of obtaining observation data of at least two ultrasonic arrays on a robot at each timing of tracking includes:
[0110] obtaining observation data 4, z,2, .................................
zici of at least two ultrasonic arrays on the robot at timing k, wherein k, n are each an integer greater than 0, and 4, z ...
zk' are all vectors of measured robot angles and distance data.
[0111] As is noticeable, the process of step 101 can further be realized via other modes besides the mode specified in the above step, and the specific modes are not restricted in the embodiments of the present invention.
[0112] 102 - employing a preset dimension-extended IMM-EKF algorithm to estimate a motion state of the robot at each timing, obtaining state estimation to which the robot corresponds under each motion model at timing k through m number of dimension-extended EKF

filters that match m number of motion models corresponding to m number of motion states at timing k, obtaining m number of states, and performing weighted calculation on the m number of states to obtain a state estimation result of the robot at timing k, wherein each timing is expressed by timing k, and k, m are each an integer greater than 0.
[0113] As shown in Fig. 2, step 102 further includes the following sub-steps:
[0114] 1021 ¨ a stochastic hybrid system calculating step: a robot tracking system creating step:
creating the robot tracking system that includes a motion equation and an observation equation of the robot, expressed as:
[0115] the motion equation: 4+1 = Fx + wk;
[0116] the observation equation: 41 = AIkixk + vikl;
[0117] C11 = P(Mk = Mi I Mk_ = Mi);
[0118] where i, j=1, 2... m represents the number of models, n=1,2 ........ n represents the number of ultrasonic arrays, m, n are each an integer greater than or equal to 1, k E
N represents timing, C11 represents a probability of a target transferring from model i at timing k-1 to Date Recue/Date Received 2022-04-25 model j at timing k, F represents an ith model state transfer matrix at timing k, xx represents a target state under an till motion model at timing k, Alici represents an observation matrix of an nth array at timing k, 4,1 represents target state observation received by the nth array at timing k, wk represents process noise of model i, represents observation noise of the nth array, the two noises are both supposed to be white Gaussian noise with a mean value of zero and a covariance of (211, and Rikl respectively;
[0119] 1022 - a model input interacting step: letting xki be state estimation of dimension-extended ETU filter i at timing k-1, Pk_1ik_1 be corresponding covariance matrix estimation, and /4_1 be a probability of model i at timing k-1, after interactive calculation, input calculation formulae of dimension-extended ETU filter j at timing k being as follows:
[0120 xk_i,, ] of .
1-1 = rn=lxki [0121] of r,i of i of Pk-tik-1 = E'in=itPk _L i-tik-1 Xk-111c-1ir, -1LA'k-lik-1 Xk-111c-11 1-k-1 [0122] where =CUklc1 =
ci [0123] 1023 - a sub-model filtering step: calculating and obtaining corresponding inputs x0i k-lik-1, k-lik-1 P i (i = 1,2... m) at various dimension-extended ETU filters, and employing the obtained measurement 4,, zi2, .............................. 41 to perform corresponding state estimation updates under various models;
[0124] 1024 - a model probability updating step: calculating model probabilities on various models i=1, 2, ...m, the calculation formula being as follows:
[0125] 4 =
[0126] where ci = Er_l Cii c = Aik Ci; and [0127] 1025 - an estimation fusion outputting step: calculating state estimation and covariance matrix estimation of the target at the current timing according to update probabilities, state estimations and estimation covariance matrix estimations of the various models, the calculation formulae being as follows:

Date Recue/Date Received 2022-04-25 [0128] xkik = ik 14;
[0129] Pk'', = rin,r(Plcik + [xkik ¨ xiicik][xkik ¨ xkik]} itik;
[0130] where xkik represents target state estimation at timing k, and Pk'', represents target state covariance matrix estimation at timing k.
[0131] As shown in Fig. 3, the aforementioned sub-model filtering step further includes the following sub-steps:
[0132] 1023a - a state predicting sub-step: with respect to the various models i=1, 2... m, calculating corresponding prediction states and prediction covariance matrixes respectively, the calculation formulae being as follows:
[0133] xki ik_i = iik-t;
[0134] Pki lk-1 = N-1Pki -11k-A-1)T QL-1;
[0135] 1023b - a data fusing sub-step: employing a dimension-extended algorithm to perform data fusion, formulae of various corresponding variables being as follows:
[0136] 4 = [(4)T,(z,2c)T1T;
[0137] A k = MDT , (A2k)T1T ;
[0138] Rk = diag[N,Q;
[0139] corresponding to the models i=1, 2... m, calculation formulae of their respective measurement prediction residuals and measurement covariances being as follows:
[0140] vk = 4 ¨ Akxki ik_,;
[0141] Sk = AkPkilk-i(Ak)T + Rk;
[0142] a likelihood function corresponding to model i being simultaneously calculated, the likelihood function being as follows in the supposition that a condition of Gaussian distribution is abided by:
[0143] 4 ¨ 1 exp[¨ -1 (vOT (S)-1-14]; and \1127T41 2 [0144] 1023c - a filter updating sub-step: corresponding to the models i=1, 2... m, respectively Date Recue/Date Received 2022-04-25 calculating their respective filter gains, state estimation updates, and error covariance matrixes, the calculation formulae being as follows:
[0145] K = Pkilk_1(Ak)T(Sk)_l;
[0146] xki ik ¨ xi _L
¨ kik-1 [0147] Pildk = ¨
[0148] Fig. 4 is a flow block diagram illustrating the robot tracking method provided by an embodiment of the present invention, Fig. 5 is a view schematically illustrating the process of calculating state calculation result in the robot tracking method provided by an embodiment of the present invention, and the two Figures together demonstrate a mode of execution in which two ultrasonic arrays are selected and used.
[0149] As is noticeable, the process of step 102 can further be realized via other modes besides the mode specified in the above step, and the specific modes are not restricted in the embodiments of the present invention.
[0150] Fig. 6 is a view schematically illustrating the structure of the robot tracking device provided by an embodiment of the present invention, as shown in Fig. 6, the robot tracking device provided by an embodiment of the present invention comprises a data obtaining module 1 and a calculating module 2.
[0151] The data obtaining module 1 is employed for obtaining observation data of at least two ultrasonic arrays on a robot at each timing of tracking. Specifically, the data obtaining module 1 is employed for: obtaining observation data 4, 4 .................
ziki of at least two ultrasonic arrays on the robot at timing k, wherein k, n are each an integer greater than 0, and 4, 4 ..................................................................
ziki are all vectors of robot angles and distance data measured by the at least two ultrasonic arrays.

Date Recue/Date Received 2022-04-25 [0152] The calculating module 2 is employed for: employing a preset dimension-extended IMM-EKF algorithm to estimate a motion state of the robot at each timing, obtaining state estimation to which the robot corresponds under each motion model at timing k through m number of dimension-extended EKF filters that match m number of motion models corresponding to m number of motion states at timing k, obtaining m number of states, and performing weighted calculation on the m number of states to obtain a state estimation result at timing k, wherein each timing is expressed by timing k, and k, m are each an integer greater than 0.
[0153] Specifically, the calculating module 2 includes a robot tracking system creating module 21, a model input interacting module 22, a sub-model filtering module 23, a model probability updating module 24 and an estimation fusion outputting module 25.
[0154] The robot tracking system creating module 21 is employed for:
[0155] creating the robot tracking system that includes a motion equation and an observation equation of the robot, expressed as:
[0156] the motion equation: 4+1 = Fic4 + wk;
[0157] the observation equation: 41 = A7k'xk + vikl;
[0158] C11 = P(Mk = = Mi);
[0159] where i, j=1, 2... m represents the number of models, n=1,2 ........ n represents the number of ultrasonic arrays, m, n are each an integer greater than or equal to 1, k E
N represents timing, C11 represents a probability of a target transferring from model i at timing k-1 to model j at timing k, F represents an ith model state transfer matrix at timing k, 44 represents a target state under an ith motion model at timing k, AZ represents an observation matrix of an nth array at timing k, zki represents target state observation received by the nth array at timing k, represents process noise of model i, 19/7,' represents observation noise of the nth array, the two noises are both supposed to be white Gaussian noise with a mean value of zero and a covariance of 0, and RIki respectively.

Date Recue/Date Received 2022-04-25 [0160] The model input interacting module 22 is employed for:
[0161] letting xki be state estimation of dimension-extended ETU filter i at timing k-1, Pi k-111c-1 be corresponding covariance matrix estimation, and 1.4_1 be a probability of model i at timing k-1, after interactive calculation, input calculation formulae of dimension-extended ETU filter j at timing k being as follows:
[0162] 4i_ilk 1 = xki [0163] of õ
Pk-1jk-1 = rin=1tPki-111c-1 [Xki ¨1Ik-1 Xkl3L11k-11[Xki ¨1Ik-1 Xkl3L1ik-1iTil¨ki ¨1 [0164] where =CJklc1 = riT i c;
[0165] The sub-model filtering module 23 is employed for:
[0166] calculating and obtaining corresponding inputs xk _i p,(cLi('= 1,2... m) at various dimension-extended ETU filters, and employing the obtained measurement Zk1, Zk ......... 41 to perform corresponding state estimation updates under various models.
[0167] The model probability updating module 24 is employed for: calculating model probabilities on various models i=1, 2, ...m, the calculation formula being as follows:
[0168] 4 =
[0169] where ci = Er_lCii ilk] c = Ci.
[0170] The estimation fusion outputting module 25 is employed for:
[0171] calculating state estimation and covariance matrix estimation of the target at the current timing according to update probabilities, state estimations and covariance matrix estimations of the various models, the calculation formulae being as follows:
[0172] xkik = rin=i xki ik /Ilk;
[01731 P kik = E'in=ltPki lk kiclic Xki lici[Xklk Xki 11(19 [ilk;

Date Recue/Date Received 2022-04-25 [0174] where xkik represents target state estimation at timing k, and Pk'', represents target state covariance matrix estimation at timing k.
[0175] Further, the sub-model filtering module 23 includes a state predicting sub-module 231, a data fusing sub-module 232 and a filter updating sub-module 233.
[0176] The state predicting sub-module 231 is employed for:
[0177] with respect to the various models i=1, 2... m, calculating corresponding prediction states and prediction covariance matrixes respectively, the calculation formulae being as follows:
[Ol 78] xki ik_l = iik_i;
[0179] ( )7' _L
Pki pc-1= Filc-tPki -11k-tv0 -1-[0180] The data fusing sub-module 232 is employed for:
[0181] employing a dimension-extended algorithm to perform data fusion, formulae of various corresponding variables being as follows:
[0182] Z k =
[0183] A k = [OW (Ak)T17 [0184] R k = diag[Rk,R];
[0185] corresponding to the models i=1, 2... m, calculation formulae of their respective measurement prediction residuals and measurement covariances being as follows:
[0186] vk = 4 -Akxklik_1;
[0187] Sk = Ak_Pki pc_i(Ak)T + R k;
[0188] a likelihood function corresponding to model i being simultaneously calculated, the likelihood function being as follows in the supposition that a condition of Gaussian distribution is abided by:

Date Recue/Date Received 2022-04-25 [0189] Aik ¨ __ 1 __ exp[¨ -1 (v)T (Sk)_lvild.
õ1127T41 2 [0190] The filter updating sub-module 233 is employed for:
[0191] corresponding to the models i=1, 2... m, respectively calculating their respective filter gains, state estimation updates, and error covariance matrixes, the calculation formulae being as follows:
[0192] K = Pki ik_i(Ak)T(Sk)_l;
[0193] xki ik = xki pc-1 + Kilcvic;
[0194] Pkiik ¨ Pi Ki A Pi ¨ k k kik-1-[0195] Fig. 7 is a view schematically illustrating the formation of the robot tracking equipment provided by an embodiment of the present invention, as shown in Fig. 7, the robot tracking equipment provided by an embodiment of the present invention comprises a processor 3 and a memory 4, the memory 4 is for storing an executable instruction of the processor 3, and the processor 3 is configured to execute via the executable instruction steps of the robot tracking method according to any of the aforementioned solutions.
[0196] An embodiment of the present invention further provides a computer-readable storage medium, and the computer-readable storage medium stores a computer program, the steps of the robot tracking method according to any of the aforementioned solutions are realized when the computer program is executed by a processor.
[0197] As should be noted, when the robot tracking device provided by the aforementioned embodiment triggers a robot tracking business, the division into the aforementioned various functional modules is merely by way of example, while it is possible, in actual application, to base on requirements to assign the functions to different functional modules for completion, that is to say, to divide the internal structure of the device into Date Recue/Date Received 2022-04-25 different functional modules to complete the entire or partial functions described above.
In addition, the robot tracking device, robot tracking equipment and computer-readable storage medium that trigger the robot tracking business provided by the aforementioned embodiments pertain to the same conception as the robot tracking method that triggers the robot tracking business as provided by the method embodiment ¨ see the corresponding method embodiment for their specific realization processes, while no repetition will be made in this context.
[0198] All of the aforementioned optional technical solutions can be randomly combined to form the optional embodiments of the present invention, while no redundancy is made in this context on a one-by-one basis.
[0199] To enunciate the advantages of the robot tracking solution provided by the embodiments of the present invention in tracking automated equipment indoors, measurement data of the robot is processed through the robot tracking method, the IMM-EKF method and the weighted IMM-EKF provided by the embodiments of the present invention, to realize state estimation of the robot, and the results are as shown in Fig. 8.
[0200] Fig. 8 is a view illustrating the effect comparison of tracking tracks between the robot tracking solution provided by the embodiments of the present invention and a currently available solution. Fig. 9 is a view illustrating the effect comparison of tracking errors between the robot tracking solution provided by the embodiments of the present invention and a currently available solution.
[0201] As shown in Fig. 9, to further depict the performances of the different methods, tracking errors of the above estimation results are calculated to evaluate the performances. The error formula of state estimation at timing tk is follows:
[0202] RMSE = xk)2 + (9k ¨ Yk)2 [0203] where (ik, jik) represents position coordinates obtained by the target state estimation at Date Recue/Date Received 2022-04-25 timing tk, and (xk,yk) represents the actual position of the target at timing tk.
[0204] The following Table 1 shows target average tracking errors of three methods, as shown below:
[0205]
Tracking Dimension- Weighted IMM- I]\41M-EKF
Algorithms extended IMM- EKF Algorithm Algorithm EKF Algorithm Tracking Errors 0.15 0.22 3.9 (m) [0206] Table 1 [0207] As can be known, the tracking precision of the robot tracking method provided by the embodiments of the present invention is apparently better than that of the IMM-EKF
algorithm, moreover, in comparison with the weighted IMM-EKF algorithm, the tracking error is reduced by approximately 50%.
[0208] In summary, in comparison with the prior-art technology, the robot tracking method, and corresponding device, equipment and computer-readable storage medium provided by the embodiments of the present invention achieve the following advantageous effects.
[0209] 1. By arranging multiple ultrasonic arrays, observation data is obtained at each timing of tracking the robot, each step of the iterative process is performed with measurement and dimension extension through a preset dimension-extended IMM-EKF algorithm on the basis of the IMM-EKF algorithm, more target motion state information is obtained, and such a solution is applicable to multiple ultrasonic arrays.
[0210] 2. The primary observation data is fully utilized, the fusion effect is optimized, tracking precision in tracking a robot indoors is enhanced, the tracking error is small, and the Date Recue/Date Received 2022-04-25 computational amount is relatively low, so as to realize stable and effective tracking of an intelligent robot also when its state is unknown and variable, and to reduce the phenomenon of erroneous tracking or failed tracking.
[0211] 3. The dimension-extended IMM-EKF algorithm is employed to track the intelligent robot, whereby it is made possible to effectively weaken the influence of reverberation and noise to the tracking precision, the tracking error is rendered apparently lower than that of the traditional IMM-EKF algorithm, and excellent robustness also exhibits in the tracking scenario where observation data is missing.
[0212] As understandable by persons ordinarily skilled in the art, realization of the entire or partial steps of the aforementioned embodiments can be completed by hardware, or by a program instructing relevant hardware, the program can be stored in a computer-readable storage medium, and the storage medium can be a read-only memory, a magnetic disk, or an optical disk, etc.
[0213] The embodiments of the present application are described with reference to flowcharts and/or block diagrams of the method, device (system), and computer program product embodied in the embodiments of the present application. As should be understood, each flow and/or block in the flowcharts and/or block diagrams, and any combination of flow and/or block in the flowcharts and/or block diagrams can be realized by computer program instructions. These computer program instructions can be supplied to a general computer, a dedicated computer, an embedded processor or a processor of any other programmable data processing device to form a machine, so that the instructions executed by the computer or the processor of any other programmable data processing device generate a device for realizing the functions designated in one or more flow(s) of the flowcharts and/or one or more block(s) of the block diagrams.
[0214] These computer program instructions can also be stored in a computer-readable memory Date Recue/Date Received 2022-04-25 enabling a computer or any other programmable data processing device to operate by a specific mode, so that the instructions stored in the computer-readable memory generate a product containing instructing means, and this instructing means realizes the functions designated in one or more flow(s) of the flowcharts and/or one or more block(s) of the block diagrams.
[0215] These computer program instructions can also be loaded onto a computer or any other programmable data processing device, so as to execute a series of operations and steps on the computer or the any other programmable device to generate computer-realized processing, so that the instructions executed on the computer or the any other programmable device provide steps for realizing the functions designated in one or more flow(s) of the flowcharts and/or one or more block(s) of the block diagrams.
[0216] Although preferred embodiments in the embodiments of the present application have been described so far, it is still possible for persons skilled in the art to make additional modifications and amendments to these embodiments upon learning of the basic inventive conception. Accordingly, the attached Claims are meant to cover the preferred embodiments and all modifications and amendments that fall within the scope of the embodiments of the present application.
[0217] Apparently, persons skilled in the art can make various amendments and modifications to the present invention without departing from the spirit and scope of the present invention.
Thus, if such amendments and modifications to the present invention fall within the Claims of the present invention and equivalent technology, the present invention is also meant to cover these amendments and modifications.
[0218] What is described above is merely directed to preferred embodiments of the present invention, and they are not meant to restrict the present invention. Any amendment, equivalent replacement and improvement makeable within the spirit and scope of the Date Recue/Date Received 2022-04-25 present invention shall all be covered within the protection scope of the present invention.
Date Recue/Date Received 2022-04-25

Claims (10)

What is claimed is:
1. A robot tracking method, characterized in that the method comprises:
obtaining observation data of at least two ultrasonic arrays on a robot at each timing of tracking;
and employing a preset dimension-extended IMM-EKF algorithm to estimate a motion state of the robot at each timing, obtaining state estimation to which the robot corresponds under each motion model at timing k through m number of dimension-extended EKF filters that match m number of motion models corresponding to m number of motion states at timing k, obtaining m number of states, and performing weighted calculation on the m number of states to obtain a state estimation result of the robot at timing k, wherein each timing is expressed by timing k, and k, m are each an integer greater than 0.
2. The method according to Claim 1, characterized in that the step of obtaining observation data of at least two ultrasonic arrays on a robot at each timing of tracking includes:
obtaining observation data of at least two ultrasonic arrays on the robot at timing k, wherein k, n are each an integer greater than 0, and are all vectors of robot angles and distance data measured by the at least two ultrasonic arrays.
3. The method according to Claim 2, characterized in that the step, employing a preset dimension-extended IMM-EKF algorithm to estimate a motion state of the robot at each timing, obtaining state estimation to which the robot corresponds under each motion model at timing k through m number of dimension-extended EKF filters that match m number of motion models corresponding to m number of motion states at timing k, obtaining m number of states, and performing weighted calculation on the m number of states to obtain a state estimation result of the robot at timing k, includes:
a robot tracking system creating step: creating the robot tracking system that includes a motion equation and an observation equation of the robot, expressed as:
the motion equation: the observation equation: cii = P(Mk = Mi IMk-1 = Mi);
where i, j=1, 2... m represents the number of models, n=1,2 ........... n represents the number of ultrasonic arrays, m, n are each an integer greater than or equal to 1, k E N
represents timing, Cii represents a probability of a target transferring from model i at timing k-1 to model j at timing k, PI, represents an ith model state transfer matrix at timing k, 44 represents a target state under an ith motion model at timing k, AZ represents an observation matrix of an nth array at timing k, zk' represents target state observation received by the nth array at timing k, wk represents process noise of model i, viki represents observation noise of the nth array, the two noises are both supposed to be white Gaussian noise with a mean value of zero and a covariance of QJ and RIki respectively;
a model input interacting step: letting Xki _llk_l be state estimation of dimension-extended EKF
filter i at timing k-1, Pki_llk_l be corresponding covariance matrix estimation, and 1.4_1 be a probability of model i at timing k-1, after interactive calculation, input calculation formulae of dimension-extended EKF filter j at timing k being as follows:
where a sub-model filtering step: calculating and obtaining corresponding inputs at various dimension-extended EKF filters, and employing the obtained measurement to perform corresponding state estimation updates under various models;
a model probability updating step: calculating model probabilities on various models i=1, 2, ...m, the calculation formula being as follows:

where an estimation fusion outputting step: calculating state estimation and covariance matrix estimation of the target at the current timing according to update probabilities, state estimations and covariance matrix estimations of the various models, the calculation formulae being as follows:
where xi, lk represents target state estimation at timing k, and Pklk represents target state covariance matrix estimation at timing k.
4. The method according to Claim 3, characterized in that the sub-model filtering step includes:
a state predicting sub-step: with respect to the various models i=1, 2... m, calculating corresponding prediction states and prediction covariance matrixes respectively, the calculation formulae being as follows:
a data fusing sub-step: employing a dimension-extended algorithm to perform data fusion, formulae of various corresponding variables being as follows:
corresponding to the models i=1, 2... m, calculation formulae of their respective measurement prediction residuals and measurement covariances being as follows:
a likelihood function corresponding to model i being simultaneously calculated, the likelihood function being as follows in the supposition that a condition of Gaussian distribution is abided by:
a filter updating sub-step: corresponding to the models i=1, 2... m, respectively calculating their respective filter gains, state estimation updates, and error covariance matrixes, the calculation formulae being as follows:
5. A robot tracking device, characterized in that the device comprises:
a data obtaining module, for obtaining observation data of at least two ultrasonic arrays on a robot at each timing of tracking; and a calculating module, for employing a preset dimension-extended IMM-EKF
algorithm to estimate a motion state of the robot at each timing, obtaining state estimation to which the robot corresponds under each motion model at timing k through m number of dimension-extended EKF
filters that match m number of motion models corresponding to m number of motion states at timing k, obtaining m number of states, and performing weighted calculation on the m number of states to obtain a state estimation result at timing k, wherein each timing is expressed by timing k, and k, m are each an integer greater than O.
6. The device according to Claim 5, characterized in that the data obtaining module is employed for:
obtaining observation data 4, 4 ....................................... ziki of at least two ultrasonic arrays on the robot at timing k, wherein k, n are each an integer greater than 0, and 4, 4 .......... zn are all vectors of robot k angles and distance data measured by the at least two ultrasonic arrays.
7. The device according to Claim 6, characterized in that the calculating module includes a robot tracking system creating module for:
creating the robot tracking system that includes a motion equation and an observation equation of the robot, expressed as:
the motion equation: the observation equation:
where i, j=1, 2... m represents the number of models, n=1,2 ........... n represents the number of ultrasonic arrays, m, n are each an integer greater than or equal to 1, k E N
represents timing, Cii represents a probability of a target transferring from model i at timing k-1 to model j at timing k, F represents an ith model state transfer matrix at timing k, xLxii, represents a target state under an ith motion model at timing k, 41 represents an observation matrix of an nth array at timing k, zk' represents target state observation received by the nth array at timing k, wk represents process noise of model i, viki represents observation noise of the nth array, the two noises are both supposed to be white Gaussian noise with a mean value of zero and a covariance of Qk and f 41 respectively;
a model input interacting module, for letting xki _11k_1 be state estimation of dimension-extended EKF filter i at timing k-1, Pk_11k_1 be corresponding covariance matrix estimation, and ptik_1 be a probability of model i at timing k-1, after interactive calculation, input calculation formulae of dimension-extended EKF filter j at timing k being as follows:
where a sub-model filtering module, for calculating and obtaining corresponding inputs xic i llk-1, at various dimension-extended EKF filters, and employing the obtained measurement 4,, 4 .. 41 to perform corresponding state estimation updates under various models;
a model probability updating module: for calculating model probabilities on various models i=1, 2, ...m, the calculation formula being as follows:
where and an estimation fusion outputting module, for calculating state estimation and covariance matrix estimation of the target at the current timing according to update probabilities, state estimations and covariance matrix estimations of the various models, the calculation formulae being as follows:
where xi, lk represents target state estimation at timing k, and Pk lk represents target state covariance matrix estimation at timing k.
8. The device according to Claim 7, characterized in that the sub-model filtering module includes:
a state predicting sub-module for, with respect to the various models i=1, 2... m, calculating corresponding prediction states and prediction covariance matrixes respectively, the calculation formulae being as follows:
a data fusing sub-module, for employing a dimension-extended algorithm to perform data fusion, formulae of various corresponding variables being as follows:
corresponding to the models i=1, 2... m, calculation formulae of their respective measurement prediction residuals and measurement covariances being as follows:
a likelihood function corresponding to model i being simultaneously calculated, the likelihood function being as follows in the supposition that a condition of Gaussian distribution is abided by:
a filter updating sub-module, for, corresponding to the models i=1, 2... m, respectively calculating their respective filter gains, state estimation updates, and error covariance matrixes, the calculation formulae being as follows:
9. A robot tracking equipment, characterized in comprising:
a processor; and a memory, for storing an executable instruction of the processor; wherein the processor is configured to execute steps of the robot tracking method according to anyone of Claims 1 to 4 via the executable instruction.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program, the steps of the robot tracking method according to anyone of Claims 1 to 4 are realized when the computer program is executed by a processor.
CA3158929A 2019-10-29 2020-07-30 Robot tracking method, device, equipment, and computer-readable storage medium Pending CA3158929A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN201911048673.3 2019-10-29
CN201911048673.3A CN110849369B (en) 2019-10-29 2019-10-29 Robot tracking method, device, equipment and computer readable storage medium
PCT/CN2020/105997 WO2021082571A1 (en) 2019-10-29 2020-07-30 Robot tracking method, device and equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CA3158929A1 true CA3158929A1 (en) 2021-05-06

Family

ID=69599184

Family Applications (1)

Application Number Title Priority Date Filing Date
CA3158929A Pending CA3158929A1 (en) 2019-10-29 2020-07-30 Robot tracking method, device, equipment, and computer-readable storage medium

Country Status (3)

Country Link
CN (1) CN110849369B (en)
CA (1) CA3158929A1 (en)
WO (1) WO2021082571A1 (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110849369B (en) * 2019-10-29 2022-03-29 苏宁云计算有限公司 Robot tracking method, device, equipment and computer readable storage medium
CN113534164B (en) * 2021-05-24 2023-12-12 中船海洋探测技术研究院有限公司 Target path tracking method based on active-passive combined sonar array
CN113805141B (en) * 2021-08-31 2023-06-30 西北工业大学 Single-station passive positioning method based on signal intensity
CN114021073A (en) * 2021-09-24 2022-02-08 西北工业大学 Multi-sensor cooperative target tracking method based on federal IMM
CN114018250B (en) * 2021-10-18 2024-05-03 杭州鸿泉物联网技术股份有限公司 Inertial navigation method, electronic device, storage medium and computer program product
CN114035154B (en) * 2021-11-10 2024-05-24 中国人民解放军空军工程大学 Single-station radio frequency signal positioning method assisted by motion parameters
CN114445456B (en) * 2021-12-23 2023-04-07 西北工业大学 Data-driven intelligent maneuvering target tracking method and device based on partial model
CN114488116B (en) * 2022-01-17 2024-04-26 武汉大学 3D target tracking method based on two-part two-coordinate exogenous radar system
CN115166635B (en) * 2022-06-24 2023-03-28 江南大学 Robot positioning method based on risk sensitive FIR filtering
CN115792796B (en) * 2023-02-13 2023-06-06 鹏城实验室 Co-location method, device and terminal based on relative observation equivalent model
CN116383966B (en) * 2023-03-30 2023-11-21 中国矿业大学 Multi-unmanned system distributed cooperative positioning method based on interaction multi-model
CN117906601B (en) * 2023-12-04 2024-08-30 国家基础地理信息中心 Multi-source sensor fusion-oriented adaptive interactive navigation positioning filtering method

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5325098A (en) * 1993-06-01 1994-06-28 The United States Of America As Represented By The Secretary Of The Navy Interacting multiple bias model filter system for tracking maneuvering targets
JP2009244234A (en) * 2008-03-31 2009-10-22 New Industry Research Organization Ultrasonic array sensor and signal processing method
CN101610567B (en) * 2009-07-10 2012-05-30 华南理工大学 Dynamic group scheduling method based on wireless sensor network
CN101661104B (en) * 2009-09-24 2012-04-25 北京航空航天大学 Target tracking method based on radar/infrared measurement data coordinate conversion
DE102010027972A1 (en) * 2010-04-20 2011-10-20 Robert Bosch Gmbh Arrangement for determining the distance and the direction to an object
CN101894278B (en) * 2010-07-16 2012-06-27 西安电子科技大学 Human motion tracing method based on variable structure multi-model
AU2012203094B2 (en) * 2010-11-19 2016-08-04 Commonwealth Scientific And Industrial Research Organisation Tracking location of mobile devices in a wireless network
CN103853908B (en) * 2012-12-04 2017-11-14 中国科学院沈阳自动化研究所 A kind of maneuvering target tracking method of adaptive interaction formula multi-model
JP2014228278A (en) * 2013-05-17 2014-12-08 日本精工株式会社 Ultrasonic proximity sensor device and object detection method
CN104252178B (en) * 2014-09-12 2017-11-03 西安电子科技大学 It is a kind of based on strong motor-driven method for tracking target
CN104316058B (en) * 2014-11-04 2017-01-18 东南大学 Interacting multiple model adopted WSN-INS combined navigation method for mobile robot
US10572640B2 (en) * 2015-11-16 2020-02-25 Personnus System for identity verification
CN106093951B (en) * 2016-06-06 2018-11-13 清华大学 Object tracking methods based on array of ultrasonic sensors
WO2018010099A1 (en) * 2016-07-12 2018-01-18 深圳大学 Target tracking method for turn maneuver, and system for same
WO2018119912A1 (en) * 2016-12-29 2018-07-05 深圳大学 Target tracking method and device based on parallel fuzzy gaussian and particle filter
CN109029243B (en) * 2018-07-04 2021-02-26 南京理工大学 Improved agricultural machinery working area measuring terminal and method
CN109781118A (en) * 2019-03-08 2019-05-21 兰州交通大学 A kind of location tracking method of unmanned vehicle
CN110095728A (en) * 2019-05-23 2019-08-06 合肥工业大学智能制造技术研究院 Battery SOC, SOH combined estimation method based on interactive multi-model
CN110849369B (en) * 2019-10-29 2022-03-29 苏宁云计算有限公司 Robot tracking method, device, equipment and computer readable storage medium

Also Published As

Publication number Publication date
CN110849369A (en) 2020-02-28
WO2021082571A1 (en) 2021-05-06
CN110849369B (en) 2022-03-29

Similar Documents

Publication Publication Date Title
CA3158929A1 (en) Robot tracking method, device, equipment, and computer-readable storage medium
Couturier et al. A review on absolute visual localization for UAV
Kovvali et al. An introduction to kalman filtering with matlab examples
US9798929B2 (en) Real-time pose estimation system using inertial and feature measurements
CN108037520B (en) Neural network-based direct positioning deviation correction method under array amplitude-phase error condition
Eustice et al. Exactly sparse delayed-state filters
Niedfeldt et al. Multiple target tracking using recursive RANSAC
US20150341723A1 (en) Multitask learning method for broadband source-location mapping of acoustic sources
Mahboub Variance component estimation in errors-in-variables models and a rigorous total least-squares approach
Olofsson et al. Multi-agent informed path planning using the probability hypothesis density
Batstone et al. Robust time-of-arrival self calibration with missing data and outliers
WO2022045982A1 (en) Unmanned aerial vehicle and localization method for unmanned aerial vehicle
Ghazvinian Zanjani et al. Modality-agnostic topology aware localization
US20140028489A1 (en) Target tracking apparatus, storage medium stored a target tracking program, target tracking system, and target tracking method
Witzgall et al. Single platform passive Doppler geolocation with unknown emitter frequency
CN112567203B (en) Method and apparatus for assisting in the navigation of a fleet of vehicles using a invariant Kalman filter
Masmitja et al. Underwater mobile target tracking with particle filter using an autonomous vehicle
Zhang et al. Learning cross-scale visual representations for real-time image geo-localization
CN113093093B (en) Vehicle positioning method based on linear array direction of arrival estimation
Smith et al. Stochastic modeling and control for tracking the periodic movement of marine animals via AUVs
Tsyganov et al. Adaptive eetimation of a moving object trajectory using sequential hypothesis testing
Wang et al. Particle Filter with Hybrid Proposal Distribution for Nonlinear State Estimation.
Gonzales–Martínez et al. Faster R-CNN with a cross-validation approach to object detection in radar images
Abouzahir et al. FastSLAM 2.0 running on a low-cost embedded architecture
Smith On Doppler measurements for tracking

Legal Events

Date Code Title Description
EEER Examination request

Effective date: 20220916

EEER Examination request

Effective date: 20220916

EEER Examination request

Effective date: 20220916

EEER Examination request

Effective date: 20220916