WO2021082571A1 - Robot tracking method, device and equipment and computer readable storage medium - Google Patents

Robot tracking method, device and equipment and computer readable storage medium Download PDF

Info

Publication number
WO2021082571A1
WO2021082571A1 PCT/CN2020/105997 CN2020105997W WO2021082571A1 WO 2021082571 A1 WO2021082571 A1 WO 2021082571A1 CN 2020105997 W CN2020105997 W CN 2020105997W WO 2021082571 A1 WO2021082571 A1 WO 2021082571A1
Authority
WO
WIPO (PCT)
Prior art keywords
time
model
robot
state
motion
Prior art date
Application number
PCT/CN2020/105997
Other languages
French (fr)
Chinese (zh)
Inventor
郑鑫江
李铭浩
樊锅旭
赵井全
Original Assignee
苏宁云计算有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 苏宁云计算有限公司 filed Critical 苏宁云计算有限公司
Priority to CA3158929A priority Critical patent/CA3158929A1/en
Publication of WO2021082571A1 publication Critical patent/WO2021082571A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control

Definitions

  • the invention relates to the field of intelligent robot manipulation, and in particular to a robot tracking method, device, equipment and computer readable storage medium.
  • Intelligent robots are used to transmit ultrasonic waves. .
  • the acoustic signals emitted by the robot are received by a multi-ultrasonic array, the observation position of the robot is obtained through processing, and finally the position estimation of the robot is filtered by the tracking algorithm.
  • This method satisfies the real-time performance and has high tracking accuracy.
  • Common tracking algorithms mainly include extended Kalman filter, unscented Kalman filter, particle filter, etc. These algorithms have a good tracking effect when the target motion model is known and the motion state is basically unchanged. However, in the actual target tracking process, the motion model is often unknown, and the motion state of the robot often changes. The tracking effect of the above algorithm will decrease or even diverge. Compared with the single ultrasonic receiving array, the multi-array tracking system can obtain more information about the target's motion state, and use the corresponding fusion algorithm to improve the tracking accuracy.
  • the embodiments of the present invention provide a robot tracking method and device, which improve the tracking accuracy when tracking a robot indoors.
  • the tracking error is small and the amount of calculation is relatively low, thereby realizing that the state of the intelligent robot is unknown and many It can also track it stably and effectively under changing circumstances, reducing the occurrence of mistracking or tracking loss.
  • the technical solution is as follows:
  • a robot tracking method includes:
  • the preset extended-dimensional IMM-EKF algorithm is used to estimate the motion state of the robot at each time.
  • m extended-dimensional EKF filters that match the m motion models corresponding to the m motion states at time k, each type at time k is obtained.
  • the state of the robot under the motion model is estimated, m states are obtained, and the m states are weighted to obtain the robot state estimation result at time k, where each time is represented by k time, and k and m are both greater than 0 Integer.
  • the observation data of at least two ultrasonic arrays on the robot is acquired, including:
  • the preset extended-dimensional IMM-EKF algorithm is used to estimate the motion state of the robot at each time, and k is obtained by using m extended-dimensional EKF filters that match the m motion models corresponding to the m motion states at time k.
  • the corresponding state estimation of the robot under each motion model at time is obtained, and m states are obtained, and the weighted calculation is performed on the m states to obtain the estimation result of the robot state at time k, including:
  • m and n are both integers greater than or equal to 1
  • k ⁇ N represents time
  • C ij represents k -1
  • the probability that the target transfers from model i to model j at time k Represents the i-th model state transition matrix at time k, Indicates the target state under the i-th motion model at time k, Means the observation matrix of the nth array at time k, Represents the target state observation received by the nth array at time k.
  • Represents the process noise of model i Denoted as the observed noise of the nth array, both types of noise are assumed to be zero mean, and the covariances are respectively Gaussian white noise;
  • Model input interaction steps set Is the state estimation of the extended dimension EKF filter i at k-1 For the corresponding covariance matrix estimation, Is the probability of model i at time k-1.
  • the input calculation formula of the extended-dimension EKF filter j at time k is as follows:
  • Sub-model filtering step calculate the corresponding input in each extended dimension EKF filter Take advantage of the measurement Update the corresponding state estimation under each model;
  • Estimated fusion output step According to the update probability and state estimation of each model and the estimated covariance matrix estimation, calculate the state estimation and estimated covariance matrix estimation of the target at the current moment.
  • the calculation formula is as follows:
  • k represents the estimation of the target state at time k
  • k represents the estimation of the covariance matrix of the target state at time k
  • sub-model filtering step includes:
  • Data fusion sub-step data fusion is carried out using the expansion algorithm, and the formula of each corresponding variable is as follows:
  • the likelihood function corresponding to model i is calculated. Under the assumption that it obeys the Gaussian distribution, the likelihood function is as follows:
  • a robot tracking device which includes:
  • the data acquisition module is used to: acquire observation data of the robot by at least two ultrasonic arrays at each moment of tracking;
  • the calculation module is used to estimate the motion state of the robot at each time by using the preset extended-dimensional IMM-EKF algorithm, and pass m extended-dimensional EKF filters that match the m motion models corresponding to the m motion states at time k, Obtain the state estimation corresponding to the robot under each motion model at time k, obtain m states, and perform weighted calculation on the m states to obtain the state estimation result at time k, where each time is represented by k, k, m Both are integers greater than 0.
  • the data acquisition module is used for:
  • the calculation module includes a robot tracking system establishment module for:
  • the robot tracking system is established, and the robot tracking system includes the robot's motion equation and observation equation as follows:
  • m and n are both integers greater than or equal to 1
  • k ⁇ N represents time
  • C ij represents k -1
  • the probability that the target transfers from model i to model j at time k Represents the i-th model state transition matrix at time k, Indicates the target state under the i-th motion model at time k, Means the observation matrix of the nth array at time k, Represents the target state observation received by the nth array at time k.
  • Represents the process noise of model i Denoted as the observed noise of the nth array, both types of noise are assumed to be zero mean, and the covariances are respectively Gaussian white noise;
  • Model input interaction module used to: set Is the state estimation of the extended dimension EKF filter i at k-1 For the corresponding covariance matrix estimation, Is the probability of model i at time k-1. After interactive calculation, the input calculation formula of the extended-dimension EKF filter j at time k is as follows:
  • the sub-model filter module is used to calculate the corresponding input in each extended dimension EKF filter Use acquired measurements Update the corresponding state estimation under each model;
  • the estimation fusion output module is used to calculate the state estimation and covariance matrix estimation of the target at the current moment according to the update probability and state estimation of each model and the covariance matrix estimation.
  • the calculation formula is as follows:
  • k represents the estimation of the target state at time k
  • k represents the estimation of the covariance matrix of the target state at time k
  • sub-model filtering module includes:
  • the data fusion sub-module is used for data fusion using the expansion algorithm, and the formulas of each corresponding variable are as follows:
  • the likelihood function corresponding to model i is calculated. Under the assumption that it obeys the Gaussian distribution, the likelihood function is as follows:
  • the calculation formula is as follows:
  • a robot tracking device includes:
  • a memory for storing executable instructions of the processor
  • the processor is configured to execute the steps of the robot tracking method according to any one of the above solutions via the executable instructions.
  • a computer-readable storage medium stores a computer program that, when executed by a processor, implements the steps of the robot tracking method according to any one of the above solutions.
  • the observation data is acquired at each time of the robot tracking, and the dimension expansion is performed on the basis of the IMM-EKF algorithm through the preset dimension expansion IMM-EKF algorithm, and each step of the iterative process is measured and expanded to obtain more information.
  • Multi-target motion state information suitable for multi-ultrasound arrays
  • Fig. 1 is a flowchart of a robot tracking method provided by an embodiment of the present invention
  • FIG. 2 is a flowchart of sub-steps of step 102 in Figure 1;
  • FIG. 3 is a flowchart of sub-steps of step 1023 in Figure 2;
  • FIG. 5 is a schematic diagram of a calculation process of a state calculation result in a robot tracking method provided by an embodiment of the present invention
  • Figure 6 is a schematic structural diagram of a robot tracking device provided by an embodiment of the present invention.
  • FIG. 7 is a schematic diagram of the composition of a robot tracking device provided by an embodiment of the present invention.
  • FIG. 8 is a comparison diagram of the tracking effect of the robot tracking solution provided by the embodiment of the present invention and the existing solution;
  • FIG. 9 is a comparison diagram of the tracking error effect of the robot tracking solution provided by the embodiment of the present invention and the existing solution.
  • the robot tracking method, device, equipment, and computer-readable storage medium provided by the embodiments of the present invention can obtain observation data at each time of robot tracking by arranging multiple ultrasonic arrays, and use the preset expanded dimension IMM-EKF algorithm in IMM- Based on the EKF algorithm, each step of the iterative process is measured and expanded to obtain more target motion state information. It is suitable for multi-ultrasonic arrays, makes full use of the original observation data, has the best fusion effect, and improves the tracking accuracy when tracking the robot indoors. , The tracking error is small and the amount of calculation is relatively low, so that the intelligent robot can be tracked stably and effectively even when the state of the intelligent robot is unknown and changeable, reducing the occurrence of false follow or follow-up. Therefore, the robot tracking method is suitable for application fields involving intelligent robot manipulation, and is especially suitable for application scenarios of multiple ultrasonic arrays.
  • Fig. 1 is a flowchart of a robot tracking method provided by an embodiment of the present invention.
  • Fig. 2 is a flowchart of sub-steps of step 102 in Fig. 1.
  • Fig. 3 is a flowchart of sub-steps of step 1023 in Fig. 2.
  • the robot tracking method provided by the embodiment of the present invention includes the following steps:
  • At each time of tracking obtain the observation data of the robot by at least two ultrasonic arrays, including:
  • step 101 may also be implemented in other manners, and the embodiment of the present invention does not limit the specific manner.
  • the above 102 step further includes the following sub-steps:
  • Robot tracking system establishment step The robot tracking system is established, and the robot tracking system includes the motion equation and observation equation of the robot, which are expressed as follows:
  • m and n are both integers greater than or equal to 1
  • k ⁇ N represents time
  • C ij represents k -1
  • the probability that the target transfers from model i to model j at time k Represents the i-th model state transition matrix at time k, Indicates the target state under the i-th motion model at time k, Means the observation matrix of the nth array at time k, Represents the target state observation received by the nth array at time k.
  • Represents the process noise of model i Denoted as the observed noise of the nth array, both types of noise are assumed to be zero mean, and the covariances are respectively Gaussian white noise;
  • 1023-Sub-model filtering step Calculate the corresponding input in each extended dimension EKF filter Use acquired measurements Update the corresponding state estimation under each model;
  • 1025-Estimation fusion output step According to the update probability and state estimation of each model and the estimated covariance matrix estimation, calculate the state estimation and covariance matrix estimation of the target at the current moment.
  • the calculation formula is as follows:
  • k represents the estimation of the target state at time k
  • k represents the estimation of the covariance matrix of the target state at time k
  • the aforementioned sub-model filtering step further includes the following sub-steps:
  • the likelihood function corresponding to model i is calculated. Under the assumption that it obeys the Gaussian distribution, the likelihood function is as follows:
  • FIG. 4 is a flowchart of a robot tracking method provided by an embodiment of the present invention.
  • FIG. 5 is a schematic diagram of a state calculation result calculation process in the robot tracking method provided by an embodiment of the present invention, and jointly demonstrates an implementation of selecting two ultrasonic arrays.
  • step 102 may also be implemented in other ways, and the embodiment of the present invention does not limit the specific manner.
  • FIG. 6 is a schematic structural diagram of a robot tracking device provided by an embodiment of the present invention. As shown in FIG. 6, the robot tracking device provided by an embodiment of the present invention includes a data acquisition module 1 and a calculation module 2.
  • the data acquisition module 1 is used to acquire observation data of the robot by at least two ultrasonic arrays at each moment of tracking. Specifically, the data acquisition module 1 is used to: at time k, acquire observation data of the robot by at least two ultrasonic arrays Where k and n are both integers greater than 0, Both are vectors of robot angle and distance data measured by at least two ultrasonic arrays.
  • the calculation module 2 is used to: use the preset extended dimension IMM-EKF algorithm to estimate the motion state of the robot at each moment, and obtain respectively through m extended dimensional EKF filters matching m motion models corresponding to the m motion states at time k
  • the state estimation of the robot corresponding to each motion model at k time is obtained, and m states are obtained, and the weighted calculation of m states is performed to obtain the state estimation result at time k, where each time is represented by k time, and k and m are integers greater than 0 .
  • the calculation module 2 includes a robot tracking system establishment module 21, a model input interaction module 22, a sub-model filtering module 23, a model probability update module 24, and an estimated fusion output module 25.
  • the robot tracking system establishment module 21 is used for:
  • the robot tracking system is established, and the robot tracking system includes the robot's motion equation and observation equation as follows:
  • m and n are both integers greater than or equal to 1
  • k ⁇ N represents time
  • C ij represents k -1
  • the probability that the target transfers from model i to model j at time k Represents the i-th model state transition matrix at time k, Indicates the target state under the i-th motion model at time k, Means the observation matrix of the nth array at time k, Represents the target state observation received by the nth array at time k.
  • Represents the process noise of model i Denoted as the observed noise of the nth array, both types of noise are assumed to be zero mean, and the covariances are respectively Gaussian white noise.
  • the model input interaction module 22 is used to:
  • the sub-model filtering module 23 is used to:
  • the estimated fusion output module 25 is used for:
  • k represents the estimation of the target state at time k
  • k represents the estimation of the covariance matrix of the target state at time k
  • sub-model filtering module 23 includes a state prediction sub-module 231, a data fusion sub-module 232, and a filtering update sub-module 233.
  • the state prediction sub-module 231 is used to:
  • the data fusion sub-module 232 is used for:
  • the likelihood function corresponding to model i is calculated. Under the assumption that it obeys the Gaussian distribution, the likelihood function is as follows:
  • the filtering update sub-module 233 is used to:
  • FIG. 7 is a schematic diagram of the composition of a robot tracking device provided by an embodiment of the present invention.
  • the robot tracking device provided by an embodiment of the present invention includes a processor 3 and a memory 4, and the memory 4 is used to store the processor 3 Executing instructions; wherein, the processor 3 is configured to execute the steps of the robot tracking method described in any one of the above solutions via the aforementioned executable instructions.
  • the embodiment of the present invention also provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the steps of the robot tracking method described in any of the above solutions are implemented.
  • the robot tracking device when the robot tracking device provided in the above embodiment triggers the robot tracking service, only the division of the above functional modules is used as an example for illustration. In actual applications, the above functions can be allocated by different functional modules according to needs. , That is, divide the internal structure of the device into different functional modules to complete all or part of the functions described above.
  • the robot tracking device, robot tracking device, computer-readable storage medium and the robot tracking method for triggering the robot tracking service provided in the above embodiments belong to the same concept. For the specific implementation process, please refer to the method embodiment. I won't repeat it here.
  • the robot tracking method, the IMM-EKF method, and the weighted IMM-EKF provided by the embodiments of the present invention are used to process the robot measurement data.
  • the state of the robot is estimated, and the result is shown in Figure 8.
  • FIG. 8 is a comparison diagram of the tracking effect of the robot tracking solution provided by the embodiment of the present invention and the existing solution.
  • FIG. 9 is a comparison diagram of the tracking error effect of the robot tracking solution provided by the embodiment of the present invention and the existing solution.
  • Table 1 below shows the average target tracking error of the three methods, as shown below:
  • the tracking accuracy of the robot tracking method provided by the embodiment of the present invention is significantly better than the IMM-EKF algorithm, and compared with the weighted IMM-EKF algorithm, the tracking error is reduced by nearly 50%.
  • the robot tracking method, device, device, and computer-readable storage medium provided by the embodiments of the present invention have the following beneficial effects compared with the prior art:
  • the observation data is acquired at each time of the robot tracking, and the dimension expansion is performed on the basis of the IMM-EKF algorithm through the preset dimension expansion IMM-EKF algorithm, and each step of the iterative process is measured and expanded to obtain more information.
  • Multi-target motion state information suitable for multi-ultrasound arrays
  • the program can be stored in a computer-readable storage medium.
  • the storage medium mentioned can be a read-only memory, a magnetic disk or an optical disk, etc.
  • These computer program instructions can be provided to the processor of a general-purpose computer, a special-purpose computer, an embedded processor, or other programmable data processing equipment to generate a machine, so that the instructions executed by the processor of the computer or other programmable data processing equipment are generated It is a device that realizes the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
  • These computer program instructions can also be stored in a computer-readable memory that can guide a computer or other programmable data processing equipment to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction device.
  • the device implements the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
  • These computer program instructions can also be loaded on a computer or other programmable data processing equipment, so that a series of operation steps are executed on the computer or other programmable equipment to produce computer-implemented processing, so as to execute on the computer or other programmable equipment.
  • the instructions provide steps for implementing the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A robot tracking method, device and equipment and a computer readable storage medium, belonging to the field of intelligent robot control. The method comprises: at each moment of tracking, obtaining observation data obtained by observing a robot by at least two ultrasonic arrays (101); and estimating the motion state of the robot of each moment by using a preset augmented IMM-EKF algorithm, specifically including: respectively obtaining, by means of a number m of augmented EKF filters matched with m motion models corresponding to m motion states at k moment, the corresponding state estimation of the robot under each of the motion models at the k moment to obtain m state estimations, and performing weighting calculation on the m state estimations to obtain a state estimation result of the robot at the k moment, wherein each moment is represented by k moment and k and m are integers greater than zero (102). The intelligent robot can be stably and effectively tracked when the motion state of the robot is unknown and changeable, the phenomenon of mistracking or tracking loss is reduced, and the method is suitable for application scenarios with multiple ultrasonic arrays.

Description

机器人跟踪方法、装置、设备及计算机可读存储介质Robot tracking method, device, equipment and computer readable storage medium 技术领域Technical field
本发明涉及智能机器人操控领域,特别涉及一种机器人跟踪方法、装置、设备及计算机可读存储介质。The invention relates to the field of intelligent robot manipulation, and in particular to a robot tracking method, device, equipment and computer readable storage medium.
背景技术Background technique
目前,智能机器人已经被应用在海洋探测、安防、医疗等各个领域,给科技发展和人们的生活带来了极大的便利,因此有必要对机器人进行实时跟踪。然而,当智能机器人在水下或室内工作时,无法使用卫星定位。视觉导航的方法具有获取信息完整,探测范围宽等优点,在机器人导航中占有重要地位,缺点是视觉图像处理时间长,实时性较差。因此行业内学者开展了基于射频识别(RFID)技术的移动机器人定位的研究,需要指出的是,上述无线定位技术的精度为米级,不能满足室内机器人高精度导航定位的要求利用智能机器人发射超声波。通过多超声波阵列接收机器人发射的声信号,经过处理获得机器人的观测位置,最后经跟踪算法滤波得到机器人的位置估计,此方法满足实时性的同时也具有较高的跟踪精度。At present, intelligent robots have been used in various fields such as ocean exploration, security, and medical treatment, which have brought great convenience to the development of science and technology and people's lives. Therefore, it is necessary to track the robot in real time. However, when smart robots are working underwater or indoors, satellite positioning cannot be used. The visual navigation method has the advantages of complete information acquisition and wide detection range. It occupies an important position in robot navigation. The disadvantage is that the visual image processing time is long and the real-time performance is poor. Therefore, scholars in the industry have carried out research on the positioning of mobile robots based on radio frequency identification (RFID) technology. It should be pointed out that the accuracy of the above-mentioned wireless positioning technology is meter-level, which cannot meet the requirements of high-precision navigation and positioning of indoor robots. Intelligent robots are used to transmit ultrasonic waves. . The acoustic signals emitted by the robot are received by a multi-ultrasonic array, the observation position of the robot is obtained through processing, and finally the position estimation of the robot is filtered by the tracking algorithm. This method satisfies the real-time performance and has high tracking accuracy.
常见的跟踪算法主要有扩展卡尔曼滤波、无迹卡尔曼滤波、粒子滤波等,这些算法在目标运动模型已知且运动状态基本不变的情况下,具有良好的跟踪效果。然而在实际目标跟踪过程中,运动模型往往是未知的,机器人的运动状态也经常改变,上述算法的跟踪效果就会下降,甚至发散。相比于单超声波接收阵列,多阵列跟踪系统能够获取目标更多的运动状态信息,利用相应融合算法提高跟踪精度。Common tracking algorithms mainly include extended Kalman filter, unscented Kalman filter, particle filter, etc. These algorithms have a good tracking effect when the target motion model is known and the motion state is basically unchanged. However, in the actual target tracking process, the motion model is often unknown, and the motion state of the robot often changes. The tracking effect of the above algorithm will decrease or even diverge. Compared with the single ultrasonic receiving array, the multi-array tracking system can obtain more information about the target's motion state, and use the corresponding fusion algorithm to improve the tracking accuracy.
发明内容Summary of the invention
为了解决现有技术的问题,本发明实施例提供了一种机器人跟踪方法及装置,提高在室内跟踪机器人时的跟踪精度,跟踪误差小、计算量相对较低,从而实现智能机器人状态未知且多变的情况下也能对其进行稳定有效的跟踪,减少误跟或跟丢的现象发生。所述技术方案如下:In order to solve the problems of the prior art, the embodiments of the present invention provide a robot tracking method and device, which improve the tracking accuracy when tracking a robot indoors. The tracking error is small and the amount of calculation is relatively low, thereby realizing that the state of the intelligent robot is unknown and many It can also track it stably and effectively under changing circumstances, reducing the occurrence of mistracking or tracking loss. The technical solution is as follows:
一方面,提供了一种机器人跟踪方法,所述方法包括:In one aspect, a robot tracking method is provided, and the method includes:
在跟踪的每个时刻,获取至少两个超声波阵列对机器人的观测数据;Obtain the observation data of the robot by at least two ultrasonic arrays at each moment of tracking;
利用预设扩维IMM-EKF算法估计所述每个时刻机器人运动状态,通过与k时刻m种运动状态对应的m种运动模型相匹配的m个扩维EKF滤波器,分别获取k时刻每种运动模型下机器人对应的状态估计,获得m个状态,对所述m个状态进行加权计算得到k时刻机器人状态估计结果,其中所述每个时刻用k时刻表示,k、m均为大于0的整数。The preset extended-dimensional IMM-EKF algorithm is used to estimate the motion state of the robot at each time. Through m extended-dimensional EKF filters that match the m motion models corresponding to the m motion states at time k, each type at time k is obtained. The state of the robot under the motion model is estimated, m states are obtained, and the m states are weighted to obtain the robot state estimation result at time k, where each time is represented by k time, and k and m are both greater than 0 Integer.
进一步地,在跟踪的每个时刻,获取至少两个超声波阵列对机器人的观测数据,包括:Further, at each moment of tracking, the observation data of at least two ultrasonic arrays on the robot is acquired, including:
在k时刻,获取至少两个超声波阵列对机器人的观测数据
Figure PCTCN2020105997-appb-000001
其中k、n均为大于0的整数,
Figure PCTCN2020105997-appb-000002
均为所述至少两个超声波阵列测得的机器人角度、距离数据的向量。
At time k, obtain the observation data of at least two ultrasonic arrays on the robot
Figure PCTCN2020105997-appb-000001
Where k and n are both integers greater than 0,
Figure PCTCN2020105997-appb-000002
Both are vectors of the angle and distance data of the robot measured by the at least two ultrasonic arrays.
进一步地,利用预设扩维IMM-EKF算法估计所述每个时刻机器人运动状态,通过与k时刻m种运动状态对应的m种运动模型相匹配的m个扩维EKF滤波器,分别获取k时刻每种运动模型下机器人的对应的状态估计,获得m个状态,对所述m个状态进行加权计算得到k时刻机器人状态估计结果,包括:Further, the preset extended-dimensional IMM-EKF algorithm is used to estimate the motion state of the robot at each time, and k is obtained by using m extended-dimensional EKF filters that match the m motion models corresponding to the m motion states at time k. The corresponding state estimation of the robot under each motion model at time is obtained, and m states are obtained, and the weighted calculation is performed on the m states to obtain the estimation result of the robot state at time k, including:
机器人跟踪系统建立步骤:建立所述机器人跟踪系统,所述机器人跟踪系统包括机器人的运动方程和观测方程,如下表示:The steps of establishing the robot tracking system: establishing the robot tracking system, which includes the motion equation and the observation equation of the robot, which are expressed as follows:
运动方程:
Figure PCTCN2020105997-appb-000003
Motion equation:
Figure PCTCN2020105997-appb-000003
观测方程:
Figure PCTCN2020105997-appb-000004
Observation equation:
Figure PCTCN2020105997-appb-000004
C ij=P(M k=M j|M k-1=M i); C ij =P(M k =M j |M k-1 =M i );
其中i,j=1,2...m表示模型数量,n=1,2……n表示超声波阵列数量,m、n均为大于等于1的整数,k∈N表示时刻,C ij表示k-1时刻目标从模型i转移到k时刻的模型j的概率,
Figure PCTCN2020105997-appb-000005
表示k时刻第i个模型状态转移矩阵,
Figure PCTCN2020105997-appb-000006
表示,k时刻第i个运动模型下的目标状态,
Figure PCTCN2020105997-appb-000007
表示,k时刻第n个阵列的观测矩阵,
Figure PCTCN2020105997-appb-000008
表示k时刻第n个阵列接收到的目标状态观测。
Figure PCTCN2020105997-appb-000009
表示模型i的过程噪声,
Figure PCTCN2020105997-appb-000010
表示为第n个阵列的观测噪声,两种噪声均假设为零均值,协方差分别为
Figure PCTCN2020105997-appb-000011
的高斯白噪声;
Where i, j = 1, 2...m represents the number of models, n = 1, 2...n represents the number of ultrasonic arrays, m and n are both integers greater than or equal to 1, k ∈ N represents time, C ij represents k -1 The probability that the target transfers from model i to model j at time k,
Figure PCTCN2020105997-appb-000005
Represents the i-th model state transition matrix at time k,
Figure PCTCN2020105997-appb-000006
Indicates the target state under the i-th motion model at time k,
Figure PCTCN2020105997-appb-000007
Means the observation matrix of the nth array at time k,
Figure PCTCN2020105997-appb-000008
Represents the target state observation received by the nth array at time k.
Figure PCTCN2020105997-appb-000009
Represents the process noise of model i,
Figure PCTCN2020105997-appb-000010
Denoted as the observed noise of the nth array, both types of noise are assumed to be zero mean, and the covariances are respectively
Figure PCTCN2020105997-appb-000011
Gaussian white noise;
模型输入交互步骤:设
Figure PCTCN2020105997-appb-000012
为k-1时刻的扩维EKF滤波器i的状态估计,
Figure PCTCN2020105997-appb-000013
为相应的协方差矩阵估计,
Figure PCTCN2020105997-appb-000014
为k-1时刻模型i的概率,交互式计算后,扩维EKF滤波器j在k时刻的输入计算公式如下:
Model input interaction steps: set
Figure PCTCN2020105997-appb-000012
Is the state estimation of the extended dimension EKF filter i at k-1
Figure PCTCN2020105997-appb-000013
For the corresponding covariance matrix estimation,
Figure PCTCN2020105997-appb-000014
Is the probability of model i at time k-1. After interactive calculation, the input calculation formula of the extended-dimension EKF filter j at time k is as follows:
Figure PCTCN2020105997-appb-000015
Figure PCTCN2020105997-appb-000015
Figure PCTCN2020105997-appb-000016
Figure PCTCN2020105997-appb-000016
其中
Figure PCTCN2020105997-appb-000017
among them
Figure PCTCN2020105997-appb-000017
子模型滤波步骤:在各个扩维EKF滤波器计算得到相应的输入
Figure PCTCN2020105997-appb-000018
Figure PCTCN2020105997-appb-000019
利用获得量测
Figure PCTCN2020105997-appb-000020
进行各个模型下对应的状态估计更新;
Sub-model filtering step: calculate the corresponding input in each extended dimension EKF filter
Figure PCTCN2020105997-appb-000018
Figure PCTCN2020105997-appb-000019
Take advantage of the measurement
Figure PCTCN2020105997-appb-000020
Update the corresponding state estimation under each model;
模型概率更新步骤:对各个模型i=1,2,...m计算模型概率,计算公式如下:Model probability update step: Calculate the model probability for each model i=1, 2,...m, the calculation formula is as follows:
Figure PCTCN2020105997-appb-000021
Figure PCTCN2020105997-appb-000021
其中,
Figure PCTCN2020105997-appb-000022
among them,
Figure PCTCN2020105997-appb-000022
估计融合输出步骤:根据各个模型的更新概率和状态估计以及估计协方差矩阵估计,计算出当前时刻目标的状态估计和估计协方差矩阵估计,计算公式如下:
Figure PCTCN2020105997-appb-000023
Estimated fusion output step: According to the update probability and state estimation of each model and the estimated covariance matrix estimation, calculate the state estimation and estimated covariance matrix estimation of the target at the current moment. The calculation formula is as follows:
Figure PCTCN2020105997-appb-000023
Figure PCTCN2020105997-appb-000024
Figure PCTCN2020105997-appb-000024
x k|k表示k时刻目标状态估计,P k|k表示k时刻目标状态协方差矩阵估计。 x k|k represents the estimation of the target state at time k, and P k|k represents the estimation of the covariance matrix of the target state at time k.
进一步地,所述子模型滤波步骤包括:Further, the sub-model filtering step includes:
状态预测子步骤:对于各个模型i=1,2...m,分别计算相应的预测状态和预测协方差矩阵,计算公式如下:State prediction sub-step: For each model i=1, 2...m, calculate the corresponding prediction state and prediction covariance matrix, the calculation formula is as follows:
Figure PCTCN2020105997-appb-000025
Figure PCTCN2020105997-appb-000025
Figure PCTCN2020105997-appb-000026
Figure PCTCN2020105997-appb-000026
数据融合子步骤:利用扩维算法进行数据融合,各个相应变量公式如下:Data fusion sub-step: data fusion is carried out using the expansion algorithm, and the formula of each corresponding variable is as follows:
Figure PCTCN2020105997-appb-000027
Figure PCTCN2020105997-appb-000027
Figure PCTCN2020105997-appb-000028
Figure PCTCN2020105997-appb-000028
Figure PCTCN2020105997-appb-000029
Figure PCTCN2020105997-appb-000029
对应于模型i=1,2...m,各自的量测预测残差和量测协方差计算公式如下:Corresponding to the model i=1, 2...m, the calculation formulas for the respective measurement prediction residuals and measurement covariances are as follows:
Figure PCTCN2020105997-appb-000030
Figure PCTCN2020105997-appb-000030
Figure PCTCN2020105997-appb-000031
Figure PCTCN2020105997-appb-000031
同时计算与模型i对应的似然函数,在假定服从高斯分布的条件下,似然函数如下:At the same time, the likelihood function corresponding to model i is calculated. Under the assumption that it obeys the Gaussian distribution, the likelihood function is as follows:
Figure PCTCN2020105997-appb-000032
Figure PCTCN2020105997-appb-000032
滤波更新子步骤:对应于模型i=1,2,...m,分别计算各自的滤波增益,状态估计更新,以及误差协方差矩阵,计算公式如下:Filtering update sub-step: corresponding to the model i = 1, 2, ... m, respectively calculate the respective filter gain, state estimation update, and error covariance matrix, the calculation formula is as follows:
Figure PCTCN2020105997-appb-000033
Figure PCTCN2020105997-appb-000033
Figure PCTCN2020105997-appb-000034
Figure PCTCN2020105997-appb-000034
Figure PCTCN2020105997-appb-000035
Figure PCTCN2020105997-appb-000035
另一方面,提供了一种机器人跟踪装置,所述装置包括:In another aspect, a robot tracking device is provided, which includes:
数据获取模块,用于:在跟踪的每个时刻,获取至少两个超声波阵列对机器人的观测数据;The data acquisition module is used to: acquire observation data of the robot by at least two ultrasonic arrays at each moment of tracking;
计算模块,用于:利用预设扩维IMM-EKF算法估计所述每个时刻机器人运动状态,通过与k时刻m种运动状态对应的m种运动模型相匹配的m个扩维EKF 滤波器,分别获取k时刻每种运动模型下机器人对应的状态估计,获得m个状态,对所述m个状态进行加权计算得到k时刻状态估计结果,其中所述每个时刻用k时刻表示,k、m均为大于0的整数。The calculation module is used to estimate the motion state of the robot at each time by using the preset extended-dimensional IMM-EKF algorithm, and pass m extended-dimensional EKF filters that match the m motion models corresponding to the m motion states at time k, Obtain the state estimation corresponding to the robot under each motion model at time k, obtain m states, and perform weighted calculation on the m states to obtain the state estimation result at time k, where each time is represented by k, k, m Both are integers greater than 0.
进一步地,所述数据获取模块用于:Further, the data acquisition module is used for:
在k时刻,获取至少两个超声波阵列对机器人的观测数据
Figure PCTCN2020105997-appb-000036
其中k、n均为大于0的整数,
Figure PCTCN2020105997-appb-000037
均为所述至少两个超声波阵列测得的机器人角度、距离数据的向量。
At time k, obtain the observation data of at least two ultrasonic arrays on the robot
Figure PCTCN2020105997-appb-000036
Where k and n are both integers greater than 0,
Figure PCTCN2020105997-appb-000037
Both are vectors of the angle and distance data of the robot measured by the at least two ultrasonic arrays.
进一步地,所述计算模块包括机器人跟踪系统建立模块,用于:Further, the calculation module includes a robot tracking system establishment module for:
建立所述机器人跟踪系统,所述机器人跟踪系统包括机器人的运动方程和观测方程如下表示:The robot tracking system is established, and the robot tracking system includes the robot's motion equation and observation equation as follows:
运动方程:
Figure PCTCN2020105997-appb-000038
Motion equation:
Figure PCTCN2020105997-appb-000038
观测方程:
Figure PCTCN2020105997-appb-000039
Observation equation:
Figure PCTCN2020105997-appb-000039
C ij=P(M k=M j|M k-1=M i); C ij =P(M k =M j |M k-1 =M i );
其中i,j=1,2...m表示模型数量,n=1,2……n表示超声波阵列数量,m、n均为大于等于1的整数,k∈N表示时刻,C ij表示k-1时刻目标从模型i转移到k时刻的模型j的概率,
Figure PCTCN2020105997-appb-000040
表示k时刻第i个模型状态转移矩阵,
Figure PCTCN2020105997-appb-000041
表示,k时刻第i个运动模型下的目标状态,
Figure PCTCN2020105997-appb-000042
表示,k时刻第n个阵列的观测矩阵,
Figure PCTCN2020105997-appb-000043
表示k时刻第n个阵列接收到的目标状态观测。
Figure PCTCN2020105997-appb-000044
表示模型i的过程噪声,
Figure PCTCN2020105997-appb-000045
表示为第n个阵列的观测噪声,两种噪声均假设为零均值,协方差分别为
Figure PCTCN2020105997-appb-000046
的高斯白噪声;
Where i, j = 1, 2...m represents the number of models, n = 1, 2...n represents the number of ultrasonic arrays, m and n are both integers greater than or equal to 1, k ∈ N represents time, C ij represents k -1 The probability that the target transfers from model i to model j at time k,
Figure PCTCN2020105997-appb-000040
Represents the i-th model state transition matrix at time k,
Figure PCTCN2020105997-appb-000041
Indicates the target state under the i-th motion model at time k,
Figure PCTCN2020105997-appb-000042
Means the observation matrix of the nth array at time k,
Figure PCTCN2020105997-appb-000043
Represents the target state observation received by the nth array at time k.
Figure PCTCN2020105997-appb-000044
Represents the process noise of model i,
Figure PCTCN2020105997-appb-000045
Denoted as the observed noise of the nth array, both types of noise are assumed to be zero mean, and the covariances are respectively
Figure PCTCN2020105997-appb-000046
Gaussian white noise;
模型输入交互模块,用于:设
Figure PCTCN2020105997-appb-000047
为k-1时刻的扩维EKF滤波器i的状态估计,
Figure PCTCN2020105997-appb-000048
为相应的协方差矩阵估计,
Figure PCTCN2020105997-appb-000049
为k-1时刻模型i的概率,交互式计算后,扩维EKF滤波器j在k时刻的输入计算公式如下:
Model input interaction module, used to: set
Figure PCTCN2020105997-appb-000047
Is the state estimation of the extended dimension EKF filter i at k-1
Figure PCTCN2020105997-appb-000048
For the corresponding covariance matrix estimation,
Figure PCTCN2020105997-appb-000049
Is the probability of model i at time k-1. After interactive calculation, the input calculation formula of the extended-dimension EKF filter j at time k is as follows:
Figure PCTCN2020105997-appb-000050
Figure PCTCN2020105997-appb-000050
Figure PCTCN2020105997-appb-000051
Figure PCTCN2020105997-appb-000051
其中
Figure PCTCN2020105997-appb-000052
among them
Figure PCTCN2020105997-appb-000052
子模型滤波模块,用于:在各个扩维EKF滤波器计算得到相应的输入
Figure PCTCN2020105997-appb-000053
利用获得的量测
Figure PCTCN2020105997-appb-000054
进行各个模型下对应的状态估计更新;
The sub-model filter module is used to calculate the corresponding input in each extended dimension EKF filter
Figure PCTCN2020105997-appb-000053
Use acquired measurements
Figure PCTCN2020105997-appb-000054
Update the corresponding state estimation under each model;
模型概率更新模块,用于:对各个模型i=1,2,...m计算模型概率,计算公式如下:The model probability update module is used to calculate the model probability for each model i=1, 2,...m, the calculation formula is as follows:
Figure PCTCN2020105997-appb-000055
Figure PCTCN2020105997-appb-000055
其中,
Figure PCTCN2020105997-appb-000056
among them,
Figure PCTCN2020105997-appb-000056
估计融合输出模块,用于:根据各个模型的更新概率和状态估计以及协方差矩阵估计,计算出当前时刻目标的状态估计和协方差矩阵估计,计算公式如下:The estimation fusion output module is used to calculate the state estimation and covariance matrix estimation of the target at the current moment according to the update probability and state estimation of each model and the covariance matrix estimation. The calculation formula is as follows:
Figure PCTCN2020105997-appb-000057
Figure PCTCN2020105997-appb-000057
Figure PCTCN2020105997-appb-000058
Figure PCTCN2020105997-appb-000058
x k|k表示k时刻目标状态估计,P k|k表示k时刻目标状态协方差矩阵估计。 x k|k represents the estimation of the target state at time k, and P k|k represents the estimation of the covariance matrix of the target state at time k.
进一步地,所述子模型滤波模块包括:Further, the sub-model filtering module includes:
状态预测子模块,用于:对于各个模型i=1,2...m,分别计算相应的预测状态和预测协方差矩阵,计算公式如下:The state prediction sub-module is used to calculate the corresponding prediction state and prediction covariance matrix for each model i=1, 2...m, and the calculation formula is as follows:
Figure PCTCN2020105997-appb-000059
Figure PCTCN2020105997-appb-000059
Figure PCTCN2020105997-appb-000060
Figure PCTCN2020105997-appb-000060
数据融合子模块,用于:利用扩维算法进行数据融合,各个相应变量公式如下:The data fusion sub-module is used for data fusion using the expansion algorithm, and the formulas of each corresponding variable are as follows:
Figure PCTCN2020105997-appb-000061
Figure PCTCN2020105997-appb-000061
Figure PCTCN2020105997-appb-000062
Figure PCTCN2020105997-appb-000062
Figure PCTCN2020105997-appb-000063
Figure PCTCN2020105997-appb-000063
对应于模型i=1,2...m,各自的量测预测残差和量测协方差计算公式如下:Corresponding to the model i=1, 2...m, the calculation formulas for the respective measurement prediction residuals and measurement covariances are as follows:
Figure PCTCN2020105997-appb-000064
Figure PCTCN2020105997-appb-000064
Figure PCTCN2020105997-appb-000065
Figure PCTCN2020105997-appb-000065
同时计算与模型i对应的似然函数,在假定服从高斯分布的条件下,似然函数如下:At the same time, the likelihood function corresponding to model i is calculated. Under the assumption that it obeys the Gaussian distribution, the likelihood function is as follows:
Figure PCTCN2020105997-appb-000066
Figure PCTCN2020105997-appb-000066
滤波更新子模块,用于:对应于模型i=1,2,...m,分别计算各自的滤波增益,状态估计更新,以及误差协方差矩阵,计算公式如下:The filter update sub-module is used to calculate the respective filter gain, state estimation update, and error covariance matrix corresponding to the model i=1, 2,...m. The calculation formula is as follows:
Figure PCTCN2020105997-appb-000067
Figure PCTCN2020105997-appb-000067
Figure PCTCN2020105997-appb-000068
Figure PCTCN2020105997-appb-000068
Figure PCTCN2020105997-appb-000069
Figure PCTCN2020105997-appb-000069
又一方面,一种机器人跟踪设备,包括:In another aspect, a robot tracking device includes:
处理器;processor;
存储器,用于存储有所述处理器的可执行指令;A memory for storing executable instructions of the processor;
其中,所述处理器配置为经由所述可执行指令来执行上述方案任一项所述的机器人跟踪方法的步骤。Wherein, the processor is configured to execute the steps of the robot tracking method according to any one of the above solutions via the executable instructions.
又一方面,提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时实现上述方案任一项所述的机器人跟踪方法的步骤。In yet another aspect, a computer-readable storage medium is provided, and the computer-readable storage medium stores a computer program that, when executed by a processor, implements the steps of the robot tracking method according to any one of the above solutions.
本发明实施例提供的技术方案带来的有益效果:Beneficial effects brought about by the technical solutions provided by the embodiments of the present invention:
1、通过布置多个超声波阵列,在机器人跟踪的每个时刻获取观测数据,通过预设扩维IMM-EKF算法在IMM-EKF算法基础上,对每一步迭代过程进行量测扩维,获取更多目标运动状态信息,适用于多超声波阵列;1. By arranging multiple ultrasonic arrays, the observation data is acquired at each time of the robot tracking, and the dimension expansion is performed on the basis of the IMM-EKF algorithm through the preset dimension expansion IMM-EKF algorithm, and each step of the iterative process is measured and expanded to obtain more information. Multi-target motion state information, suitable for multi-ultrasound arrays;
2、充分利用原始观测数据,融合效果最优,提高在室内跟踪机器人时的跟踪精度,跟踪误差小、计算量相对较低,从而实现智能机器人状态未知且多变的情况下也能对其进行稳定有效的跟踪,减少误跟或跟丢的现象发生;2. Make full use of the original observation data, the fusion effect is optimal, and the tracking accuracy when tracking the robot indoors is improved. The tracking error is small and the calculation amount is relatively low, so that the intelligent robot can be performed even when the state of the intelligent robot is unknown and changeable. Stable and effective tracking to reduce the occurrence of mistracking or tracking loss;
3、利用扩维IMM-EKF算法对智能机器人进行跟踪,能够有效地削弱混响和噪声对跟踪精度造成的影响,跟踪误差明显小于传统IMM-EKF算法,且对观测数据缺失的跟踪场景也具有良好的稳健性。3. Using the extended-dimensional IMM-EKF algorithm to track the intelligent robot can effectively weaken the influence of reverberation and noise on the tracking accuracy. The tracking error is significantly smaller than the traditional IMM-EKF algorithm, and it also has the ability to track scenes with missing observation data. Good robustness.
附图说明Description of the drawings
为了更清楚地说明本发明实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。In order to explain the technical solutions in the embodiments of the present invention more clearly, the following will briefly introduce the drawings needed in the description of the embodiments. Obviously, the drawings in the following description are only some embodiments of the present invention. For those of ordinary skill in the art, other drawings can be obtained from these drawings without creative work.
图1是本发明实施例提供的机器人跟踪方法流程图;Fig. 1 is a flowchart of a robot tracking method provided by an embodiment of the present invention;
图2是图1中102步骤的子步骤流程图;Figure 2 is a flowchart of sub-steps of step 102 in Figure 1;
图3是图2中1023步骤的子步骤流程图;Figure 3 is a flowchart of sub-steps of step 1023 in Figure 2;
图4是本发明实施例提供的机器人跟踪方法流程框图;4 is a flowchart of a robot tracking method provided by an embodiment of the present invention;
图5是本发明实施例提供的机器人跟踪方法中状态计算结果计算过程示意图;5 is a schematic diagram of a calculation process of a state calculation result in a robot tracking method provided by an embodiment of the present invention;
图6是本发明实施例提供的机器人跟踪装置结构示意图;Figure 6 is a schematic structural diagram of a robot tracking device provided by an embodiment of the present invention;
图7是本发明实施例提供的机器人跟踪设备组成示意图;FIG. 7 is a schematic diagram of the composition of a robot tracking device provided by an embodiment of the present invention;
图8是本发明实施例提供的机器人跟踪方案与现有方案的跟踪轨迹效果对比图;FIG. 8 is a comparison diagram of the tracking effect of the robot tracking solution provided by the embodiment of the present invention and the existing solution;
图9是本发明实施例提供的机器人跟踪方案与现有方案的跟踪误差效果对比图。FIG. 9 is a comparison diagram of the tracking error effect of the robot tracking solution provided by the embodiment of the present invention and the existing solution.
具体实施方式Detailed ways
为使本发明的目的、技术方案和优点更加清楚,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中 的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。在本发明的描述中,“多个”的含义是两个以上,除非另有明确具体的限定。In order to make the objectives, technical solutions and advantages of the present invention clearer, the following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only A part of the embodiments of the present invention, but not all the embodiments. Based on the embodiments of the present invention, all other embodiments obtained by those of ordinary skill in the art without creative work shall fall within the protection scope of the present invention. In the description of the present invention, "plurality" means two or more, unless otherwise specifically defined.
本发明实施例提供的机器人跟踪方法、装置、设备及计算机可读存储介质,通过布置多个超声波阵列,在机器人跟踪的每个时刻获取观测数据,通过预设扩维IMM-EKF算法在IMM-EKF算法基础上,对每一步迭代过程进行量测扩维,获取更多目标运动状态信息,适用于多超声波阵列,充分利用原始观测数据,融合效果最优,提高在室内跟踪机器人时的跟踪精度,跟踪误差小、计算量相对较低,从而实现智能机器人状态未知且多变的情况下也能对其进行稳定有效的跟踪,减少误跟或跟丢的现象发生。因此,该机器人跟踪方法适用于涉及智能机器人操控应用领域,尤其适用于多超声波阵列应用场景。The robot tracking method, device, equipment, and computer-readable storage medium provided by the embodiments of the present invention can obtain observation data at each time of robot tracking by arranging multiple ultrasonic arrays, and use the preset expanded dimension IMM-EKF algorithm in IMM- Based on the EKF algorithm, each step of the iterative process is measured and expanded to obtain more target motion state information. It is suitable for multi-ultrasonic arrays, makes full use of the original observation data, has the best fusion effect, and improves the tracking accuracy when tracking the robot indoors. , The tracking error is small and the amount of calculation is relatively low, so that the intelligent robot can be tracked stably and effectively even when the state of the intelligent robot is unknown and changeable, reducing the occurrence of false follow or follow-up. Therefore, the robot tracking method is suitable for application fields involving intelligent robot manipulation, and is especially suitable for application scenarios of multiple ultrasonic arrays.
下面结合具体实施例及附图,对本发明实施例提供的机器人跟踪方法、装置、设备及计算机可读存储介质作详细说明。The robot tracking method, device, equipment, and computer-readable storage medium provided by the embodiments of the present invention will be described in detail below with reference to specific embodiments and drawings.
图1是本发明实施例提供的机器人跟踪方法流程图。图2是图1中102步骤的子步骤流程图。图3是图2中1023步骤的子步骤流程图。Fig. 1 is a flowchart of a robot tracking method provided by an embodiment of the present invention. Fig. 2 is a flowchart of sub-steps of step 102 in Fig. 1. Fig. 3 is a flowchart of sub-steps of step 1023 in Fig. 2.
如图1所示,本发明实施例提供的机器人跟踪方法,包括以下步骤:As shown in Figure 1, the robot tracking method provided by the embodiment of the present invention includes the following steps:
101、在跟踪的每个时刻,获取至少两个超声波阵列对机器人的观测数据。101. Obtain observation data of the robot by at least two ultrasonic arrays at each moment of tracking.
在跟踪的每个时刻,获取至少两个超声波阵列对机器人的观测数据,包括:At each time of tracking, obtain the observation data of the robot by at least two ultrasonic arrays, including:
在k时刻,获取至少两个超声波阵列对机器人的观测数据
Figure PCTCN2020105997-appb-000070
其中k、n均为大于0的整数,
Figure PCTCN2020105997-appb-000071
均为测得机器人角度、距离数据的向量。
At time k, obtain the observation data of at least two ultrasonic arrays on the robot
Figure PCTCN2020105997-appb-000070
Where k and n are both integers greater than 0,
Figure PCTCN2020105997-appb-000071
Both are vectors of measured robot angle and distance data.
值得注意的是,步骤101的过程,除了上述步骤所述的方式之外,还可以通过其他方式实现该过程,本发明实施例对具体的方式不加以限定。It is worth noting that, in addition to the manner described in the foregoing steps, the process of step 101 may also be implemented in other manners, and the embodiment of the present invention does not limit the specific manner.
102、利用预设扩维IMM-EKF算法估计每个时刻机器人运动状态,通过与k时刻m种运动状态对应的m种运动模型相匹配的m个扩维EKF滤波器,分别获取k时刻每种运动模型下机器人对应的状态估计获得m个状态,对 m个状态进行加权计算得到k时刻机器人状态估计结果,其中每个时刻用k时刻表示,k、m均为大于0的整数。102. Utilize the preset extended dimension IMM-EKF algorithm to estimate the motion state of the robot at each time, and obtain each type of the robot at time k through m extended-dimensional EKF filters that match the m motion models corresponding to the m motion states at time k. The state estimation corresponding to the robot under the motion model obtains m states, and the weighted calculation of the m states obtains the robot state estimation result at k time, where each time is represented by k time, and k and m are integers greater than 0.
如图2所示,上述102步骤进一步包括以下子步骤:As shown in Figure 2, the above 102 step further includes the following sub-steps:
1021-随机混合系统计算步骤:机器人跟踪系统建立步骤:建立所述机器人跟踪系统,所述机器人跟踪系统包括机器人的运动方程和观测方程,如下表示:1021-Stochastic hybrid system calculation step: Robot tracking system establishment step: The robot tracking system is established, and the robot tracking system includes the motion equation and observation equation of the robot, which are expressed as follows:
运动方程:
Figure PCTCN2020105997-appb-000072
Motion equation:
Figure PCTCN2020105997-appb-000072
观测方程:
Figure PCTCN2020105997-appb-000073
Observation equation:
Figure PCTCN2020105997-appb-000073
C ij=P(M k=M j|M k-1=M i); C ij =P(M k =M j |M k-1 =M i );
其中i,j=1,2...m表示模型数量,n=1,2……n表示超声波阵列数量,m、n均为大于等于1的整数,k∈N表示时刻,C ij表示k-1时刻目标从模型i转移到k时刻的模型j的概率,
Figure PCTCN2020105997-appb-000074
表示k时刻第i个模型状态转移矩阵,
Figure PCTCN2020105997-appb-000075
表示,k时刻第i个运动模型下的目标状态,
Figure PCTCN2020105997-appb-000076
表示,k时刻第n个阵列的观测矩阵,
Figure PCTCN2020105997-appb-000077
表示k时刻第n个阵列接收到的目标状态观测。
Figure PCTCN2020105997-appb-000078
表示模型i的过程噪声,
Figure PCTCN2020105997-appb-000079
表示为第n个阵列的观测噪声,两种噪声均假设为零均值,协方差分别为
Figure PCTCN2020105997-appb-000080
的高斯白噪声;
Where i, j = 1, 2...m represents the number of models, n = 1, 2...n represents the number of ultrasonic arrays, m and n are both integers greater than or equal to 1, k ∈ N represents time, C ij represents k -1 The probability that the target transfers from model i to model j at time k,
Figure PCTCN2020105997-appb-000074
Represents the i-th model state transition matrix at time k,
Figure PCTCN2020105997-appb-000075
Indicates the target state under the i-th motion model at time k,
Figure PCTCN2020105997-appb-000076
Means the observation matrix of the nth array at time k,
Figure PCTCN2020105997-appb-000077
Represents the target state observation received by the nth array at time k.
Figure PCTCN2020105997-appb-000078
Represents the process noise of model i,
Figure PCTCN2020105997-appb-000079
Denoted as the observed noise of the nth array, both types of noise are assumed to be zero mean, and the covariances are respectively
Figure PCTCN2020105997-appb-000080
Gaussian white noise;
1022-模型输入交互步骤:设
Figure PCTCN2020105997-appb-000081
为k-1时刻扩维EKF滤波器i的状态估计,
Figure PCTCN2020105997-appb-000082
为相应的协方差矩阵估计,
Figure PCTCN2020105997-appb-000083
为k-1时刻模型i的概率,交互式计算后,扩维EKF滤波器j在k时刻的输入计算公式如下:
1022-Model input interaction steps: set
Figure PCTCN2020105997-appb-000081
Is the state estimation of the extended dimension EKF filter i at time k-1,
Figure PCTCN2020105997-appb-000082
For the corresponding covariance matrix estimation,
Figure PCTCN2020105997-appb-000083
Is the probability of model i at time k-1. After interactive calculation, the input calculation formula of the extended-dimension EKF filter j at time k is as follows:
Figure PCTCN2020105997-appb-000084
Figure PCTCN2020105997-appb-000084
Figure PCTCN2020105997-appb-000085
Figure PCTCN2020105997-appb-000085
其中
Figure PCTCN2020105997-appb-000086
among them
Figure PCTCN2020105997-appb-000086
1023-子模型滤波步骤:在各个扩维EKF滤波器计算得到相应的输入
Figure PCTCN2020105997-appb-000087
利用获得的量测
Figure PCTCN2020105997-appb-000088
进行各个模型下对应的状态估计更新;
1023-Sub-model filtering step: Calculate the corresponding input in each extended dimension EKF filter
Figure PCTCN2020105997-appb-000087
Use acquired measurements
Figure PCTCN2020105997-appb-000088
Update the corresponding state estimation under each model;
1024-模型概率更新步骤:对各个模型i=1,2,...m计算模型概率,计算公式 如下:1024-model probability update step: calculate the model probability for each model i=1, 2,...m, the calculation formula is as follows:
Figure PCTCN2020105997-appb-000089
Figure PCTCN2020105997-appb-000089
其中,
Figure PCTCN2020105997-appb-000090
among them,
Figure PCTCN2020105997-appb-000090
1025-估计融合输出步骤:根据各个模型的更新概率和状态估计以及估计协方差矩阵估计,计算出当前时刻目标的状态估计和协方差矩阵估计,计算公式如下:1025-Estimation fusion output step: According to the update probability and state estimation of each model and the estimated covariance matrix estimation, calculate the state estimation and covariance matrix estimation of the target at the current moment. The calculation formula is as follows:
Figure PCTCN2020105997-appb-000091
Figure PCTCN2020105997-appb-000091
Figure PCTCN2020105997-appb-000092
Figure PCTCN2020105997-appb-000092
x k|k表示k时刻目标状态估计,P k|k表示k时刻目标状态协方差矩阵估计。 x k|k represents the estimation of the target state at time k, and P k|k represents the estimation of the covariance matrix of the target state at time k.
如图3所示,上述子模型滤波步骤又进一步包括以下子步骤:As shown in Fig. 3, the aforementioned sub-model filtering step further includes the following sub-steps:
1023a-状态预测子步骤:对于各个模型i=1,2...m,分别计算相应的预测状态和预测协方差矩阵,计算公式如下:1023a-State prediction sub-step: For each model i=1, 2...m, calculate the corresponding prediction state and prediction covariance matrix, and the calculation formula is as follows:
Figure PCTCN2020105997-appb-000093
Figure PCTCN2020105997-appb-000093
Figure PCTCN2020105997-appb-000094
Figure PCTCN2020105997-appb-000094
1023b-数据融合子步骤:利用扩维算法进行数据融合,各个相应变量公式如下:1023b-Data fusion sub-step: data fusion is carried out using the expansion algorithm, the formula of each corresponding variable is as follows:
Figure PCTCN2020105997-appb-000095
Figure PCTCN2020105997-appb-000095
Figure PCTCN2020105997-appb-000096
Figure PCTCN2020105997-appb-000096
Figure PCTCN2020105997-appb-000097
Figure PCTCN2020105997-appb-000097
对应于模型i=1,2...m,各自的量测预测残差和量测协方差计算公式如下:
Figure PCTCN2020105997-appb-000098
Corresponding to the model i=1, 2...m, the calculation formulas for the respective measurement prediction residuals and measurement covariances are as follows:
Figure PCTCN2020105997-appb-000098
Figure PCTCN2020105997-appb-000099
Figure PCTCN2020105997-appb-000099
同时计算与模型i对应的似然函数,在假定服从高斯分布的条件下,似然函数如下:At the same time, the likelihood function corresponding to model i is calculated. Under the assumption that it obeys the Gaussian distribution, the likelihood function is as follows:
Figure PCTCN2020105997-appb-000100
Figure PCTCN2020105997-appb-000100
1023c-滤波更新子步骤:对应于模型i=1,2,...m,分别计算各自的滤波增益,状态估计更新,以及误差协方差矩阵,计算公式如下:1023c-Filtering update sub-step: corresponding to the model i=1, 2,...m, respectively calculate the respective filter gain, state estimation update, and error covariance matrix, the calculation formula is as follows:
Figure PCTCN2020105997-appb-000101
Figure PCTCN2020105997-appb-000101
Figure PCTCN2020105997-appb-000102
Figure PCTCN2020105997-appb-000102
Figure PCTCN2020105997-appb-000103
Figure PCTCN2020105997-appb-000103
图4是本发明实施例提供的机器人跟踪方法流程框图,图5是本发明实施例提供的机器人跟踪方法中状态计算结果计算过程示意图,共同演示了一种选择采用两个超声波阵列的实施方式。FIG. 4 is a flowchart of a robot tracking method provided by an embodiment of the present invention. FIG. 5 is a schematic diagram of a state calculation result calculation process in the robot tracking method provided by an embodiment of the present invention, and jointly demonstrates an implementation of selecting two ultrasonic arrays.
值得注意的是,上述步骤102的过程,除了上述步骤所述的方式之外,还可以通过其他方式实现该过程,本发明实施例对具体的方式不加以限定。It is worth noting that, in addition to the manner described in the foregoing steps, the process of step 102 may also be implemented in other ways, and the embodiment of the present invention does not limit the specific manner.
图6是本发明实施例提供的机器人跟踪装置结构示意图,如图6所示,本发明实施例提供的机器人跟踪装置包括数据获取模块1和计算模块2。FIG. 6 is a schematic structural diagram of a robot tracking device provided by an embodiment of the present invention. As shown in FIG. 6, the robot tracking device provided by an embodiment of the present invention includes a data acquisition module 1 and a calculation module 2.
其中,数据获取模块1用于:在跟踪的每个时刻,获取至少两个超声波阵列对机器人的观测数据。具体地,数据获取模块1用于:在k时刻,获取至少两个超声波阵列对机器人的观测数据
Figure PCTCN2020105997-appb-000104
其中k、n均为大于0的整数,
Figure PCTCN2020105997-appb-000105
均为至少两个超声波阵列测得的机器人角度、距离数据的向量。
Among them, the data acquisition module 1 is used to acquire observation data of the robot by at least two ultrasonic arrays at each moment of tracking. Specifically, the data acquisition module 1 is used to: at time k, acquire observation data of the robot by at least two ultrasonic arrays
Figure PCTCN2020105997-appb-000104
Where k and n are both integers greater than 0,
Figure PCTCN2020105997-appb-000105
Both are vectors of robot angle and distance data measured by at least two ultrasonic arrays.
计算模块2用于:利用预设扩维IMM-EKF算法估计每个时刻机器人运动状态,通过与k时刻m种运动状态对应的m种运动模型相匹配的m个扩维EKF滤波器,分别获取k时刻每种运动模型下机器人对应的状态估计,获得m个状态,对m个状态进行加权计算得到k时刻状态估计结果,其中每个时刻用k时刻表示,k、m均为大于0的整数。The calculation module 2 is used to: use the preset extended dimension IMM-EKF algorithm to estimate the motion state of the robot at each moment, and obtain respectively through m extended dimensional EKF filters matching m motion models corresponding to the m motion states at time k The state estimation of the robot corresponding to each motion model at k time is obtained, and m states are obtained, and the weighted calculation of m states is performed to obtain the state estimation result at time k, where each time is represented by k time, and k and m are integers greater than 0 .
具体地,计算模块2包括机器人跟踪系统建立模块21、模型输入交互模块22、子模型滤波模块23、模型概率更新模块24和估计融合输出模块25。Specifically, the calculation module 2 includes a robot tracking system establishment module 21, a model input interaction module 22, a sub-model filtering module 23, a model probability update module 24, and an estimated fusion output module 25.
其中,机器人跟踪系统建立模块21用于:Among them, the robot tracking system establishment module 21 is used for:
建立所述机器人跟踪系统,所述机器人跟踪系统包括机器人的运动方程和观测方程如下表示:The robot tracking system is established, and the robot tracking system includes the robot's motion equation and observation equation as follows:
运动方程:
Figure PCTCN2020105997-appb-000106
Motion equation:
Figure PCTCN2020105997-appb-000106
观测方程:
Figure PCTCN2020105997-appb-000107
Observation equation:
Figure PCTCN2020105997-appb-000107
C ij=P(M k=M j|M k-1=M i); C ij =P(M k =M j |M k-1 =M i );
其中i,j=1,2...m表示模型数量,n=1,2……n表示超声波阵列数量,m、n均为大于等于1的整数,k∈N表示时刻,C ij表示k-1时刻目标从模型i转移到k时刻的模型j的概率,
Figure PCTCN2020105997-appb-000108
表示k时刻第i个模型状态转移矩阵,
Figure PCTCN2020105997-appb-000109
表示,k时刻第i个运动模型下的目标状态,
Figure PCTCN2020105997-appb-000110
表示,k时刻第n个阵列的观测矩阵,
Figure PCTCN2020105997-appb-000111
表示k时刻第n个阵列接收到的目标状态观测。
Figure PCTCN2020105997-appb-000112
表示模型i的过程噪声,
Figure PCTCN2020105997-appb-000113
表示为第n个阵列的观测噪声,两种噪声均假设为零均值,协方差分别为
Figure PCTCN2020105997-appb-000114
的高斯白噪声。
Where i, j = 1, 2...m represents the number of models, n = 1, 2...n represents the number of ultrasonic arrays, m and n are both integers greater than or equal to 1, k ∈ N represents time, C ij represents k -1 The probability that the target transfers from model i to model j at time k,
Figure PCTCN2020105997-appb-000108
Represents the i-th model state transition matrix at time k,
Figure PCTCN2020105997-appb-000109
Indicates the target state under the i-th motion model at time k,
Figure PCTCN2020105997-appb-000110
Means the observation matrix of the nth array at time k,
Figure PCTCN2020105997-appb-000111
Represents the target state observation received by the nth array at time k.
Figure PCTCN2020105997-appb-000112
Represents the process noise of model i,
Figure PCTCN2020105997-appb-000113
Denoted as the observed noise of the nth array, both types of noise are assumed to be zero mean, and the covariances are respectively
Figure PCTCN2020105997-appb-000114
Gaussian white noise.
模型输入交互模块22用于:The model input interaction module 22 is used to:
Figure PCTCN2020105997-appb-000115
为k-1时刻扩维EKF滤波器i的状态估计,
Figure PCTCN2020105997-appb-000116
为相应的协方差矩阵估计,
Figure PCTCN2020105997-appb-000117
为k-1时刻模型i的概率,交互式计算后,扩维EKF滤波器j在k时刻的输入计算公式如下:
Assume
Figure PCTCN2020105997-appb-000115
Is the state estimation of the extended dimension EKF filter i at time k-1,
Figure PCTCN2020105997-appb-000116
For the corresponding covariance matrix estimation,
Figure PCTCN2020105997-appb-000117
Is the probability of model i at time k-1. After interactive calculation, the input calculation formula of the extended-dimension EKF filter j at time k is as follows:
Figure PCTCN2020105997-appb-000118
Figure PCTCN2020105997-appb-000118
Figure PCTCN2020105997-appb-000119
Figure PCTCN2020105997-appb-000119
其中
Figure PCTCN2020105997-appb-000120
among them
Figure PCTCN2020105997-appb-000120
子模型滤波模块23用于:The sub-model filtering module 23 is used to:
在各个扩维EKF滤波器计算得到相应的输入
Figure PCTCN2020105997-appb-000121
Figure PCTCN2020105997-appb-000122
利用获得的量测
Figure PCTCN2020105997-appb-000123
进行各个模型下对应的状态估计更新;
Calculate the corresponding input in each extended dimension EKF filter
Figure PCTCN2020105997-appb-000121
Figure PCTCN2020105997-appb-000122
Use acquired measurements
Figure PCTCN2020105997-appb-000123
Update the corresponding state estimation under each model;
模型概率更新模块24用于:对各个模型i=1,2,...m计算模型概率,计算公式如下:The model probability update module 24 is used to calculate the model probability for each model i=1, 2,...m, and the calculation formula is as follows:
Figure PCTCN2020105997-appb-000124
Figure PCTCN2020105997-appb-000124
其中,
Figure PCTCN2020105997-appb-000125
among them,
Figure PCTCN2020105997-appb-000125
估计融合输出模块25用于:The estimated fusion output module 25 is used for:
根据各个模型的更新概率和状态估计以及协方差矩阵估计,计算出当前时刻目标的状态估计和协方差矩阵估计,计算公式如下:According to the update probability, state estimation and covariance matrix estimation of each model, the state estimation and covariance matrix estimation of the target at the current moment are calculated. The calculation formula is as follows:
Figure PCTCN2020105997-appb-000126
Figure PCTCN2020105997-appb-000126
Figure PCTCN2020105997-appb-000127
Figure PCTCN2020105997-appb-000127
x k|k表示k时刻目标状态估计,P k|k表示k时刻目标状态协方差矩阵估计。 x k|k represents the estimation of the target state at time k, and P k|k represents the estimation of the covariance matrix of the target state at time k.
进一步地,子模型滤波模块23包括状态预测子模块231、数据融合子模块232和滤波更新子模块233。Further, the sub-model filtering module 23 includes a state prediction sub-module 231, a data fusion sub-module 232, and a filtering update sub-module 233.
其中,状态预测子模块231用于:Among them, the state prediction sub-module 231 is used to:
对于各个模型i=1,2...m,分别计算相应的预测状态和预测协方差矩阵,计算公式如下:For each model i=1, 2...m, respectively calculate the corresponding prediction state and prediction covariance matrix, the calculation formula is as follows:
Figure PCTCN2020105997-appb-000128
Figure PCTCN2020105997-appb-000128
Figure PCTCN2020105997-appb-000129
Figure PCTCN2020105997-appb-000129
数据融合子模块232用于:The data fusion sub-module 232 is used for:
利用扩维算法进行数据融合,各个相应变量公式如下:The data fusion is carried out using the expansion algorithm, and the formulas of each corresponding variable are as follows:
Figure PCTCN2020105997-appb-000130
Figure PCTCN2020105997-appb-000130
Figure PCTCN2020105997-appb-000131
Figure PCTCN2020105997-appb-000131
Figure PCTCN2020105997-appb-000132
Figure PCTCN2020105997-appb-000132
对应于模型i=1,2...m,各自的量测预测残差和量测协方差计算公式如下:
Figure PCTCN2020105997-appb-000133
Corresponding to the model i=1, 2...m, the calculation formulas for the respective measurement prediction residuals and measurement covariances are as follows:
Figure PCTCN2020105997-appb-000133
Figure PCTCN2020105997-appb-000134
Figure PCTCN2020105997-appb-000134
同时计算与模型i对应的似然函数,在假定服从高斯分布的条件下,似然函数如下:At the same time, the likelihood function corresponding to model i is calculated. Under the assumption that it obeys the Gaussian distribution, the likelihood function is as follows:
Figure PCTCN2020105997-appb-000135
Figure PCTCN2020105997-appb-000135
滤波更新子模块233用于:The filtering update sub-module 233 is used to:
对应于模型i=1,2,...m,分别计算各自的滤波增益,状态估计更新,以及误差协方差矩阵,计算公式如下:Corresponding to the model i = 1, 2, ... m, respectively calculate the respective filter gain, state estimation update, and error covariance matrix, the calculation formula is as follows:
Figure PCTCN2020105997-appb-000136
Figure PCTCN2020105997-appb-000136
Figure PCTCN2020105997-appb-000137
Figure PCTCN2020105997-appb-000137
Figure PCTCN2020105997-appb-000138
Figure PCTCN2020105997-appb-000138
图7是本发明实施例提供的机器人跟踪设备组成示意图,如图7所示,本发明实施例提供的机器人跟踪设备,包括处理器3和存储器4,存储器4用于存储有处理器3的可执行指令;其中,所述处理器3配置为经由前述可执行指令来执行上述任一方案所述的机器人跟踪方法的步骤。FIG. 7 is a schematic diagram of the composition of a robot tracking device provided by an embodiment of the present invention. As shown in FIG. 7, the robot tracking device provided by an embodiment of the present invention includes a processor 3 and a memory 4, and the memory 4 is used to store the processor 3 Executing instructions; wherein, the processor 3 is configured to execute the steps of the robot tracking method described in any one of the above solutions via the aforementioned executable instructions.
本发明实施例还提供了一种计算机可读存储介质,该计算机可读存储介质存储有计算机程序,该计算机程序被处理器执行时实现上述任一方案所述的机器人跟踪方法的步骤。The embodiment of the present invention also provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the steps of the robot tracking method described in any of the above solutions are implemented.
需要说明的是:上述实施例提供的机器人跟踪装置在触发机器人跟踪业务时,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的触发机器人跟踪业务的机器人跟踪装置、机器人跟踪设备、计算机可读存储介质与触发机器人跟踪业务的机器人跟踪方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述。It should be noted that when the robot tracking device provided in the above embodiment triggers the robot tracking service, only the division of the above functional modules is used as an example for illustration. In actual applications, the above functions can be allocated by different functional modules according to needs. , That is, divide the internal structure of the device into different functional modules to complete all or part of the functions described above. In addition, the robot tracking device, robot tracking device, computer-readable storage medium and the robot tracking method for triggering the robot tracking service provided in the above embodiments belong to the same concept. For the specific implementation process, please refer to the method embodiment. I won't repeat it here.
上述所有可选技术方案,可以采用任意结合形成本发明的可选实施例,在此不再一一赘述。All the above optional technical solutions can be combined in any way to form an optional embodiment of the present invention, which will not be repeated here.
为说明本发明实施例提供的机器人跟踪方案在进行室内自动化设备跟踪方面的优势,通过本发明实施例提供的机器人跟踪方法、IMM-EKF方法以及加权 IMM-EKF对机器人量测数据进行处理,实现机器人的状态估计,结果如图8所示。To illustrate the advantages of the robot tracking solution provided by the embodiments of the present invention in tracking indoor automation equipment, the robot tracking method, the IMM-EKF method, and the weighted IMM-EKF provided by the embodiments of the present invention are used to process the robot measurement data. The state of the robot is estimated, and the result is shown in Figure 8.
图8是本发明实施例提供的机器人跟踪方案与现有方案的跟踪轨迹效果对比图。图9是本发明实施例提供的机器人跟踪方案与现有方案的跟踪误差效果对比图。FIG. 8 is a comparison diagram of the tracking effect of the robot tracking solution provided by the embodiment of the present invention and the existing solution. FIG. 9 is a comparison diagram of the tracking error effect of the robot tracking solution provided by the embodiment of the present invention and the existing solution.
如图9所示,为进一步刻画不同方法的性能,计算上述估计结果的跟踪误差进行性能评估。t k时刻状态估计的误差公式为: As shown in Figure 9, in order to further characterize the performance of different methods, the tracking error of the above estimation results is calculated for performance evaluation. The error formula of state estimation at t k is:
Figure PCTCN2020105997-appb-000139
Figure PCTCN2020105997-appb-000139
其中,
Figure PCTCN2020105997-appb-000140
表示t k时刻的目标状态估计得到的位置坐标,(x k,y k)表示t k时刻目标的真实位置。
among them,
Figure PCTCN2020105997-appb-000140
Represents the estimated position coordinates of the target state at time t k , and (x k , y k ) represents the true position of the target at time t k.
下表1示出了三种方法的目标平均跟踪误差,如下所示:Table 1 below shows the average target tracking error of the three methods, as shown below:
Figure PCTCN2020105997-appb-000141
Figure PCTCN2020105997-appb-000141
表1Table 1
由此可知,本发明实施例提供的机器人跟踪方法的跟踪精度明显优于IMM-EKF算法,并且,与加权IMM-EKF算法相比,跟踪误差减小将近50%。It can be seen that the tracking accuracy of the robot tracking method provided by the embodiment of the present invention is significantly better than the IMM-EKF algorithm, and compared with the weighted IMM-EKF algorithm, the tracking error is reduced by nearly 50%.
综上所述,本发明实施例提供的机器人跟踪方法、装置、设备及计算机可读存储介质,相比现有技术,具有以下有益效果:In summary, the robot tracking method, device, device, and computer-readable storage medium provided by the embodiments of the present invention have the following beneficial effects compared with the prior art:
1、通过布置多个超声波阵列,在机器人跟踪的每个时刻获取观测数据,通过预设扩维IMM-EKF算法在IMM-EKF算法基础上,对每一步迭代过程进行量测扩维,获取更多目标运动状态信息,适用于多超声波阵列;1. By arranging multiple ultrasonic arrays, the observation data is acquired at each time of the robot tracking, and the dimension expansion is performed on the basis of the IMM-EKF algorithm through the preset dimension expansion IMM-EKF algorithm, and each step of the iterative process is measured and expanded to obtain more information. Multi-target motion state information, suitable for multi-ultrasound arrays;
2、充分利用原始观测数据,融合效果最优,提高在室内跟踪机器人时的跟踪精度,跟踪误差小、计算量相对较低,从而实现智能机器人状态未知且多变的情况下也能对其进行稳定有效的跟踪,减少误跟或跟丢的现象发生;2. Make full use of the original observation data, the fusion effect is optimal, and the tracking accuracy when tracking the robot indoors is improved. The tracking error is small and the calculation amount is relatively low, so that the intelligent robot can be performed even when the state of the intelligent robot is unknown and changeable. Stable and effective tracking to reduce the occurrence of mistracking or tracking loss;
3、利用扩维IMM-EKF算法对智能机器人进行跟踪,能够有效地削弱混响和噪声对跟踪精度造成的影响,跟踪误差明显小于传统IMM-EKF算法,且对观 测数据缺失的跟踪场景也具有良好的稳健性。3. Using the extended-dimensional IMM-EKF algorithm to track the intelligent robot can effectively weaken the influence of reverberation and noise on the tracking accuracy. The tracking error is significantly smaller than the traditional IMM-EKF algorithm, and it also has the ability to track scenes with missing observation data. Good robustness.
本领域普通技术人员可以理解实现上述实施例的全部或部分步骤可以通过硬件来完成,也可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,上述提到的存储介质可以是只读存储器,磁盘或光盘等。A person of ordinary skill in the art can understand that all or part of the steps in the above embodiments can be implemented by hardware, or by a program to instruct relevant hardware. The program can be stored in a computer-readable storage medium. The storage medium mentioned can be a read-only memory, a magnetic disk or an optical disk, etc.
本申请实施例中是参照根据本申请实施例中实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。The embodiments of the present application are described with reference to the flowcharts and/or block diagrams of the methods, devices (systems), and computer program products according to the embodiments of the present application. It should be understood that each process and/or block in the flowchart and/or block diagram, and the combination of processes and/or blocks in the flowchart and/or block diagram can be implemented by computer program instructions. These computer program instructions can be provided to the processor of a general-purpose computer, a special-purpose computer, an embedded processor, or other programmable data processing equipment to generate a machine, so that the instructions executed by the processor of the computer or other programmable data processing equipment are generated It is a device that realizes the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。These computer program instructions can also be stored in a computer-readable memory that can guide a computer or other programmable data processing equipment to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction device. The device implements the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。These computer program instructions can also be loaded on a computer or other programmable data processing equipment, so that a series of operation steps are executed on the computer or other programmable equipment to produce computer-implemented processing, so as to execute on the computer or other programmable equipment. The instructions provide steps for implementing the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
尽管已描述了本申请实施例中的优选实施例,但本领域内的技术人员一旦得知了基本创造性概念,则可对这些实施例作出另外的变更和修改。所以,所附权利要求意欲解释为包括优选实施例以及落入本申请实施例中范围的所有变更和修改。Although the preferred embodiments in the embodiments of the present application have been described, those skilled in the art can make additional changes and modifications to these embodiments once they learn the basic creative concept. Therefore, the appended claims are intended to be interpreted as including the preferred embodiments and all changes and modifications falling within the scope of the embodiments of the present application.
显然,本领域的技术人员可以对本发明进行各种改动和变型而不脱离本发明的精神和范围。这样,倘若本发明的这些修改和变型属于本发明权利要求及 其等同技术的范围之内,则本发明也意图包含这些改动和变型在内。Obviously, those skilled in the art can make various changes and modifications to the present invention without departing from the spirit and scope of the present invention. In this way, if these modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalent technologies, the present invention is also intended to include these modifications and variations.
以上所述仅为本发明的较佳实施例,并不用以限制本发明,凡在本发明的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。The above are only the preferred embodiments of the present invention and are not intended to limit the present invention. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention shall be included in the protection of the present invention. Within range.

Claims (10)

  1. 一种机器人跟踪方法,其特征在于,所述方法包括:A robot tracking method, characterized in that the method includes:
    在跟踪的每个时刻,获取至少两个超声波阵列对机器人的观测数据;Obtain the observation data of the robot by at least two ultrasonic arrays at each moment of tracking;
    利用预设扩维IMM-EKF算法估计所述每个时刻机器人运动状态,通过与k时刻m种运动状态对应的m种运动模型相匹配的m个扩维EKF滤波器,分别获取k时刻每种运动模型下机器人对应的状态估计,获得m个状态,对所述m个状态进行加权计算得到k时刻机器人状态估计结果,其中所述每个时刻用k时刻表示,k、m均为大于0的整数。The preset extended-dimensional IMM-EKF algorithm is used to estimate the motion state of the robot at each time. Through m extended-dimensional EKF filters that match the m motion models corresponding to the m motion states at time k, each type at time k is obtained. The state of the robot under the motion model is estimated, m states are obtained, and the m states are weighted to obtain the robot state estimation result at time k, where each time is represented by k time, and k and m are both greater than 0 Integer.
  2. 根据权利要求1所述的方法,其特征在于,在跟踪的每个时刻,获取至少两个超声波阵列对机器人的观测数据,包括:The method according to claim 1, wherein at each moment of tracking, obtaining observation data of the robot by at least two ultrasonic arrays comprises:
    在k时刻,获取至少两个超声波阵列对机器人的观测数据
    Figure PCTCN2020105997-appb-100001
    其中k、n均为大于0的整数,
    Figure PCTCN2020105997-appb-100002
    均为所述至少两个超声波阵列测得的机器人角度、距离数据的向量。
    At time k, obtain the observation data of at least two ultrasonic arrays on the robot
    Figure PCTCN2020105997-appb-100001
    Where k and n are both integers greater than 0,
    Figure PCTCN2020105997-appb-100002
    Both are vectors of the angle and distance data of the robot measured by the at least two ultrasonic arrays.
  3. 根据权利要求2所述的方法,其特征在于,利用预设扩维IMM-EKF算法估计所述每个时刻机器人运动状态,通过与k时刻m种运动状态对应的m种运动模型相匹配的m个扩维EKF滤波器,分别获取k时刻每种运动模型下机器人对应的状态估计,获得m个状态,对所述m个状态进行加权计算得到k时刻机器人状态估计结果,包括:The method according to claim 2, characterized in that a preset extended dimension IMM-EKF algorithm is used to estimate the motion state of the robot at each time, and the motion state of the robot at each time is estimated by m motion models corresponding to the m motion states at time k. An extended-dimensional EKF filter respectively obtains the state estimation corresponding to the robot under each motion model at time k, obtains m states, and performs weighted calculation on the m states to obtain the robot state estimation result at time k, including:
    机器人跟踪系统建立步骤:建立所述机器人跟踪系统,所述机器人跟踪系统包括机器人的运动方程和观测方程,如下表示:The steps of establishing the robot tracking system: establishing the robot tracking system, which includes the motion equation and the observation equation of the robot, which are expressed as follows:
    运动方程:
    Figure PCTCN2020105997-appb-100003
    Motion equation:
    Figure PCTCN2020105997-appb-100003
    观测方程:
    Figure PCTCN2020105997-appb-100004
    Observation equation:
    Figure PCTCN2020105997-appb-100004
    C ij=P(M k=M j|M k-1=M i); C ij =P(M k =M j |M k-1 =M i );
    其中i,j=1,2...m表示模型数量,n=1,2……n表示超声波阵列数量,m、n 均为大于等于1的整数,k∈N表示时刻,C ij表示k-1时刻目标从模型i转移到k时刻的模型j的概率,
    Figure PCTCN2020105997-appb-100005
    表示k时刻第i个模型状态转移矩阵,
    Figure PCTCN2020105997-appb-100006
    表示,k时刻第i个运动模型下的目标状态,
    Figure PCTCN2020105997-appb-100007
    表示,k时刻第n个阵列的观测矩阵,
    Figure PCTCN2020105997-appb-100008
    表示k时刻第n个阵列接收到的目标状态观测。
    Figure PCTCN2020105997-appb-100009
    表示模型i的过程噪声,
    Figure PCTCN2020105997-appb-100010
    表示为第n个阵列的观测噪声,两种噪声均假设为零均值,协方差分别为
    Figure PCTCN2020105997-appb-100011
    的高斯白噪声;
    Among them, i, j = 1, 2...m represents the number of models, n = 1, 2...n represents the number of ultrasonic arrays, m and n are both integers greater than or equal to 1, k ∈ N represents time, C ij represents k -1 The probability that the target transfers from model i to model j at time k,
    Figure PCTCN2020105997-appb-100005
    Represents the i-th model state transition matrix at time k,
    Figure PCTCN2020105997-appb-100006
    Indicates the target state under the i-th motion model at time k,
    Figure PCTCN2020105997-appb-100007
    Means the observation matrix of the nth array at time k,
    Figure PCTCN2020105997-appb-100008
    Represents the target state observation received by the nth array at time k.
    Figure PCTCN2020105997-appb-100009
    Represents the process noise of model i,
    Figure PCTCN2020105997-appb-100010
    Denoted as the observed noise of the nth array, both types of noise are assumed to be zero mean, and the covariances are respectively
    Figure PCTCN2020105997-appb-100011
    Gaussian white noise;
    模型输入交互步骤:设
    Figure PCTCN2020105997-appb-100012
    为k-1时刻扩维EKF滤波器i的状态估计,
    Figure PCTCN2020105997-appb-100013
    为相应的协方差矩阵估计,
    Figure PCTCN2020105997-appb-100014
    为k-1时刻模型i的概率,交互式计算后,扩维EKF滤波器j在k时刻的输入计算公式如下:
    Model input interaction steps: set
    Figure PCTCN2020105997-appb-100012
    Is the state estimation of the extended dimension EKF filter i at time k-1,
    Figure PCTCN2020105997-appb-100013
    For the corresponding covariance matrix estimation,
    Figure PCTCN2020105997-appb-100014
    Is the probability of model i at time k-1. After interactive calculation, the input calculation formula of the extended-dimension EKF filter j at time k is as follows:
    Figure PCTCN2020105997-appb-100015
    Figure PCTCN2020105997-appb-100015
    Figure PCTCN2020105997-appb-100016
    Figure PCTCN2020105997-appb-100016
    其中
    Figure PCTCN2020105997-appb-100017
    among them
    Figure PCTCN2020105997-appb-100017
    子模型滤波步骤:在各个扩维EKF滤波器计算得到相应的输入
    Figure PCTCN2020105997-appb-100018
    Figure PCTCN2020105997-appb-100019
    利用获得的量测
    Figure PCTCN2020105997-appb-100020
    进行各个模型下对应的状态估计更新;
    Sub-model filtering step: calculate the corresponding input in each extended dimension EKF filter
    Figure PCTCN2020105997-appb-100018
    Figure PCTCN2020105997-appb-100019
    Use acquired measurements
    Figure PCTCN2020105997-appb-100020
    Update the corresponding state estimation under each model;
    模型概率更新步骤:对各个模型i=1,2,...m计算模型概率,计算公式如下:Model probability update step: Calculate the model probability for each model i=1, 2,...m, the calculation formula is as follows:
    Figure PCTCN2020105997-appb-100021
    Figure PCTCN2020105997-appb-100021
    其中,
    Figure PCTCN2020105997-appb-100022
    among them,
    Figure PCTCN2020105997-appb-100022
    估计融合输出步骤:根据各个模型的更新概率和状态估计以及协方差矩阵估计,计算出当前时刻目标的状态估计和协方差矩阵估计,计算公式如下:Estimated fusion output step: According to the update probability and state estimation of each model and the covariance matrix estimation, calculate the state estimation and covariance matrix estimation of the target at the current moment. The calculation formula is as follows:
    Figure PCTCN2020105997-appb-100023
    Figure PCTCN2020105997-appb-100023
    Figure PCTCN2020105997-appb-100024
    Figure PCTCN2020105997-appb-100024
    x k|k表示k时刻目标状态估计,P k|k表示k时刻目标状态协方差矩阵估计。 x k|k represents the estimation of the target state at time k, and P k|k represents the estimation of the covariance matrix of the target state at time k.
  4. 根据权利要求3所述的方法,其特征在于,所述子模型滤波步骤包括:The method according to claim 3, wherein the sub-model filtering step comprises:
    状态预测子步骤:对于各个模型i=1,2...m,分别计算相应的预测状态和预测协方差矩阵,计算公式如下:State prediction sub-step: For each model i=1, 2...m, calculate the corresponding prediction state and prediction covariance matrix, the calculation formula is as follows:
    Figure PCTCN2020105997-appb-100025
    Figure PCTCN2020105997-appb-100025
    Figure PCTCN2020105997-appb-100026
    Figure PCTCN2020105997-appb-100026
    数据融合子步骤:利用扩维算法进行数据融合,各个相应变量公式如下:Data fusion sub-step: data fusion is carried out using the expansion algorithm, and the formula of each corresponding variable is as follows:
    Figure PCTCN2020105997-appb-100027
    Figure PCTCN2020105997-appb-100027
    Figure PCTCN2020105997-appb-100028
    Figure PCTCN2020105997-appb-100028
    Figure PCTCN2020105997-appb-100029
    Figure PCTCN2020105997-appb-100029
    对应于模型i=1,2...m,各自的量测预测残差和量测协方差计算公式如下:Corresponding to the model i=1, 2...m, the calculation formulas for the respective measurement prediction residuals and measurement covariances are as follows:
    Figure PCTCN2020105997-appb-100030
    Figure PCTCN2020105997-appb-100030
    Figure PCTCN2020105997-appb-100031
    Figure PCTCN2020105997-appb-100031
    同时计算与模型i对应的似然函数,在假定服从高斯分布的条件下,似然函数如下:At the same time, the likelihood function corresponding to model i is calculated. Under the assumption that it obeys the Gaussian distribution, the likelihood function is as follows:
    Figure PCTCN2020105997-appb-100032
    Figure PCTCN2020105997-appb-100032
    滤波更新子步骤:对应于模型i=1,2,...m,分别计算各自的滤波增益,状态估计更新,以及误差协方差矩阵,计算公式如下:Filtering update sub-step: corresponding to the model i = 1, 2, ... m, respectively calculate the respective filter gain, state estimation update, and error covariance matrix, the calculation formula is as follows:
    Figure PCTCN2020105997-appb-100033
    Figure PCTCN2020105997-appb-100033
    Figure PCTCN2020105997-appb-100034
    Figure PCTCN2020105997-appb-100034
    Figure PCTCN2020105997-appb-100035
    Figure PCTCN2020105997-appb-100035
  5. 一种机器人跟踪装置,其特征在于,所述装置包括:A robot tracking device, characterized in that the device includes:
    数据获取模块,用于:在跟踪的每个时刻,获取至少两个超声波阵列对机器人的观测数据;The data acquisition module is used to: acquire observation data of the robot by at least two ultrasonic arrays at each moment of tracking;
    计算模块,用于:利用预设扩维IMM-EKF算法估计所述每个时刻机器人运动状态,通过与k时刻m种运动状态对应的m种运动模型相匹配的m个扩维EKF滤波器,分别获取k时刻每种运动模型下机器人对应的状态估计,获得m个状态, 对所述m个状态进行加权计算得到k时刻状态估计结果,其中所述每个时刻用k时刻表示,k、m均为大于0的整数。The calculation module is used to estimate the motion state of the robot at each time by using the preset extended-dimensional IMM-EKF algorithm, and pass m extended-dimensional EKF filters that match the m motion models corresponding to the m motion states at time k, Obtain the state estimation corresponding to the robot under each motion model at time k, obtain m states, and perform weighted calculation on the m states to obtain the state estimation result at time k, where each time is represented by k, k, m Both are integers greater than 0.
  6. 根据权利要求5所述的装置,其特征在于,所述数据获取模块用于:The device according to claim 5, wherein the data acquisition module is configured to:
    在k时刻,获取至少两个超声波阵列对机器人的观测数据
    Figure PCTCN2020105997-appb-100036
    其中k、n均为大于0的整数,
    Figure PCTCN2020105997-appb-100037
    均为所述至少两个超声波阵列测得的机器人角度、距离数据的向量。
    At time k, obtain the observation data of at least two ultrasonic arrays on the robot
    Figure PCTCN2020105997-appb-100036
    Where k and n are both integers greater than 0,
    Figure PCTCN2020105997-appb-100037
    Both are vectors of the angle and distance data of the robot measured by the at least two ultrasonic arrays.
  7. 根据权利要求6所述的装置,其特征在于,所述计算模块包括机器人跟踪系统建立模块,用于:7. The device according to claim 6, wherein the calculation module comprises a robot tracking system establishment module for:
    建立所述机器人跟踪系统,所述机器人跟踪系统包括机器人的运动方程和观测方程如下表示:The robot tracking system is established, and the robot tracking system includes the robot's motion equation and observation equation as follows:
    运动方程:
    Figure PCTCN2020105997-appb-100038
    Motion equation:
    Figure PCTCN2020105997-appb-100038
    观测方程:
    Figure PCTCN2020105997-appb-100039
    Observation equation:
    Figure PCTCN2020105997-appb-100039
    C ij=P(M k=M j|M k-1=M i); C ij =P(M k =M j |M k-1 =M i );
    其中i,j=1,2...m表示模型数量,n=1,2……n表示超声波阵列数量,m、n均为大于等于1的整数,k∈N表示时刻,C ij表示k-1时刻目标从模型i转移到k时刻的模型j的概率,
    Figure PCTCN2020105997-appb-100040
    表示k时刻第i个模型状态转移矩阵,
    Figure PCTCN2020105997-appb-100041
    表示,k时刻第i个运动模型下的目标状态,
    Figure PCTCN2020105997-appb-100042
    表示,k时刻第n个阵列的观测矩阵,
    Figure PCTCN2020105997-appb-100043
    表示k时刻第n个阵列接收到的目标状态观测。
    Figure PCTCN2020105997-appb-100044
    表示模型i的过程噪声,
    Figure PCTCN2020105997-appb-100045
    表示为第n个阵列的观测噪声,两种噪声均假设为零均值,协方差分别为
    Figure PCTCN2020105997-appb-100046
    的高斯白噪声;
    Where i, j = 1, 2...m represents the number of models, n = 1, 2...n represents the number of ultrasonic arrays, m and n are both integers greater than or equal to 1, k ∈ N represents time, C ij represents k -1 The probability that the target transfers from model i to model j at time k,
    Figure PCTCN2020105997-appb-100040
    Represents the i-th model state transition matrix at time k,
    Figure PCTCN2020105997-appb-100041
    Indicates the target state under the i-th motion model at time k,
    Figure PCTCN2020105997-appb-100042
    Means the observation matrix of the nth array at time k,
    Figure PCTCN2020105997-appb-100043
    Represents the target state observation received by the nth array at time k.
    Figure PCTCN2020105997-appb-100044
    Represents the process noise of model i,
    Figure PCTCN2020105997-appb-100045
    Denoted as the observed noise of the nth array, both types of noise are assumed to be zero mean, and the covariances are respectively
    Figure PCTCN2020105997-appb-100046
    Gaussian white noise;
    模型输入交互模块,用于:设
    Figure PCTCN2020105997-appb-100047
    为k-1时刻扩维EKF滤波器i的状态估计,
    Figure PCTCN2020105997-appb-100048
    为相应的协方差矩阵估计,
    Figure PCTCN2020105997-appb-100049
    为k-1时刻模型i的概率,交互式计算后,扩维EKF滤波器j在k时刻的输入计算公式如下:
    Model input interaction module, used to: set
    Figure PCTCN2020105997-appb-100047
    Is the state estimation of the extended dimension EKF filter i at time k-1,
    Figure PCTCN2020105997-appb-100048
    For the corresponding covariance matrix estimation,
    Figure PCTCN2020105997-appb-100049
    Is the probability of model i at time k-1. After interactive calculation, the input calculation formula of the extended-dimension EKF filter j at time k is as follows:
    Figure PCTCN2020105997-appb-100050
    Figure PCTCN2020105997-appb-100050
    Figure PCTCN2020105997-appb-100051
    Figure PCTCN2020105997-appb-100051
    其中
    Figure PCTCN2020105997-appb-100052
    among them
    Figure PCTCN2020105997-appb-100052
    子模型滤波模块,用于:在各个扩维EKF滤波器计算得到相应的输入
    Figure PCTCN2020105997-appb-100053
    利用获得的量测
    Figure PCTCN2020105997-appb-100054
    进行各个模型下对应的状态估计更新;
    The sub-model filter module is used to calculate the corresponding input in each extended dimension EKF filter
    Figure PCTCN2020105997-appb-100053
    Use acquired measurements
    Figure PCTCN2020105997-appb-100054
    Update the corresponding state estimation under each model;
    模型概率更新模块,用于:对各个模型i=1,2,...m计算模型概率,计算公式如下:The model probability update module is used to calculate the model probability for each model i=1, 2,...m, the calculation formula is as follows:
    Figure PCTCN2020105997-appb-100055
    Figure PCTCN2020105997-appb-100055
    其中,
    Figure PCTCN2020105997-appb-100056
    among them,
    Figure PCTCN2020105997-appb-100056
    估计融合输出模块,用于:根据各个模型的更新概率和状态估计以及协方差矩阵估计,计算出当前时刻目标的状态估计和协方差矩阵估计,计算公式如下:The estimation fusion output module is used to calculate the state estimation and covariance matrix estimation of the target at the current moment according to the update probability and state estimation of each model and the covariance matrix estimation. The calculation formula is as follows:
    Figure PCTCN2020105997-appb-100057
    Figure PCTCN2020105997-appb-100057
    Figure PCTCN2020105997-appb-100058
    Figure PCTCN2020105997-appb-100058
    x k|k表示k时刻目标状态估计,P k|k表示k时刻目标状态协方差矩阵估计。 x k|k represents the estimation of the target state at time k, and P k|k represents the estimation of the covariance matrix of the target state at time k.
  8. 根据权利要求7所述的装置,其特征在于,所述子模型滤波模块包括:The device according to claim 7, wherein the sub-model filtering module comprises:
    状态预测子模块,用于:对于各个模型i=1,2...m,分别计算相应的预测状态和预测协方差矩阵,计算公式如下:The state prediction sub-module is used to calculate the corresponding prediction state and prediction covariance matrix for each model i=1, 2...m, and the calculation formula is as follows:
    Figure PCTCN2020105997-appb-100059
    Figure PCTCN2020105997-appb-100059
    Figure PCTCN2020105997-appb-100060
    Figure PCTCN2020105997-appb-100060
    数据融合子模块,用于:利用扩维算法进行数据融合,各个相应变量公式如下:The data fusion sub-module is used for data fusion using the expansion algorithm, and the formulas of each corresponding variable are as follows:
    Figure PCTCN2020105997-appb-100061
    Figure PCTCN2020105997-appb-100061
    Figure PCTCN2020105997-appb-100062
    Figure PCTCN2020105997-appb-100062
    Figure PCTCN2020105997-appb-100063
    Figure PCTCN2020105997-appb-100063
    对应于模型i=1,2...m,各自的量测预测残差和量测协方差计算公式如下:Corresponding to the model i=1, 2...m, the calculation formulas for the respective measurement prediction residuals and measurement covariances are as follows:
    Figure PCTCN2020105997-appb-100064
    Figure PCTCN2020105997-appb-100064
    Figure PCTCN2020105997-appb-100065
    Figure PCTCN2020105997-appb-100065
    同时计算与模型i对应的似然函数,在假定服从高斯分布的条件下,似然函数如下:At the same time, the likelihood function corresponding to model i is calculated. Under the assumption that it obeys the Gaussian distribution, the likelihood function is as follows:
    Figure PCTCN2020105997-appb-100066
    Figure PCTCN2020105997-appb-100066
    滤波更新子模块,用于:对应于模型i=1,2,...m,分别计算各自的滤波增益,状态估计更新,以及误差协方差矩阵,计算公式如下:The filter update sub-module is used to calculate the respective filter gain, state estimation update, and error covariance matrix corresponding to the model i=1, 2,...m. The calculation formula is as follows:
    Figure PCTCN2020105997-appb-100067
    Figure PCTCN2020105997-appb-100067
    Figure PCTCN2020105997-appb-100068
    Figure PCTCN2020105997-appb-100068
    Figure PCTCN2020105997-appb-100069
    Figure PCTCN2020105997-appb-100069
  9. 一种机器人跟踪设备,其特征在于,包括:A robot tracking device, which is characterized in that it comprises:
    处理器;processor;
    存储器,用于存储有所述处理器的可执行指令;A memory for storing executable instructions of the processor;
    其中,所述处理器配置为经由所述可执行指令来执行权利要求1至4中任一项所述的机器人跟踪方法的步骤。Wherein, the processor is configured to execute the steps of the robot tracking method according to any one of claims 1 to 4 via the executable instructions.
  10. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时实现权利要求1至4中任一项所述的机器人跟踪方法的步骤。A computer-readable storage medium, wherein the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the robot tracking method according to any one of claims 1 to 4 is implemented. step.
PCT/CN2020/105997 2019-10-29 2020-07-30 Robot tracking method, device and equipment and computer readable storage medium WO2021082571A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CA3158929A CA3158929A1 (en) 2019-10-29 2020-07-30 Robot tracking method, device, equipment, and computer-readable storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911048673.3A CN110849369B (en) 2019-10-29 2019-10-29 Robot tracking method, device, equipment and computer readable storage medium
CN201911048673.3 2019-10-29

Publications (1)

Publication Number Publication Date
WO2021082571A1 true WO2021082571A1 (en) 2021-05-06

Family

ID=69599184

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/105997 WO2021082571A1 (en) 2019-10-29 2020-07-30 Robot tracking method, device and equipment and computer readable storage medium

Country Status (3)

Country Link
CN (1) CN110849369B (en)
CA (1) CA3158929A1 (en)
WO (1) WO2021082571A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113534164A (en) * 2021-05-24 2021-10-22 中船海洋探测技术研究院有限公司 Target path tracking method based on active and passive combined sonar array
CN113805141A (en) * 2021-08-31 2021-12-17 西北工业大学 Single-station passive positioning method based on signal intensity
CN114021073A (en) * 2021-09-24 2022-02-08 西北工业大学 Multi-sensor cooperative target tracking method based on federal IMM
CN114018250A (en) * 2021-10-18 2022-02-08 杭州鸿泉物联网技术股份有限公司 Inertial navigation method, electronic device, storage medium, and computer program product
CN114445456A (en) * 2021-12-23 2022-05-06 西北工业大学 Data-driven intelligent maneuvering target tracking method and device based on partial model
CN114488116A (en) * 2022-01-17 2022-05-13 武汉大学 3D target tracking method based on two-coordinate exogenous radar systems
CN115166635A (en) * 2022-06-24 2022-10-11 江南大学 Robot positioning method based on risk sensitive FIR filtering
CN115792796A (en) * 2023-02-13 2023-03-14 鹏城实验室 Cooperative positioning method, device and terminal based on relative observation equivalent model
CN116383966A (en) * 2023-03-30 2023-07-04 中国矿业大学 Multi-unmanned system distributed cooperative positioning method based on interaction multi-model

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110849369B (en) * 2019-10-29 2022-03-29 苏宁云计算有限公司 Robot tracking method, device, equipment and computer readable storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101661104A (en) * 2009-09-24 2010-03-03 北京航空航天大学 Target tracking method based on radar/infrared measurement data coordinate conversion
WO2012065233A1 (en) * 2010-11-19 2012-05-24 Commonwealth Scientific And Industrial Research Organisation Tracking location of mobile devices in a wireless network
CN103853908A (en) * 2012-12-04 2014-06-11 中国科学院沈阳自动化研究所 Self-adapting interactive multiple model mobile target tracking method
CN104252178A (en) * 2014-09-12 2014-12-31 西安电子科技大学 Strong maneuver-based target tracking method
CN104316058A (en) * 2014-11-04 2015-01-28 东南大学 Interacting multiple model adopted WSN-INS combined navigation method for mobile robot
US20170140141A1 (en) * 2015-11-16 2017-05-18 Personnus System for identity verification
CN109781118A (en) * 2019-03-08 2019-05-21 兰州交通大学 A kind of location tracking method of unmanned vehicle
CN110849369A (en) * 2019-10-29 2020-02-28 苏宁云计算有限公司 Robot tracking method, device, equipment and computer readable storage medium

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5325098A (en) * 1993-06-01 1994-06-28 The United States Of America As Represented By The Secretary Of The Navy Interacting multiple bias model filter system for tracking maneuvering targets
JP2009244234A (en) * 2008-03-31 2009-10-22 New Industry Research Organization Ultrasonic array sensor and signal processing method
CN101610567B (en) * 2009-07-10 2012-05-30 华南理工大学 Dynamic group scheduling method based on wireless sensor network
DE102010027972A1 (en) * 2010-04-20 2011-10-20 Robert Bosch Gmbh Arrangement for determining the distance and the direction to an object
CN101894278B (en) * 2010-07-16 2012-06-27 西安电子科技大学 Human motion tracing method based on variable structure multi-model
JP2014228278A (en) * 2013-05-17 2014-12-08 日本精工株式会社 Ultrasonic proximity sensor device and object detection method
CN106093951B (en) * 2016-06-06 2018-11-13 清华大学 Object tracking methods based on array of ultrasonic sensors
WO2018010099A1 (en) * 2016-07-12 2018-01-18 深圳大学 Target tracking method for turn maneuver, and system for same
WO2018119912A1 (en) * 2016-12-29 2018-07-05 深圳大学 Target tracking method and device based on parallel fuzzy gaussian and particle filter
CN109029243B (en) * 2018-07-04 2021-02-26 南京理工大学 Improved agricultural machinery working area measuring terminal and method
CN110095728A (en) * 2019-05-23 2019-08-06 合肥工业大学智能制造技术研究院 Battery SOC, SOH combined estimation method based on interactive multi-model

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101661104A (en) * 2009-09-24 2010-03-03 北京航空航天大学 Target tracking method based on radar/infrared measurement data coordinate conversion
WO2012065233A1 (en) * 2010-11-19 2012-05-24 Commonwealth Scientific And Industrial Research Organisation Tracking location of mobile devices in a wireless network
CN103853908A (en) * 2012-12-04 2014-06-11 中国科学院沈阳自动化研究所 Self-adapting interactive multiple model mobile target tracking method
CN104252178A (en) * 2014-09-12 2014-12-31 西安电子科技大学 Strong maneuver-based target tracking method
CN104316058A (en) * 2014-11-04 2015-01-28 东南大学 Interacting multiple model adopted WSN-INS combined navigation method for mobile robot
US20170140141A1 (en) * 2015-11-16 2017-05-18 Personnus System for identity verification
CN109781118A (en) * 2019-03-08 2019-05-21 兰州交通大学 A kind of location tracking method of unmanned vehicle
CN110849369A (en) * 2019-10-29 2020-02-28 苏宁云计算有限公司 Robot tracking method, device, equipment and computer readable storage medium

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113534164A (en) * 2021-05-24 2021-10-22 中船海洋探测技术研究院有限公司 Target path tracking method based on active and passive combined sonar array
CN113534164B (en) * 2021-05-24 2023-12-12 中船海洋探测技术研究院有限公司 Target path tracking method based on active-passive combined sonar array
CN113805141B (en) * 2021-08-31 2023-06-30 西北工业大学 Single-station passive positioning method based on signal intensity
CN113805141A (en) * 2021-08-31 2021-12-17 西北工业大学 Single-station passive positioning method based on signal intensity
CN114021073A (en) * 2021-09-24 2022-02-08 西北工业大学 Multi-sensor cooperative target tracking method based on federal IMM
CN114018250A (en) * 2021-10-18 2022-02-08 杭州鸿泉物联网技术股份有限公司 Inertial navigation method, electronic device, storage medium, and computer program product
CN114018250B (en) * 2021-10-18 2024-05-03 杭州鸿泉物联网技术股份有限公司 Inertial navigation method, electronic device, storage medium and computer program product
CN114445456A (en) * 2021-12-23 2022-05-06 西北工业大学 Data-driven intelligent maneuvering target tracking method and device based on partial model
CN114488116A (en) * 2022-01-17 2022-05-13 武汉大学 3D target tracking method based on two-coordinate exogenous radar systems
CN114488116B (en) * 2022-01-17 2024-04-26 武汉大学 3D target tracking method based on two-part two-coordinate exogenous radar system
CN115166635A (en) * 2022-06-24 2022-10-11 江南大学 Robot positioning method based on risk sensitive FIR filtering
CN115166635B (en) * 2022-06-24 2023-03-28 江南大学 Robot positioning method based on risk sensitive FIR filtering
CN115792796A (en) * 2023-02-13 2023-03-14 鹏城实验室 Cooperative positioning method, device and terminal based on relative observation equivalent model
CN116383966A (en) * 2023-03-30 2023-07-04 中国矿业大学 Multi-unmanned system distributed cooperative positioning method based on interaction multi-model
CN116383966B (en) * 2023-03-30 2023-11-21 中国矿业大学 Multi-unmanned system distributed cooperative positioning method based on interaction multi-model

Also Published As

Publication number Publication date
CN110849369B (en) 2022-03-29
CA3158929A1 (en) 2021-05-06
CN110849369A (en) 2020-02-28

Similar Documents

Publication Publication Date Title
WO2021082571A1 (en) Robot tracking method, device and equipment and computer readable storage medium
CN107038717B (en) A method of 3D point cloud registration error is automatically analyzed based on three-dimensional grid
US11138742B2 (en) Event-based feature tracking
CN111354022B (en) Target Tracking Method and System Based on Kernel Correlation Filtering
Agrawal et al. PCE-SLAM: A real-time simultaneous localization and mapping using LiDAR data
Feng et al. Visual map construction using RGB-D sensors for image-based localization in indoor environments
CN111812978B (en) Cooperative SLAM method and system for multiple unmanned aerial vehicles
Li et al. Indoor multi-sensor fusion positioning based on federated filtering
CN114063056A (en) Ship track fusion method, system, medium and equipment
Deng et al. Long-range binocular vision target geolocation using handheld electronic devices in outdoor environment
Dong-Si et al. Consistency analysis for sliding-window visual odometry
Masmitja et al. Underwater mobile target tracking with particle filter using an autonomous vehicle
Le et al. Human detection and tracking for autonomous human-following quadcopter
CN113947636B (en) Laser SLAM positioning system and method based on deep learning
CN115962773A (en) Method, device and equipment for synchronous positioning and map construction of mobile robot
Nawaf et al. Guided underwater survey using semi-global visual odometry
Zhang et al. OW-LOAM: Observation-weighted LiDAR odometry and mapping
CN114613002B (en) Dynamic object detection method and system under motion visual angle based on light projection principle
Wang et al. Improved simultaneous localization and mapping by stereo camera and SURF
Li et al. Research on image feature extraction and matching algorithms for simultaneous localization and mapping
Ren An improved binocular LSD_SLAM method for object localization
Myasnikov Impact of mobile device sensors errors on SLAM problem solution
Ge A real-time stereo visual servoing for moving object grasping based parallel algorithms
Baur et al. On Runtime Reduction in 3D Extended Object Tracking by Measurement Downsampling
Nguyen et al. A robust localization method for mobile robots based on ceiling landmarks

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20882375

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 3158929

Country of ref document: CA

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20882375

Country of ref document: EP

Kind code of ref document: A1