CN114624688B - Tracking and positioning method based on multi-sensor combination - Google Patents

Tracking and positioning method based on multi-sensor combination Download PDF

Info

Publication number
CN114624688B
CN114624688B CN202210254096.9A CN202210254096A CN114624688B CN 114624688 B CN114624688 B CN 114624688B CN 202210254096 A CN202210254096 A CN 202210254096A CN 114624688 B CN114624688 B CN 114624688B
Authority
CN
China
Prior art keywords
target
observation station
observation
coordinate
matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210254096.9A
Other languages
Chinese (zh)
Other versions
CN114624688A (en
Inventor
李静玲
李改有
魏逸凡
陈奕琪
高林
魏平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN202210254096.9A priority Critical patent/CN114624688B/en
Publication of CN114624688A publication Critical patent/CN114624688A/en
Application granted granted Critical
Publication of CN114624688B publication Critical patent/CN114624688B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/02Systems for determining distance or velocity not using reflection or reradiation using radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/02Systems for determining distance or velocity not using reflection or reradiation using radio waves
    • G01S11/04Systems for determining distance or velocity not using reflection or reradiation using radio waves using angle measurements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/02Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using radio waves
    • G01S3/14Systems for determining direction or deviation from predetermined direction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Computer Hardware Design (AREA)
  • Geometry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention belongs to the technical field of tracking and positioning, and particularly relates to a tracking and positioning method based on multi-sensor combination. According to the method, smooth filtering is carried out on the measured data, and then the joint positioning of multi-sensor confidence fusion is carried out, so that the generation of a required target is effectively avoided. The whole system is based on a Bayesian framework, and the information transmission is based on a probability description form, so that the proposed scheme has good performance, environmental adaptability and robustness and can meet the design requirements in engineering.

Description

Tracking and positioning method based on multi-sensor combination
Technical Field
The invention belongs to the technical field of tracking and positioning, and particularly relates to a tracking and positioning method based on multi-sensor combination.
Background
The traditional radiation source tracking mainly carries out sorting identification on targets in a detection area, then carries out frequency correlation and then carries out positioning tracking. Data association and hard decision are often performed in the process of the initial sorting identification, the result obtained in the mode cannot be repaired in the next stage, the calculated amount rises exponentially along with the increase of the dimensionality and the measurement dimensionality of the target, and the target is difficult to be effectively tracked in real time in a complex scene. So the traditional method is to process the single target first and then track the target.
In recent years, a tracking algorithm based on a random finite set theory frame is widely concerned, and the multi-target tracking with unknown target number can be quickly realized without considering the correlation between measurement and targets. The Probability Hypothesis Density (PHD) filter is widely applied to multi-target tracking applications due to its low computational complexity and easy implementation. And the tracking algorithm is mainly used for smoothing the measured information (direction of arrival (DOA)) to reduce the influence of the clutter on the measured information. And based on the result after the smoothing, estimating the position of the target by joint positioning of multiple sensors. The method of smoothing first and then positioning is used for tracking, hard decision is avoided, the adaptability and robustness of the algorithm to complex scenes can be improved, and the tracking performance of multiple targets is improved.
Disclosure of Invention
Aiming at the problems, the invention provides a multi-sensor joint tracking and positioning algorithm to realize the multi-radiation source tracking problem of unknown number of radiation sources under the condition of unknown initial position of a target, has good performance, environmental adaptability and robustness, and can meet the design requirements in engineering.
The technical scheme adopted by the invention is as follows:
the method adopts the joint positioning of firstly smoothing the measured data and then fusing the confidence degrees of the multiple sensors, thereby effectively avoiding the generation of the required target. The whole system is based on a Bayesian framework, and the information transmission is based on a probability description form, so that the proposed scheme has strong robustness and expansibility.
Setting a total observation sampling time K, a total number of observation stations as M and an actual target number as N k The total number of the targets detected by the ith observation station is
Figure BDA0003548129760000011
Knowing the time k, the state vector of the target j extracted from the signal received by the observation station i is
Figure BDA0003548129760000012
Wherein K is more than or equal to 1 and less than or equal to K>
Figure BDA0003548129760000013
And &>
Figure BDA0003548129760000014
Angle of arrival and angle of arrival acceleration of target j relative to observation station i, respectively>
Figure BDA0003548129760000021
For the signal frequency of the target j measured at the observation station i, the corresponding confidence is >>
Figure BDA0003548129760000022
Let the coordinate of the target j at time k be->
Figure BDA0003548129760000023
The coordinate of observation station i is (x) i ,y i ) Defining:
Figure BDA0003548129760000024
wherein n is i For the angle measurement error of the ith observation station, the mean value is obeyed, and the variance is
Figure BDA0003548129760000025
I is more than or equal to 1 and less than or equal to M, and the tracking and positioning method comprises the following steps:
s1, smoothing measured data by adopting a PHD filtering algorithm, which specifically comprises the following steps:
s11, defining that the target and the observation station are positioned on an XY plane, and knowing that the observation station i at the moment k acquires the observed quantity of the target j
Figure BDA0003548129760000026
S12, smoothing the measured data by using a Gaussian mixture probability hypothesis density filter, and the steps are as follows:
s121, defining the total number of targets of the observation station i at the time k-1 as
Figure BDA0003548129760000027
The posterior intensity of the target of observation station i
Figure BDA0003548129760000028
In the form of a mixture of gaussians:
Figure BDA0003548129760000029
wherein
Figure BDA00035481297600000210
A gaussian function with mean m and variance P is defined. />
Figure BDA00035481297600000211
The confidence, state vector and covariance matrix of the corresponding target j are indicated, respectively.
S122, defining a multi-target intensity function of the observation station i at the k moment to be predicted, wherein the multi-target intensity function conforms to a Gaussian mixture form:
Figure BDA00035481297600000212
the expression is divided into two parts, namely the posterior intensity of the survival target
Figure BDA00035481297600000213
Figure BDA00035481297600000214
Figure BDA00035481297600000215
And posterior intensity of newborn target
Figure BDA00035481297600000216
Wherein p is S,k To target survival probability, F k|k-1 Being a state transition matrix, Q k-1 Is a process noise covariance matrix;
s123, according to the predicted PHD function, combining the measurement value obtained at the current moment
Figure BDA0003548129760000031
The updated posterior intensity of the available observation stations i at time k, is also gaussian-mixed:
Figure BDA0003548129760000032
Figure BDA0003548129760000033
Figure BDA0003548129760000034
Figure BDA0003548129760000035
Figure BDA0003548129760000036
Figure BDA0003548129760000037
in the formula H k To observe the matrix, R k To observe the noise covariance matrix, p D,k Is the target detection probability, κ k (z) is the clutter probability density;
s13, outputting the target state parameter after one iteration to be
Figure BDA0003548129760000038
S2, positioning the target according to the ML algorithm and the frequency correlation, and specifically comprising the following steps:
s21, defining that the target and the observation station are positioned on an XY plane, and knowing the position coordinate (x) of the observation station i ,y i ) I is more than or equal to 1 and less than or equal to M, and all observation azimuth angles containing measurement errors of all observation stations are
Figure BDA0003548129760000039
S22, dividing the target plane into grids of a Q multiplied by R range, wherein each grid point represents one position coordinate (p) in the target plane q ,p r ) Where Q =1,2,., Q, R =1,2,.., R, traverses each grid point in the grid plane, calculates a point (p) q ,p r ) Azimuth angle with respect to each observatory:
Figure BDA00035481297600000310
s23, calculating each search point (p) q ,p r ) Azimuth angle alpha relative to observation station k i,(q,r) All azimuth angles observed by the observation station
Figure BDA00035481297600000311
Error e between k i,(q,r)
Figure BDA0003548129760000041
Wherein:
Figure BDA0003548129760000045
and assigning weights to azimuth angles of the search points relative to each observation station:
Figure BDA0003548129760000042
wherein j is min Is that make
Figure BDA0003548129760000046
Taking the minimum value of j, p D,k Is the target detection probability.
S24, calculating a cost matrix T (q, r) consisting of total errors obtained by each search:
Figure BDA0003548129760000043
wherein: q =1,2,. Wherein Q, R =1,2,. Wherein R;
calculating a weight matrix C consisting of the total weights obtained by each search, wherein the matrix elements are as follows:
Figure BDA0003548129760000044
wherein c is k i,(q,r) Searching for a grid point (p) for time k q ,p r ) Weight relative to ith observation station
S25, traversing each grid point in the grid plane to obtain T (Q, R), wherein Q =1,2,.., Q, R =1,2,.., R, and then obtaining a pseudo spectrum of the position of the target, and obtaining a plurality of target estimated positions through the peak value of the pseudo spectrum of the target position by combining the weight matrix T
Figure BDA0003548129760000047
Figure BDA0003548129760000048
Figure BDA0003548129760000049
Representing the total number of targets estimated at the k-th moment;
s3, based on the target estimated coordinates, eliminating false points in multi-target positioning by adopting a frequency correlation algorithm, thereby screening out real target coordinates, and specifically comprising the following steps:
s31, estimating coordinates based on the target
Figure BDA00035481297600000410
And obtaining corresponding frequency according to the azimuth angle of each observation station:
Figure BDA0003548129760000051
wherein j is min Is that make
Figure BDA0003548129760000052
Taking j of the minimum value;
s32, if the target is estimated to be the coordinate
Figure BDA0003548129760000053
Satisfies the following conditions: any two f k i,j' If the difference values are less than the set value, then the judgment is made>
Figure BDA0003548129760000054
The target real coordinate is taken as the target real coordinate; otherwise, it is a false coordinate.
The method has the advantages that the problem of multi-radiation source combined tracking and positioning under unknown radiation sources can be solved, and the method is strong in robustness and good in effect.
Drawings
FIG. 1 is a diagram of raw sensor positions and real trajectories of targets;
FIG. 2 is a confidence map of a grid.
Fig. 3 is a diagram of target position estimation.
Fig. 4 is a diagram of the number estimation of targets.
FIG. 5 shows the target OSPA error.
Detailed Description
The present invention will be described in detail with reference to examples below:
examples
In this embodiment, matlab is used to verify the above algorithm scheme based on multi-sensor joint tracking and positioning, and for simplification, the following assumptions are made for the algorithm model:
the effectiveness of the invention is illustrated below with reference to the figures and simulation examples.
Simulation conditions and parameters
Simulation environment: for ease of illustration, consider a representative two-dimensional scene in the monitored area [ -1000,1000]×[-1000,1000](m) using 3 random regions respectively located at [ -3000, -7000]、[5000,-7000]、[9000,-7000]Sense an unknown number of targets that vary over time; the target state vector is
Figure BDA0003548129760000055
Each target is defined by a position (p) x,k ,p y,k ) And speed->
Figure BDA0003548129760000056
And frequency f k And (4) determining. And the measurements are the frequency and angle DOA of the target. The state equation and the measurement equation of a single target in a two-dimensional plane are respectively x k+1 =F k x k +Gw k ,y k+1 =h(x k+1 )+v k+1 Wherein w is k And v k Process noise and measurement noise, respectively. They are zero mean and the covariance is Q k And R k Gaussian noise vector of (2).
Probability of survival p for each target S,k =0.99, the state transition matrix and the state noise matrix of the linear gaussian motion state equation are as follows:
Figure BDA0003548129760000061
/>
Δ =1s is the sampling period, the probability of detection p for each target D,k =0.98,R k =[(π/180)(rad)] 2 Is the measured noise variance. The measurement equation is as follows:
Figure BDA0003548129760000062
the 4 targets are newly generated at 1s,10s and 20s, respectively. The target appears from four fixed points. Poisson RFS gamma of target neogenesis model k The strength of (a) is as follows:
Figure BDA0003548129760000067
wherein the content of the first and second substances,
Figure BDA0003548129760000063
Figure BDA0003548129760000064
Figure BDA0003548129760000065
Figure BDA0003548129760000066
P γ =diag[1,2,1] 2
for simulating
Figure BDA0003548129760000071
And &>
Figure BDA0003548129760000072
The natural growth of the vicinity.
Simulation content and result analysis
It can be seen in the grid confidence map of fig. 2 that during the process of performing intersection fixing, the fusion confidence can make it accurately positioned on the correct radiation source, as the target location map of fig. 3 corresponds to. In contrast, the algorithm eventually can obtain an estimated trajectory of the radiation source by accumulation over time. It can be seen from fig. 4 and 5 that both the detection performance and OSPA performance meet the requirements. As can be seen from the above description, the proposed algorithm has strong robustness and complex environment adaptability.

Claims (1)

1. A tracking and positioning method based on multi-sensor combination sets total observation sampling time K, total number of observation stations M and actual target number N k The total number of the targets detected by the ith observation station is
Figure FDA0003548129750000011
Knowing the time k, the status vector associated with the target j extracted by the signal received by the observation station i is->
Figure FDA0003548129750000012
Wherein K is more than or equal to 1 and less than or equal to K>
Figure FDA0003548129750000013
And &>
Figure FDA0003548129750000014
Based on the angle of arrival and the acceleration of the angle of arrival of the target j relative to the observation station i>
Figure FDA0003548129750000015
For the signal frequency of the target j measured at the observation station i, the corresponding confidence is >>
Figure FDA0003548129750000016
Let the coordinate of the target j at time k be->
Figure FDA0003548129750000017
The coordinate of observation station i is (x) i ,y i ) Defining:
Figure FDA0003548129750000018
wherein n is i For the angle measurement error of the ith observation station, the mean value is obeyed, and the variance is
Figure FDA0003548129750000019
I is more than or equal to 1 and less than or equal to M; the method is characterized by comprising the following steps:
s1, smoothing measured data by adopting a PHD filtering algorithm, which specifically comprises the following steps:
s11, defining that the target and the observation station are positioned on an XY plane, and knowing that the observation station i at the moment k acquires the observed quantity of the target j
Figure FDA00035481297500000110
S12, smoothing the measured data by using a Gaussian mixture probability hypothesis density filter, and the steps are as follows:
s121, defining the total number of targets of observation station i at the moment k-1 as
Figure FDA00035481297500000111
The posterior intensity of the target of observation station i
Figure FDA00035481297500000112
In the form of a mixture of gaussians:
Figure FDA00035481297500000113
wherein
Figure FDA00035481297500000114
Defines a Gaussian function with mean m and variance P>
Figure FDA00035481297500000115
Respectively representing the confidence coefficient, the state vector and the covariance matrix of the corresponding target j;
s122, defining a multi-target intensity function of the observation station i at the k moment to be in a Gaussian mixture form:
Figure FDA00035481297500000116
the expression is divided into two parts, namely the posterior intensity of the survival target
Figure FDA0003548129750000021
Figure FDA0003548129750000022
Figure FDA0003548129750000023
And posterior intensity of newborn target
Figure FDA0003548129750000024
Wherein p is S,k To target survival probability, F k|k-1 Being a state transition matrix, Q k-1 Is a process noise covariance matrix;
s123, according to the predicted PHD function, combining the measurement value obtained at the current moment
Figure FDA0003548129750000025
The updated posterior strength of observation station i at time k can be obtained,also gaussian mixed: />
Figure FDA0003548129750000026
Figure FDA0003548129750000027
Figure FDA0003548129750000028
Figure FDA0003548129750000029
Figure FDA00035481297500000210
Figure FDA00035481297500000211
In the formula H k To observe the matrix, R k To observe the noise covariance matrix, p D,k Is the target detection probability, κ k (z) is the clutter probability density;
s13, outputting the target state parameter after one iteration to be
Figure FDA00035481297500000212
S2, positioning the target according to the ML algorithm and the frequency correlation, and specifically comprising the following steps:
s21, defining that the target and the observation station are positioned on an XY plane, and knowing the position coordinate (x) of the observation station i ,y i ) I is more than or equal to 1 and less than or equal to M, and all observation azimuth angles containing measurement errors of all observation stations are
Figure FDA00035481297500000213
S22, dividing the target plane into grids of a Q multiplied by R range, wherein each grid point represents one position coordinate (p) in the target plane q ,p r ) Where Q =1,2,., Q, R =1,2,.., R, traverses each grid point in the grid plane, calculates a point (p) q ,p r ) Azimuth angle with respect to each observation station:
Figure FDA0003548129750000031
s23, calculating each search point (p) q ,p r ) Azimuth angle alpha relative to observation station k i,(q,r) All azimuth angles observed by the observation station
Figure FDA0003548129750000032
Error e between k i,(q,r)
Figure FDA0003548129750000033
Wherein: j =1,2, \ 8230;,
Figure FDA0003548129750000034
and weighting the azimuth angles of the search points relative to each observation station:
Figure FDA0003548129750000035
wherein j is min Is that make
Figure FDA0003548129750000036
Taking the minimum value of j, p D,k Is the target detection probability;
s24, calculating a cost matrix T (q, r) consisting of total errors obtained by each search:
Figure FDA0003548129750000037
wherein: q =1,2,. Wherein Q, R =1,2,. Wherein R;
calculating a weight matrix C consisting of the total weights obtained by each search, wherein the matrix elements are as follows:
Figure FDA0003548129750000038
wherein c is k i,(q,r) Searching for a grid point (p) for time k q ,p r ) A weight relative to an ith observation station;
s25, traversing each grid point in the grid plane to obtain T (Q, R), wherein Q =1,2, is, Q, R =1,2, R, namely, obtaining a pseudo spectrum of the position of the target, and obtaining a plurality of target estimated positions through the peak value of the pseudo spectrum of the target position by combining the weight matrix T
Figure FDA0003548129750000039
Figure FDA00035481297500000310
Representing the estimated target total number at the kth moment;
s3, based on the target estimated coordinates, eliminating false points in multi-target positioning by adopting a frequency correlation algorithm, thereby screening out real target coordinates, and specifically comprising the following steps:
s31, estimating coordinates based on target
Figure FDA0003548129750000041
And obtaining corresponding frequency according to the azimuth angle of each observation station:
Figure FDA0003548129750000042
wherein j is min Is that make
Figure FDA0003548129750000043
Obtaining j of the minimum value;
s32, if the target is estimated to be the coordinate
Figure FDA0003548129750000044
Satisfies the following conditions: any two f k i,j' If the difference values are less than the set value, then the judgment is made>
Figure FDA0003548129750000045
The target real coordinate is taken as the target real coordinate; otherwise, it is a false coordinate. />
CN202210254096.9A 2022-03-15 2022-03-15 Tracking and positioning method based on multi-sensor combination Active CN114624688B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210254096.9A CN114624688B (en) 2022-03-15 2022-03-15 Tracking and positioning method based on multi-sensor combination

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210254096.9A CN114624688B (en) 2022-03-15 2022-03-15 Tracking and positioning method based on multi-sensor combination

Publications (2)

Publication Number Publication Date
CN114624688A CN114624688A (en) 2022-06-14
CN114624688B true CN114624688B (en) 2023-04-07

Family

ID=81902485

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210254096.9A Active CN114624688B (en) 2022-03-15 2022-03-15 Tracking and positioning method based on multi-sensor combination

Country Status (1)

Country Link
CN (1) CN114624688B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106249228A (en) * 2016-06-30 2016-12-21 杭州电子科技大学 A kind of cycle vibration source based on fundamental frequency energy-distributing feature distance intelligent detecting method
CN111457918A (en) * 2020-05-06 2020-07-28 辽宁工程技术大学 Continuous miner navigation and positioning system based on multi-sensor information fusion
CN111983636A (en) * 2020-08-12 2020-11-24 深圳华芯信息技术股份有限公司 Pose fusion method, pose fusion system, terminal, medium and mobile robot
CN112016612A (en) * 2020-08-26 2020-12-01 四川阿泰因机器人智能装备有限公司 Monocular depth estimation-based multi-sensor fusion SLAM method
CN112083403A (en) * 2020-07-21 2020-12-15 青岛小鸟看看科技有限公司 Positioning tracking error correction method and system for virtual scene
WO2021007293A1 (en) * 2019-07-08 2021-01-14 Strong Force Vcn Portfolio 2019, Llc Systems and methods for detecting occupancy using radio signals
CN112556689A (en) * 2020-10-30 2021-03-26 郑州联睿电子科技有限公司 Positioning method integrating accelerometer and ultra-wideband ranging
CN113822335A (en) * 2021-08-20 2021-12-21 杭州电子科技大学 GPB 1-GM-PHD-based sequential fusion target tracking method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2505715A1 (en) * 2004-05-03 2005-11-03 Her Majesty In Right Of Canada As Represented By The Minister Of National Defence Volumetric sensor for mobile robotics
US11150322B2 (en) * 2018-09-20 2021-10-19 International Business Machines Corporation Dynamic, cognitive hybrid method and system for indoor sensing and positioning

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106249228A (en) * 2016-06-30 2016-12-21 杭州电子科技大学 A kind of cycle vibration source based on fundamental frequency energy-distributing feature distance intelligent detecting method
WO2021007293A1 (en) * 2019-07-08 2021-01-14 Strong Force Vcn Portfolio 2019, Llc Systems and methods for detecting occupancy using radio signals
CN111457918A (en) * 2020-05-06 2020-07-28 辽宁工程技术大学 Continuous miner navigation and positioning system based on multi-sensor information fusion
CN112083403A (en) * 2020-07-21 2020-12-15 青岛小鸟看看科技有限公司 Positioning tracking error correction method and system for virtual scene
CN111983636A (en) * 2020-08-12 2020-11-24 深圳华芯信息技术股份有限公司 Pose fusion method, pose fusion system, terminal, medium and mobile robot
CN112016612A (en) * 2020-08-26 2020-12-01 四川阿泰因机器人智能装备有限公司 Monocular depth estimation-based multi-sensor fusion SLAM method
CN112556689A (en) * 2020-10-30 2021-03-26 郑州联睿电子科技有限公司 Positioning method integrating accelerometer and ultra-wideband ranging
CN113822335A (en) * 2021-08-20 2021-12-21 杭州电子科技大学 GPB 1-GM-PHD-based sequential fusion target tracking method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Li Yunsheng.Auto-recognition Pedestrians Research Based on HOG Feature and SVM Classifier for Vehicle Images.《2020 IEEE International Conference on Real-time Computing and Robotics (RCAR)》.2020,全文. *
孙建强.多源融合室内外无缝定位技术研究.《中国优秀硕士学位论文全文数据库信息科技辑》.2022,(第undefined期),全文. *
胡富国.卫星动平台下动态门控调度的设计与实现.《空间电子技术》.2022,第第19卷卷(第第19卷期),全文. *
谭维茜.多站纯方位被动跟踪粒子滤波算法研究.《中国优秀硕士学位论文全文数据库信息科技辑》.2010,(第undefined期),全文. *

Also Published As

Publication number Publication date
CN114624688A (en) 2022-06-14

Similar Documents

Publication Publication Date Title
CN107861123B (en) Method for real-time tracking of multiple moving targets by through-wall radar in complex environment
CN110823217A (en) Integrated navigation fault-tolerant method based on self-adaptive federal strong tracking filtering
CN110749891B (en) Self-adaptive underwater single beacon positioning method capable of estimating unknown effective sound velocity
CN110794409B (en) Underwater single beacon positioning method capable of estimating unknown effective sound velocity
CN107436427B (en) Spatial target motion track and radiation signal correlation method
CN112613532B (en) Moving target tracking method based on radar and cyclic neural network complement infrared fusion
CN106932771A (en) A kind of radar simulation targetpath tracking and system
CN109886305A (en) A kind of non-sequential measurement asynchronous fusion method of multisensor based on GM-PHD filtering
CN107346020B (en) Distributed batch estimation fusion method for asynchronous multi-base radar system
CN110516193B (en) Maneuvering target tracking method based on transformation Rayleigh filter under Cartesian coordinate system
CN111829505A (en) Multi-sensor track quality extrapolation track fusion method
Aernouts et al. Combining TDoA and AoA with a particle filter in an outdoor LoRaWAN network
CN108717174A (en) The quick covariance of prediction based on information theory interacts combining passive co-located method
CN110646783A (en) Underwater beacon positioning method of underwater vehicle
CN111757258B (en) Self-adaptive positioning fingerprint database construction method under complex indoor signal environment
Sun et al. Vessel velocity estimation and tracking from Doppler echoes of T/RR composite compact HFSWR
CN114624688B (en) Tracking and positioning method based on multi-sensor combination
CN112333634A (en) Hybrid node positioning method based on UAV
CN111624549B (en) Passive filtering tracking method under non-common-view condition
CN116866752A (en) Design realization method of two-layer information fusion model of distributed microphone array
CN110191422A (en) Ocean underwater sensor network target tracking method
CN113534164A (en) Target path tracking method based on active and passive combined sonar array
CN112946695A (en) Satellite positioning suppression interference identification method based on singular value decomposition
Woischneck et al. Localization and velocity estimation based on multiple bistatic measurements
CN114548159B (en) Ultra-wideband accurate positioning method under signal interference

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant