CN110596643A - Multi-sound-array moving target detection and positioning method - Google Patents

Multi-sound-array moving target detection and positioning method Download PDF

Info

Publication number
CN110596643A
CN110596643A CN201910740948.3A CN201910740948A CN110596643A CN 110596643 A CN110596643 A CN 110596643A CN 201910740948 A CN201910740948 A CN 201910740948A CN 110596643 A CN110596643 A CN 110596643A
Authority
CN
China
Prior art keywords
target
state
sensor
sensors
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201910740948.3A
Other languages
Chinese (zh)
Inventor
刘伟峰
田正旺
茹心锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Dianzi University
Original Assignee
Hangzhou Dianzi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Dianzi University filed Critical Hangzhou Dianzi University
Priority to CN201910740948.3A priority Critical patent/CN110596643A/en
Publication of CN110596643A publication Critical patent/CN110596643A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
    • G01S5/20Position of source determined by a plurality of spaced direction-finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
    • G01S5/22Position of source determined by co-ordinating a plurality of position lines defined by path-difference measurements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/18Complex mathematical operations for evaluating statistical data, e.g. average values, frequency distributions, probability functions, regression analysis

Abstract

The invention relates to a method for detecting and positioning a moving target of a multi-sound array. The traditional method has the uncertain conditions of mixed missed detection, false alarm and the like in the measured data, and the association problem of the data which needs to be processed is inevitable. The invention researches a target tracking and positioning problem based on a random finite set of labels and by using a sound array sensor as a main measurement means, and solves the problem of uncertain relation between a signal source and a target. Firstly, calculating the time difference measurement of a group of sensors for receiving signals according to a generalized cross-correlation function, and then calculating the angle difference measurement of each group of sensors according to the signal receiving direction; and then, carrying out fusion processing on the array measurement information, detecting the target by adopting a Gilles-generalized label multi-Bernoulli algorithm and combining a Gaussian mixture model, and estimating the state, the track and the number of the target. The method adopts real sound signals, calculates array measurement information, and runs faster and tracks results more effectively through a Gibbs-generalized label multi-Bernoulli algorithm.

Description

Multi-sound-array moving target detection and positioning method
Technical Field
The invention belongs to the technical field of multi-sensor multi-target tracking, and particularly relates to a multi-sound array moving target detection and positioning method based on a Gibbs-generalized label multi-Bernoulli (Gibbs-GLMB) algorithm.
Background
In the field of target tracking, although a sound source signal is far from an optical method, the greatest advantage of observing the sound source signal is that the sound array sensor has incomparable advantages when being used for measurement under obstacle conditions, such as complex conditions of a target environment, for example, special environments such as smoke, dust, darkness, transparent objects, reflected light, nuclear radiation and the like, and especially when being applied to underwater detection targets, the sound array sensor is more used for detecting the positions of the targets, and is also a main reason for wide application of the sensors.
The target tracking device can accurately record the change of an observed target and is suitable for the special environment with dark light and light pollution, and has great research significance for research. The greatest disadvantage of this type of sensor is that the information of the pixel points in the observation information is missing, and the target observation is easily disturbed by the echoes of the surrounding buildings, compared with the sound signal and the general image or video observation data. Therefore, compared with a visual sensor, the sound sensor is characterized by small data volume, target information loss and complex interference of environment. Under this condition, the tracking difficulty of the target is increased. For this reason, a priori information must be added to compensate: including parameter information, model information, initial conditions, etc., to effectively solve the problem of state (motion) estimation of the target.
In the observation process of a plurality of pairs of sensors, the traditional method inevitably needs to process the associated problems of data, such as the new growth and death of a detection target, uncertain conditions such as missing detection and false alarm of the target in the detection process of the sensor, errors in observation of observation equipment, lack of prior observation information and the like, so that some false measurements can be mixed in the measurement data, the source of the measurement data cannot be determined, and the corresponding relation between the observation target and the measurement is also damaged. The invention researches a target tracking and positioning problem based on a random finite set of labels (RFS) and by using an acoustic array sensor as a main measuring means, solves the correlation problem of data sources, and the correlation data is generated by data correlation to ensure that the observed information target is fixed.
Disclosure of Invention
The invention aims to provide a multi-sound array moving target detection and positioning method based on a Gibbs-generalized label multi-Bernoulli (Gibbs-GLMB) algorithm aiming at the defects of the prior art.
The method specifically comprises the following steps:
(1) building a model;
(1-1) modeling background:
for multiple targets, the state of the target and the sensor measurements at time k can be represented as a random finite set: xk={xk,1,…,xk,N(k)}∈ΕsWherein EsState space representing objects, eoRepresenting a measurement space observed by a sensor, wherein N (k) is the survival number of targets at the moment k, M (k) is the number of targets observed at the moment k, and q is the number of a sensor group; xkAnda finite set of target state vectors and observation measurement vectors representing time k;
the single target state is represented as:wherein x is the target state and l is the target label, and the change of the target state does not affect the label.
(1-2) modeling a system, which comprises establishing a state equation and an observation equation;
the state equation is: xk=AXk-1+Bωk(ii) a Where A is the target state transition matrix, B is the noise matrix, ωkProcess noise, standard gaussian distribution obeyed;
wherein T ═ 1, represents the sample time;
Xk={xk,1,…,xk,i,…,xk,N(k)in the form of multiple targetsSet of states, i ∈ [1, N (k)]N (k) is the survival number of the target at the k time, and represents the existence state vector of the ith target at the k time,wherein xk,iWhich represents the x-direction coordinate, is,representing the x-direction velocity; y isk,iWhich represents the coordinates in the y-direction,represents the y-direction velocity, T represents transposition;
the observation equation is:wherein the content of the first and second substances,time difference, delta, observed for a pair of sensorsqThe angle difference of signals received by a pair of sensors is shown, Q is the serial number of the sensors, and Q groups of sensors are shared;for noise measurement, a standard gaussian distribution with a mean of 0 is followed.
(1-3) model environment:
within the detection range there are Q groups of sensor arrays, denoted S1:Q={s1,…,sq,…,sQ},q∈[1,Q](ii) a Wherein s isq={sq,1,…,sq,j,…,sq,NRepresents all the j sensor arrays in the qth group, j ∈ [1, N ]];sq,jFor each sensor array position coordinate, sq,j=(xq,j,yq,j) (ii) a Assume that the target position state is denoted xi=(x,y);
The time difference for each pair of arrays is expressed as:Xias coordinate position, sq,jThe coordinate position of each sensor of the q-th group of sensors is a position under a Cartesian rectangular coordinate system; i | · | | represents a 2 norm, v is the speed of sound;
the angular difference for each pair of arrays is expressed as: deltaq=|arctan(xi-sq,1)-arctan(xi-sq,2)|;δqRepresenting the difference in azimuth angle between the target and the two sensors.
(1-4) calculating the sound signal time difference:
the signal observed for the sensor is represented by a mathematical model as:
z1(t)=α1s(t)+n1(t),z2(t)=α2s(t-τ1:2)+n2(t); wherein z is1(t) and z2(t) respectively, the sensor receives the signal, s (t) is the true signal, n1(t) and n2(t) ambient noise, τ, respectively1:2For the time difference, alpha, between the signals detected by the two sensors1And alpha2Is the amplitude of the signal received;
the time difference is estimated by convolution calculation:τ1:2=arg max R1:21:2) (ii) a Wherein R is1:21:2) Is a convolution calculation when R1:21:2) τ corresponding to the time of maximum1:2The time difference is the time difference received by different sensors corresponding to the same signal;
the Fourier transform is carried out on the received signals, the time domain is converted into the frequency domain, the problem processing is simplified, and the cross-correlation function R of the two signals isgccq) Expressed as:
Rgccq) Is defined as a generalized cross-correlation function, wherein Z1(omega) and Z2(ω) is each z1(t) and z2(t) Fourier transform of complex conjugate,. phi., (t)1,2And (ω) is a phase transform (path) weighting function of the generalized cross-correlation.
(2) Gibbs-generalized label multi-bernoulli filtering;
the generalized label multi-Bernoulli random finite set is a state space ofThe label space isA random finite set of tags of (a), distributed as:where ξ is a set of discrete subscripts, ω(ξ)(L) and p(ξ)Satisfies the following conditions:∫p(ξ)(x, l) dx ═ 1; weight in formulaDependent only on multi-target state label set, multi-target indexDepends on the whole multi-target state;
given a multi-target state X, wherein each target satisfies (X, l) epsilon X, the detection probability of X is pD,m(x, l) the observed state z produced by each state is represented by a likelihood function g (z | x, l); defining a mapping relationship between multiple sensors and multiple targets as a functionIf theta (I) ═ theta (I ') > 0, I ═ I', theta (I, l) represents the ith pair of associated members in theta (I), the set theta represents the ensemble vector mapping set of multisensors, and the subset I thereof is represented by theta (I); assuming that the generation of the target and the clutter are both independent detection, the multi-sensor multi-target likelihood function is:
wherein the content of the first and second substances, clutter function, p, representing Poisson distributionD,m(x, l) probability detection of sensor numbered m;
the delta-GLMB filter satisfies:wherein xi is a set of discrete spaces;
the delta-GLMB filter is a multi-target Bayesian filter based on generalized label Bernoulli distribution, and the forward propagation expression of the delta-GLMB is as follows:
given the forward propagation at time k, the expression k +1 at the next time through the joint update step and the prediction step is defined as:
whereinξ∈Ξ,θ+∈Θ+And, and:
whereinIs the label space of the new object,is the label space of the object, I+Set of labels, r, of the target trajectory at the next time instantB,+(l+) Indicates a label of l+Probability of birth of, pB,+(x+,l+) Is the distribution of motion states, f+(x+|·,l+) Is a markov state transition equation,is determined by a prior probability p(ξ)(. l) the probability density of the survival target; the total expression lists all cases of birth, death and survival by the new measurement hypothesis tags.
(3) Gibbs sampling estimation:
under the condition of known covariance data and parameter distribution, the target state is assumed to be Xk={xk,1,…,xk,N(k)And the probability distribution pi is satisfied, and the probability distribution pi is obtained by Gibbs sampling, and the specific method is as follows:
initialization: x1={x1,1,…,x1,N(1)};
Sampling: x is the number of2,1~π(·|x1,2:N(1));
Sampling: x is the number of2,2~π(·|x2,1,x1,3:N(1));
······
Sampling: x is the number of2,n~π(·|x2,1:n,x1,n+1:N(1));
Thereby realizing X1={x1,1,…,x1,N(1)To X2={x2,1,…,x2,nSampling of the samples;
repeating the steps to obtain the target state distribution state X at the kth momentk={xk,1,…,xk,N(k)}。
The invention has the beneficial effects that: the method detects sound signals emitted by a plurality of targets in a multi-sensor array, obtains the time difference required by the signals emitted by the same target to reach different sensors through a generalized cross-correlation method, calculates the angle difference of a group of sensors through the angle of the signals received by each sensor, and establishes a target observation model based on a random finite set theory. And after observation information is obtained, performing multi-target positioning tracking through a Gibbs-generalized label multi-Bernoulli (Gibbs-GLMB) algorithm.
Drawings
FIG. 1 is a detection model of the present invention;
FIG. 2. acoustic signals of each of three experimental targets;
FIG. 3. cross-correlation waveform with time difference of 0.02 s;
FIG. 4. motion trajectory tracking for the Gibbs-generalized Label Multi-Bernoulli (Gibbs-GLMB) algorithm;
FIG. 5. number of targets estimation (100 MCs);
FIG. 6. OSPA distance of target (100 MCs).
Detailed Description
The invention is further described below with reference to the accompanying drawings:
a multi-sound array moving target detection and positioning method based on a Gibbs-generalized label multi-Bernoulli (Gibbs-GLMB) algorithm specifically comprises the following steps:
(1) building a model;
(1-1) modeling background: for multiple targets, the state of the target and the sensor measurements at time k can be expressed in a Random Finite Set (RFS) as: xk={xk,1,…,xk,N(k)}∈ΕsWherein EsState space representing objects, eoThe number of the targets observed at the moment k is N (k), the number of the targets observed at the moment k is M (k), and q is the number of the sensor group. XkAnda finite set of target state vectors and observed metrology vectors representing time k.
Defining X for Multi-object Statek={xk,1,…,xk,N(k)}∈ΕsWherein a single target state is defined as:
where x is the target state and l is the target tag (different from the other targets), and a transition in the target state does not affect the tag.
(1-2) modeling a system, which comprises establishing a state equation and an observation equation.
The equation of state is expressed as: xk=AXk-1+Bωk(ii) a Where A is the target state transition matrix, B is the noise matrix, ωkProcess noise, standard gaussian distribution obeyed;
wherein T ═ 1, represents the sample time;
Xk={xk,1,…,xk,i,…,xk,N(k)is a set of states for multiple targets, i ∈ [1, N (k)]N (k) is the survival number of the target at the k time, and represents the existence state vector of the ith target at the k time,wherein xk,iWhich represents the x-direction coordinate, is,representing the x-direction velocity; y isk,iWhich represents the coordinates in the y-direction,indicating y-direction velocity and T denotes transposition.
The observation equation is expressed as:
wherein the content of the first and second substances,time difference, delta, observed for a pair of sensorsqThe angle difference of signals received by a pair of sensors is shown, Q is the serial number of the sensors, and Q groups of sensors are shared;for noise measurement, a standard gaussian distribution with a mean of 0 is followed.
(1-3) model environment: within the detection range there are Q groups of sensor arrays, denoted S1:Q={s1,…,sq,…,sQ},q∈[1,Q](ii) a Wherein s isq={sq,1,…,sq,j,…,sq,NDenotes all j sensor arrays in the qth group, j ∈ [1, N ]](ii) a Wherein s isq,jFor each sensor array position coordinate, sq,j=(xq,j,yq,j). Assume that the target position state is denoted xi=(x,y)。
The time difference for each pair of arrays is expressed as:
wherein, XiAs coordinate position, sq,jThe coordinate position of each sensor of the q-th group of sensors is a position under a Cartesian rectangular coordinate system; | | · | | represents a 2 norm, and v is the speed of sound.
The angular difference for each pair of arrays is expressed as: deltaq=|arctan(xi-sq,1)-arctan(xi-sq,2)|;
Wherein, deltaqRepresenting the difference in azimuth angle between the target and the two sensors.
(1-4) calculating the sound signal time difference:
the signal observed for the sensor is represented by a mathematical model as:
z1(t)=α1s(t)+n1(t),z2(t)=α2s(t-τ1:2)+n2(t);
wherein z is1(t) and z2(t) respectively, the sensor receives the signal, s (t) is the true signal, n1(t) and n2(t) ambient noise, τ, respectively1:2For the time difference, alpha, between the signals detected by the two sensors1And alpha2Is the received amplitude of the signal.
The time difference is estimated by convolution calculation:τ1:2=arg max R1:21:2) (ii) a Wherein R is1:21:2) Is a convolution calculation when R1:21:2) τ corresponding to the time of maximum1:2I.e. the time difference received by the same signal corresponding to different sensors.
For convenient calculation, Fourier transform is carried out on the received signals, the time domain is converted into the frequency domain, problem processing is simplified, and the cross-correlation function R of the two signals isgccq) Expressed as:
Rgccq) Is defined as a generalized cross-correlation function, wherein Z1(omega) and Z2(ω) is each z1(t) and z2(t) Fourier transform of complex conjugate,. phi., (t)1,2And (ω) is a phase transform (path) weighting function of the generalized cross-correlation.
Description of the symbols:
the single target state is represented by lower case letters (e.g., x);
multi-target states are indicated by italic capital letters (e.g., X);
labeled distributions or states are in bold capital letters (e.g., π, X, X);
black letters for spaces, e.g. state spaces ofObservation space is
RepresentsA limited subset of (a).
The inner product abbreviation means:
exponential expression of the real-valued function h:definition of
Generalized Kronecker delta function:
includes the functions:
(2) gibbs-generalized label multi-bernoulli filtering;
generalized label multi-bernoulli (GLMB) Random Finite Set (RFS); single target stateDescribed by single bernoulli RFS, the probability density distribution is:
where r represents the probability of the presence of a single target X, and p (X) is the probability density of target X. Single Bernoulli RFS inThe probability of 1-r in space is null, the probability of existence as a single object is r, and the single object satisfies a probability density of p (defined inUpper) of the first and second groups.
Broad-sense markThe finite set of labbernoulli randoms a state space ofThe label space isA random finite set of tags of (a), distributed as:where ξ is a set of discrete subscripts, ω(ξ)(L) and p(ξ)Satisfies the following conditions:∫p(ξ)(x, l) dx ═ 1; weight in formulaDependent only on multi-target state label set, multi-target indexDepending on the overall multi-target state.
Given a multi-target state X, wherein each target satisfies (X, l) epsilon X, the detection probability of X is pD,m(x, l), the observed state z resulting from each state is represented by a likelihood function g (z | x, l). Defining a mapping relationship between multiple sensors and multiple targets as a functionIf θ (I) ═ θ (I ') > 0, I ═ I', θ (I, l) denotes the ith pair of associated members in θ (I), the set Θ denotes the ensemble vector mapping set of multisensors, the subset I of which is denoted by Θ (I). Assuming that the generation of the target and the clutter are both independent detection, the multi-sensor multi-target likelihood function is:
wherein the content of the first and second substances, clutter function, p, representing Poisson distributionD,m(x, l) probability detection of sensor numbered m;
the delta-GLMB filter satisfies:wherein xi is a set of discrete spaces;
the delta-GLMB filter is a multi-target Bayesian filter based on GLMB distribution, and the forward propagation expression of the delta-GLMB is as follows:
through Bayesian recursion, the posterior probability of the estimated target state is calculated as an updating step and a predicting step:
a prediction step: pik|k-1(Xk|Z1:k-1)=∫fk|k-1(Xk|Xk-1k-1(Xk-1|Z1:k-1)δXk-1
And (3) updating:
wherein pik(·|Z1:k) Representing the multiple posterior probability density at time k, fk|k-1(. gth.) is the multiple target transfer density, gk(. is) is the likelihood function for each target in the multiple targets.
The integral is defined inThe set of (2) is integrated:
the probability density of the multi-target filtering includes all information of the multi-target state, such as the number and the current time of the targetStatus. For convenience of writing, the notation at time k is abbreviated as follows: subscript + indicates the next time instant.
The updating step and the predicting step need to obtain the weight by calculating the optimal path and the optimal allocation, and the calculation structures of the updating step and the predicting step are the same, so that the calculation efficiency is low. Invalid and repeated particle generation in the two-step truncation process is reduced by combining the updating step and the predicting step of Bayesian recursion, so that the calculation amount is reduced.
Given the forward propagation at time k, the expression k +1 at the next time through the joint update step and the prediction step is defined as:
whereinξ∈Ξ,θ+∈Θ+And, and:
whereinIs the label space of the new object,is the label space of the object, I+Set of labels, r, of the target trajectory at the next time instantB,+(l+) Indicates a label of l+Probability of birth of, pB,+(x+,l+) Is the distribution of motion states, f+(x+|·,l+) Is a markov state transition equation,is determined by a prior probability p(ξ)(. l) probability density of surviving objects. The total expression lists all cases of birth, death and survival by the new measurement hypothesis tags.
(3) Gibbs sampling estimation:
the probability density is given by means of truncation estimation, and under the condition of known covariance data and parameter distribution, the target state is assumed to be Xk={xk,1,…,xk,N(k)It satisfies the probability distribution pi, obtained by gibbs sampling, as follows:
initialization: x1={x1,1,…,x1,N(1)};
Sampling: x is the number of2,1~π(·|x1,2:N(1));
Sampling: x is the number of2,2~π(·|x2,1,x1,3:N(1));
······
Sampling: x is the number of2,n~π(·|x2,1:n,x1,n+1:N(1));
Thereby realizing X1={x1,1,…,x1,N(1)To X2={x2,1,…,x2,nSampling of the samples;
repeating the steps to obtain the target state distribution state X at the kth momentk={xk,1,…,xk,N(k)}。
To verify the effectiveness of the invention, three sets of acoustic sensor arrays were set up, each set containing two sensors, with a tracking area of [0,100 ]]×[0,100]m2The sensor positions are respectively (100m,95m) and (95m,100m), (5m,100m) and (0m,95m) and (0m,5m) and (5m,0m) the target makes uniform motion in the simulation area. Probability of detection PD0.98, probability of survival PS0.99, clutter intensity λcWhen the detection time is 100s, the maximum number of targets is 3, and the motion model of the targets is a constant velocity linear (CV) motion model.
Multiple target state set as Xk={xk,1,…,xk,i…,xk,N(k)},i∈[1,N(k)];
Representing the x-direction velocity; y isk,iWhich represents the coordinates in the y-direction,representing the y-direction velocity. i denotes the ith target.
The survival time of target 1 is 1s-80s, that of target 2 is 20s-100s, and that of target 3 is 25s-100 s. The initial states of the three targets are:
x1=[100m,-1m/s,0m,1m/s]T
x2=[50m,0.1m/s,0m,1m/s]T
x2=[0m,0.8m/s,50m,-0.2m/s]T
according to the experiment, three sections of audio files are read through Matlab software according to the collected sound signals of three targets: sample1.wav, sample2.wav, and sample3. wav.
The real position and the multi-target real running track of the multi-sensor array within 1s-100s are shown in the figure 1, three groups of sensors are independent from each other, the sensors are indicated by circles, the target positions are unknown and are within the detection range of 6 sensors, and in the figure, p1, p2 and p3 are the tracks of three detected targets respectively.
Fig. 2 is a graph of the spectrum of an audio signal read by Matlab software for three different targets.
Fig. 3 shows three different targets, taking the sound wave time difference τ as 0.02 as an example, and the three signals are simulated by a generalized cross-correlation algorithm.
The black line in fig. 4 is the real motion trajectory of the target, and the black point is the estimated target position obtained by the Gibbs-generalized label multi-bernoulli (Gibbs-GLMB) algorithm. As can be seen from the figure, the independence of a plurality of targets can be guaranteed through the random set theory of the tags, the measured information is subjected to fusion processing, and the three targets in the experiment can be effectively tracked by combining the Gibbs-generalized tag multi-Bernoulli (Gibbs-GLMB) algorithm.
Fig. 5 shows the estimation of the number of targets in the detection area, where a black line is the number of real targets, and a dotted line is an average value obtained by 100 monte carlo simulations of the Gibbs-generalized label multi-bernoulli (Gibbs-GLMB) algorithm, where the estimation of the number of targets by the algorithm fluctuates greatly when the real number of targets increases, but when the real number does not change, the estimated number is substantially consistent with the real value.
FIG. 6 for evaluating the tracking effectiveness of the present invention, an optimal sub-pattern assignment (OSPA) distance is used for evaluation, defined as:
wherein X andset of true and estimated states, Π, both of which are targetsnAll permutation modes of the set {1, …, k } are shown, p is a sequence parameter, andthe value range is more than or equal to 1 and less than or equal to p and less than or equal to infinity. The experiment selects c as 100 and p as 1. As can be seen from FIG. 6, some deviations appear in the estimation of the number of targets, and in general, the overall tracking effect of the method is better and basically accords with the true value.
The above description is only for the purpose of technical solution of the present invention and not intended to limit the scope thereof, i.e. modifications or equivalent substitutions may be made to the technical solution of the present invention without departing from the purpose and scope thereof, and shall be covered by the claims of the present invention.

Claims (1)

1. A method for detecting and positioning a moving target of a multi-sound array is characterized by specifically comprising the following steps:
(1) building a model;
(1-1) modeling background:
for multiple targets, the state of the target and the sensor measurements at time k can be represented as a random finite set: xk={xk,1,…,xk,N(k)}∈ΕsWherein EsState space representing objects, eoRepresenting a measurement space observed by a sensor, wherein N (k) is the survival number of targets at the moment k, M (k) is the number of targets observed at the moment k, and q is the number of a sensor group; xkAnda finite set of target state vectors and observation measurement vectors representing time k;
the single target state is represented as:wherein x is a target state, l is a target label, and the label is not affected by the transformation of the target state;
(1-2) modeling a system, which comprises establishing a state equation and an observation equation;
the state equation is: xk=AXk-1+Bωk(ii) a It is composed ofWhere A is the target state transition matrix, B is the noise matrix, ωkProcess noise, standard gaussian distribution obeyed;
wherein T ═ 1, represents the sample time;
Xk={xk,1,…,xk,i,…,xk,N(k)is a set of states for multiple targets, i ∈ [1, N (k)]N (k) is the survival number of the target at the k time, and represents the existence state vector of the ith target at the k time,wherein xk,iWhich represents the x-direction coordinate, is,representing the x-direction velocity; y isk,iWhich represents the coordinates in the y-direction,represents the y-direction velocity;
the observation equation is:q ═ 1,2, …, Q; wherein the content of the first and second substances,time difference, delta, observed for a pair of sensorsqThe angle difference of signals received by a pair of sensors is shown, Q is the serial number of the sensors, and Q groups of sensors are shared;for measuring noise, standard gaussian distribution with a mean value of 0 is obeyed;
(1-3) model environment:
within the detection range there are Q groups of sensor arrays, denoted S1:Q={s1,…,sq,…,sQ},q∈[1,Q](ii) a Wherein s isq={sq,1,…,sq,j,…,sq,NRepresents all the j sensor arrays in the qth group, j ∈ [1, N ]];sq,jFor each sensor array position coordinate, sq,j=(xq,j,yq,j) (ii) a Assume that the target position state is denoted xi=(x,y);
The time difference for each pair of arrays is expressed as:Xias coordinate position, sq,jThe coordinate position of each sensor of the q-th group of sensors is a position under a Cartesian rectangular coordinate system; i | · | | represents a 2 norm, v is the speed of sound;
the angular difference for each pair of arrays is expressed as: deltaq=|arctan(xi-sq,1)-arctan(xi-sq,2)|;δqRepresenting the difference between the azimuth of the target and the two sensors;
(1-4) calculating the sound signal time difference:
the signal observed for the sensor is represented by a mathematical model as:
z1(t)=α1s(t)+n1(t),z2(t)=α2s(t-τ1:2)+n2(t); wherein z is1(t) and z2(t) respectively, the sensor receives the signal, s (t) is the true signal, n1(t) and n2(t) ambient noise, τ, respectively1:2For the time difference, alpha, between the signals detected by the two sensors1And alpha2Is the amplitude of the signal received;
the time difference is estimated by convolution calculation:τ1:2=argmaxR1:21:2) (ii) a Wherein R is1:21:2) Is a convolution calculation when R1:21:2) τ corresponding to the time of maximum1:2I.e. the same signal corresponds to different transmissionsTime difference received by the sensor;
the Fourier transform is carried out on the received signals, the time domain is converted into the frequency domain, the problem processing is simplified, and the cross-correlation function R of the two signals isgccq) Expressed as:
Rgccq) Is defined as a generalized cross-correlation function, wherein Z1(omega) and Z2(ω) is each z1(t) and z2(t) Fourier transform of complex conjugate,. phi., (t)1,2(ω) a phase transform (path) weighting function which is a generalized cross-correlation;
(2) gibbs-generalized label multi-bernoulli filtering;
the generalized label multi-Bernoulli random finite set is a state space ofThe label space isA random finite set of tags of (a), distributed as:where ξ is a set of discrete subscripts, ω(ξ)(L) and p(ξ)Satisfies the following conditions:∫p(ξ)(x, l) dx ═ 1; weight in formulaDependent only on multi-target state label set, multi-target indexDepends on the whole multi-target state;
given a multi-target state X, wherein each target satisfies (X, l) epsilon X, the detection probability of X is pD,m(x, l) the observed state z produced by each state is represented by a likelihood function g (z | x, l); defining a mapping relationship between multiple sensors and multiple targets as a functionIf theta (I) ═ theta (I ') > 0, I ═ I', theta (I, l) represents the ith pair of associated members in theta (I), the set theta represents the ensemble vector mapping set of multisensors, and the subset I thereof is represented by theta (I); assuming that the generation of the target and the clutter are both independent detection, the multi-sensor multi-target likelihood function is:
wherein the content of the first and second substances, clutter function, p, representing Poisson distributionD,m(x, l) probability detection of sensor numbered m;
the delta-GLMB filter satisfies:wherein xi is a set of discrete spaces;
the delta-GLMB filter is a multi-target Bayesian filter based on generalized label multi-Bernoulli distribution, and the forward propagation expression of the delta-GLMB is as follows:
given the forward propagation at time k, the expression k +1 at the next time through the joint update step and the prediction step is defined as:
whereinξ∈Ξ,θ+∈Θ+And, and:
whereinIs the label space of the new object,is the label space of the object, I+Set of labels, r, of the target trajectory at the next time instantB,+(l+) Indicates a label of l+Probability of birth of, pB,+(x+,l+) Is the distribution of motion states, f+(x+|·,l+) Is a markov state transition equation,is determined by a prior probability p(ξ)(. l) the probability density of the survival target; the total expression lists all cases of birth, death and survival by a new measurement hypothesis tag;
(3) gibbs sampling estimation:
under the condition of known covariance data and parameter distribution, the target state is assumed to be Xk={xk,1,…,xk,N(k)And the probability distribution pi is satisfied, and the probability distribution pi is obtained by Gibbs sampling, and the specific method is as follows:
initialization: x1={x1,1,…,x1,N(1)};
Sampling: x is the number of2,1~π(·|x1,2:N(1));
Sampling: x is the number of2,2~π(·|x2,1,x1,3:N(1));
······
Sampling: x is the number of2,n~π(·|x2,1:n,x1,n+1:N(1));
Thereby realizing X1={x1,1,…,x1,N(1)To X2={x2,1,…,x2,nSampling of the samples;
repeating the steps to obtain the target state distribution state X at the kth momentk={xk,1,…,xk,N(k)}。
CN201910740948.3A 2019-08-12 2019-08-12 Multi-sound-array moving target detection and positioning method Withdrawn CN110596643A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910740948.3A CN110596643A (en) 2019-08-12 2019-08-12 Multi-sound-array moving target detection and positioning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910740948.3A CN110596643A (en) 2019-08-12 2019-08-12 Multi-sound-array moving target detection and positioning method

Publications (1)

Publication Number Publication Date
CN110596643A true CN110596643A (en) 2019-12-20

Family

ID=68853904

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910740948.3A Withdrawn CN110596643A (en) 2019-08-12 2019-08-12 Multi-sound-array moving target detection and positioning method

Country Status (1)

Country Link
CN (1) CN110596643A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111504326A (en) * 2020-04-30 2020-08-07 江苏理工学院 Robust G L MB multi-target tracking method based on T distribution
CN111610492A (en) * 2020-06-03 2020-09-01 电子科技大学 Multi-acoustic sensor array intelligent sensing method and system
CN111812637A (en) * 2020-06-02 2020-10-23 杭州电子科技大学 L-RFS mixed target structure modeling and estimation method with type probability
CN111929645A (en) * 2020-09-23 2020-11-13 深圳市友杰智新科技有限公司 Method and device for positioning sound source of specific human voice and computer equipment
CN113390406A (en) * 2021-06-16 2021-09-14 电子科技大学 Multi-target data association and positioning method based on passive multi-sensor system

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105007057A (en) * 2015-07-09 2015-10-28 杭州电子科技大学 Uniformly dense clutter sparse method aiming at finite set tracking filter
CN106199581A (en) * 2016-06-30 2016-12-07 电子科技大学 A kind of multiple maneuver target tracking methods under random set theory
CN107102295A (en) * 2017-04-13 2017-08-29 杭州电子科技大学 The multisensor TDOA passive location methods filtered based on GLMB
CN104820993B (en) * 2015-03-27 2017-12-01 浙江大学 It is a kind of to combine particle filter and track the underwater weak signal target tracking for putting preceding detection
CN107677997A (en) * 2017-09-28 2018-02-09 杭州电子科技大学 Extension method for tracking target based on GLMB filtering and Gibbs samplings
CN108615070A (en) * 2018-04-30 2018-10-02 国网四川省电力公司电力科学研究院 A kind of TDOA and AOA hybrid locating methods based on Chaos particle swarm optimization algorithm
US20190035088A1 (en) * 2017-07-31 2019-01-31 National Technology & Engineering Solutions Of Sandia, Llc Data-driven delta-generalized labeled multi-bernoulli tracker
CN109901106A (en) * 2019-04-02 2019-06-18 北京理工大学 A kind of TDOA/AOA hybrid locating method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104820993B (en) * 2015-03-27 2017-12-01 浙江大学 It is a kind of to combine particle filter and track the underwater weak signal target tracking for putting preceding detection
CN105007057A (en) * 2015-07-09 2015-10-28 杭州电子科技大学 Uniformly dense clutter sparse method aiming at finite set tracking filter
CN106199581A (en) * 2016-06-30 2016-12-07 电子科技大学 A kind of multiple maneuver target tracking methods under random set theory
CN107102295A (en) * 2017-04-13 2017-08-29 杭州电子科技大学 The multisensor TDOA passive location methods filtered based on GLMB
US20190035088A1 (en) * 2017-07-31 2019-01-31 National Technology & Engineering Solutions Of Sandia, Llc Data-driven delta-generalized labeled multi-bernoulli tracker
CN107677997A (en) * 2017-09-28 2018-02-09 杭州电子科技大学 Extension method for tracking target based on GLMB filtering and Gibbs samplings
CN108615070A (en) * 2018-04-30 2018-10-02 国网四川省电力公司电力科学研究院 A kind of TDOA and AOA hybrid locating methods based on Chaos particle swarm optimization algorithm
CN109901106A (en) * 2019-04-02 2019-06-18 北京理工大学 A kind of TDOA/AOA hybrid locating method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
JING WU等: "Tracking Multiple Targets from Multi-static Doppler", 《2017 INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND INFORMATION SCIENCES (ICCAIS)》 *
王煦东: "基于随机有限集理论的多传感器阵列目标定位跟踪与拦截方法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
臧天建: ""基于移动设备的室内定位技术研究", 《中国优秀硕士学位论文数据库 信息科技辑》 *
袁吉鲁 等: "《石油天然气井位测量》", 31 March 2014, 石油大学出版社 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111504326A (en) * 2020-04-30 2020-08-07 江苏理工学院 Robust G L MB multi-target tracking method based on T distribution
CN111504326B (en) * 2020-04-30 2023-10-27 江苏理工学院 Robust GLMB multi-target tracking method based on T distribution
CN111812637A (en) * 2020-06-02 2020-10-23 杭州电子科技大学 L-RFS mixed target structure modeling and estimation method with type probability
CN111812637B (en) * 2020-06-02 2022-12-02 杭州电子科技大学 L-RFS mixed target structure modeling and estimation method with type probability
CN111610492A (en) * 2020-06-03 2020-09-01 电子科技大学 Multi-acoustic sensor array intelligent sensing method and system
CN111929645A (en) * 2020-09-23 2020-11-13 深圳市友杰智新科技有限公司 Method and device for positioning sound source of specific human voice and computer equipment
CN113390406A (en) * 2021-06-16 2021-09-14 电子科技大学 Multi-target data association and positioning method based on passive multi-sensor system
CN113390406B (en) * 2021-06-16 2022-05-24 电子科技大学 Multi-target data association and positioning method based on passive multi-sensor system

Similar Documents

Publication Publication Date Title
CN110596643A (en) Multi-sound-array moving target detection and positioning method
Chen et al. A modified probabilistic data association filter in a real clutter environment
CN107102295A (en) The multisensor TDOA passive location methods filtered based on GLMB
CN112305530B (en) Target detection method for unmanned aerial vehicle group, electronic equipment and storage medium
Cheng et al. A novel radar point cloud generation method for robot environment perception
Gunes et al. Joint underwater target detection and tracking with the Bernoulli filter using an acoustic vector sensor
CN103729859A (en) Probability nearest neighbor domain multi-target tracking method based on fuzzy clustering
CN111999735B (en) Dynamic and static target separation method based on radial speed and target tracking
CN110187337B (en) LS and NEU-ECEF space-time registration-based high maneuvering target tracking method and system
CN107037423A (en) Multi-object tracking method is filtered with reference to the PHD of amplitude information
Tian et al. Feature-aided passive tracking of noncooperative multiple targets based on the underwater sensor networks
Sun et al. Track-to-Track association based on maximum likelihood estimation for T/RR composite compact HFSWR
CN113189575B (en) Detection method and device for positioning personnel in smoke scene
George et al. A finite point process approach to multi-target localization using transient measurements
Ristic et al. Gaussian mixture multitarget–multisensor Bernoulli tracker for multistatic sonobuoy fields
Zhang et al. Localization of multiple emitters based on the sequential PHD filter
CN113093174B (en) PHD filter radar fluctuation weak multi-target-based pre-detection tracking method
Fang et al. A fast implementation of dynamic programming based track-before-detect for radar system
Zhang et al. Underwater multi-source DOA tracking using uniform linear array based on improved GM-PHD filter
CN108981707B (en) Passive tracking multi-target method based on time difference measurement box particle PHD
Li et al. Improved cardinalized probability hypothesis density filtering algorithm
CN102707278B (en) Multi-target tracking method for singular value decomposition
CN112114286A (en) Multi-target tracking method based on line spectrum life cycle and single-vector hydrophone
CN102707279B (en) Multi-target tracking method for sequence UD decomposition
Musicki et al. Efficient active sonar multitarget tracking

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20191220