CN110111360A - A kind of through-wall radar human action characterizing method based on self-organized mapping network - Google Patents

A kind of through-wall radar human action characterizing method based on self-organized mapping network Download PDF

Info

Publication number
CN110111360A
CN110111360A CN201910316233.5A CN201910316233A CN110111360A CN 110111360 A CN110111360 A CN 110111360A CN 201910316233 A CN201910316233 A CN 201910316233A CN 110111360 A CN110111360 A CN 110111360A
Authority
CN
China
Prior art keywords
aen
indicate
data
range profile
network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910316233.5A
Other languages
Chinese (zh)
Other versions
CN110111360B (en
Inventor
杨晓波
王明阳
高绪宇
陈朋云
黄华宾
程璨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201910316233.5A priority Critical patent/CN110111360B/en
Publication of CN110111360A publication Critical patent/CN110111360A/en
Application granted granted Critical
Publication of CN110111360B publication Critical patent/CN110111360B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/088Non-supervised learning, e.g. competitive learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

Disclosed herein is a kind of through-wall radar human action characterizing method based on self-organized mapping network, is related to through-wall radar human action characterization technique field, acts characterization problems especially for human body target hidden after the nontransparent barrier such as wall.Initial range picture is pre-processed first, then to reduce data dimension and extracts the feature of each monocycle Range Profile using the AEN with three layers of full articulamentum.Finally, trained AEN is connected using SOM network, to establish the mapping of word to index, the timing monocycle Range Profile data of human motion through walls are converted into the integer sequence comprising movement semantic information.It is an advantage of the invention that the Stepped frequency radar that can accurately estimate the human body specific action type through walls after the nontransparent barrier such as wall, and need to only be received using a hair one, data processing amount are small.

Description

A kind of through-wall radar human action characterizing method based on self-organized mapping network
Technical field
The present invention relates to through-wall radar human action characterization technique fields, after the nontransparent barrier such as wall Hidden human body target acts characterization problems.
Background technique
Through-wall radar human action be characterized in fall down detection, security monitoring, counterterrorism operations and hostage rescue etc. fields have Significant application value.Domestic and international many research institutions have carried out the research of through-wall radar human action characterization, it was also proposed that a system The characteristic manner of column human action.Tsinghua University G.Li professor team is acquired human action using more base radars, obtains Corresponding micro-Doppler feature, and propose the personal identification based on depth convolutional neural networks and Approach for Gait Classification (Z.Chen,G.Li,F.Fioranelli,et al.Personnel recognition and gait classification based on multistatic micro-doppler signatures using deep convolutional neural networks[J].IEEE Geoscience and Remote Sensing Letters,2018:1-5).U.S.'s wiener Nova University M.Amin professor team has studied the old man based on radar and falls down detection method and technology, by being become using Fourier in short-term It changes and falls down feature with the linear and bilinearity Time-Frequency Analysis Methods such as wavelet transformation, acquisition and complete to fall down the identification of movement (M.Amin,Y.Zhang,F.Ahmad,et al.Radar signal processing for elderly fall detection:The future for in-home monitoring[J].IEEE Signal Processing Magazine,2016,33(2):71-80).The researcher in portion, Missouri, USA Missouri Economic Commission for Europe, Colombia makes Indicate the Doppler Feature of various mankind's activities with cepstrum coefficient (MFCC), such as walk, bend over, fall, using SVM and Two different classifiers such as kNN, according to the MFCC feature of extraction detect automatically tumble movement (L.Liu, M.Popescu, M.Skubic,et al.Automatic fall detection based on Doppler radar motion signature[C].International Conference on Pervasive Computing Technologies for Healthcare and Workshops,Dublin,Ireland,2011,222-225)。
The studies above uses signal transformation method, by the way that original radar signal is transformed into various unlike signal domains, such as Human action echo through walls is decomposed into several relatively simple representation in components by Doppler domain, wavelet field and cepstrum domain.However this A little components may have lost some essential characteristics for being difficult to be detected, therefore the inherence that cannot represent human action well is special Sign.
Summary of the invention
The present invention is to solve through-wall radar human action characterization problems, is proposed a kind of based on coding Self-organizing Maps net certainly The new characterizing method of the human action through walls of network.Firstly, making it meet deep learning as data pre-process initial range Training dataset condition.Then using tool, there are three autocoder network (the auto encoder of full articulamentum Network, AEN) it reduces Range Profile data dimension and extracts Range Profile feature, finally, using Self-organizing Maps (self Organized mapping, SOM) network connects trained AEN, to establish hidden human action transient posture The timing Range Profile data of human action through walls are converted to the integer sequence comprising movement semantic information by index mapping.
Technical solution of the present invention is as follows: a kind of through-wall radar human action characterizing method based on self-organized mapping network, The following steps are included:
Step 1: radar return data processing
Step 1.1, radar echo signal modeling
Using one hair one receipts through-wall radar detection be hidden in a thickness of wall after single human body target movement, one hair One stepped frequency waveforms for receiving through-wall radar transmitting are s (t), and expression formula is
Wherein, K indicates that the frequency point sum of stepped frequency signal, k indicate that the frequency point number of stepped frequency signal, T indicate each frequency point Duration, f0Indicate initial frequency, Δ f indicates frequency stepping, and function rect () is
Then, t moment is apart from radar RtarPlace's target reflection echo is expressed as
Wherein, p indicates p-th of scattering point on target surface in P strong scattering, σpIndicate p-th of strong scattering point Scattering strength, τpIndicate that the trip time delay between p-th of strong scattering point and radar, ψ (t) indicate clutter and noise;
By sr(t) it is mixed with s (t), after filtering out second order and order harmonic frequency, acquired results is carried out with section T Sampling, obtains following discrete vector
sr=[sr,0,sr,1,…,sr,K-1]T (4)
Wherein
ψkIndicate the clutter and noise when kth time sampling;
By the s in formula (4)rBy Hamming window to inhibit distance side lobe level, f is filled into from 0Hz with interval delta f0Hz it Afterwards, N point fast adverse Fourier transform is carried out to it, the Range Profile for obtaining a slow time cycle (is referred to as the monocycle below Range Profile), it is expressed as
sRP=[sRP(0),sRP(1),…,sRP(N-1)]T (6)
Wherein
Indicate the corresponding Range Profile data of n-th of distance unit, n indicates that index number of the distance to unit, value range are [0, N-1], K0It indicates from 0Hz to f00 filler of Hz, N indicate total points of fast adverse Fourier transform.
Step 1.2, on the basis of the monocycle Range Profile that step 1.1 obtains, using moving-target detect (moving Target indicator, MTI) algorithm, normalized and thresholding method improve the signal-to-noise ratio of Range Profile data (signal-noise ratio, SNR), threshold value thresholding can be calculated by following formula:
Wherein,Indicate noise maximum decibel value,Indicate the maximum decibel value of all sample datas, Indicate the minimum decibel value of all sample datas;
Step 1.3, monocycle Range Profile distance truncation: to reduce calculation amount and data being made to meet actual needs, by single-revolution Phase Range Profile is intercepted, and interception points should be calculated as
Wherein, NdIndicate the Range Profile points of interception, Rtar,maxIndicate the farthest generation distance of hidden human body target movement, Rradar,maxIndicate that the farthest detectable one way distance of through-wall radar, N indicate total points of fast adverse Fourier transform;
Step 2: the Range Profile data characteristics preextraction based on AEN;
Since hidden human action changes over time, then the human body attitude at each slow moment time corresponds to a single-revolution Phase Range Profile, so entire human action corresponds to multicycle Range Profile.Further, since each monocycle Range Profile wraps Containing hundreds of even more distance unit, so that monocycle Range Profile data processing dimension with higher, it is therefore desirable to right Pretreated each monocycle Range Profile data carry out dimension-reduction treatment.In order to excavate human body in monocycle Range Profile as much as possible Posture information and important information is avoided to lose, the present invention carries out posture feature preextraction to monocycle Range Profile using AEN;
Step 2.1, training AEN network:
AEN is constructed using the full articulamentum of multilayer, including numeral network portion is conciliate in coding sub-network part.Two parts exist Mirror symmetry in structure, and share the common layer for being known as characteristic layer.When inputting monocycle Range Profile action data, coding Weight parameter effect in data and network is generated coding characteristic and from feature at output by sub-network, and then the coding characteristic is again Weight parameter effect in decoded sub-network and network attempts to restore original monocycle Range Profile action data.Therefore pass through tune Weight parameter in whole AEN keeps the error between the output of AEN and input minimum, can obtain monocycle Range Profile in characteristic layer Effective low-dimensional character representation of action data;
If m-th of slow time cycle Range Profile is expressed asIt is entered into AEN, then the output of first of hidden layer is expressed as in AEN
Wherein,WithOutputting and inputting for first of hidden layer is respectively indicated, and W[l-1,l]Indicate the connection weight matrix between l-1 and first of hidden layer;b[l-1,l]Indicate being biased towards in l hidden layer Amount;F indicates activation primitive, and selects ReLU function as activation primitive, is expressed as
Therefore, the training process of AEN is equivalent to solve following optimization problem
Wherein, W and b respectively indicates all weights in AEN and all bias vectors;Function dist (x, y) indicates x and y The distance between, usually using mean square error (mean squared error, MSE) function, it is expressed asAEN(sRP [m], W, b) it indicates to input s under the conditions of W and bRPThe estimation of [m];
After the completion of step 2.2, AEN network training, only retain the coding sub-network part of AEN, by obtaining corresponding AEN Dimensionality reduction and the feature extraction of monocycle Range Profile action data are realized in characteristic layer output;
Step 3: the timing characterization of the human action type based on SOM network
After extracting feature by the AEN of pre-training, low-dimensional character representation form is converted by monocycle Range Profile.Therefore, The time series that the corresponding multicycle Range Profile of hidden human action has translated into a series of corresponding low-dimensional feature vectors indicates Form.Due to these feature vectors be numerically it is continuous, where character pair space be finite dimension continuous space.But It is that hidden human action posture of the same movement in the adjacent slow time cycle has similitude, and hidden human action type Be it is discrete, therefore, continuous feature space is mapped to another new finite dimension discrete space using SOM network by the present invention, Make the unified character representation of similar posture feature, to be further simplified the characterization of human action through walls, highlights movement at any time Between state transfer characteristic.
SOM network is a kind of unsupervised learning method using competition mechanism study input data topological structure.SOM network It is made of single input layer and single competition layer, is fully connected between two layers by largely power connection, the dimension of each power connection Number is equal to the dimension of SOM network inputs.
Competition layer is made of the multiple neurons for being arranged in array format on two-dimensional surface, these neurons are mutual Competition, i.e., only one neuron is entered vector activation in SOM network competition layer, remaining neuron is all suppressed.Have The input of similar features can activate the neuron being closer in competition layer, and the input with different characteristic can activate competition layer Middle apart from farther away neuron, therefore, SOM network is substantially a kind of clustering method, i.e., special to the monocycle Range Profile of extraction Sign vector is clustered, and with cluster centre neuron coordinate in a two-dimensional plane be indicated.
Step 3.1, training SOM network:
If including M sample by the data set that the range distribution feature vector extracted from trained AEN forms;
Data: training dataset XM, the sample size M of training data concentration;
Input: size L=Q × Q of competition layer, the number of iterations I learn attenuation rate η;
Output: the weight W of SOM network;
S1. random initializtion Wj(i), wherein j=1 ..., L, i=0;
S2. learning rate α (i)=1, i=0 is initialized;
S3. neighbouring scale σ is initializedΛ(i)=Q/2, i=0;
S4. it for i-th iteration, is handled as follows:
S5. from XMIn take out a data sample X (i) at random;
S6. it solves
S7. renewal learning rate α (i)=α (0) e-η(i-1)
S8. neighborhood function Λ (j, j are updated*(i),i);
S9. error is calculated
S10. weight is updated
S11. to step S5~S10 loop iteration I times.
Wherein, j*(i) it is known as the best match unit (Best Matching Unit, BMU) during i-th iteration, The weight vectors of connection and the input data of i-th iteration are most like;In order to further decrease the weight vector and input data of BMU Between error, the error transfer factor current weight that right value update process is indicated according to S9;Weight and farther nerve due to BMU The weight difference of member is increasing, only updates the weight vectors of the neuron near BMU within the scope of adjustable distance, this table in S8 It is shown as neighborhood function Λ (j, j*(i),i);In general, Λ (j, j*(i), i) it is designated as Gaussian function, it indicates are as follows:
Wherein
Then, the step according to S10 updates weight vectors, and final weight is obtained after I iteration;
Step 3.2, training SOM network after, by step 2 AEN extract Range Profile feature vector be transferred to competition In layer on the coordinate of BMU, Range Profile maps feature vectors are gone out two similar movements by trained SOM network can be by Two sentences with similar semantic are translated into, and are shown as two similar space three-dimensional tracks.
The beneficial effects of the present invention are:
The invention proposes a kind of based on from the human action characterizing method through walls for encoding self-organized mapping network, is similar to Each monocycle Range Profile is considered as a word by natural language, and the human action Range Profile data that more time frames are hidden regard Simplify for a sentence so that the echo-signal of human motion through walls to be construed to the simple integer sequence with movement semantic And the validity for keeping human action through walls to indicate.
Detailed description of the invention
Fig. 1 is that a hair one receives the human action schematic diagram of a scenario after through-wall radar detection wall;
Fig. 2 is to characterize block schematic illustration based on the human action through walls from coding self-organized mapping network;
Fig. 3 is autoencoder network structural schematic diagram;
Fig. 4 is self-organized mapping network structural schematic diagram;
Fig. 5 is experiment scene schematic diagram in specific embodiment;
Fig. 6 is the Range Profile example of four seed types movement in specific embodiment, (a) boxing;(b) the not lift of swinging arm Leg movement;(c) it picks up;(d) two arms are gone up and down in parallel;
Fig. 7 is the initial range picture of four seed types movement and AEN estimation output Range Profile and feature in specific embodiment The example of extraction;(a)~(d) is boxing, the movement of lift leg, pickup and two arms of swinging arm do not go up and down four kinds in parallel and move The initial range picture of work;(e)~(h) is that the AEN after training exports Range Profile to the estimation of (a)~(d);(i)~(l) is corresponding In the 16 dimension Range Profile feature vectors of (a)~(d) learnt;
The test sample of final sequence expression and corresponding initial range picture that Fig. 8 acts for four seed types, (a)~ It (d) is boxing, the movement of lift leg, pickup and two arms of swinging arm do not go up and down the initial range pictures of four kinds of movements in parallel; (e)~(h) is the corresponding transfer integer sequence of visual (a)~(d) in three dimensions.
Specific embodiment
A specific embodiment of the invention is provided below according to a specific experimental data:
Specific embodiment using one kind voluntarily develop one hair one receive through-wall radar, by tranmitting frequency be 1.6GHz extremely 2.2GHz stepped frequency signal carries out human action detection through walls.Data acquisition is carried out in the Scientific Research Building A of University of Electronic Science and Technology, is built Vertical data set.It is as shown in Figure 5 to test scene.It invites 4 volunteers to carry out data acquisition to stand in the range of visibility of radar Apart from 1.5 position meter Chu of wall, following four movement is carried out: (a) boxing;(b) the not lift leg movement of swinging arm;(c) it bends over It picks up;(d) two arms are gone up and down in parallel.Fig. 6 gives the Range Profile example of four seed types movement.Each experimenter will be each Movement repeats 54 times, and the duration acted every time is different, obtains corresponding Range Profile data after pre-processing to radar return.
Step 1: radar return data processing
Step 1.1, radar echo signal modeling
Assuming that being hidden in using one receipts through-wall radar detection of a hair with a thickness of dwSingle human body mesh after the wall of=70cm Target movement, a hair one receive the stepped frequency signal frequency point sum K=301 of through-wall radar transmitting, and the duration T of each frequency point= 100 μ s, initial frequency f0=1.6GHz, frequency stepping Δ f=2MHz, the expression formula of the stepped frequency signal waveform s (t) of transmitting are
Wherein, function rect () is
Then t moment is apart from radar RtarPlace's target reflection echo can be expressed as
By sr(t) it is mixed with s (t), after filtering out second order and order harmonic frequency, with section T=10-4S is to gained knot Fruit is sampled, and following discrete vector is obtained
sr=[sr,0,sr,1,…,sr,300]T (19)
Wherein
ψkIndicate the clutter and noise when kth time sampling.
By the s in formula (18)r(t) it by Hamming window with suppressed sidelobes level, and is filled out with interval delta f=2MHz from 0Hz It is charged to f0=1.6GHz, filler K0=f0/ Δ f=1.6GHz/2MHz=800 carries out N=8192 point fast Fourier to it Inverse transformation obtains the monocycle Range Profile of slow time dimension.
Step 1.2, on the basis of the Range Profile that step 1.1 obtains, using MTI algorithm, normalized and thresholding side Method improves the signal-to-noise ratio of Range Profile data.In collected sample database, The setting of threshold value thresholding are as follows:
In addition, 11 differences are added in original radar return in order to which the movement characterization for generating frame has robustness The additive white Gaussian noise of rank carries out data enhancing, and signal-to-noise ratio is expanded to 18dB, section 1dB from 8dB.Therefore, data Collection contains (1+11) × 54 × 4 × 4=10368 sample.
Step 1.3, monocycle Range Profile distance truncation: since each test target executes each move at the distance of 1.5m Make type, execute 8192 Inverse Fast Fourier Transforms, to reduce calculation amount and data being made to meet actual needs, by each example The Range Profile period be truncated to 512 in pixel.Interception points and the corresponding relationship of actual range are
Step 2: the Range Profile data characteristics preextraction based on AEN
Collected data set is divided into one and includes 8448 exemplary trained Sub Data Set XMIt include 1920 with one A exemplary test Sub Data Set.
An AEN is constructed to reduce pretreated Range Profile dimension, and carries out feature extraction.Using four fork cross validations Method select suitable hyper parameter in AEN.The Adadelta algorithm conduct that learning rate is 1.0 is chosen in training for AEN Optimizer chooses MSE function as initial range as the loss between data and AEN estimated result.Finally choose best super ginseng Number are as follows: coding dense layer is 3 layers, 256,128,64 neurons of every layer of correspondence;Characteristic layer is 1 layer, 16 nerves of every layer of correspondence Member.
Fig. 7 shows the example of the low-dimensional Range Profile of the transfer of four sports category.In Fig. 7, each Range Profile by It is converted into fixed and short length vector, similar feature also corresponds to similar Range Profile.Therefore, AEN can reduce original The size of beginning Range Profile, without significantly losing information.
Step 3: the timing characterization of the human action type based on SOM network
Competition layer in SOM network is designed and sized to 5 × 5, because bigger size can bring huge calculating to consume With mode abruption problem.On the other hand, lesser size makes SOM network have lower mode identificating ability.Using from AEN The feature corresponding with training data of middle extraction utilizes Tensorflow in two NVIDIA GeForce as input A figure is constructed on GTX1080Ti GPU, each figure has 11GB memory, to enhance training for the SOM network constructed Journey.
Data: training Sub Data Set XM, the sample size M=8448 of training data concentration;
Input: size L=5 × 5 of competition layer, the number of iterations I=500 learn attenuation rate η=1;
Output: the weight W of SOM network;
S1. random initializtion Wj(i), wherein j=1 ..., L, i=0;
S2. learning rate α (i)=1, i=0 is initialized;
S3. neighbouring scale σ is initializedΛ(i)=5/2, i=0;
S4. for i-th iteration:
S5. from XMIn take out a data sample X (i) at random;
S6. it solves
S7. renewal learning rate α (i)=α (0) e-η(i-1)
S8. neighborhood function Λ (j, j are updated*(i),i);
S9. error is calculated
S10. weight is updated
S11. to step S5~S10 loop iteration I times.
During the training period, maximum epoch is set as 500, and initial learning rate is set as 1.0.By 500 training, SOM net Network has understood the similitude between any two Range Profile, and for each initial range as data establish index.Therefore, When a people executes specific type of sports, corresponding continuous Range Profile data can be converted into a series of integers, this meaning Taste movement " semanteme ".
In order to verify validity, proposed frame is had evaluated in test data set.Fig. 8 shows four kinds of movements through walls The final sequence of type indicates and its corresponding test initial range picture.In fig. 8, in figure 5 it can be seen that (a) each type of sports Any original time series Range Profile can be converted into the integer sequence extracted by the AEN-SOM frame proposed;(b) different Type of sports correspond to different integer sequences.Therefore, the human motion representational framework through walls proposed is effective.
Radar parameter in 1 specific embodiment of table
Parameter Numerical value
Frequency range 1.6GHz~2.2GHz
Frequency point number 301
The frequency point duration 100μs
Duration monocycle 30.1ms

Claims (1)

1. a kind of through-wall radar human action characterizing method based on self-organized mapping network, comprising the following steps:
Step 1: radar return data processing
Step 1.1, radar echo signal modeling
Using one hair one receipts through-wall radar detection be hidden in a thickness of wall after single human body target movement, one hair one receive The stepped frequency waveforms of through-wall radar transmitting are s (t), and expression formula is
Wherein, K indicates that the frequency point sum of stepped frequency signal, k indicate that the frequency point number of stepped frequency signal, T indicate holding for each frequency point Continuous time, f0Indicate initial frequency, Δ f indicates frequency stepping, and function rect () is
Then, t moment is apart from radar RtarPlace's target reflection echo is expressed as
Wherein, p indicates p-th of scattering point on target surface in P strong scattering, σpIndicate that the scattering of p-th of strong scattering point is strong Degree, τpIndicate that the trip time delay between p-th of strong scattering point and radar, ψ (t) indicate clutter and noise;
By sr(t) it is mixed with s (t), after filtering out second order and order harmonic frequency, acquired results is sampled with section T, Obtain following discrete vector
sr=[sr,0,sr,1,…,sr,K-1]T (4)
Wherein
ψkIndicate the clutter and noise when kth time sampling;
By the s in formula (4)rBy Hamming window to inhibit distance side lobe level, f is filled into from 0Hz with interval delta f0After Hz, N point fast adverse Fourier transform is carried out to it, the Range Profile for obtaining a slow time cycle (is referred to as monocycle distance below Picture), it is expressed as
sRP=[sRP(0),sRP(1),…,sRP(N-1)]T (6)
Wherein
Indicate the corresponding Range Profile data of n-th of distance unit, n indicates that index number of the distance to unit, value range are [0, N- 1], K0It indicates from 0Hz to f00 filler of Hz, N indicate total points of fast adverse Fourier transform.
Step 1.2, on the basis of the monocycle Range Profile that step 1.1 obtains, using moving-target detection algorithm, normalized The signal-to-noise ratio of Range Profile data is improved with thresholding method, threshold value thresholding can be calculated by following formula:
Wherein,Indicate noise maximum decibel value,Indicate the maximum decibel value of all sample datas,It indicates The minimum decibel value of all sample datas;
Step 1.3, the monocycle Range Profile distance truncation: for reduce calculation amount and make data meet actual needs, by the monocycle away from From as being intercepted, interception points should be calculated as
Wherein, NdIndicate the Range Profile points of interception, Rtar,maxIndicate the farthest generation distance of hidden human body target movement, Rradar,maxIndicate that the farthest detectable one way distance of through-wall radar, N indicate total points of fast adverse Fourier transform;
Step 2: the Range Profile data characteristics preextraction based on AEN;
Step 2.1, training AEN network:
AEN is constructed using the full articulamentum of multilayer, including numeral network portion is conciliate in coding sub-network part.Two parts are in structure Upper mirror symmetry, and share the common layer for being known as characteristic layer.When inputting monocycle Range Profile action data, subnet is encoded Weight parameter effect in data and network is generated coding characteristic and from feature at output by network, and then the coding characteristic is again through solving Weight parameter effect in numeral network and network attempts to restore original monocycle Range Profile action data.Therefore by adjusting Weight parameter in AEN keeps the error between the output of AEN and input minimum, and it is dynamic can to obtain monocycle Range Profile in characteristic layer Make effective low-dimensional character representation of data;
If m-th of slow time cycle Range Profile is expressed asAEN is entered into, The output of first of hidden layer is expressed as in so AEN
Wherein,WithOutputting and inputting for first of hidden layer is respectively indicated, and W[l-1,l]Indicate the connection weight matrix between l-1 and first of hidden layer;b[l-1,l]Indicate being biased towards in l hidden layer Amount;F indicates activation primitive, and selects ReLU function as activation primitive, is expressed as
Therefore, the training process of AEN is equivalent to solve following optimization problem
Wherein, W and b respectively indicates all weights in AEN and all bias vectors;Function dist (x, y) is indicated between x and y Distance be expressed as usually using mean square error functionAEN(sRP[m], W, b) it indicates to input under the conditions of W and b sRPThe estimation of [m];
After the completion of step 2.2, AEN network training, only retain the coding sub-network part of AEN, by obtaining corresponding AEN feature Dimensionality reduction and the feature extraction of monocycle Range Profile action data are realized in layer output;
Step 3: the timing characterization of the human action type based on SOM network
Step 3.1, training SOM network:
If including M sample by the data set that the range distribution feature vector extracted from trained AEN forms;
Data: training dataset XM, the sample size M of training data concentration;
Input: size L=Q × Q of competition layer, the number of iterations I learn attenuation rate η;
Output: the weight W of SOM network;
S1. random initializtion Wj(i), wherein j=1 ..., L, i=0;
S2. learning rate α (i)=1, i=0 is initialized;
S3. neighbouring scale σ is initializedΛ(i)=Q/2, i=0;
S4. it for i-th iteration, is handled as follows:
S5. from XMIn take out a data sample X (i) at random;
S6. it solves
S7. renewal learning rate α (i)=α (0) e-η(i-1)
S8. neighborhood function Λ (j, j are updated*(i),i);
S9. error is calculated
S10. weight is updated
S11. to step S5~S10 loop iteration I times.
Wherein, j*(i) be known as the best match unit during i-th iteration, the weight vectors of connection and i-th iteration it is defeated It is most like to enter data;In order to further decrease the error between the weight vector of BMU and input data, right value update process is according to S9 The error transfer factor current weight of expression;Since the weight of BMU and the weight difference of farther neuron are increasing, only update The weight vectors of neuron near BMU within the scope of adjustable distance, this is expressed as neighborhood function Λ (j, j in S8*(i),i); In general, Λ (j, j*(i), i) it is designated as Gaussian function, it indicates are as follows:
Wherein
Then, the step according to S10 updates weight vectors, and final weight is obtained after I iteration;
Step 3.2, training SOM network after, by step 2 AEN extract Range Profile feature vector be transferred in competition layer On the coordinate of BMU, Range Profile maps feature vectors, which are gone out two similar movements, by trained SOM network to be translated At two sentences with similar semantic, and it is shown as two similar space three-dimensional tracks.
CN201910316233.5A 2019-04-19 2019-04-19 Through-wall radar human body action characterization method based on self-organizing mapping network Active CN110111360B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910316233.5A CN110111360B (en) 2019-04-19 2019-04-19 Through-wall radar human body action characterization method based on self-organizing mapping network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910316233.5A CN110111360B (en) 2019-04-19 2019-04-19 Through-wall radar human body action characterization method based on self-organizing mapping network

Publications (2)

Publication Number Publication Date
CN110111360A true CN110111360A (en) 2019-08-09
CN110111360B CN110111360B (en) 2022-05-03

Family

ID=67485775

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910316233.5A Active CN110111360B (en) 2019-04-19 2019-04-19 Through-wall radar human body action characterization method based on self-organizing mapping network

Country Status (1)

Country Link
CN (1) CN110111360B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111796272A (en) * 2020-06-08 2020-10-20 桂林电子科技大学 Real-time gesture recognition method and computer equipment for through-wall radar human body image sequence

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090262005A1 (en) * 2005-04-14 2009-10-22 L-3 Communications Cyterra Corporation Moving-entity detection
US20100026550A1 (en) * 2007-07-17 2010-02-04 Rosenbury Erwin T Handheld Instrument Capable of Measuring Heartbeat and Breathing Motion at a Distance
CN106127110A (en) * 2016-06-15 2016-11-16 中国人民解放军第四军医大学 A kind of human body fine granularity motion recognition method based on UWB radar with optimum SVM
CN107132512A (en) * 2017-03-22 2017-09-05 中国人民解放军第四军医大学 UWB radar human motion micro-Doppler feature extracting method based on multichannel HHT
CN107219522A (en) * 2017-05-08 2017-09-29 电子科技大学 A kind of united through-wall radar object localization method of ellipse-hyperbolic
CN107657243A (en) * 2017-10-11 2018-02-02 电子科技大学 Neutral net Radar range profile's target identification method based on genetic algorithm optimization
CN108776336A (en) * 2018-06-11 2018-11-09 电子科技大学 A kind of adaptive through-wall radar static human body object localization method based on EMD

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090262005A1 (en) * 2005-04-14 2009-10-22 L-3 Communications Cyterra Corporation Moving-entity detection
US20100026550A1 (en) * 2007-07-17 2010-02-04 Rosenbury Erwin T Handheld Instrument Capable of Measuring Heartbeat and Breathing Motion at a Distance
CN106127110A (en) * 2016-06-15 2016-11-16 中国人民解放军第四军医大学 A kind of human body fine granularity motion recognition method based on UWB radar with optimum SVM
CN107132512A (en) * 2017-03-22 2017-09-05 中国人民解放军第四军医大学 UWB radar human motion micro-Doppler feature extracting method based on multichannel HHT
CN107219522A (en) * 2017-05-08 2017-09-29 电子科技大学 A kind of united through-wall radar object localization method of ellipse-hyperbolic
CN107657243A (en) * 2017-10-11 2018-02-02 电子科技大学 Neutral net Radar range profile's target identification method based on genetic algorithm optimization
CN108776336A (en) * 2018-06-11 2018-11-09 电子科技大学 A kind of adaptive through-wall radar static human body object localization method based on EMD

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
MINGMIN ZHAO等: "Through-Wall Human Pose Estimation Using Radio Signals", 《2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION》 *
XIAOLIN LIANG等: "Through-wall human being detection using UWB impulse radar", 《EURASIP JOURNAL ON WIRELESS COMMUNICATIONS AND NETWORKING》 *
付庆霞等: "基于小波与EMD级联的穿墙雷达动目标检测", 《雷达科学与技术》 *
李松林等: "基于改进Camshift的穿墙雷达运动人体目标成像跟踪算法", 《计算机应用》 *
贾勇等: "穿墙雷达多视角建筑布局成像", 《电子与信息学报》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111796272A (en) * 2020-06-08 2020-10-20 桂林电子科技大学 Real-time gesture recognition method and computer equipment for through-wall radar human body image sequence

Also Published As

Publication number Publication date
CN110111360B (en) 2022-05-03

Similar Documents

Publication Publication Date Title
CN109522857B (en) People number estimation method based on generation type confrontation network model
CN109993280B (en) Underwater sound source positioning method based on deep learning
CN108226892B (en) Deep learning-based radar signal recovery method in complex noise environment
CN109389058B (en) Sea clutter and noise signal classification method and system
CN109683161B (en) Inverse synthetic aperture radar imaging method based on depth ADMM network
CN104459668A (en) Radar target recognition method based on deep learning network
CN104820993B (en) It is a kind of to combine particle filter and track the underwater weak signal target tracking for putting preceding detection
Liu et al. Deep learning and recognition of radar jamming based on CNN
CN104899567A (en) Small weak moving target tracking method based on sparse representation
CN105738888B (en) Bicharacteristic offshore floating small target detecting method based on ocean clutter cancellation
CN112861813B (en) Method for identifying human behavior behind wall based on complex value convolution neural network
CN110109080A (en) Method for detecting weak signals based on IA-SVM model
CN103714331A (en) Facial expression feature extraction method based on point distribution model
Qu et al. Human activity recognition based on WRGAN-GP-synthesized micro-Doppler spectrograms
CN110933633A (en) Onboard environment indoor positioning method based on CSI fingerprint feature migration
CN110223342B (en) Space target size estimation method based on deep neural network
Zhang et al. Multi-source information fused generative adversarial network model and data assimilation based history matching for reservoir with complex geologies
CN109086667A (en) Similar active recognition methods based on intelligent terminal
CN116269259A (en) Human body breathing and heartbeat detection method based on improved BP neural network
CN110275147A (en) Human behavior micro-Doppler classification and identification method based on migration depth neural network
Shao et al. Deep learning methods for personnel recognition based on micro-Doppler features
CN114428234A (en) Radar high-resolution range profile noise reduction identification method based on GAN and self-attention
CN110111360A (en) A kind of through-wall radar human action characterizing method based on self-organized mapping network
Wang et al. Through-wall human motion representation via autoencoder-self organized mapping network
Li et al. Sea/land clutter recognition for over-the-horizon radar via deep CNN

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant