CN110275163A - A kind of millimetre-wave radar detection target imaging method neural network based - Google Patents

A kind of millimetre-wave radar detection target imaging method neural network based Download PDF

Info

Publication number
CN110275163A
CN110275163A CN201910574031.0A CN201910574031A CN110275163A CN 110275163 A CN110275163 A CN 110275163A CN 201910574031 A CN201910574031 A CN 201910574031A CN 110275163 A CN110275163 A CN 110275163A
Authority
CN
China
Prior art keywords
millimetre
wave radar
matrix
convolution
layer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910574031.0A
Other languages
Chinese (zh)
Other versions
CN110275163B (en
Inventor
张雷
张博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingsi Microelectronics Nanjing Co ltd
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN201910574031.0A priority Critical patent/CN110275163B/en
Publication of CN110275163A publication Critical patent/CN110275163A/en
Application granted granted Critical
Publication of CN110275163B publication Critical patent/CN110275163B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/10Image enhancement or restoration using non-spatial domain filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/08Indexing scheme for image data processing or generation, in general involving all processing steps from image acquisition to 3D model generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20056Discrete and fast Fourier transform, [DFT, FFT]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • General Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Geometry (AREA)
  • Electromagnetism (AREA)
  • Computer Graphics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The present invention relates to a kind of millimetre-wave radar array image-forming methods neural network based, belong to automatic Pilot technical field.This method passes through millimetre-wave radar array first and obtains original signal data, environment point cloud data is obtained by laser radar, to obtaining initial eigenmatrix after Fourier transformation, the point cloud data for being as coordinate origin using the installation site of millimetre-wave radar array is obtained after doing coordinate transform to point cloud data, millimetre-wave radar array image-forming model is constructed using convolutional neural networks, the point cloud data training millimetre-wave radar array image-forming model for being as coordinate origin using the eigenmatrix of MMW RADAR SIGNAL USING and using the installation site of millimetre-wave radar array.Building of the method for the present invention based on neural fusion millimetre-wave radar array image-forming model solves the problems, such as to be difficult to decouple radar antenna secondary lobe interference signal in conventional radar signal modeling method.

Description

A kind of millimetre-wave radar detection target imaging method neural network based
Technical field
The present invention relates to a kind of millimetre-wave radars neural network based to detect target imaging method, belongs to automatic Pilot skill Art field.
Background technique
In recent years, become a research hotspot for the environment sensing of automobile driving system.Millimetre-wave radar has whole day Work, low-cost advantage are waited, is commonly used for object ranging, tests the speed, angle measurement.There is laser radar high-precision advantage to be used for Point cloud imaging, but it is expensive, it is difficult to volume production.In order to solve this problem, the world has a small number of enterprises just in research and utilization milli The method of metre wave radar array realization point cloud imaging.Since millimetre-wave radar antenna can not completely inhibit secondary lobe in the design process Often there is the crosstalk of side-lobe signal in signal, the signal that millimetre-wave radar array receives.Solution between different antennae signal Coupling model generally requires the actual conditions of millimetre-wave radar product and antenna, and traditional Radar Signal Processing algorithm is difficult to realize other The decoupling problem of valve crosstalk signal.Need a kind of more universal method fast implement millimetre-wave radar array signal to point cloud at As the mapping of data and target detection data.
Neural network algorithm is widely used in constructing between inputoutput data there are the system model of complex mapping relation, Its advantage is not need to carry out principle analysis, the system that can realize performance beyond tradition by data training to mapping relations Modeling method.The disadvantage is that the training of neural network needs a large amount of data, to accurately calculate loss value trained every time, having In supervised learning, these data samples generally require manually to mark.
Summary of the invention
The purpose of the present invention is to propose to a kind of millimetre-wave radars neural network based to detect target imaging, utilize nerve net The advantage of network and millimetre-wave radar, acquisition are in the point cloud data of identical working environment with millimetre-wave radar, to point cloud data into Neural network is trained as matched sample after the corresponding coordinate transform of row, realizes and radar points cloud number is generated by radar data According to the high-precision mapping with target detection data.
Millimetre-wave radar neural network based proposed by the present invention detects target imaging method, comprising the following steps:
(1) millimetre-wave radar data are acquired, line number of going forward side by side Data preprocess obtains the eigenmatrix R an of millimetre-wave radar, Specifically includes the following steps:
(1-1) setting millimetre-wave radar is located at the origin o of rectangular coordinate system xyz, and millimetre-wave radar is equipped with N group antenna, The angle of the detection direction of i-th group of antenna of millimetre-wave radar and the plane xoy of rectangular coordinate system are set as αi
(1-2) acquires the original signal Z of the T submillimeter wave radar issued by i-th group of antenna of millimetre-wave radari
The original signal Z of (1-3) to step (1-2)iFourier transformation is carried out respectively, obtains a matrix Fi, FiIt is one The matrix of K × T dimension, K is original signal strength, to matrix FiTwo-dimensional Fourier transform is carried out, eigenmatrix r is obtainedi
(1-4) traverses the N group antenna in millimetre-wave radar, repeats step (1-3), obtains the feature square of all N group antennas Battle array, all eigenmatrixes are combined into, the eigenmatrix R an of millimetre-wave radar, i.e. R={ r are obtained1;r2;r3;…;rN, R For a K × T × N-dimensional matrix;
(2) data collection point is set, the point cloud data of acquisition detection target constitutes a three-dimensional point cloud matrix P, has Body the following steps are included:
(2-1) sets data collection point, is located at data collection point H in the rectangular coordinate system xyz of step (1-1), H point Coordinate is x=0, y=λ, z=0, and the point cloud data E of detection target is acquired from H point, includes M cloud positions in point cloud data E Information, M are set according to detection accuracy, M > 100, ε point p in M cloud location informationsεLocation information useIt indicates, wherein βεIt indicates from H point to pεThe angle of plane xoy in the line and rectangular coordinate system xyz of point, ε= 1,2..., M, and set βεValue be angle αiIn any one value, i=1,2 ..., N,It indicates from H point to pεPoint The angle of plane yoz, l in line and rectangular coordinate systemεIndicate H point to pεThe distance of point;
(2-2) utilizes following formula, respectively to above-mentioned all M location informationsIt is handled, obtains M New location information
βnewεε
(2-3) M according to step (2-2) new location informationsDue to βnewε= βε, therefore βnewεThe only possible value of N kind, according to βnewεDifferent values, by the new location informations of the M of step (2-2)It is divided into N group, every groupA data, willA data are arranged in three-dimensional point cloud matrix P, That is P is oneMatrix;
(3) convolutional neural networks are constructed, using the convolutional neural networks to the three-dimensional feature matrix R in step (1) Carry out Automatic Feature Extraction, final outputDimension point cloud estimates data evaluationConstitute millimetre-wave radar detection target at As model, specifically includes the following steps:
(3-1) constructs the input layer of convolutional neural networks, and the input matrix of convolutional neural networks is that the size of step (1) is K × T × N three-dimensional feature matrix R, the input layer convolutional network of convolutional neural networks have g-th of 100 convolution kernel input layers Convolution kernel Wg0It indicates, g=1,2,3 ... 100, the size of convolution kernel is 3 × 3 × N, and convolution step-length is 1, input layer convolutional network Activation primitive be RELU function, the output matrix of input layer convolutional networkSize be K × T × 100, each convolution kernel Operational formula are as follows:
Dg=RELU (R*Wg0)
Wherein, * is convolution operator, convolution kernel Wg0For to training parameter, DgTo do convolution algorithm using each convolution kernel Output afterwards, g=1,2 ..., 100, all DgIt is combined into
(3-2) constructs the middle layer of convolutional neural networks, and the middle layer of convolutional neural networks includes Q layers of convolutional network, In q layers of convolutional network input matrix be q-1 layers of convolutional network output matrixQ=2 ..., Q, in middle layer The input matrix of level 1 volume product network is the output matrix of convolutional neural networks input layerEach layer of convolutional network has 100 volumes Product core, g-th of convolution kernel W of q layergqIt indicates, convolution kernel is having a size of 3 × 3 × 100, and convolution step-length is 1, each layer of convolution net The activation primitive of network is RELU function, and the output matrix of q layers of convolutional network is usedIt indicates, having a size of K × T × 100, Q layers of volume The operational formula of each convolution kernel of product network are as follows:
Wherein, * is convolution operator, convolution kernel WgqFor to training parameter,To do convolution algorithm using each convolution kernel Output afterwards, g=1,2 ..., 100, it is allIt is combined into sequence
(3-3) constructs the output layer of convolutional neural networks, and the output layer of convolutional neural networks is one layer of convolutional network, output The input matrix of layer convolutional network is the output matrix of the last layer convolutional network of convolutional neural networks middle layerOutput layer The convolution kernel of convolutional network has 3M, uses WeIt indicates, e=1,2 ..., 3M, the size of each convolution kernel is K × T × 100, each The operational formula of convolution kernel are as follows:
Wherein, * is convolution operator, convolution kernel WeFor to training parameter, DeAfter doing convolution algorithm using each convolution kernel As a result, e=1,2 ..., 3M, all DeIt combines and is on a large scaleThree-dimensional matrice, which is millimeter Wave radar detection target imaging model, is denoted as
(4) step (1) and step (2) s times are repeated, s group eigenmatrix R is obtained and puts the sample of cloud matrix P, utilizes gradient Descending method, the convolutional neural networks in repetitive exercise step (3) obtain millimetre-wave radar detection target imaging modelSpecifically Include the following steps:
(4-1) repeats step (1)-step (2) s times, obtains s group eigenmatrix R and puts the sample of cloud matrix P;
(4-2) uses gradient descent method, traversal (4-1) collected s group eigenmatrix R and the sample for putting cloud matrix P, It repeats step (3), the training set of step (4-1) is trained, model in step (3) is obtainedNeeded training parameter, i.e., Wg0、WgqAnd We, and obtain and training parameter Wg0、WgqAnd WeCorresponding millimetre-wave radar detects target imaging model
(4-3) repeats step (4-2) η times, and 50≤η≤100 obtain last η training parameter Wg0、WgqAnd WeCorresponding milli Metre wave radar detects target imaging modelAs final millimetre-wave radar array image-forming model
(5) it utilizes final millimetre-wave radar obtained in (4) to detect target imaging model, realizes that millimetre-wave radar detects mesh Target imaging.
Millimetre-wave radar neural network based proposed by the present invention detects target imaging method, its advantage is that:
1, traditional radar target imaging method is in signal processing, it is difficult to what solution was interfered by antenna sidelobe noise Problem, and millimetre-wave radar neural network based of the invention detects target imaging method, makes an uproar without the concern for antenna sidelobe Interference caused by sound can realize decoupling certainly for interference noise.
2, millimetre-wave radar neural network based of the invention detects target imaging method, passes through the instruction to neural network The problem of practicing, realizing the automatic mapping between different antennae signal and target imaging simplifies millimetre-wave radar detection target Imaging process.
Detailed description of the invention
Fig. 1 and Fig. 2 is schematic diagram of the millimetre-wave radar position that is related to of the method for the present invention in rectangular coordinate system.
Fig. 3 is the schematic diagram of data collection point involved in the method for the present invention.
In Fig. 1-Fig. 3,1 is millimetre-wave radar, and 2 be antenna, and the detection direction of antenna 2 and the angle of plane xoy are αi, βε It indicates from H point to pεThe angle of plane xoy in the line and rectangular coordinate system xyz of point,Indicate from H point to the line of p ε point with The angle of plane yoz, l in rectangular coordinate systemεIndicate H point to pεThe distance of point.
Specific embodiment
Millimetre-wave radar neural network based proposed by the present invention detects target imaging method, comprising the following steps:
(1) millimetre-wave radar data are acquired, line number of going forward side by side Data preprocess obtains the eigenmatrix R an of millimetre-wave radar, Specifically includes the following steps:
(1-1) setting millimetre-wave radar is located at the origin o of rectangular coordinate system xyz, and millimetre-wave radar is equipped with N group antenna, The angle of the detection direction of i-th group of antenna of millimetre-wave radar and the plane xoy of rectangular coordinate system are set as αi;Such as Fig. 1 and Fig. 2 Shown, millimetre-wave radar 1 is located at the origin o of rectangular coordinate system xyz, and millimetre-wave radar antenna 2 is distributed in around radar, millimeter wave The angle of the plane xoy of the detection direction 2 and rectangular coordinate system of i-th group of antenna 1 of radar is αi
(1-2) acquires the original signal Z of the T submillimeter wave radar issued by i-th group of antenna of millimetre-wave radari
The original signal Z of (1-3) to step (1-2)iFourier transformation is carried out respectively, obtains a matrix Fi, FiIt is one The matrix of K × T dimension, K is original signal strength, to matrix FiTwo-dimensional Fourier transform is carried out, eigenmatrix r is obtainedi
(1-4) traverses the N group antenna in millimetre-wave radar, repeats step (1-3), obtains the feature square of all N group antennas Battle array, all eigenmatrixes are combined into, the eigenmatrix R an of millimetre-wave radar, i.e. R={ r are obtained1;r2;r3;…;rN, R For a K × T × N-dimensional matrix;
(2) data collection point is set, the point cloud data of acquisition detection target constitutes a three-dimensional point cloud matrix P, has Body the following steps are included:
(2-1) sets data collection point, is located at data collection point H in the rectangular coordinate system xyz of step (1-1), H point Coordinate is x=0, y=λ, z=0, and the point cloud data E of detection target is acquired from H point, includes M cloud positions in point cloud data E Information, M are set according to detection accuracy, M > 100, ε point p in M cloud location informationsεLocation information useIt indicates, wherein βεThe angle of expression plane xoy into the line of p ε point and rectangular coordinate system xyz from H point, ε= 1,2..., M, and set βεValue be angle αiIn any one value, i=1,2 ..., N,It indicates from H point to pεPoint The angle of plane yoz, l in line and rectangular coordinate systemεIndicate H point to pεThe distance of point, as shown in Figure 3;
(2-2) utilizes following formula, respectively to above-mentioned all M location informationsIt is handled, obtains M New location information
βnewεε
(2-3) M according to step (2-2) new location informationsDue to βnewε= βε, therefore βnewεThe only possible value of N kind, according to βnewεDifferent values, by the new location informations of the M of step (2-2)It is divided into N group, every groupA data, willA data are arranged in three-dimensional point cloud matrix P, That is P is oneMatrix;
(3) convolutional neural networks are constructed, using the convolutional neural networks to the three-dimensional feature matrix R in step (1) Carry out Automatic Feature Extraction, final outputDimension point cloud estimates data evaluationConstitute millimetre-wave radar detection target at As model, specifically includes the following steps:
(3-1) constructs the input layer of convolutional neural networks, and the input matrix of convolutional neural networks is that the size of step (1) is K × T × N three-dimensional feature matrix R, the input layer convolutional network of convolutional neural networks have 100 convolution kernel (convolution kernel engineerings The public noun in habit field), g-th of convolution kernel W of input layerg0It indicates, g=1,2,3 ... 100, the size of convolution kernel For 3 × 3 × N, convolution step-length is 1, and the activation primitive of input layer convolutional network is that (RELU function is machine learning neck to RELU function The public function in domain), the output matrix of input layer convolutional networkSize be K × T × 100, the fortune of each convolution kernel Calculate formula are as follows:
Dg=RELU (R*Wg0)
Wherein, * is convolution operator, convolution kernel Wg0For to training parameter, DgTo do convolution algorithm using each convolution kernel Output afterwards, g=1,2 ..., 100, all DgIt is combined into
(3-2) constructs the middle layer of convolutional neural networks, and the middle layer of convolutional neural networks includes Q layers of convolutional network, In q layers of convolutional network input matrix be q-1 layers of convolutional network output matrixQ=2 ..., Q, in middle layer The input matrix of level 1 volume product network is the output matrix of convolutional neural networks input layerEach layer of convolutional network has 100 volumes Product core, g-th of convolution kernel W of q layergqIt indicates, convolution kernel is having a size of 3 × 3 × 100, and convolution step-length is 1, each layer of convolution net The activation primitive of network is RELU function, and the output matrix of q layers of convolutional network is usedIt indicates, having a size of K × T × 100, Q layers of volume The operational formula of each convolution kernel of product network are as follows:
Wherein, * is convolution operator, convolution kernel WgqFor to training parameter,To do convolution algorithm using each convolution kernel Output afterwards, g=1,2 ..., 100, it is allIt is combined into sequence
(3-3) constructs the output layer of convolutional neural networks, and the output layer of convolutional neural networks is one layer of convolutional network, output The input matrix of layer convolutional network is the output matrix of the last layer convolutional network of convolutional neural networks middle layerOutput layer The convolution kernel of convolutional network has 3M, uses WeIt indicates, e=1,2 ..., 3M, the size of each convolution kernel is K × T × 100, each The operational formula of convolution kernel are as follows:
Wherein, * is convolution operator, convolution kernel WeFor to training parameter, DeAfter doing convolution algorithm using each convolution kernel As a result, e=1,2 ..., 3M, all DeIt combines and is on a large scaleThree-dimensional matrice, which is millimeter Wave radar detection target imaging model, is denoted as
(4) step (1) and step (2) s times are repeated, s group eigenmatrix R is obtained and puts the sample of cloud matrix P, utilizes gradient Descending method, the convolutional neural networks in repetitive exercise step (3) obtain millimetre-wave radar detection target imaging modelSpecifically Include the following steps:
(4-1) repeats step (1)-step (2) s times, obtains s group eigenmatrix R and puts the sample of cloud matrix P;
(4-2) uses gradient descent method, traversal (4-1) collected s group eigenmatrix R and the sample for putting cloud matrix P, It repeats step (3), the training set of step (4-1) is trained, model in step (3) is obtainedNeeded training parameter, i.e., Wg0、WgqAnd We, and obtain and training parameter Wg0、WgqAnd WeCorresponding millimetre-wave radar detects target imaging model
(4-3) repeats step (4-2) η times, 50≤η≤100, and in one embodiment of the present of invention, the value of η is 100, obtains To last η training parameter Wg0、WgqAnd WeCorresponding millimetre-wave radar detects target imaging modelAs final millimeter wave Radar array imaging model
(5) it utilizes final millimetre-wave radar obtained in (4) to detect target imaging model, realizes that millimetre-wave radar detects mesh Target imaging.

Claims (1)

1. a kind of millimetre-wave radar neural network based detects target imaging method, which is characterized in that this method includes following Step:
(1) millimetre-wave radar data are acquired, line number of going forward side by side Data preprocess obtains the eigenmatrix R an of millimetre-wave radar, specifically The following steps are included:
(1-1) setting millimetre-wave radar is located at the origin o of rectangular coordinate system xyz, and millimetre-wave radar is equipped with N group antenna, setting The angle of the plane xoy of the detection direction and rectangular coordinate system of i-th group of antenna of millimetre-wave radar is αi
(1-2) acquires the original signal Z of the T submillimeter wave radar issued by i-th group of antenna of millimetre-wave radari
The original signal Z of (1-3) to step (1-2)iFourier transformation is carried out respectively, obtains a matrix Fi, FiFor a K × T The matrix of dimension, K is original signal strength, to matrix FiTwo-dimensional Fourier transform is carried out, eigenmatrix r is obtainedi
(1-4) traverses the N group antenna in millimetre-wave radar, repeats step (1-3), obtains the eigenmatrix of all N group antennas, will All eigenmatrixes are combined into, and obtain the eigenmatrix R an of millimetre-wave radar, i.e. R={ r1;r2;r3;…;rN, R is one K × T × N-dimensional matrix;
(2) data collection point is set, the point cloud data of acquisition detection target constitutes a three-dimensional point cloud matrix P, specific to wrap Include following steps:
(2-1) sets data collection point, is located at data collection point H in the rectangular coordinate system xyz of step (1-1), the coordinate of H point For x=0, y=λ, z=0, the point cloud data E of detection target is acquired from H point, includes M cloud location informations in point cloud data E, M is set according to detection accuracy, M > 100, ε point p in M cloud location informationsεLocation information useTable Show, wherein βεIt indicates from H point to pεThe angle of plane xoy, ε=1,2..., M in the line and rectangular coordinate system xyz of point, and set Determine βεValue be angle αiIn any one value, i=1,2 ..., N,It indicates from H point to pεThe line and rectangular co-ordinate of point The angle of plane yoz, l in systemεIndicate H point to pεThe distance of point;
(2-2) utilizes following formula, respectively to above-mentioned all M location informationsIt is handled, it is new to obtain M Location information
βnewεε
(2-3) M according to step (2-2) new location informationsDue to Bnewεε, because This βnewεThe only possible value of N kind, according to βnewεDifferent values, by the new location informations of the M of step (2-2)It is divided into N group, every groupA data, willA data are arranged in three-dimensional point cloud matrix P, That is P is oneMatrix;
(3) convolutional neural networks are constructed, the three-dimensional feature matrix R in step (1) is carried out using the convolutional neural networks Automatic Feature Extraction, final outputDimension point cloud estimates data evaluationIt constitutes millimetre-wave radar and detects target imaging mould Type, specifically includes the following steps:
(3-1) constructs the input layer of convolutional neural networks, and the input matrix of convolutional neural networks is that the size of step (1) is K × T The three-dimensional feature matrix R of × N, the input layer convolutional network of convolutional neural networks have 100 convolution kernels, g-th of convolution of input layer Core Wg0It indicates, g=1,2,3 ... 100, the size of convolution kernel is 3 × 3 × N, and convolution step-length is 1, the activation letter of input layer convolutional network Number is RELU function, the output matrix of input layer convolutional networkSize be K × T × 100, the operational formula of each convolution kernel are as follows:
Dg=RELU (R*Wg0)
Wherein, * is convolution operator, convolution kernel Wg0For to training parameter, DgAfter doing convolution algorithm using each convolution kernel Output, g=1,2 ..., 100, all DgIt is combined into
(3-2) constructs the middle layer of convolutional neural networks, and the middle layer of convolutional neural networks includes Q layers of convolutional network, wherein q The input matrix of layer convolutional network is the output matrix of q-1 layers of convolutional networkThe 1st layer in middle layer The input matrix of convolutional network is the output matrix of convolutional neural networks input layerEach layer of convolutional network has 100 convolution Core, g-th of convolution kernel W of q layergqIt indicates, convolution kernel is having a size of 3 × 3 × 100, and convolution step-length is 1, each layer of convolutional network Activation primitive be RELU function, the output matrix use of q layer convolutional networkIt indicates, having a size of K × T × 100, Q layers of convolution The operational formula of each convolution kernel of network are as follows:
Wherein, * is convolution operator, convolution kernel WgqFor to training parameter,After doing convolution algorithm using each convolution kernel Output, g=1,2 ..., 100, it is allIt is combined into sequence
(3-3) constructs the output layer of convolutional neural networks, and the output layer of convolutional neural networks is one layer of convolutional network, output layer volume The input matrix of product network is the output matrix of the last layer convolutional network of convolutional neural networks middle layerOutput layer convolution The convolution kernel of network has 3M, uses WeIt indicates, e=1,2 ..., 3M, the size of each convolution kernel is K × T × 100, each convolution The operational formula of core are as follows:
Wherein, * is convolution operator, convolution kernel WeFor to training parameter, DeTo do the knot after convolution algorithm using each convolution kernel Fruit, e=1,2 ..., 3M, all DeIt combines and is on a large scaleThree-dimensional matrice, which is millimeter wave thunder Up to detection target imaging model, it is denoted as
(4) step (1) and step (2) s times are repeated, s group eigenmatrix R is obtained and puts the sample of cloud matrix P, is declined using gradient Method, the convolutional neural networks in repetitive exercise step (3) obtain millimetre-wave radar detection target imaging modelIt specifically includes Following steps:
(4-1) repeats step (1)-step (2) s times, obtains s group eigenmatrix R and puts the sample of cloud matrix P;
(4-2) uses gradient descent method, traversal (4-1) collected s group eigenmatrix R and the sample for putting cloud matrix P, repeats Step (3) is trained the training set of step (4-1), obtains model in step (3)Needed training parameter, i.e. Wg0、 WgqAnd We, and obtain and training parameter Wg0、WgqAnd WeCorresponding millimetre-wave radar detects target imaging model
(4-3) repeats step (4-2) η times, and 50≤η≤100 obtain last η training parameter Wg0、WgqAnd WeCorresponding millimeter wave Radar detection target imaging modelAs final millimetre-wave radar array image-forming model
(5) it utilizes final millimetre-wave radar obtained in (4) to detect target imaging model, realizes millimetre-wave radar detection target Imaging.
CN201910574031.0A 2019-06-28 2019-06-28 Millimeter wave radar detection target imaging method based on neural network Active CN110275163B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910574031.0A CN110275163B (en) 2019-06-28 2019-06-28 Millimeter wave radar detection target imaging method based on neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910574031.0A CN110275163B (en) 2019-06-28 2019-06-28 Millimeter wave radar detection target imaging method based on neural network

Publications (2)

Publication Number Publication Date
CN110275163A true CN110275163A (en) 2019-09-24
CN110275163B CN110275163B (en) 2020-11-27

Family

ID=67962562

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910574031.0A Active CN110275163B (en) 2019-06-28 2019-06-28 Millimeter wave radar detection target imaging method based on neural network

Country Status (1)

Country Link
CN (1) CN110275163B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111353581A (en) * 2020-02-12 2020-06-30 北京百度网讯科技有限公司 Lightweight model acquisition method and device, electronic equipment and storage medium
CN111752754A (en) * 2020-06-05 2020-10-09 清华大学 Method for recovering radar image data in memory
CN111833395A (en) * 2020-06-04 2020-10-27 西安电子科技大学 Direction-finding system single target positioning method and device based on neural network model

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102713665A (en) * 2009-09-17 2012-10-03 曼彻斯特城市大学 Detection of objects
CN205679762U (en) * 2016-02-24 2016-11-09 闻鼓通信科技股份有限公司 Dangerous goods detecting devices hidden by millimetre-wave radar
CN106371105A (en) * 2016-08-16 2017-02-01 长春理工大学 Vehicle targets recognizing method, apparatus and vehicle using single-line laser radar
CN107229918A (en) * 2017-05-26 2017-10-03 西安电子科技大学 A kind of SAR image object detection method based on full convolutional neural networks
WO2018125928A1 (en) * 2016-12-29 2018-07-05 DeepScale, Inc. Multi-channel sensor simulation for autonomous control systems
CN108596961A (en) * 2018-04-17 2018-09-28 浙江工业大学 Point cloud registration method based on Three dimensional convolution neural network
CN109871787A (en) * 2019-01-30 2019-06-11 浙江吉利汽车研究院有限公司 A kind of obstacle detection method and device
CN109902702A (en) * 2018-07-26 2019-06-18 华为技术有限公司 The method and apparatus of target detection

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102713665A (en) * 2009-09-17 2012-10-03 曼彻斯特城市大学 Detection of objects
CN205679762U (en) * 2016-02-24 2016-11-09 闻鼓通信科技股份有限公司 Dangerous goods detecting devices hidden by millimetre-wave radar
CN106371105A (en) * 2016-08-16 2017-02-01 长春理工大学 Vehicle targets recognizing method, apparatus and vehicle using single-line laser radar
WO2018125928A1 (en) * 2016-12-29 2018-07-05 DeepScale, Inc. Multi-channel sensor simulation for autonomous control systems
CN107229918A (en) * 2017-05-26 2017-10-03 西安电子科技大学 A kind of SAR image object detection method based on full convolutional neural networks
CN108596961A (en) * 2018-04-17 2018-09-28 浙江工业大学 Point cloud registration method based on Three dimensional convolution neural network
CN109902702A (en) * 2018-07-26 2019-06-18 华为技术有限公司 The method and apparatus of target detection
CN109871787A (en) * 2019-01-30 2019-06-11 浙江吉利汽车研究院有限公司 A kind of obstacle detection method and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
XIAO TANG ET AL.: "SAR image despeckling with a multilayer perceptron neural network", 《INTERNATIONAL JOURNAL OF DIGITAL EARTH》 *
肖怀铁 等: "宽带毫米波雷达目标时延神经网络识别新方法", 《红外与毫米波学报》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111353581A (en) * 2020-02-12 2020-06-30 北京百度网讯科技有限公司 Lightweight model acquisition method and device, electronic equipment and storage medium
CN111353581B (en) * 2020-02-12 2024-01-26 北京百度网讯科技有限公司 Lightweight model acquisition method and device, electronic equipment and storage medium
CN111833395A (en) * 2020-06-04 2020-10-27 西安电子科技大学 Direction-finding system single target positioning method and device based on neural network model
CN111833395B (en) * 2020-06-04 2022-11-29 西安电子科技大学 Direction-finding system single target positioning method and device based on neural network model
CN111752754A (en) * 2020-06-05 2020-10-09 清华大学 Method for recovering radar image data in memory

Also Published As

Publication number Publication date
CN110275163B (en) 2020-11-27

Similar Documents

Publication Publication Date Title
CN106886023B (en) A kind of Radar Echo Extrapolation method based on dynamic convolutional neural networks
CN110275163A (en) A kind of millimetre-wave radar detection target imaging method neural network based
CN104020469B (en) A kind of MIMO radar distance-angle two-dimensional super-resolution rate imaging algorithm
CN108983228B (en) RCS near-far field transformation method based on deep neural network
CN108182450A (en) A kind of airborne Ground Penetrating Radar target identification method based on depth convolutional network
CN109283499B (en) Radar equation-based three-dimensional visualization method for detection range under active interference
CN104898119B (en) A kind of moving target parameter estimation method based on correlation function
CN113314832B (en) Millimeter wave vehicle-mounted MIMO radar antenna array device and design method
CN105425231B (en) A kind of multiple-sensor and multiple-object localization method based on layering projection and Taylor expansion
CN109283590B (en) Multi-source gravimetric data fusion method based on wavelet transformation
CN106855618A (en) Based on the interference sample elimination method under broad sense inner product General Cell
CN106855941A (en) Gesture identification method and system based on the radar sparse optimization of micro-doppler signal
CN104198992A (en) Passive underwater target positioning method based on compressed sensing of multipath time delay structure
CN108089155A (en) Single hydrophone sound source Passive Location under a kind of abyssal environment
CN113126087B (en) Space-borne interference imaging altimeter antenna
CN106707255A (en) Phased array radar simulation system and method
CN105807275A (en) MIMO-OFDM-STAP steady waveform design method based on partial clutter priori knowledge
CN109001687A (en) Airborne radar space-time self-adaptive filtering method based on generalized sidelobe cancellation structure
CN108398659B (en) Direction-of-arrival estimation method combining matrix beam and root finding MUSIC
CN110244273A (en) It is a kind of based on the target angle estimation method for being uniformly distributed formula array
CN108931776A (en) A kind of high-precision Matched Field localization method
CN112782704A (en) Passive positioning acoustic wave coverage characteristic and positioning accuracy analysis method for sea surface sound source
CN109100711A (en) Active sonar low operand 3-D positioning method in single base under a kind of deep-marine-environment
CN104793206B (en) Using the imaging method of transmitting graing lobe
CN110736976A (en) sonar beam former performance estimation method of arbitrary array

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20210329

Address after: Room 301, block C, Yingying building, 99 Tuanjie Road, yanchuangyuan, Jiangbei new district, Nanjing, Jiangsu, 211800

Patentee after: Qingsi Microelectronics (Nanjing) Co.,Ltd.

Address before: 100084 No. 1 Tsinghua Yuan, Beijing, Haidian District

Patentee before: TSINGHUA University

TR01 Transfer of patent right