CN113050797A - Method for realizing gesture recognition through millimeter wave radar - Google Patents

Method for realizing gesture recognition through millimeter wave radar Download PDF

Info

Publication number
CN113050797A
CN113050797A CN202110326016.1A CN202110326016A CN113050797A CN 113050797 A CN113050797 A CN 113050797A CN 202110326016 A CN202110326016 A CN 202110326016A CN 113050797 A CN113050797 A CN 113050797A
Authority
CN
China
Prior art keywords
target
radar
distance
module
clustering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110326016.1A
Other languages
Chinese (zh)
Inventor
姚衡
邹毅
王彦杰
王凌云
赵瑞峰
张义军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Huajie Zhitong Technology Co ltd
Original Assignee
Shenzhen Huajie Zhitong Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Huajie Zhitong Technology Co ltd filed Critical Shenzhen Huajie Zhitong Technology Co ltd
Priority to CN202110326016.1A priority Critical patent/CN113050797A/en
Publication of CN113050797A publication Critical patent/CN113050797A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/14Fourier, Walsh or analogous domain transformations, e.g. Laplace, Hilbert, Karhunen-Loeve, transforms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Algebra (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Databases & Information Systems (AREA)
  • Electromagnetism (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a method for realizing gesture recognition through a millimeter wave radar, which comprises the following steps: the method comprises the steps of firstly measuring and calculating distance, speed, azimuth Angle and pitch Angle information of a target through a radar, converting an analog signal into a digital signal through an analog-to-digital converter, carrying out fast Fourier transform on data of the same distance dimension and speed dimension through a Range FFT module and a Doppler FFT module, detecting the target through a detector under constant false alarm probability, calculating phase difference of signals received by a receiving antenna through an Angle Estimation module arranged on the radar to obtain Angle information of the target, Clustering a plurality of radar targets into one target through a Clustering module to obtain a vector representing the distance, the speed, the azimuth Angle and the pitch Angle of a hand, additionally adding three values through the Clustering module, inputting the integrated feature vector into a machine learning network, and carrying out gesture classification. The invention has reasonable design, improves the gesture recognition precision and reduces the error recognition rate by adding new characteristic values for constraint.

Description

Method for realizing gesture recognition through millimeter wave radar
Technical Field
The invention relates to the technical field of gesture recognition, in particular to a method for realizing gesture recognition through a millimeter wave radar.
Background
In computer science, gesture recognition is an issue of recognizing human gestures through mathematical algorithms. The gesture recognition can be from the motion of various parts of human body, but generally refers to the motion of face and hands, and a user can control or interact with the equipment by using simple gestures to solve human behaviors by a computer, and the core technologies of the gesture recognition are gesture segmentation, gesture analysis and gesture recognition. Gestures may originate from any body motion or state, but typically originate from the face or hands. The existing gesture recognition method is mainly realized by combining a single optical lens and an optical lens with the technologies of infrared, TOF, millimeter wave radar and the like. In recent years, the technology on the millimeter wave radar chip is gradually mature, the technology comprises high integration of a radar radio frequency unit, a signal processing unit and an MCU unit, and the realization of an on-chip antenna array and the like, so that the millimeter wave radar is more and more widely embedded and applied in the fields of unmanned driving and novel intelligent terminals. The millimeter wave radar has the advantages of protecting privacy, not receiving the influence of weather environment, improving the detection resolution ratio by a small antenna aperture and a narrow beam, resisting interference by a large bandwidth and the like. With the continuous promotion of smart homes and smart cities, the application of millimeter wave radars to smart terminals has huge potential.
The current machine learning network has no interpretability, and because the setting of the hyper-parameters is different, the optimal solution of the network is not necessarily the true optimal solution, so the machine learning network has certain misidentification, low identification precision and certain defects.
Disclosure of Invention
In view of the above disadvantages of the prior art, an object of the present invention is to provide a method for implementing gesture recognition by a millimeter wave radar, so as to improve the accuracy of gesture recognition and reduce the error recognition rate.
In order to achieve the purpose, the invention adopts the following technical scheme:
a method for realizing gesture recognition through a millimeter wave radar is characterized by comprising the following steps:
s100, radar measurement: measuring and calculating distance, speed, azimuth angle and pitch angle information of a target through a radar;
s200, clustering: clustering a plurality of radar targets generated by a real target into a target;
s300, target information expansion: obtaining a vector representing the distance, the speed, the azimuth angle and the pitch angle of the hand through a Clustering module, and additionally adding three values through the module to respectively represent a speed symbol, an azimuth angle symbol and a pitch angle symbol;
s400, gesture recognition: and inputting the integrated feature vectors into a machine learning network, and performing gesture classification.
A method for realizing gesture recognition through a millimeter wave radar comprises the following steps of S100, and the radar measuring and calculating process comprises the following steps:
s101, ADC data sampling: obtaining radar baseband digital signals through ADC sampling;
s102, distance dimension transformation: performing Fourier transform in a distance dimension;
s103, velocity dimension conversion: performing Fourier transform in a velocity dimension;
s104, target detection: detecting the target by a detector under the constant false alarm probability;
s105, information extraction: and extracting azimuth angle and pitch angle information of the target.
In a preferred embodiment of the present invention, for the data sampling of step S101, the ADC data is the input data of the process, the data is in a matrix form, the data in the same row is referred to as the same distance dimension, and the data in the same column is referred to as the same velocity dimension.
In a preferred embodiment of the present invention, for the target detection in step S104, the constant false alarm detector estimates the target to be detected by using the training data collected by the distance unit near the measuring unit, and the protection unit and the reference unit are symmetrically arranged outside the detecting unit in sequence.
In a preferred embodiment of the present invention, for the information extraction in step S105, an Angle Estimation module is provided on the millimeter wave radar, and the module includes two receiving antennas, and calculates the azimuth Angle and the pitch Angle information of each passing detection point according to the phase difference of the receiving signals of the receiving antennas.
In a preferred embodiment of the present invention, for the Clustering in step S200, the Clustering module clusters the target points with a short distance into a target point, and obtains distance, velocity, azimuth angle and pitch angle information of the clustered target points.
Through the technical scheme, the invention has the following beneficial effects:
the invention has reasonable design, the radar measures the state information of the target, including the information of the distance, the speed, the azimuth angle, the pitch angle and the like of the target, carries out clustering processing on the information, and adds a new characteristic value to carry out constraint, thereby being convenient for inputting the state information of the target into a machine learning network to judge the category of the current gesture, improving the gesture recognition precision and reducing the false recognition rate.
Drawings
FIG. 1 is a flow chart of a method of gesture recognition by millimeter wave radar according to the present invention;
FIG. 2 is a radar measurement flow chart of a method for achieving gesture recognition by a millimeter wave radar according to the present invention;
FIG. 3 is a diagram of an ADC data structure of a method for implementing gesture recognition by millimeter wave radar according to the present invention;
FIG. 4 is a detection model diagram of a CFAR detection module of the method for gesture recognition by millimeter wave radar according to the present invention;
fig. 5 is an azimuth and pitch angle calculation auxiliary diagram of the method for realizing gesture recognition by a millimeter wave radar of the present invention.
In the drawings, the components represented by the respective reference numerals are listed below:
1. a reference unit; 2. a protection unit; 3. and detecting the target.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
As shown in fig. 1, a method for implementing gesture recognition by millimeter wave radar according to the present invention is shown, which includes the steps of:
s100, radar measurement: measuring and calculating distance, speed, azimuth angle and pitch angle information of a target through a radar;
s200, clustering: clustering a plurality of radar targets generated by a real target into a target;
s300, target information expansion: obtaining a vector representing the distance, the speed, the azimuth angle and the pitch angle of the hand through a Clustering module, and additionally adding three values through the module to respectively represent a speed symbol, an azimuth angle symbol and a pitch angle symbol;
s400, gesture recognition: and inputting the integrated feature vectors into a machine learning network, and performing gesture classification.
The method of the present invention is described in detail below with reference to the accompanying drawings.
Referring to fig. 2, the radar measurement process of step S100 includes:
s101, ADC data sampling: and obtaining the radar baseband digital signal through ADC sampling.
In particular, radar measures a target and an analog-to-digital converter (ADC) converts real-world analog signals (e.g., temperature, pressure sounds or images) into digital signals, making them easier to store, process and transmit. The ADC samples the input analog signal at predetermined time intervals and compares the sampled analog signal with a series of standard digital signals, and the digital signals converge successively until the two signals are equal.
As shown in fig. 3, ADC data is in a matrix form, data in the same row is referred to as the same distance dimension, data in the same column is referred to as the same velocity dimension, and it is assumed that the distance dimension of the data is N and the velocity dimension is M.
S102, distance dimension transformation: and performing Fourier transform in a distance dimension.
Specifically, Fast Fourier Transform (FFT) is performed on data in the same distance dimension by the Range FFT module, because the data size of the ADC is N × M, M times of distance dimension FFTs are required, and the result after the Range FFT is also a matrix in the size of N × M.
S103, velocity dimension conversion: and performing Fourier transform in a velocity dimension.
Specifically, the Doppler FFT module performs fast fourier transform on data in the same velocity dimension, and the input of the Doppler FFT module is the output of the Range FFT module, because the size of the Range FFT module is N × M, N times of velocity dimension FFTs need to be performed, and the Doppler FFT result is also a matrix in the size of N × M.
S104, target detection: and detecting the target through a CFAR detection module. The CFAR is a constant false alarm rate, and in radar signal detection, when the external interference intensity changes, the radar can automatically adjust the sensitivity of the radar, so that the false alarm probability of the radar is kept unchanged, and the characteristic is called as a constant false alarm rate characteristic. The target 3 to be detected is detected by a detector under the constant false alarm probability, and a protection unit 2 and a reference unit 1 are sequentially and symmetrically arranged on the outer side of the target 3 to be detected.
Specifically, as shown in fig. 4, the CFAR module estimates the target to be detected through training data collected by the distance unit near the target 3 to be detected, the target 3 to be detected is located at the center, and the protection unit 2 and the reference unit 1 are symmetrically arranged outside the target 3 to be detected in sequence. The CFAR module firstly assumes the type of radar clutter, then estimates parameters of the clutter in the target 3 to be detected by adopting a certain algorithm through data of a reference unit 1 near the target 3 to be detected, normalizes the clutter in the target 3 to be detected to form a quantity irrelevant to the parameters of the clutter, compares the quantity with a threshold relevant to the false alarm probability and the number of the reference units 1, judges whether the target exists or not, if an input end signal exceeds the threshold, the target exists, and otherwise, the target does not exist.
The specific detection process is as follows, firstly, the average values of the reference units 1 on the left side and the right side are respectively calculated, and the calculation results are recorded as x _ average1 and x _ average 2; then calculating the mean value of x _ average1 and x _ average2, recording the calculation result as x _ noise, and finally comparing the size of the target 3 to be detected with the size of K x _ noise, wherein K is called a threshold value, if the value of the target 3 to be detected is greater than K x _ noise, the target 3 to be detected is a target point, otherwise, the target point is not the target point; after passing through the CFAR detection module, a series of over-detection points can be obtained, and the rows and columns of each over-detection point in the matrix are recorded.
S105, information extraction: and extracting azimuth angle and pitch angle information of the target. An Angle Estimation module is arranged on the millimeter wave radar and comprises two receiving antennas, and the module calculates the azimuth Angle and pitch Angle information of each over-detection point through the phase difference of signals received by the receiving antennas. The module calculates the azimuth angle and the pitch angle information of each over-detection point through the phase difference of signals received by the receiving antennas.
Specifically, as shown in fig. 5, the millimeter wave radar is provided with two receiving antennas RX, and for the angle θ of the target object, if the phase difference between the signals received by the two receiving antennas RX is ω, there is a difference between the phase difference and the phase difference
Figure BDA0002994691850000061
Thereby can obtain
Figure BDA0002994691850000062
Wherein d isRXAnd calculating azimuth angle and pitch angle information of each passing detection point for the distance between two receiving antennas of the millimeter wave radar.
For step S200, clustering: the Clustering module clusters the target points with close distances into a target point, and obtains the distance, the speed, the azimuth angle and the pitch angle of the clustered target points. Clustering is the process of dividing a collection of physical or abstract objects into classes composed of similar objects, the cluster generated by clustering is a collection of a set of data objects that are similar to objects in the same cluster and different from objects in other clusters.
Specifically, the distance, the azimuth Angle and the pitch Angle of each target point under a polar coordinate system are obtained through an Angle Estimation module, and the values of x, y and z of the target under a rectangular coordinate system are obtained through coordinate conversion. The Clustering module is used for Clustering target points with close distances into a target point, and obtaining the distance r, the speed v, the azimuth angle alpha and the pitch angle beta of the clustered target points so as to conveniently input the information of the target points into a machine learning network and classify gestures.
For step S300, target information extension:
specifically, a vector representing the distance, the speed, the azimuth angle and the pitch angle of the hand is obtained through a Clustering module, and three symbols respectively representing the speed, the azimuth angle and the pitch angle are additionally added through the module.
If the speed value (or the azimuth angle or the pitch angle) is greater than 0, the speed symbol value (or the azimuth angle symbol or the pitch angle symbol) is 1; if the speed value is equal to 0, the speed symbol value is 0; if the speed value is less than 0, the speed symbol value is-1; through the module, the output result of the radar is expanded from 4 values to 7 values, which are respectively: the distance, the speed, the azimuth angle, the pitch angle, the speed symbol, the azimuth angle symbol and the pitch angle symbol increase 4 input characteristic values of the original ANN module to 7 after passing through the module.
Specific values are given in the following table by way of example:
Figure BDA0002994691850000063
Figure BDA0002994691850000071
for step S400, gesture recognition:
specifically, the ANN module is an artificial neural network, which is one type of machine learning network, and a complex network structure formed by connecting a large number of processing units (neurons) is an abstraction, simplification and simulation of a human brain organization structure and an operation mechanism; the artificial neural network simulates the neuron activity by a mathematical model, is an information processing system established based on the simulation of the structure and the function of the cerebral neural network, and classifies and identifies gestures through the machine learning network.
And performing gesture classification and identification according to the speed symbol and the azimuth symbol, wherein the gesture classification and identification is specifically shown in the following table:
if a gesture to the left:
velocity symbol Azimuth sign
-1 -1
-1 -1
1 1
1 1
The second and third rows of the table, the velocity sign and the azimuth sign, are simultaneously switched from-1 to 1.
If the gesture is to the right:
velocity symbol Azimuth sign
-1 1
-1 1
1 -1
1 -1
The second and third rows of the table, the velocity sign is switched from-1 to 1 while the azimuth sign is switched from 1 to-1.
If it is an upward gesture:
velocity symbol Azimuth sign
-1 1
-1 1
1 -1
1 -1
In the second and third rows of the table, the velocity symbol is switched from-1 to 1 while the pitch symbol is switched from 1 to-1.
If the gesture is downward:
velocity symbol Azimuth sign
-1 -1
-1 -1
1 1
1 1
In the second and third rows of the table, the velocity symbol is switched from-1 to 1 while the pitch symbol is switched from-1 to 1.
The gestures can be rapidly and accurately recognized and classified according to the symbols corresponding to the gesture recognition, and the problems of low recognition precision and even false recognition in a machine learning network in the prior art are effectively solved.
The invention has reasonable design, the radar measures the state information of the target, including the information of the distance, the speed, the azimuth angle, the pitch angle and the like of the target, carries out clustering processing on the information, and adds a new characteristic value to carry out constraint, thereby being convenient for inputting the state information of the target into a machine learning network to judge the category of the current gesture, improving the gesture recognition precision and reducing the false recognition rate.
While the invention has been described with respect to a preferred embodiment, it will be understood by those skilled in the art that the foregoing and other changes, omissions and deviations in the form and detail thereof may be made without departing from the scope of this invention. Those skilled in the art can make various changes, modifications and equivalent arrangements, which are equivalent to the embodiments of the present invention, without departing from the spirit and scope of the present invention, and which may be made by utilizing the techniques disclosed above; meanwhile, any changes, modifications and variations of the above-described embodiments, which are equivalent to those of the technical spirit of the present invention, are within the scope of the technical solution of the present invention.

Claims (6)

1. A method for realizing gesture recognition through a millimeter wave radar is characterized by comprising the following steps:
s100, radar measurement: measuring and calculating distance, speed, azimuth angle and pitch angle information of a target through a radar;
s200, clustering: clustering a plurality of radar targets generated by a real target into a target;
s300, target information expansion: obtaining a vector representing the distance, the speed, the azimuth angle and the pitch angle of a hand through a Clustering module, and additionally adding three symbols respectively used for representing the speed, the azimuth angle and the pitch angle through the module to serve as characteristic values input by a machine learning network;
s400, gesture recognition: and inputting the integrated feature vectors into a machine learning network, and performing gesture classification.
2. A method for realizing gesture recognition through a millimeter wave radar is disclosed, wherein in step S100, the radar measurement and calculation process comprises the following steps:
s101, ADC data sampling: obtaining radar baseband digital signals through ADC sampling;
s102, distance dimension transformation: performing Fourier transform in a distance dimension;
s103, velocity dimension conversion: performing Fourier transform in a velocity dimension;
s104, target detection: detecting the target by a detector under the constant false alarm probability;
s105, information extraction: and extracting azimuth angle and pitch angle information of the target.
3. The method of claim 2, wherein in the data sampling in step S101, the ADC data is input data of the process, the data is in a matrix form, the data in the same row is referred to as the same distance dimension, and the data in the same column is referred to as the same speed dimension.
4. The method of claim 2, wherein in the step S104, the constant false alarm detector estimates the target to be detected according to the training data collected by the distance unit near the target to be detected, the target to be detected is located at the center, and the protection unit and the reference unit are symmetrically arranged outside the target to be detected in sequence.
5. The method as claimed in claim 2, wherein in the step S105, an Angle Estimation module is installed on the millimeter wave radar, and the module includes two receiving antennas, and the module calculates the azimuth Angle and the pitch Angle information of each passing detection point according to the phase difference between the signals received by the receiving antennas.
6. The method of claim 1, wherein in the Clustering in step S200, the Clustering module clusters the target points with a short distance into one target point, and obtains the distance, velocity, azimuth angle and pitch angle information of the clustered target points.
CN202110326016.1A 2021-03-26 2021-03-26 Method for realizing gesture recognition through millimeter wave radar Pending CN113050797A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110326016.1A CN113050797A (en) 2021-03-26 2021-03-26 Method for realizing gesture recognition through millimeter wave radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110326016.1A CN113050797A (en) 2021-03-26 2021-03-26 Method for realizing gesture recognition through millimeter wave radar

Publications (1)

Publication Number Publication Date
CN113050797A true CN113050797A (en) 2021-06-29

Family

ID=76515580

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110326016.1A Pending CN113050797A (en) 2021-03-26 2021-03-26 Method for realizing gesture recognition through millimeter wave radar

Country Status (1)

Country Link
CN (1) CN113050797A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114280565A (en) * 2021-11-12 2022-04-05 苏州豪米波技术有限公司 Gesture recognition method based on millimeter wave radar
CN114970618A (en) * 2022-05-17 2022-08-30 西北大学 Environmental robust sign language identification method and system based on millimeter wave radar
CN115009220A (en) * 2022-06-21 2022-09-06 无锡威孚高科技集团股份有限公司 Kicking type induction tail gate control system and method based on millimeter wave radar
CN115345908A (en) * 2022-10-18 2022-11-15 四川启睿克科技有限公司 Human body posture recognition method based on millimeter wave radar
CN117331047A (en) * 2023-12-01 2024-01-02 德心智能科技(常州)有限公司 Human behavior data analysis method and system based on millimeter wave radar

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107907870A (en) * 2017-09-28 2018-04-13 西安空间无线电技术研究所 A kind of signal creating method for being used to verify spacecrafts rendezvous microwave radar angle measurement function
CN108268132A (en) * 2017-12-26 2018-07-10 北京航空航天大学 A kind of gesture identification method and human-computer interaction device based on gloves acquisition
KR20200023670A (en) * 2018-08-14 2020-03-06 전자부품연구원 Apparatus and Method for Recognizing Gestures and Finger Spellings using Cluster Radar and 3D Convolution Neural Network
CN111830508A (en) * 2020-06-23 2020-10-27 北京航空航天大学 Barrier gate anti-smashing system and method adopting millimeter wave radar
US20200348761A1 (en) * 2018-01-26 2020-11-05 lUCF-HYU (Industry-University Cooperation Foundation Hanyang University) Gesture recognition device and method using radar
CN112034446A (en) * 2020-08-27 2020-12-04 南京邮电大学 Gesture recognition system based on millimeter wave radar

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107907870A (en) * 2017-09-28 2018-04-13 西安空间无线电技术研究所 A kind of signal creating method for being used to verify spacecrafts rendezvous microwave radar angle measurement function
CN108268132A (en) * 2017-12-26 2018-07-10 北京航空航天大学 A kind of gesture identification method and human-computer interaction device based on gloves acquisition
US20200348761A1 (en) * 2018-01-26 2020-11-05 lUCF-HYU (Industry-University Cooperation Foundation Hanyang University) Gesture recognition device and method using radar
KR20200023670A (en) * 2018-08-14 2020-03-06 전자부품연구원 Apparatus and Method for Recognizing Gestures and Finger Spellings using Cluster Radar and 3D Convolution Neural Network
CN111830508A (en) * 2020-06-23 2020-10-27 北京航空航天大学 Barrier gate anti-smashing system and method adopting millimeter wave radar
CN112034446A (en) * 2020-08-27 2020-12-04 南京邮电大学 Gesture recognition system based on millimeter wave radar

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114280565A (en) * 2021-11-12 2022-04-05 苏州豪米波技术有限公司 Gesture recognition method based on millimeter wave radar
CN114970618A (en) * 2022-05-17 2022-08-30 西北大学 Environmental robust sign language identification method and system based on millimeter wave radar
CN114970618B (en) * 2022-05-17 2024-03-19 西北大学 Sign language identification method and system based on millimeter wave radar and with environment robustness
CN115009220A (en) * 2022-06-21 2022-09-06 无锡威孚高科技集团股份有限公司 Kicking type induction tail gate control system and method based on millimeter wave radar
CN115345908A (en) * 2022-10-18 2022-11-15 四川启睿克科技有限公司 Human body posture recognition method based on millimeter wave radar
CN115345908B (en) * 2022-10-18 2023-03-07 四川启睿克科技有限公司 Human body posture recognition method based on millimeter wave radar
CN117331047A (en) * 2023-12-01 2024-01-02 德心智能科技(常州)有限公司 Human behavior data analysis method and system based on millimeter wave radar

Similar Documents

Publication Publication Date Title
CN113050797A (en) Method for realizing gesture recognition through millimeter wave radar
CN113093170B (en) Millimeter wave radar indoor personnel detection method based on KNN algorithm
CN107300698B (en) Radar target track starting method based on support vector machine
Zhang et al. Pedestrian detection method based on Faster R-CNN
CN114859339B (en) Multi-target tracking method based on millimeter wave radar
CN113156391B (en) Radar signal multi-dimensional feature intelligent sorting method
Liu et al. Deep learning and recognition of radar jamming based on CNN
CN110456320B (en) Ultra-wideband radar identity recognition method based on free space gait time sequence characteristics
CN114564982B (en) Automatic identification method for radar signal modulation type
CN113313040B (en) Human body posture identification method based on FMCW radar signal
CN111401168A (en) Multi-layer radar feature extraction and selection method for unmanned aerial vehicle
CN113064483A (en) Gesture recognition method and related device
CN116338684A (en) Human body falling detection method and system based on millimeter wave radar and deep learning
CN111368930A (en) Radar human body posture identification method and system based on multi-class spectrogram fusion and hierarchical learning
Qian et al. Parallel lstm-cnn network with radar multispectrogram for human activity recognition
CN108932468B (en) Face recognition method suitable for psychology
Wang et al. A survey of hand gesture recognition based on FMCW radar
CN114994656A (en) Indoor personnel tracking method based on millimeter wave radar
Tian et al. Indoor device-free passive localization for intrusion detection using multi-feature PNN
CN113486917A (en) Radar HRRP small sample target identification method based on metric learning
CN113064489A (en) Millimeter wave radar gesture recognition method based on L1-Norm
Feng et al. DAMUN: A domain adaptive human activity recognition network based on multimodal feature fusion
CN115937977A (en) Few-sample human body action recognition method based on multi-dimensional feature fusion
CN115909086A (en) SAR target detection and identification method based on multistage enhanced network
CN116304966A (en) Track association method based on multi-source data fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination