CN111813224B - Method for establishing and identifying fine gesture library based on ultrahigh-resolution radar - Google Patents

Method for establishing and identifying fine gesture library based on ultrahigh-resolution radar Download PDF

Info

Publication number
CN111813224B
CN111813224B CN202010655795.5A CN202010655795A CN111813224B CN 111813224 B CN111813224 B CN 111813224B CN 202010655795 A CN202010655795 A CN 202010655795A CN 111813224 B CN111813224 B CN 111813224B
Authority
CN
China
Prior art keywords
gesture
dynamic
radar
finger
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010655795.5A
Other languages
Chinese (zh)
Other versions
CN111813224A (en
Inventor
曹宗杰
王星
崔宗勇
闵锐
李晋
皮亦鸣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN202010655795.5A priority Critical patent/CN111813224B/en
Publication of CN111813224A publication Critical patent/CN111813224A/en
Application granted granted Critical
Publication of CN111813224B publication Critical patent/CN111813224B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/049Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention belongs to the field of radar gesture recognition in the field of human-computer interaction, and particularly relates to an establishment and recognition method based on an ultrahigh-resolution radar fine gesture library. Considering special control requirements in the human-computer interaction process and gesture convention meanings in daily activities, 43 kinds of fine gestures applied to ultrahigh-resolution radar recognition are designed, and a gesture recognition sample library is constructed. On the basis, the method for identifying a single gesture sample in the gesture library is provided by combining the differences and the characteristics of different gesture radar echoes: firstly, judging the rough categories of the gestures according to the instantaneous energy sequence variance and the radial motion change of the gesture samples to be recognized, and then extracting corresponding gesture features based on each category to perform classification recognition of the gestures. The method can realize high gesture recognition accuracy in the gesture library by a simple recognition process and lower calculation complexity, combines actual control requirements, and can be widely applied to human-computer interaction application scenes.

Description

Method for establishing and identifying fine gesture library based on ultrahigh-resolution radar
Technical Field
The invention belongs to the technical field of radar gesture recognition in the field of human-computer interaction, and particularly relates to an establishment and recognition method based on an ultrahigh-resolution radar fine gesture library.
Background
Gestures are a common form of human interaction, while also providing the possibility for human-computer interaction. This technique allows a computer to understand human instructions without the need for traditional interactive hardware such as a mouse and keyboard. For example, employing a user interface based on gesture recognition in an automobile may improve driver safety. It allows the driver to concentrate on driving when interacting with devices in the vehicle, such as information, entertainment or control, reducing the probability of traffic safety accidents.
Compared with the traditional gesture recognition, the gesture recognition method based on the radar has the advantages that: the robustness to the illumination condition is stronger; the calculation complexity is low because the moving target is directly detected; due to the penetrating capability of electromagnetic waves, the shielding processing capability is realized; no additional auxiliary equipment is worn, and high flexibility and comfort are achieved. Therefore, radar gesture recognition is expected to become a new generation of man-machine interaction mode, and naturalness and efficiency in the interaction process are improved.
In the effective distance space for human-computer interaction, the large-amplitude gestures do not accord with the operation habit of people, and some gestures need to be accurately quantized to realize fine control instructions. Therefore, the design of the gesture directly influences the accuracy and the real-time performance of instruction transmission and control in the human-computer interaction process. In the process of gesture recognition by the ultra-high resolution radar, the first consideration is how to design a fine gesture. Considering the sensitivity of the ultra-high resolution radar to the micro motion of an object and the convention of people on the gesture meaning in daily life, a fine gesture recognition sample library is urgently needed to be established to better serve the research of a radar gesture recognition method in the field of human-computer interaction. Meanwhile, for the established fine gesture recognition library, how to accurately recognize various fine gestures by analyzing the collected ultrahigh-resolution radar echo signals. This will be an important issue in the popularization and application of radar gesture recognition.
Disclosure of Invention
Aiming at the problems, the control requirements of the ultrahigh-resolution radar on equipment in the gesture recognition process are considered, and different types of fine gestures are designed on the basis that a single hand is taken as a main object to construct a fine gesture recognition sample library. And based on the established fine gesture library, the difference between the ultra-high resolution radar echo signals generated by different gestures is considered, so that the recognition of various gestures is realized.
The method is realized by the following design steps, and the constructed fine gesture library and the single gesture sample identification process are shown in the attached figures 1 and 3.
Step 1, considering the habit of using daily gestures by human beings in the process of human-computer interaction and a method for recognizing ultra-high resolution radar gesture signals, 43 gestures which accord with the habit of user interaction and can be applied to various control application scenes are designed, and a fine gesture library for recognizing ultra-high resolution radar gestures is constructed. The defined gesture is as follows:
dynamic palms class: pushing the palm forwards, pulling the palm backwards, sliding leftwards, sliding rightwards, sliding upwards, sliding downwards, circling clockwise and circling anticlockwise;
dynamic combination class: moving after grabbing by five fingers and moving after clicking by an index finger;
dynamic multi-finger class: the five fingers grasp a fist, the five fingers are opened, the five fingers grasp, the four fingers are bent, the four fingers are unfolded, the two fingers are enlarged, and the two fingers are reduced;
dynamic single-finger class: the index finger slides left, right, up, down, anticlockwise, clockwise, single, double and fork;
static number combination class: the numbers 1,2, 3, 4, 5, 6, 7, 8, 9, 10;
stationary multi-finger combination: the thumb is upward, the thumb is downward, the thumb is connected with the index finger to form a ring, the middle finger is straightened, the thumb is crossed with the index finger, the index finger and the middle finger are closed and upward, and the thumb and the little finger are straightened.
And 2, establishing a gesture recognition library suitable for high-resolution radar recognition based on the step 1, and judging a static gesture or a dynamic gesture by detecting the instantaneous energy sequence variance in the gesture sample data to be recognized according to the difference of the motion states of the gesture for a single gesture sample to be recognized.
For a gesture data sample needing to be identified, n echoes are sequentially selected to form a single data frame, Fourier transform is respectively carried out along a fast time dimension and a slow time dimension to obtain corresponding range-Doppler spectrums, and the amplitudes of the range-Doppler spectrums are summed to obtain an instantaneous energy value sequence. And setting an instantaneous energy sequence variance threshold lambda, if the instantaneous energy sequence variance obtained based on a single gesture data sample is greater than lambda, judging that the gesture is a dynamic gesture, and if not, judging that the gesture is a static gesture.
And 3, based on the judgment result in the step 2, if the gesture belongs to a static gesture, and by virtue of the ultrahigh resolution of the radar, a High Resolution Range Profile (HRRP) in a gesture echo contains abundant features related to a gesture geometric structure, and a plurality of features of a gesture target HRRP observed by the radar are extracted: and (3) performing classification and identification on static gestures in the fine gesture library by using a Support Vector Machine (SVM) according to the scattering center point number, entropy, standard deviation and echo power.
An echo obtained by a radar observation target is subjected to N-point sampling in space to obtain a corresponding HRRP, and the HRRP is recorded as: x ═ X1,x2,…xN]. The scattering center point number characteristic is defined as:
Figure GDA0003455769140000031
wherein,
Figure GDA0003455769140000032
u (-) is a unit step function. This feature mainly reflects the number of distance units in the HRRP that exceed the mean; entropy is defined as:
Figure GDA0003455769140000033
the characteristic reflects the energy average distribution condition of the scattering center of the target; the standard deviation is defined as:
Figure GDA0003455769140000034
this feature reflects the degree to which the target scattering center deviates from the mean; the echo power is defined as:
Figure GDA0003455769140000035
the characteristic reflects the size of the scattering cross section of the target radar.
And respectively extracting the characteristic sequences corresponding to the static gestures in the fine gesture library based on the HRRP geometrical structural characteristics. And training the model by adopting an SVM classifier, and then outputting the gesture class of the sample to be recognized.
And 4, based on the judgment result in the step 2, if the gesture belongs to a dynamic gesture, the classification judgment of the dynamic palm gesture, the dynamic combination gesture and the dynamic multi-finger gesture and the dynamic single-finger gesture can be realized through the radial motion change range.
Compared with other dynamic gestures, the dynamic palm gestures and the dynamic combination gestures have larger motion amplitude, strong echo reflection signals and larger motion change on radial distance, so that a radial motion change threshold d is set. By detecting the range of variation of the motion gesture over radial distance, i.e. HRRP sequence for a single gesture sample
Figure GDA0003455769140000036
T represents the number of echo waves, and a variation range curve of the motion gesture on the radial distance is obtained by summing along a slow time dimension
Figure GDA0003455769140000037
And comparing the width of the peak of the curve with a radial motion change threshold d, if the width of the peak of the curve is larger than the threshold d, judging that the gesture is a dynamic palm gesture or a dynamic combination gesture, and if the width of the peak of the curve is not larger than the threshold d, judging that the gesture is a dynamic multi-finger gesture or a dynamic single-finger gesture.
And 5, based on the judgment result of the step 4, if the gesture belongs to a dynamic palm gesture or a dynamic combination gesture. The time-distance curve motion range and shape difference obtained by the gestures in the ultrahigh-resolution radar are large, and radar echoes are obvious. The following gesture sample features are thus extracted by fitting the upper and lower boundaries of the motion curve: and (4) moving radial distance, average speed, moving area and width sequence variance, and classifying and identifying the type of gesture by adopting a K-nearest neighbor (KNN) classifier. And finally, for the hand sliding and circle drawing gestures, the motion characteristic similarity is high, and the gesture category is further judged by detecting the angle change condition in the gesture motion process through a radar.
Aiming at a radar echo two-dimensional matrix of a single gesture, performing Fast Fourier Transform (FFT) along a fast time dimension to obtain a corresponding HRRP sequence
Figure GDA0003455769140000041
The change of the HRRP amplitude intensity position generated along with the change of the time forms a corresponding motion curve. The course of motion and radial geometry differences of different gestures lead to differences of different motion curves. And matching the upper and lower boundary position indexes of the motion curve to form an upper boundary sequence:
u=[u1,u2,…ut].
lower border sequence:
d=[d1,d2,…,dt].
wherein u isiRepresenting the maximum distance of the scattering point on the hand from the radar on the ith echo based on the fitted curve, diThe shortest distance between a scattering point on the hand and the radar is obtained on the ith echo based on the fitted curve, wherein i is 1,2, …, and T, u and d respectively represent the farthest and the nearest distance motion ranges of the gesture estimated at different echo moments.
Then sequentially extracting 4 characteristics of radial movement distance, average speed, movement area and width sequence variance, wherein the characteristics are as follows:
Figure GDA0003455769140000042
based on the gesture motion curve characteristics, a KNN classifier is adopted to train the model, and then gesture classification judgment of the sample to be recognized is carried out. And if the gesture sample to be recognized is judged to be a palm sliding gesture and a circle drawing gesture, further judgment of the movement direction is carried out by combining the movement angle change of the gesture detected by the radar.
And 6, based on the judgment result in the step 4, if the gesture belongs to a dynamic multi-finger or dynamic single-finger gesture. The gesture motion range is small, and radar echo signals are weak, so that Doppler frequency shift change generated in the gesture motion process is extracted to amplify the gesture motion state. And sending the extracted Doppler frequency shift change characteristics to a long-short term memory (LSTM) network for recognizing and classifying the gestures.
The method has the beneficial effects that in order to better meet the control requirement of the man-machine interaction process based on radar gesture recognition, a sample library containing 43 kinds of fine gestures is established. The ultrahigh resolution radar gesture recognition sample library established by the invention fully considers the appointed meaning and familiarity of the sample gesture in daily activities of people, so that the operation is simple in the application process of human-computer interaction and most control requirements are met. Meanwhile, aiming at the established sample library, the difference and the characteristics of different gestures in the ultra-high resolution radar echo are fully considered, the idea similar to a decision tree is adopted, one gesture sample data to be recognized is firstly attributed to a certain gesture class according to the characteristics of the gesture sample data, and finally, the corresponding feature extraction and classifier are adopted to realize the recognition and classification of the gesture. The gesture library provided by the invention has the advantages of simple identification process and low calculation complexity, and can be widely applied to the application scene of human-computer interaction.
Drawings
FIG. 1 is an overview of a fine gesture library;
FIG. 2 is a schematic diagram of a full fine gesture;
FIG. 3 is a flow chart of single gesture sample recognition.
Detailed Description
The invention is further described below with reference to the accompanying drawings.
In the invention, the establishment of the fine gesture library takes the difference and the connection between the dynamic gesture and the static gesture into consideration. On the basis, the design of gestures is carried out on the hand, multi-finger, single-finger and continuous combined motion gestures of dynamic gestures, and digital combination and multi-finger combined gestures of static gestures, a fine gesture library is shown in the attached drawing 1 in an overview mode, and a corresponding fine gesture schematic diagram is shown in the attached drawing 2 in a schematic diagram mode. It should be particularly noted that all designed fine gestures should face beams emitted by the radar antenna during human-computer interaction, and the distances of the fine gestures can be flexibly set according to application requirements and gesture recognition research methods.
In the design of dynamic gestures, the motion trajectories of many gestures have similarities, only the differences in motion trajectories and directions. Such as up-down sliding of the palm, two-finger zooming, clockwise and counterclockwise rotation of the index finger, etc., whose differences are reflected in the direction of the motion trajectory, such gestures may be defined as sub-class gestures. For the directional control requirement of human-computer interaction, the improvement of the recognition accuracy of the sub-gestures has positive significance.
In the design process of the continuous combination gesture, the problem of how to determine whether the gesture is a single dynamic gesture or a continuous combination gesture needs to be considered. The continuous combination gestures are substantially continuous in time of a single same or different dynamic gesture, and the direction of movement of subsequent single gestures among the continuous combination gestures is also flexible. The method not only puts further requirements on the accuracy of the radar gesture recognition method, but also increases the possibility of realizing complex control requirements in the human-computer interaction process.
Based on the established fine gesture sample library, the invention provides an identification method aiming at different types of gestures on the basis of considering the difference of radar echo signals of different gestures, and the specific identification flow is shown in an attached figure 3. For the gesture sample to be recognized, firstly, the rough category of the gesture is judged according to the instantaneous energy sequence variance threshold and the radial motion change threshold, and then different gesture features are extracted based on each category to perform classification recognition of the gesture. Under the recognition framework of the whole gesture library, firstly, a proper amount of radar echo data are collected according to different gesture definitions to form a model training data set, and 3 gesture classifiers with different rough categories are trained respectively. After reaching a certain gesture recognition precision in the model training process, the model can be used for single gesture sample classification recognition corresponding to the gesture category.
In summary, the invention constructs a fine gesture recognition sample library which meets the daily gesture habit and meets various interaction control requirements based on the ultra-high resolution radar, and provides a recognition flow method suitable for a single gesture sample on the basis, so that the high gesture recognition rate and various human-computer interaction application scenes can be met.

Claims (6)

1. A method for establishing and identifying a fine gesture library based on an ultrahigh-resolution radar is characterized by comprising the following steps:
s1, constructing a fine gesture library suitable for ultra-high resolution radar gesture recognition, wherein the fine gesture library comprises a plurality of gestures for controlling interaction, and each gesture is defined as a dynamic gesture or a static gesture according to the action characteristics of the gesture, the dynamic gesture comprises a dynamic palm type, a dynamic combination type, a dynamic multi-finger type and a dynamic single-finger type, and the static gesture comprises a static digital combination type and a static multi-finger combination type;
s2, after the ultrahigh resolution radar acquires the gesture signal, for a single gesture sample to be recognized, judging a static gesture or a dynamic gesture by detecting the instantaneous energy sequence variance in the gesture sample data to be recognized according to the difference of the gesture motion states, and if the gesture sample data is the static gesture, entering the step S3; if the gesture is a dynamic gesture, go to step S4;
s3, extracting a plurality of features of the gesture target high resolution range profile HRRP observed by the radar: scattering center point number, entropy, standard deviation and echo power, and performing classification and identification on static gestures in a fine gesture library by using a support vector machine;
s4, realizing classification and judgment of dynamic palm type, dynamic combination type, dynamic multi-finger type and dynamic single-finger type gestures through the radial motion change range, if the dynamic palm type or dynamic combination type gestures are adopted, entering the step S5, and if the dynamic multi-finger type or dynamic single-finger type gestures are adopted, entering the step S6;
s5, extracting the following gesture sample characteristics by fitting the upper and lower boundaries of the motion curve: the gesture recognition method comprises the following steps of (1) carrying out classification recognition on gestures of the type by adopting a K-nearest neighbor classifier according to the radial movement distance, the average speed, the movement area and the width sequence variance;
and S6, extracting Doppler frequency shift change generated in the gesture motion process to amplify the gesture motion state, and sending the extracted Doppler frequency shift change characteristics to a long-term and short-term memory network to recognize and classify the gestures.
2. The method for establishing and recognizing the ultra-high resolution radar fine gesture library according to claim 1, wherein the fine gesture library in step S1 comprises:
dynamic palms class: pushing the palm forwards, pulling the palm backwards, sliding leftwards, sliding rightwards, sliding upwards, sliding downwards, circling clockwise and circling anticlockwise;
dynamic combination class: moving after grabbing by five fingers and moving after clicking by an index finger;
dynamic multi-finger class: the five fingers grasp a fist, the five fingers are opened, the five fingers grasp, the four fingers are bent, the four fingers are unfolded, the two fingers are enlarged, and the two fingers are reduced;
dynamic single-finger class: the index finger slides left, right, up, down, anticlockwise, clockwise, single, double and fork;
static number combination class: the numbers 1,2, 3, 4, 5, 6, 7, 8, 9, 10;
stationary multi-finger combination: the thumb is upward, the thumb is downward, the thumb is connected with the index finger to form a ring, the middle finger is straightened, the thumb is crossed with the index finger, the index finger and the middle finger are closed and upward, and the thumb and the little finger are straightened.
3. The method for establishing and recognizing the ultra-high resolution radar fine gesture library according to claim 2, wherein the specific method of the step S2 is as follows:
for a gesture data sample needing to be identified, n echoes are sequentially selected to form a single data frame, Fourier transform is respectively carried out along a fast time dimension and a slow time dimension to obtain corresponding distance-Doppler spectrums, and the amplitudes of the distance-Doppler spectrums are summed to obtain an instantaneous energy value sequence; and setting an instantaneous energy sequence variance threshold lambda, if the instantaneous energy sequence variance obtained based on a single gesture data sample is greater than lambda, judging that the gesture is a dynamic gesture, and if not, judging that the gesture is a static gesture.
4. The method for establishing and identifying the ultra-high resolution radar fine gesture library according to claim 3, wherein the specific method for extracting the plurality of features of the gesture target high resolution range profile HRRP observed by the radar in the step S3 is as follows:
an echo obtained by a radar observation target is subjected to N-point sampling in space to obtain a corresponding HRRP, and the HRRP is recorded as: x ═ X1,x2,…,xi,…,xN]X is a sequence in which each element corresponds to the value of each point of the N point samples, i.e. XiI is more than or equal to 1 and less than or equal to N, and the value of N is an index of 2;
the scattering center point number characteristic is defined as:
Figure FDA0003508079430000021
wherein,
Figure FDA0003508079430000022
x0for the average distance of the entire sequence X, U (-) is a unit step function, which reflects the number of distance cells in HRRP that exceed the mean;
entropy is defined as:
Figure FDA0003508079430000023
the characteristic reflects the energy average distribution condition of the scattering center of the target;
the standard deviation is defined as:
Figure FDA0003508079430000024
this feature reflects the degree to which the target scattering center deviates from the mean;
the echo power is defined as:
Figure FDA0003508079430000031
the characteristic reflects the size of the scattering cross section of the target radar.
5. The method for establishing and identifying the ultra-high resolution radar fine gesture library according to claim 4, wherein the specific method of the step S4 is as follows:
setting a radial motion variation threshold d by detecting the variation range of the motion gesture in radial distance, i.e. HRRP sequence for single gesture sample
Figure FDA0003508079430000035
T represents the number of echo waves, and a variation range curve of the motion gesture on the radial distance is obtained by summing along a slow time dimension
Figure FDA0003508079430000033
T is more than or equal to 1 and less than or equal to T, the width of the wave crest of the curve is detected to be compared with a radial motion change threshold value d, if the width of the wave crest of the curve is larger than the threshold value d, the gesture is judged to be a dynamic palm gesture or a dynamic combination gesture, and if the width of the wave crest of the curve is not smaller than the threshold value d, the gesture is judged to be a dynamic multi-finger gesture or a dynamic single-finger gesture.
6. The method for establishing and recognizing the ultra-high resolution radar fine gesture library according to claim 5, wherein the specific method of the step S5 is as follows:
aiming at the radar echo two-dimensional matrix of a single gesture, performing fast Fourier transform along the fast time dimension to obtain a corresponding HRRP sequence
Figure FDA0003508079430000034
Over timeThe HRRP amplitude intensity position change generated by the change forms a corresponding motion curve, and the difference of different motion curves is caused by the motion processes of different gestures and the radial geometric structure difference; and matching the upper and lower boundary position indexes of the motion curve to form an upper boundary sequence:
u=[u1,u2,…ut]
lower border sequence:
d=[d1,d2,…,dt]
wherein u isiRepresenting the maximum distance of the scattering point on the hand from the radar on the ith echo based on the fitted curve, diThe shortest distance between a scattering point on the hand and the radar is obtained on the ith echo based on the fitted curve, wherein i is 1,2, …, and T, u and d respectively represent the farthest and closest distance motion ranges of the gesture estimated at different echo moments;
sequentially extracting 4 characteristics of radial movement distance, average speed, movement area and width sequence variance, wherein the characteristics are as follows:
f5=max(d)-min(u)
f6=f5/t
Figure FDA0003508079430000041
f8=var(d-u)
based on the gesture motion curve characteristics, a KNN classifier is adopted to train the model, and then gesture classification judgment of the sample to be recognized is carried out.
CN202010655795.5A 2020-07-09 2020-07-09 Method for establishing and identifying fine gesture library based on ultrahigh-resolution radar Active CN111813224B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010655795.5A CN111813224B (en) 2020-07-09 2020-07-09 Method for establishing and identifying fine gesture library based on ultrahigh-resolution radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010655795.5A CN111813224B (en) 2020-07-09 2020-07-09 Method for establishing and identifying fine gesture library based on ultrahigh-resolution radar

Publications (2)

Publication Number Publication Date
CN111813224A CN111813224A (en) 2020-10-23
CN111813224B true CN111813224B (en) 2022-03-25

Family

ID=72843257

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010655795.5A Active CN111813224B (en) 2020-07-09 2020-07-09 Method for establishing and identifying fine gesture library based on ultrahigh-resolution radar

Country Status (1)

Country Link
CN (1) CN111813224B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112415510B (en) * 2020-11-05 2023-08-04 深圳大学 Dual-station radar gesture recognition method, device, system and storage medium
CN114661142B (en) * 2020-12-22 2024-08-27 华为技术有限公司 Gesture recognition method and device
CN112926454B (en) * 2021-02-26 2023-01-06 重庆长安汽车股份有限公司 Dynamic gesture recognition method
CN113208566B (en) * 2021-05-17 2023-06-23 深圳大学 Data processing method and device, electronic equipment and storage medium
CN113849068B (en) * 2021-09-28 2024-03-29 中国科学技术大学 Understanding and interaction method and system for multi-modal information fusion of gestures
CN117218716B (en) * 2023-08-10 2024-04-09 中国矿业大学 DVS-based automobile cabin gesture recognition system and method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105258444A (en) * 2015-10-08 2016-01-20 合肥美的电冰箱有限公司 Refrigerator operating control method and system
CN109032349A (en) * 2018-07-10 2018-12-18 哈尔滨工业大学 A kind of gesture identification method and system based on millimetre-wave radar
CN109583436A (en) * 2019-01-29 2019-04-05 杭州朗阳科技有限公司 A kind of gesture recognition system based on millimetre-wave radar
CN110741385A (en) * 2019-06-26 2020-01-31 Oppo广东移动通信有限公司 Gesture recognition method and device and location tracking method and device
CN110765974A (en) * 2019-10-31 2020-02-07 复旦大学 Micro-motion gesture recognition method based on millimeter wave radar and convolutional neural network

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10168785B2 (en) * 2015-03-03 2019-01-01 Nvidia Corporation Multi-sensor based user interface
US10817065B1 (en) * 2015-10-06 2020-10-27 Google Llc Gesture recognition using multiple antenna

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105258444A (en) * 2015-10-08 2016-01-20 合肥美的电冰箱有限公司 Refrigerator operating control method and system
CN109032349A (en) * 2018-07-10 2018-12-18 哈尔滨工业大学 A kind of gesture identification method and system based on millimetre-wave radar
CN109583436A (en) * 2019-01-29 2019-04-05 杭州朗阳科技有限公司 A kind of gesture recognition system based on millimetre-wave radar
CN110741385A (en) * 2019-06-26 2020-01-31 Oppo广东移动通信有限公司 Gesture recognition method and device and location tracking method and device
CN110765974A (en) * 2019-10-31 2020-02-07 复旦大学 Micro-motion gesture recognition method based on millimeter wave radar and convolutional neural network

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Dynamic Gesture Recognition with a Terahertz Radar Based on Range Profile Sequences and Doppler Signatures;Zhi Zhou等;《Sensors》;20171221;论文第1-15页 *

Also Published As

Publication number Publication date
CN111813224A (en) 2020-10-23

Similar Documents

Publication Publication Date Title
CN111813224B (en) Method for establishing and identifying fine gesture library based on ultrahigh-resolution radar
Abiyev et al. Sign language translation using deep convolutional neural networks
Sivaraman et al. Active learning for on-road vehicle detection: A comparative study
Zhang et al. Dynamic hand gesture classification based on radar micro-Doppler signatures
Zhao et al. Cubelearn: End-to-end learning for human motion recognition from raw mmwave radar signals
CN110084209B (en) Real-time gesture recognition method based on parent-child classifier
CN110309690A (en) The gesture identification detection method composed based on time-frequency spectrum and range Doppler
CN111813222B (en) Terahertz radar-based fine dynamic gesture recognition method
Liu et al. Spectrum-based hand gesture recognition using millimeter-wave radar parameter measurements
CN110031827B (en) Gesture recognition method based on ultrasonic ranging principle
KR20210001227A (en) Non-contact type gesture recognization apparatus and method
Wu et al. An overview on feature-based classification algorithms for multivariate time series
CN113064483A (en) Gesture recognition method and related device
CN114708663A (en) Millimeter wave radar sensing gesture recognition method based on few-sample learning
Ehrnsperger et al. Real-time gesture detection based on machine learning classification of continuous wave radar signals
Mahmoud et al. Towards an end-to-end isolated and continuous deep gesture recognition process
CN101178767A (en) Recognizing layer amalgamation for human face and iris mixed recognition
Dong et al. Review of research on gesture recognition based on radar technology
Rashid et al. Hand gesture recognition using continuous wave (cw) radar based on hybrid pca-knn
Dubey Enhanced hand-gesture recognition by improved beetle swarm optimized probabilistic neural network for human–computer interaction
Agab et al. New combined DT-CWT and HOG descriptor for static and dynamic hand gesture recognition
Bai et al. Dynamic hand gesture recognition based on depth information
Qiu et al. A survey of gesture recognition using frequency modulated continuous wave radar
US11639985B2 (en) Three-dimensional feature extraction from frequency modulated continuous wave radar signals
Wang et al. Research on gesture recognition algorithm based on millimeter-wave radar in vehicle scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant