CN111368762A - Robot gesture recognition method based on improved K-means clustering algorithm - Google Patents

Robot gesture recognition method based on improved K-means clustering algorithm Download PDF

Info

Publication number
CN111368762A
CN111368762A CN202010157400.9A CN202010157400A CN111368762A CN 111368762 A CN111368762 A CN 111368762A CN 202010157400 A CN202010157400 A CN 202010157400A CN 111368762 A CN111368762 A CN 111368762A
Authority
CN
China
Prior art keywords
robot
sample
improved
category
euclidean distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010157400.9A
Other languages
Chinese (zh)
Inventor
杨忠
宋爱国
徐宝国
吴有龙
唐玉娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jinling Institute of Technology
Original Assignee
Jinling Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jinling Institute of Technology filed Critical Jinling Institute of Technology
Priority to CN202010157400.9A priority Critical patent/CN111368762A/en
Publication of CN111368762A publication Critical patent/CN111368762A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24147Distances to closest patterns, e.g. nearest neighbour classification

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Manipulator (AREA)
  • Image Analysis (AREA)

Abstract

A robot gesture recognition method based on an improved K-means clustering algorithm. Step 1, collecting data of hand movement by using a glove embedded with a micro-nano optical fiber sensor, wherein the dimension of the data collected by the sensor is 6 dimensions; step 2, uploading data acquired by the micro-nano optical fiber sensor to a robot through a WIFI module on the glove; step 3, the robot determines clustering centers corresponding to different gestures in advance by combining an improved K-means clustering algorithm; step 4, calculating Euclidean distances from data acquired by the current micro-nano sensor to predetermined clustering centers of different gestures; step 5, comparing each calculated Euclidean distance with a threshold value of a corresponding category, if the Euclidean distance is lower than the threshold value of the category, judging the Euclidean distance as the category, and if not, retraining the model; and 6, completing corresponding actions by the robot according to the judgment result, and finishing a complete closed loop. The invention effectively realizes the accurate recognition of the robot to various gestures through the improved K-means clustering algorithm.

Description

Robot gesture recognition method based on improved K-means clustering algorithm
Technical Field
The invention relates to the field of robot gesture recognition, in particular to a robot gesture recognition method based on an improved K-means clustering algorithm.
Background
With the continuous development of artificial intelligence and virtual reality technology, a human-computer interaction system has become a current research hotspot. Nowadays, as an emerging human-computer interaction mode, gesture recognition is valued by many researchers, and produces a series of effective results, and is widely applied to devices such as intelligent robots, intelligent driving and the like. Gesture recognition simply means that a machine understands ideas which people want to express with the assistance of a vision or sensor acquisition system, namely, an interaction process is completed in a non-contact mode, so that corresponding actions are completed through the robot, and intellectualization is realized in a true sense.
Aiming at the problem of gesture recognition of a robot, a domestic patent related to a solution of the problem is 'a cooperative robot gesture recognition method and device based on depth vision' (201910176271.5), a gesture template set is obtained in advance, and a plurality of depth maps of a gesture to be recognized are obtained at the same time; and aiming at each gesture template, obtaining the distance between the gesture to be recognized and the gesture template, taking the gesture template with the minimum distance between the gesture to be recognized and the gesture template as the recognition result of the gesture to be recognized, and further controlling the cooperative robot according to the control parameters corresponding to the recognition result. The invention discloses a gesture recognition method based on an intelligent robot (201910118356.8), which is characterized in that a camera carried by the robot is called to obtain a gesture image, and a gesture template is established; segmenting the gesture by detecting based on skin color and based on maximum between-class variance; denoising the segmented gesture image by using a median filtering algorithm, and extracting a gesture edge contour; and then, obtaining a recognition result by adopting a Euclidean distance template matching method based on the gesture template and the gesture edge outline. The above two invention patents identify the gesture picture, and the picture data with overlarge dimensionality increases the training difficulty of the model on one hand, and increases the time for judging the model in actual application on the other hand.
Disclosure of Invention
In order to solve the problems, the invention provides a robot gesture recognition method based on an improved K-means clustering algorithm on the basis of a micro-nano optical fiber sensor and the K-means clustering algorithm. Firstly, collecting corresponding data without gestures by using a micro-nano optical fiber sensor; then sequentially determining clustering centers and category thresholds corresponding to different gestures by utilizing an improved K-means clustering algorithm; meanwhile, the model supports online updating optimization, and the generalization of the model is greatly improved; and finally, the method is successfully applied to practice, and the robot can accurately recognize different gestures. To achieve the purpose, the invention provides a robot gesture recognition method based on an improved K-means clustering algorithm, which comprises the following specific steps:
step 1, collecting data of hand movement by using a glove embedded with a micro-nano optical fiber sensor, wherein the dimension of the data collected by the sensor is 6 dimensions;
step 2, uploading data acquired by the micro-nano optical fiber sensor to a robot through a WIFI module on the glove;
step 3, the robot determines clustering centers corresponding to different gestures in advance by combining an improved K-means clustering algorithm;
step 4, calculating Euclidean distances from data acquired by the current micro-nano sensor to predetermined clustering centers of different gestures;
step 5, comparing each calculated Euclidean distance with a threshold value of a corresponding category, if the Euclidean distance is lower than the threshold value of the category, judging the Euclidean distance as the category, and if not, retraining the model;
and 6, completing corresponding actions by the robot according to the judgment result, and finishing a complete closed loop.
Further, the specific step of using the improved K-means clustering algorithm to predetermine the clustering centers corresponding to different gestures in step 3 is as follows:
step 3.1, arbitrarily selecting one sample point from all sample points as the initial clustering center c of the first category1
Step 3.2, X ═ X for the entire training sample setjI j 1,2,.., n, each sample is calculatedx to the clustering center, and taking the position of the sample corresponding to the maximum distance as a new clustering center;
step 3.3, repeating step 3.2 until k clustering centers c are determinedi(1≤i≤k);
Step 3.4, for the whole training sample set X ═ XjI j 1,2,.. n, and calculating each sample point x respectivelyjGo to step 3.3 to determine k cluster centers ci(1 ≦ i ≦ k), sample x for the s-dimensional samplejTo class i centre ciThe Euclidean distance of (A) is:
Figure BDA0002404558320000021
step 3.5, classifying the samples into the class where the nearest Euclidean distance is located, and traversing the whole sample space to complete the construction of k class clusters;
step 3.6, for each class cluster, taking the mean vector of all sample points in the cluster as a new class cluster center, namely the update criterion of the class cluster is as follows:
Figure BDA0002404558320000022
in the formula, ciTo center the updated cluster, miIndicates the total number of samples in the ith class cluster,
Figure BDA0002404558320000023
representing the sum of the dimensions of all sample vectors in the class cluster.
Step 3.7, repeating the steps 3.4-3.6 until the square error function converges or the iteration number reaches the set number, wherein the expression of the square error function is as follows:
Figure BDA0002404558320000024
further, if the euclidean distance from the data acquired in real time to each cluster center in step 5 is greater than any category threshold, the specific description of the retraining model is as follows:
labeling a sample acquired in real time through prior knowledge, and then bringing the data into a trained model to update and correct the model: and updating the clustering centers of all the categories and the corresponding category threshold values. The model supports updating optimization, and the generalization of the model is greatly improved.
The robot gesture recognition method based on the improved K-means clustering algorithm has the beneficial effects that: the invention has the technical effects that:
1. according to the invention, the six-dimensional micro-nano optical fiber sensor is used for acquiring data under the current gesture in real time, and compared with the traditional data acquisition method for gesture recognition in a picture form, the method has lower sample dimension, shortens the training time of the model, improves the distinguishing speed of the model, and simultaneously ensures high precision;
2. the improved K-means clustering algorithm is used for clustering analysis of different gestures, so that the method is better improved in the aspect of avoiding the occurrence of wrong initial clustering centers compared with the traditional K-means clustering method, and can accurately realize classification of different gestures and determination of category thresholds;
3. the robot gesture recognition model of the invention supports optimization updating, namely: when the trained model cannot classify and judge the data collected in real time, the data at the moment is taken as training data and substituted into the model for retraining, so that the updating of the class center and the class threshold corresponding to each gesture class is realized, and the generalization of the model is greatly improved.
Drawings
FIG. 1 is a flow chart of the present invention;
FIG. 2 is a schematic diagram of the recognition of clustering centers and classification thresholds of different gestures using an improved K-means clustering algorithm in the present invention;
Detailed Description
The invention is described in further detail below with reference to the following detailed description and accompanying drawings:
the invention provides a robot gesture recognition method based on an improved K-means clustering algorithm, and aims to simply and efficiently realize accurate recognition of different gestures by a robot.
FIG. 1 is a flow chart of the present invention. The steps of the present invention will be described in detail with reference to the flow chart.
Step 1, collecting data of hand movement by using a glove embedded with a micro-nano optical fiber sensor, wherein the dimension of the data collected by the sensor is 6 dimensions;
step 2, uploading data acquired by the micro-nano optical fiber sensor to a robot through a WIFI module on the glove;
step 3, the robot determines clustering centers corresponding to different gestures in advance by combining an improved K-means clustering algorithm;
step 3.1, arbitrarily selecting one sample point from all sample points as the initial clustering center c of the first category1
Step 3.2, X ═ X for the entire training sample setjCalculating the distance from each sample x to a clustering center, and taking the position of the sample corresponding to the maximum distance as a new clustering center;
step 3.3, repeating step 3.2 until k clustering centers c are determinedi(1≤i≤k);
Step 3.4, for the whole training sample set X ═ XjI j 1,2,.. n, and calculating each sample point x respectivelyjGo to step 3.3 to determine k cluster centers ci(1 ≦ i ≦ k), sample x for the s-dimensional samplejTo class i centre ciThe Euclidean distance of (A) is:
Figure BDA0002404558320000041
step 3.5, classifying the samples into the class where the nearest Euclidean distance is located, and traversing the whole sample space to complete the construction of k class clusters;
step 3.6, for each class cluster, taking the mean vector of all sample points in the cluster as a new class cluster center, namely the update criterion of the class cluster is as follows:
Figure BDA0002404558320000042
in the formula, ciTo center the updated cluster, miIndicates the total number of samples in the ith class cluster,
Figure BDA0002404558320000043
representing the sum of the dimensions of all sample vectors in the class cluster.
Step 3.7, repeating the steps 3.4-3.6 until the square error function converges or the iteration number reaches the set number, wherein the expression of the square error function is as follows:
Figure BDA0002404558320000044
step 4, calculating Euclidean distances from data acquired by the current micro-nano sensor to predetermined clustering centers of different gestures;
and 5, comparing each calculated Euclidean distance with a threshold value of a corresponding category, judging the Euclidean distance as the category if the Euclidean distance is lower than the threshold value of the category, and retraining the model if not. The specific description of the retraining model is as follows: labeling a sample acquired in real time through prior knowledge, and then bringing the data into a trained model to update and correct the model: and updating the clustering centers of all the categories and the corresponding category threshold values. The model supports updating optimization, and the generalization of the model is greatly improved.
And 6, completing corresponding actions by the robot according to the judgment result, and finishing a complete closed loop.
FIG. 2 is a schematic diagram of the present invention for identifying cluster centers and class thresholds of different classes by using an improved K-means clustering algorithm. As can be seen from the figure, the improved K-means clustering algorithm can be used for simply and effectively determining the clustering centers and the boundaries of different classes so as to obtain the corresponding class thresholds.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the present invention in any way, but any modifications or equivalent variations made according to the technical spirit of the present invention are within the scope of the present invention as claimed.

Claims (3)

1. The robot gesture recognition method based on the improved K-means clustering algorithm comprises the following specific steps:
step 1, collecting data of hand movement by using a glove embedded with a micro-nano optical fiber sensor, wherein the dimension of the data collected by the sensor is 6 dimensions;
step 2, uploading data acquired by the micro-nano optical fiber sensor to a robot through a WIFI module on the glove;
step 3, the robot determines clustering centers corresponding to different gestures in advance by combining an improved K-means clustering algorithm;
step 4, calculating Euclidean distances from data acquired by the current micro-nano sensor to predetermined clustering centers of different gestures;
step 5, comparing each calculated Euclidean distance with a threshold value of a corresponding category, if the Euclidean distance is lower than the threshold value of the category, judging the Euclidean distance as the category, and if not, retraining the model;
and 6, completing corresponding actions by the robot according to the judgment result, and finishing a complete closed loop.
2. The robot gesture recognition method based on the improved K-means clustering algorithm of claim 1, characterized in that: the specific steps of utilizing the improved K-means clustering algorithm to predetermine the clustering centers corresponding to different gestures in the step 3 are as follows:
step 3.1, arbitrarily selecting one sample point from all sample points as the initial clustering center c of the first category1
Step 3.2, X ═ X for the entire training sample setjCalculating the distance from each sample x to a clustering center, and taking the position of the sample corresponding to the maximum distance as a new clustering center;
step 3.3, repeating step 3.2 until k clustering centers c are determinedi(1≤i≤k);
Step 3.4, for the whole training sample set X ═ XjI j 1,2,.. n, and calculating each sample point x respectivelyjGo to step 3.3 to determine k cluster centers ci(1 ≦ i ≦ k), sample x for the s-dimensional samplejTo class i centre ciThe Euclidean distance of (A) is:
Figure FDA0002404558310000011
step 3.5, classifying the samples into the class where the nearest Euclidean distance is located, and traversing the whole sample space to complete the construction of k class clusters;
step 3.6, for each class cluster, taking the mean vector of all sample points in the cluster as a new class cluster center, namely the update criterion of the class cluster is as follows:
Figure FDA0002404558310000012
in the formula, ciTo center the updated cluster, miIndicates the total number of samples in the ith class cluster,
Figure FDA0002404558310000013
representing the sum of the dimensions of all sample vectors in the class cluster.
Step 3.7, repeating the steps 3.4-3.6 until the square error function converges or the iteration number reaches the set number, wherein the expression of the square error function is as follows:
Figure FDA0002404558310000021
3. the robot gesture recognition method based on the improved K-means clustering algorithm of claim 1, characterized in that: in step 5, if the euclidean distance from the data acquired in real time to each cluster center is greater than any category threshold, the specific description of the retraining model is as follows:
labeling a sample acquired in real time through prior knowledge, and then bringing the data into a trained model to update and correct the model: and updating the clustering centers of all the categories and the corresponding category threshold values. The model supports updating optimization, and the generalization of the model is greatly improved.
CN202010157400.9A 2020-03-09 2020-03-09 Robot gesture recognition method based on improved K-means clustering algorithm Pending CN111368762A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010157400.9A CN111368762A (en) 2020-03-09 2020-03-09 Robot gesture recognition method based on improved K-means clustering algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010157400.9A CN111368762A (en) 2020-03-09 2020-03-09 Robot gesture recognition method based on improved K-means clustering algorithm

Publications (1)

Publication Number Publication Date
CN111368762A true CN111368762A (en) 2020-07-03

Family

ID=71206751

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010157400.9A Pending CN111368762A (en) 2020-03-09 2020-03-09 Robot gesture recognition method based on improved K-means clustering algorithm

Country Status (1)

Country Link
CN (1) CN111368762A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112016621A (en) * 2020-08-28 2020-12-01 上海第一财经数据科技有限公司 Training method of classification model, color classification method and electronic equipment
CN112035663A (en) * 2020-08-28 2020-12-04 京东数字科技控股股份有限公司 Cluster analysis method, device, equipment and storage medium
CN112446296A (en) * 2020-10-30 2021-03-05 杭州易现先进科技有限公司 Gesture recognition method and device, electronic device and storage medium
CN113516063A (en) * 2021-06-29 2021-10-19 北京精密机电控制设备研究所 Motion mode identification method based on K-Means and gait cycle similarity
CN113608074A (en) * 2021-06-17 2021-11-05 国网浙江省电力有限公司营销服务中心 Automatic online monitoring method and system for multi-epitope voltage withstand test device

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105571045A (en) * 2014-10-10 2016-05-11 青岛海尔空调电子有限公司 Somatosensory identification method, apparatus and air conditioner controller
CN105956604A (en) * 2016-04-20 2016-09-21 广东顺德中山大学卡内基梅隆大学国际联合研究院 Action identification method based on two layers of space-time neighborhood characteristics
CN106845348A (en) * 2016-12-20 2017-06-13 南京信息工程大学 A kind of gesture identification method based on arm surface electromyographic signal
CN107014411A (en) * 2017-04-05 2017-08-04 浙江大学 A kind of flexible micro-nano fiber angle sensor chip and sensor and preparation method
CN108983973A (en) * 2018-07-03 2018-12-11 东南大学 A kind of humanoid dexterous myoelectric prosthetic hand control method based on gesture identification
CN109032337A (en) * 2018-06-28 2018-12-18 济南大学 A kind of KEM Gesture Recognition Algorithm based on data glove
CN109547136A (en) * 2019-01-28 2019-03-29 北京邮电大学 Distributed collaborative frequency spectrum sensing method based on minimax apart from sub-clustering
CN110147754A (en) * 2019-05-17 2019-08-20 金陵科技学院 A kind of dynamic gesture identification method based on VR technology
CN110348323A (en) * 2019-06-19 2019-10-18 广东工业大学 A kind of wearable device gesture identification method based on Neural Network Optimization

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105571045A (en) * 2014-10-10 2016-05-11 青岛海尔空调电子有限公司 Somatosensory identification method, apparatus and air conditioner controller
CN105956604A (en) * 2016-04-20 2016-09-21 广东顺德中山大学卡内基梅隆大学国际联合研究院 Action identification method based on two layers of space-time neighborhood characteristics
CN106845348A (en) * 2016-12-20 2017-06-13 南京信息工程大学 A kind of gesture identification method based on arm surface electromyographic signal
CN107014411A (en) * 2017-04-05 2017-08-04 浙江大学 A kind of flexible micro-nano fiber angle sensor chip and sensor and preparation method
CN109032337A (en) * 2018-06-28 2018-12-18 济南大学 A kind of KEM Gesture Recognition Algorithm based on data glove
CN108983973A (en) * 2018-07-03 2018-12-11 东南大学 A kind of humanoid dexterous myoelectric prosthetic hand control method based on gesture identification
CN109547136A (en) * 2019-01-28 2019-03-29 北京邮电大学 Distributed collaborative frequency spectrum sensing method based on minimax apart from sub-clustering
CN110147754A (en) * 2019-05-17 2019-08-20 金陵科技学院 A kind of dynamic gesture identification method based on VR technology
CN110348323A (en) * 2019-06-19 2019-10-18 广东工业大学 A kind of wearable device gesture identification method based on Neural Network Optimization

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王道东: "基于人机交互系统的手势识别方法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112016621A (en) * 2020-08-28 2020-12-01 上海第一财经数据科技有限公司 Training method of classification model, color classification method and electronic equipment
CN112035663A (en) * 2020-08-28 2020-12-04 京东数字科技控股股份有限公司 Cluster analysis method, device, equipment and storage medium
CN112016621B (en) * 2020-08-28 2023-11-24 上海应帆数字科技有限公司 Training method of classification model, color classification method and electronic equipment
CN112035663B (en) * 2020-08-28 2024-05-17 京东科技控股股份有限公司 Cluster analysis method, device, equipment and storage medium
CN112446296A (en) * 2020-10-30 2021-03-05 杭州易现先进科技有限公司 Gesture recognition method and device, electronic device and storage medium
CN113608074A (en) * 2021-06-17 2021-11-05 国网浙江省电力有限公司营销服务中心 Automatic online monitoring method and system for multi-epitope voltage withstand test device
CN113608074B (en) * 2021-06-17 2024-05-28 国网浙江省电力有限公司营销服务中心 Automatic online monitoring method and system for multi-epitope withstand voltage test device
CN113516063A (en) * 2021-06-29 2021-10-19 北京精密机电控制设备研究所 Motion mode identification method based on K-Means and gait cycle similarity

Similar Documents

Publication Publication Date Title
CN111368762A (en) Robot gesture recognition method based on improved K-means clustering algorithm
CN107168527B (en) The first visual angle gesture identification and exchange method based on region convolutional neural networks
CN106682598B (en) Multi-pose face feature point detection method based on cascade regression
CN107150347B (en) Robot perception and understanding method based on man-machine cooperation
Xu et al. Online dynamic gesture recognition for human robot interaction
Li et al. Robust visual tracking based on convolutional features with illumination and occlusion handing
CN110232308B (en) Robot-following gesture track recognition method based on hand speed and track distribution
CN103886619B (en) A kind of method for tracking target merging multiple dimensioned super-pixel
CN110781829A (en) Light-weight deep learning intelligent business hall face recognition method
Pandey et al. Hand gesture recognition for sign language recognition: A review
CN108846359A (en) It is a kind of to divide the gesture identification method blended with machine learning algorithm and its application based on skin-coloured regions
CN106778501A (en) Video human face ONLINE RECOGNITION method based on compression tracking with IHDR incremental learnings
CN110728694A (en) Long-term visual target tracking method based on continuous learning
CN109508686B (en) Human behavior recognition method based on hierarchical feature subspace learning
CN106599785A (en) Method and device for building human body 3D feature identity information database
CN106611158A (en) Method and equipment for obtaining human body 3D characteristic information
CN110717385A (en) Dynamic gesture recognition method
KR20120089948A (en) Real-time gesture recognition using mhi shape information
CN105701486A (en) Method for realizing human face information analysis and extraction in video camera
CN109753922A (en) Anthropomorphic robot expression recognition method based on dense convolutional neural networks
CN109409231A (en) Multiple features fusion sign Language Recognition Method based on adaptive hidden Markov
Ikram et al. Real time hand gesture recognition using leap motion controller based on CNN-SVM architechture
CN108257148A (en) The target of special object suggests window generation method and its application in target following
CN112084898A (en) Assembling operation action recognition method based on static and dynamic separation
CN115035592A (en) Gesture recognition method and device for online education of artworks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200703