CN101807114A - Natural interactive method based on three-dimensional gestures - Google Patents

Natural interactive method based on three-dimensional gestures Download PDF

Info

Publication number
CN101807114A
CN101807114A CN 201010139526 CN201010139526A CN101807114A CN 101807114 A CN101807114 A CN 101807114A CN 201010139526 CN201010139526 CN 201010139526 CN 201010139526 A CN201010139526 A CN 201010139526A CN 101807114 A CN101807114 A CN 101807114A
Authority
CN
China
Prior art keywords
point
finger tip
dimensional
profile
fingertip
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 201010139526
Other languages
Chinese (zh)
Other versions
CN101807114B (en
Inventor
潘志庚
郭康德
邵兴旦
李扬
李光霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN201010139526XA priority Critical patent/CN101807114B/en
Publication of CN101807114A publication Critical patent/CN101807114A/en
Application granted granted Critical
Publication of CN101807114B publication Critical patent/CN101807114B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a natural interactive method based on three-dimensional gestures, the method utilizes a computer vision technology to obtain local features of a hand by foreground segmentation and fingertip detection, and the local features include fingertip position, palm contour, palm center position and the like. By adopting the stereoscopic vision technology, the hand features such as the fingertip position, the palm center position and the like are reconstructed in the three-dimensional space. The finger tip position, the palm center position and the like in the three-dimensional space are parameterized, and a three-dimensional interactive model based on points, lines and planes is defined, thus realizing various three-dimensional gestures in the three-dimensional space, such as fingertip clicking, fingertip squeezing, palm overturning, fingertip directing and the like. The method needs only two ordinary network cameras to meet the demands of real-time man-machine interaction.

Description

A kind of natural interactive method based on three-dimension gesture
Technical field
The present invention relates to computer vision and human-computer interaction technology, relate in particular to a kind of man-machine interaction method based on three-dimension gesture.
Background technology
Traditional two-dimentional human-computer interaction technology (based on mouse/keyboard/game paddle/window interface), aspect its Software tool and interactive mode two, what developed is ripe relatively and perfect.But along with the continuous upgrading of man-machine interaction application demand, comprise the pursuit day by day that user's (being called the player in the recreation) experiences virtual scene, it is more and more obvious that the limitation of traditional two-dimentional man-machine interaction mode also embodies, and mainly shows:
From information representation capability, can not be with a kind of multi-dimensional relation of natural method representation complexity.When the expression multi-dimensional relation, have to use multi-windowed environment to represent application message, the division of multiwindow makes the user need bigger cognitive effort to set up the cognitive model of a complete sum unanimity.
On interactive mode, can't carry out nature, reasonably express with traditional interactive mode for the application in some fields (as the virtual world roaming etc.).When needs carry out three-dimension interaction with object, can only realize by the combination of different input modes.This method is very unnatural on the one hand, and has increased user's mutual difficulty, has increased the weight of the integration work of interactive task simultaneously yet.
Three-dimensional human-computer interaction technology is compared with the former limitation and to be had its born advantage, can satisfy the needs of three-dimension interaction naturally, mainly shows:
From information representation capability, can be enough a kind of multi-dimensional relation of natural method representation complexity.Three dimensional representation at certain application internal information is comparatively directly perceived, and is comparatively approaching with the object in the real world, and people are easier to this expression of perception and rise to a kind of rational knowledge.
On interactive mode, the interactive mode of people and object in the three-dimensional man-machine interaction mode simulating reality world, and allow real people or the direct and virtual three dimensional object of thing carry out alternately.This interactive mode is a nature and clearly for the expression of interaction semantics, and actual situation merges and can produce the mutual world simultaneously, can make the experience of man-machine interaction have more attractive force.
Summary of the invention
The objective of the invention is to overcome the deficiencies in the prior art, a kind of natural interactive method based on three-dimension gesture is provided.
The objective of the invention is to be achieved through the following technical solutions: a kind of natural interactive method based on three-dimension gesture comprises the steps:
(1) inputted video image from two cameras uses online training Face Detection algorithm to obtain the foreground image of hand to image.
(2) the hand foreground image to obtaining uses the finger tip detection method to detect fingertip location.
(3) reconstruction of three-dimensional fingertip location, and by three-dimensional fingertip location definition three-dimension gesture interaction semantics.
The invention has the beneficial effects as follows: the present invention is based on the natural interactive method utilization computer vision technique of three-dimension gesture, detect by foreground segmentation, finger tip and obtain the hand local feature, these local features comprise fingertip location, palm profile, position, the centre of the palm etc.The utilization stereovision technique is rebuild hand-characteristics such as fingertip location, position, the centre of the palm at three dimensions.Three-dimensional fingertip location, position, the centre of the palm etc. is carried out parametrization to be handled, definition one cover is based on the three-dimension interaction model of point, line, surface, by this set of model, realize multiple three-dimension gestures such as three-dimensional finger tip click, finger tip gripping, palm upset, finger tip sensing.The present invention only needs two common IP Camera, promptly can satisfy the needs of real time human-machine interaction.
Description of drawings
Fig. 1 is based on the system architecture diagram of three-dimension gesture natural interactive method;
Fig. 2 is a palm profile synoptic diagram;
Fig. 3 is the computing method synoptic diagram of K vector.
Embodiment
The natural interactive method that the present invention is based on three-dimension gesture carries out with dummy object naturally by three-dimension gesture in virtual reality or augmented reality environment alternately.Comprise the steps:
One, inputted video image from two cameras uses online training Face Detection algorithm to obtain the foreground image of hand to image.Online inspection training Face Detection algorithm utilizes the cluster characteristic of people's the colour of skin at the YCbCr color space, by judging that the particular color scope is partitioned into the hand foreground graphic in the video image.In the Face Detection process, used the online training method of the current colour of skin of real-time learning to reduce the influence of illumination variation to Face Detection.
Two, the hand foreground image to obtaining uses the finger tip detection method to detect fingertip location.The finger tip detection method has merged single finger tip detection and many finger tips detect.Under the situation of illumination variation, also can find the position of finger tip accurately, have very strong stability and environmental suitability.
Finger tip detection method step is as follows:
1, extracts finger foreground image profile.
2, seek the longest profile of length, this profile is the finger contours zone, under the bad situation of foreground segmentation, still can find the finger tip point like this.
Foreground segmentation is made mistakes if 3 profile length, are then thought finger less than a threshold value (distance dependent with hand and video camera is made as 100), carries out foreground segmentation again, otherwise carries out next step.
4, the profile that back is obtained carries out the polygonal approximation processing, can reduce some noise spots like this.The defective of calculating profile (part of the wide concave of finger wheel, quantity as shown in Figure 2) is if quantity is 0, changeing v continues to handle, otherwise demarcate all convex body defective locations, obtain their starting point, end point and depth point, starting point and end point are candidate's finger tip points.Then we judge the angle between starting point, end point, the depth point, if less than 120 degree, then starting point and end point are fingertip location.
5, calculate the K vector value of each point, and obtain the center of contour area.If ask finger tip to press K vector descending order for the first time, obtain the individual K vector of preceding N (being made as 10) extreme point, and relatively each extreme point is to the distance of center, the maximum point of distance is finger tip point, the position of writing down finger tip point simultaneously.Otherwise after obtaining top n K vector extreme point, comparing the distance of N extreme point and last registration finger tip point, if less than threshold value (being made as 5), then is the finger tip point, otherwise compares the distance of N extreme point to the center, and the maximum point of distance is the finger tip point.Write down fingertip location at last.
The K vector is meant: for each the pixel v on the profile, with this some position starting point, being the some position v1 of K according to the profile clockwise direction from its distance, is v2 apart from it for the point of K counterclockwise by profile, and then the K vector of v is:
( v 1 - v ) | | v 1 - v | | × ( v 2 - v ) | | v 2 - v | |
Fig. 3 has shown the computing method of K vector.
Three, reconstruction of three-dimensional fingertip location, and by three-dimensional fingertip location definition three-dimension gesture interaction semantics.
1, calibrating camera external parameter at first, thus the model view matrix of OpenGL obtained.
2, utilize model view matrix and the finger tip obtain to detect the fingertip location that obtains, utilize the three-dimensional reconstruction algorithm in the computer vision, rebuild finger tip point based on the three-dimensional position in the world coordinate system of mark.
3, according to the three-dimensional finger tip point of rebuilding, define the three-dimension interaction semanteme, thereby realize three-dimension interaction.We define the three-dimension interaction semanteme by fingertip location and palm profile.The three-dimension interaction semanteme of realizing comprises: finger tip clicks, two finger gripping, finger tip sensings, three-dimensional finger tip rate parameterization, palm overturn, the both hands finger tip is apart from controlling etc.According to these interaction semantics, set up the point, line, surface three-dimension gesture interaction models of a cover based on hand-characteristic.
4, by the three-dimension gesture interaction semantics of definition, we can realize multiple three-dimensional applications.Such as the virtual portrait in the direct game, thus the man-machine interaction of realization natural harmony.

Claims (3)

1. the natural interactive method based on three-dimension gesture is characterized in that, comprises the steps:
(1) inputted video image from two cameras uses online training Face Detection algorithm to obtain the foreground image of hand to image.
(2) the hand foreground image to obtaining uses the finger tip detection method to detect fingertip location.
(3) reconstruction of three-dimensional fingertip location, and by three-dimensional fingertip location definition three-dimension gesture interaction semantics.
2. according to the described natural interactive method of claim 1, it is characterized in that described step (2) is specific as follows based on three-dimension gesture:
(A) extract finger foreground image profile.
(B) seek the longest profile of length, this profile is the finger contours zone, and foreground segmentation is made mistakes if profile length, is then thought finger less than threshold value, carries out foreground segmentation again.
(C) profile that obtains is carried out polygonal approximation and handle, calculate the defects count of profile, if quantity is 0, changeing v continues to handle, otherwise demarcate all convex body defective locations, obtain their starting point, end point and depth point, starting point and end point are candidate's finger tip points; Then judge the angle between starting point, end point, the depth point, if less than 120 degree, then starting point and end point are fingertip location.
(D) calculate the K vector value of each point, and obtain the center of contour area.If ask finger tip to press K vector descending order for the first time, obtain top n K vector extreme point, and relatively each extreme point is to the distance of center, the maximum point of distance is finger tip point, the position of writing down finger tip point simultaneously.Otherwise after obtaining top n K vector extreme point, comparing the distance of N extreme point and last registration finger tip point, if less than threshold value, then is the finger tip point, otherwise compares the distance of N extreme point to the center, and the maximum point of distance is the finger tip point.Write down fingertip location at last.
The K vector is meant: for each the pixel v on the profile, with this some position starting point, being the some position v1 of K according to the profile clockwise direction from its distance, is v2 apart from it for the point of K counterclockwise by profile, and then the K vector of v is:
( v 1 - v ) | | v 1 - v | | × ( v 2 - v ) | | v 2 - v | | .
3. according to the described natural interactive method of claim 1, it is characterized in that described step (3) is specific as follows based on three-dimension gesture:
(a) calibrating camera external parameter at first, thus the model view matrix of OpenGL obtained.
(b) utilize model view matrix and the finger tip obtain to detect the fingertip location that obtains, utilize the three-dimensional reconstruction algorithm in the computer vision, rebuild finger tip point based on the three-dimensional position in the world coordinate system of mark.
(c) according to the three-dimensional finger tip point of rebuilding, define the three-dimension interaction semanteme, thereby realize three-dimension interaction.
(d) by the three-dimension gesture interaction semantics of definition, realize three-dimensional applications.
CN201010139526XA 2010-04-02 2010-04-02 Natural interactive method based on three-dimensional gestures Active CN101807114B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201010139526XA CN101807114B (en) 2010-04-02 2010-04-02 Natural interactive method based on three-dimensional gestures

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201010139526XA CN101807114B (en) 2010-04-02 2010-04-02 Natural interactive method based on three-dimensional gestures

Publications (2)

Publication Number Publication Date
CN101807114A true CN101807114A (en) 2010-08-18
CN101807114B CN101807114B (en) 2011-12-07

Family

ID=42608927

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201010139526XA Active CN101807114B (en) 2010-04-02 2010-04-02 Natural interactive method based on three-dimensional gestures

Country Status (1)

Country Link
CN (1) CN101807114B (en)

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102402276A (en) * 2010-09-13 2012-04-04 大同股份有限公司 Embedded device capable of identifying nonspecific gesture in real time and identification method thereof
CN102521567A (en) * 2011-11-29 2012-06-27 Tcl集团股份有限公司 Human-computer interaction fingertip detection method, device and television
CN102663364A (en) * 2012-04-10 2012-09-12 四川长虹电器股份有限公司 Imitated 3D gesture recognition system and method
CN102736733A (en) * 2011-04-15 2012-10-17 英吉尼克斯公司 Electronic systems with touch free input devices and associated methods
CN102799318A (en) * 2012-08-13 2012-11-28 深圳先进技术研究院 Human-machine interaction method and system based on binocular stereoscopic vision
CN102902356A (en) * 2012-09-18 2013-01-30 华南理工大学 Gesture control system and control method thereof
CN102981623A (en) * 2012-11-30 2013-03-20 深圳先进技术研究院 Method and system for triggering input instruction
CN103092376A (en) * 2011-10-27 2013-05-08 联想(北京)有限公司 Method and device for generating control commands and electronic equipment
CN103092491A (en) * 2011-10-27 2013-05-08 联想(北京)有限公司 Method and device for generating control commands and electronic equipment
CN103430215A (en) * 2011-03-21 2013-12-04 Lg电子株式会社 Display device and method of controlling the same
CN103475886A (en) * 2012-06-05 2013-12-25 纬创资通股份有限公司 Stereoscopic depth image establishing system and method thereof
CN103488972A (en) * 2013-09-09 2014-01-01 西安交通大学 Method for detection fingertips based on depth information
CN103544472A (en) * 2013-08-30 2014-01-29 Tcl集团股份有限公司 Processing method and processing device based on gesture images
CN103608844A (en) * 2011-06-22 2014-02-26 微软公司 Fully automatic dynamic articulated model calibration
CN103777754A (en) * 2014-01-10 2014-05-07 上海大学 Hand motion tracking device and method based on binocular infrared vision
CN104102347A (en) * 2014-07-09 2014-10-15 东莞万士达液晶显示器有限公司 Fingertip positioning method and fingertip positioning terminal
US8866781B2 (en) 2012-05-21 2014-10-21 Huawei Technologies Co., Ltd. Contactless gesture-based control method and apparatus
CN104461324A (en) * 2013-09-16 2015-03-25 联想(北京)有限公司 Information processing method and electronic equipment
CN104699377A (en) * 2013-12-04 2015-06-10 联想(北京)有限公司 Control method and electronic equipment
CN104766503A (en) * 2014-01-08 2015-07-08 财团法人工业技术研究院 cardiopulmonary resuscitation teaching system and method
CN105844705A (en) * 2016-03-29 2016-08-10 联想(北京)有限公司 Three-dimensional virtual object model generation method and electronic device
CN105867638A (en) * 2016-05-10 2016-08-17 华南理工大学 Embedded virtual keyboard based on binocular vision and method
US9525906B2 (en) 2013-04-08 2016-12-20 Hon Hai Precision Industry Co., Ltd. Display device and method of controlling the display device
CN106340039A (en) * 2016-08-16 2017-01-18 广州视源电子科技股份有限公司 Method and device for tracking finger contour
CN106340040A (en) * 2016-08-16 2017-01-18 广州视源电子科技股份有限公司 Method and device for tracking finger contour
CN106355591A (en) * 2016-08-16 2017-01-25 广州视源电子科技股份有限公司 Method and device for tracking finger contour
CN104102332B (en) * 2013-04-08 2017-07-28 鸿富锦精密工业(深圳)有限公司 Display device and its control system and method
CN109359566A (en) * 2018-09-29 2019-02-19 河南科技大学 The gesture identification method of hierarchical classification is carried out using finger characteristic
CN109395375A (en) * 2018-09-18 2019-03-01 华南理工大学 A kind of 3d gaming method of interface interacted based on augmented reality and movement
CN111596757A (en) * 2020-04-02 2020-08-28 林宗宇 Gesture control method and device based on fingertip interaction
CN111626364A (en) * 2020-05-28 2020-09-04 中国联合网络通信集团有限公司 Gesture image classification method and device, computer equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090160767A1 (en) * 2007-12-20 2009-06-25 University Of Central Florida Research Foundation Systems and Methods of Camera-Based Fingertip Tracking
CN101567093A (en) * 2009-05-25 2009-10-28 济南大学 Method for initializing three-dimension gesture model

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090160767A1 (en) * 2007-12-20 2009-06-25 University Of Central Florida Research Foundation Systems and Methods of Camera-Based Fingertip Tracking
CN101567093A (en) * 2009-05-25 2009-10-28 济南大学 Method for initializing three-dimension gesture model

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
《计算机工程》 20080831 应宏微,王蔚,宋加涛,任小波 基于Digiclops立体视觉系统的单指指尖跟踪 207-209 1,3 第34卷, 第16期 2 *

Cited By (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102402276A (en) * 2010-09-13 2012-04-04 大同股份有限公司 Embedded device capable of identifying nonspecific gesture in real time and identification method thereof
CN103430215A (en) * 2011-03-21 2013-12-04 Lg电子株式会社 Display device and method of controlling the same
CN103430215B (en) * 2011-03-21 2017-03-22 Lg电子株式会社 Display device and method of controlling the same
CN102736733A (en) * 2011-04-15 2012-10-17 英吉尼克斯公司 Electronic systems with touch free input devices and associated methods
CN103608844A (en) * 2011-06-22 2014-02-26 微软公司 Fully automatic dynamic articulated model calibration
CN103608844B (en) * 2011-06-22 2016-07-06 微软技术许可有限责任公司 The full-automatic model calibration that dynamically joint connects
CN103092491A (en) * 2011-10-27 2013-05-08 联想(北京)有限公司 Method and device for generating control commands and electronic equipment
CN103092376A (en) * 2011-10-27 2013-05-08 联想(北京)有限公司 Method and device for generating control commands and electronic equipment
CN103092491B (en) * 2011-10-27 2017-02-01 联想(北京)有限公司 Method and device for generating control commands and electronic equipment
CN103092376B (en) * 2011-10-27 2017-07-25 联想(北京)有限公司 Generate the method and apparatus and electronic equipment of control command
CN102521567B (en) * 2011-11-29 2013-10-23 Tcl集团股份有限公司 Human-computer interaction fingertip detection method, device and television
CN102521567A (en) * 2011-11-29 2012-06-27 Tcl集团股份有限公司 Human-computer interaction fingertip detection method, device and television
CN102663364A (en) * 2012-04-10 2012-09-12 四川长虹电器股份有限公司 Imitated 3D gesture recognition system and method
US8866781B2 (en) 2012-05-21 2014-10-21 Huawei Technologies Co., Ltd. Contactless gesture-based control method and apparatus
CN103475886A (en) * 2012-06-05 2013-12-25 纬创资通股份有限公司 Stereoscopic depth image establishing system and method thereof
CN103475886B (en) * 2012-06-05 2017-12-08 纬创资通股份有限公司 Stereoscopic depth image establishing system and method thereof
CN102799318B (en) * 2012-08-13 2015-07-29 深圳先进技术研究院 A kind of man-machine interaction method based on binocular stereo vision and system
CN102799318A (en) * 2012-08-13 2012-11-28 深圳先进技术研究院 Human-machine interaction method and system based on binocular stereoscopic vision
CN102902356A (en) * 2012-09-18 2013-01-30 华南理工大学 Gesture control system and control method thereof
CN102902356B (en) * 2012-09-18 2015-08-26 华南理工大学 A kind of gestural control system and control method thereof
CN102981623A (en) * 2012-11-30 2013-03-20 深圳先进技术研究院 Method and system for triggering input instruction
CN104102332B (en) * 2013-04-08 2017-07-28 鸿富锦精密工业(深圳)有限公司 Display device and its control system and method
US9525906B2 (en) 2013-04-08 2016-12-20 Hon Hai Precision Industry Co., Ltd. Display device and method of controlling the display device
CN103544472B (en) * 2013-08-30 2018-06-19 Tcl集团股份有限公司 A kind of processing method and processing unit based on images of gestures
CN103544472A (en) * 2013-08-30 2014-01-29 Tcl集团股份有限公司 Processing method and processing device based on gesture images
CN103488972A (en) * 2013-09-09 2014-01-01 西安交通大学 Method for detection fingertips based on depth information
CN103488972B (en) * 2013-09-09 2016-07-06 西安交通大学 Fingertip Detection based on depth information
CN104461324B (en) * 2013-09-16 2017-12-29 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN104461324A (en) * 2013-09-16 2015-03-25 联想(北京)有限公司 Information processing method and electronic equipment
CN104699377A (en) * 2013-12-04 2015-06-10 联想(北京)有限公司 Control method and electronic equipment
CN104766503A (en) * 2014-01-08 2015-07-08 财团法人工业技术研究院 cardiopulmonary resuscitation teaching system and method
CN103777754B (en) * 2014-01-10 2017-01-11 上海大学 Hand motion tracking device and method based on binocular infrared vision
CN103777754A (en) * 2014-01-10 2014-05-07 上海大学 Hand motion tracking device and method based on binocular infrared vision
CN104102347A (en) * 2014-07-09 2014-10-15 东莞万士达液晶显示器有限公司 Fingertip positioning method and fingertip positioning terminal
CN105844705A (en) * 2016-03-29 2016-08-10 联想(北京)有限公司 Three-dimensional virtual object model generation method and electronic device
CN105844705B (en) * 2016-03-29 2018-11-09 联想(北京)有限公司 A kind of three-dimensional object model generation method and electronic equipment
CN105867638A (en) * 2016-05-10 2016-08-17 华南理工大学 Embedded virtual keyboard based on binocular vision and method
CN106355591A (en) * 2016-08-16 2017-01-25 广州视源电子科技股份有限公司 Method and device for tracking finger contour
CN106340040A (en) * 2016-08-16 2017-01-18 广州视源电子科技股份有限公司 Method and device for tracking finger contour
CN106340039A (en) * 2016-08-16 2017-01-18 广州视源电子科技股份有限公司 Method and device for tracking finger contour
CN106340040B (en) * 2016-08-16 2019-06-14 广州视源电子科技股份有限公司 Method and device for tracking finger contour
CN106355591B (en) * 2016-08-16 2019-06-21 广州视源电子科技股份有限公司 Method and device for tracking finger contour
CN106340039B (en) * 2016-08-16 2019-06-21 广州视源电子科技股份有限公司 Method and device for tracking finger contour
CN109395375A (en) * 2018-09-18 2019-03-01 华南理工大学 A kind of 3d gaming method of interface interacted based on augmented reality and movement
CN109359566A (en) * 2018-09-29 2019-02-19 河南科技大学 The gesture identification method of hierarchical classification is carried out using finger characteristic
CN109359566B (en) * 2018-09-29 2022-03-15 河南科技大学 Gesture recognition method for hierarchical classification by using finger characteristics
CN111596757A (en) * 2020-04-02 2020-08-28 林宗宇 Gesture control method and device based on fingertip interaction
CN111626364A (en) * 2020-05-28 2020-09-04 中国联合网络通信集团有限公司 Gesture image classification method and device, computer equipment and storage medium
CN111626364B (en) * 2020-05-28 2023-09-01 中国联合网络通信集团有限公司 Gesture image classification method, gesture image classification device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN101807114B (en) 2011-12-07

Similar Documents

Publication Publication Date Title
CN101807114B (en) Natural interactive method based on three-dimensional gestures
CN106598227B (en) Gesture identification method based on Leap Motion and Kinect
WO2021103648A1 (en) Hand key point detection method, gesture recognition method, and related devices
CN105389539B (en) A kind of three-dimension gesture Attitude estimation method and system based on depth data
CN107728792B (en) Gesture recognition-based augmented reality three-dimensional drawing system and drawing method
CN100407798C (en) Three-dimensional geometric mode building system and method
JP5887775B2 (en) Human computer interaction system, hand-to-hand pointing point positioning method, and finger gesture determination method
Bonnici et al. Sketch-based interaction and modeling: where do we stand?
Yin et al. Finger identification and hand posture recognition for human–robot interaction
Zhang et al. A practical robotic grasping method by using 6-D pose estimation with protective correction
CN106951840A (en) A kind of facial feature points detection method
Shin et al. Gesture recognition using Bezier curves for visualization navigation from registered 3-D data
CN109977833A (en) Object tracking method, object tracking device, storage medium and electronic equipment
Wu et al. Robust fingertip detection in a complex environment
He et al. Real-time gesture recognition using 3D depth camera
CN103598870A (en) Optometry method based on depth-image gesture recognition
Bhattacharjee et al. A survey on sketch based content creation: from the desktop to virtual and augmented reality
CN105354812B (en) Multi-Kinect cooperation-based depth threshold segmentation algorithm contour recognition interaction method
CN103426000B (en) A kind of static gesture Fingertip Detection
CN104808790A (en) Method of obtaining invisible transparent interface based on non-contact interaction
CN104899591A (en) Wrist point and arm point extraction method based on depth camera
Yin et al. Estimation of the fundamental matrix from uncalibrated stereo hand images for 3D hand gesture recognition
JP2016167268A (en) Gesture modeling device, gesture modeling method, program for gesture modeling system, and gesture modeling system
CN104123008B (en) A kind of man-machine interaction method and system based on static gesture
Lan et al. Data fusion-based real-time hand gesture recognition with Kinect V2

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant