CN102999152B - A kind of gesture motion recognition methods and system - Google Patents
A kind of gesture motion recognition methods and system Download PDFInfo
- Publication number
- CN102999152B CN102999152B CN201110160290.2A CN201110160290A CN102999152B CN 102999152 B CN102999152 B CN 102999152B CN 201110160290 A CN201110160290 A CN 201110160290A CN 102999152 B CN102999152 B CN 102999152B
- Authority
- CN
- China
- Prior art keywords
- gesture motion
- user
- gesture
- hand
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000000034 method Methods 0.000 title claims abstract description 35
- 230000003287 optical effect Effects 0.000 claims abstract description 12
- 230000011218 segmentation Effects 0.000 claims abstract description 11
- 238000012545 processing Methods 0.000 claims abstract description 10
- 239000013589 supplement Substances 0.000 claims description 3
- 230000002093 peripheral effect Effects 0.000 claims description 2
- 230000001502 supplementing effect Effects 0.000 claims description 2
- 230000003993 interaction Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 4
- 230000003068 static effect Effects 0.000 description 2
- 230000007812 deficiency Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000004321 preservation Methods 0.000 description 1
- 230000009469 supplementation Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Landscapes
- User Interface Of Digital Computer (AREA)
- Image Analysis (AREA)
Abstract
The present invention relates to a kind of gesture motion recognition methods and system, by infrared structure light emitting units emitting structured light plane;The sensed image of the infrared signal composition that user reflects is received again through infrared structure optical sensor;Then pass through depth image processing unit and obtain the depth information in sensed image;The information near described infrared structure Optical Transmit Unit selected again through hand target tracking unit and record in described depth information, is hand exercise trace information;By hand exercise trace information described in gesture motion recognition unit identification, it is judged that corresponding user's gesture motion control command.Dynamic for user hand motion is separated by the present invention efficiently from complex background, and identifies based on direction in space segmentation, completes gesture motion identification process, it is not necessary to carries out prior gesture modeling and learning process, improves Consumer's Experience.
Description
Technical field
The present invention relates to human-computer interaction technology, specifically, relate to a kind of gesture motion recognition methods and system.
Background technology
Human-computer interaction technology is research field very popular in recent years, mouse and keyboard etc. are traditional human-computer interaction devices, occur in that the various novel man-machine interaction modes such as such as touch-control control, sound control, gesture control in recent years, improve in the naturality and friendly of Consumer's Experience.The non-contact type human-machine interaction mode being particularly representative with gesture control, by various kinds of sensors equipment, real-time or complete the identification process to hand motion within a short period of time, and it is converted into the order that the host equipments such as computer are capable of identify that, it is current popular a kind of man-machine interaction mode.
Gesture motion recognition methods can be divided into the technology based on data glove and technology two class based on machine vision, the mode using data glove can accurately sense and identify the deliberate action of hand, however it is necessary that and wear special data glove, be currently used primarily in the special application fields such as scientific research, precise controlling, robot.This gesture motion recognition methods uses trouble, and Consumer's Experience is very bad, and gesture motion is dumb.
And carry out gesture motion based on machine vision and know method for distinguishing, then requirement that hand images is identified, it requires that hand range image sensor is nearer, in order to obtain the image frame of enough identifying processings.Identifying of hand is generally adopted with the color histogram of distribution of color statistics for foundation because skin color can easily be separately from other sections out, but this method has the high requirement of comparison for ambient lighting, clothes color etc..Identification for hand there is a method in which, it is exactly use special color to be pasted onto on the ad-hoc location of finger, by identifying that special patch image replaces the identification to hand, so can reduce the workload of image procossing to a certain extent, but need to identify especially on hand user, use also inconvenient.
Traditional gesture motion identification process needs multiple complex steps and the processes such as gesture modeling, Hand Gesture Segmentation, gesture analysis.Gesture identification can be subdivided into again static gesture identification and dynamic hand gesture recognition two class, a point in static gesture correspondence gesture model parameter space, a track in dynamic gesture then corresponding gesture model space, therefore to relate to time and spatial context relation.Especially for dynamic gesture, the difference etc. of speed difference, track difference, proficiency level can be there is in different users when carrying out gesture motion, thus causing that gesture modeling track causes nonlinear wave on a timeline, and the elimination of this nonlinear wave is extremely difficult and complicated, so traditional Dynamic Recognition rate based on two dimensional image is generally not high enough.
Summary of the invention
Present invention is primarily targeted at and overcome the deficiencies in the prior art part, disclose a kind of gesture motion recognition methods and system, adopt infrared depth image sensor and depth image treatment technology, be greatly improved dynamic hand gesture recognition efficiency, promote Consumer's Experience.
The invention discloses a kind of gesture motion recognition methods, comprise the steps:
A, to user and place environment emitting structural optical plane thereof;Again through the structured light signal acquisition sensed image that optical sensor reception user and place Ambient thereof return;Then pass through depth image processing unit and obtain the depth information in sensed image;
B, the information near structured light unit being periodically chosen over and recording in described depth information, obtain hand exercise trace information;
C, by hand exercise trace information described in gesture motion recognition unit identification, it is judged that corresponding user's gesture motion control command.
Gesture motion recognition methods disclosed by the invention, it is also possible to including:
D, is converted to, by described user's gesture motion control command, the control command that host equipment is capable of identify that.
In described step B, open and may further include: according to the interval set, periodically initial trace coordinate is saved in coordinate data buffer memory array, again according to described X/Y plane, this initial trace coordinate is split direction to be grouped, and by the adjacent data cached removal with group, obtain hand exercise trace information;
Then in described step C, then order mate hand exercise trace information and gesture motion template according to group character.
Gesture motion recognition methods disclosed by the invention, also utilizes interpolation algorithm, supplements because the grouped data that causes of frame-skipping or collecting device error is discontinuous supplementing in described hand exercise trace information as required.
The invention also discloses a kind of gesture motion identification system, including central processor unit, for controlling native system co-ordination by software and peripheral circuit;Also include:
Structured light unit, is used for user position emitting structural optical plane;
Structured light sensor, the sensed image that the structured light signal reflected for receiving user is constituted;
Depth image processing unit, for obtaining the depth information in sensed image;
Hand target tracking unit, for the information near described structured light unit selecting and recording in described depth information, is hand exercise trace information;
Gesture motion recognition unit, is used for identifying described hand exercise trace information, it is judged that corresponding user's gesture motion control command.
Gesture motion identification system disclosed by the invention, it is also possible to including:
Control command converting unit, for being converted to, by described user's gesture motion control command, the control command that host equipment is capable of identify that.
In gesture motion identification system disclosed by the invention, also include the memorizer for storing coordinate data buffer memory array;Described hand target tracking unit, according to the interval set, periodically preserves initial trace coordinate in which memory.
A kind of gesture motion recognition methods disclosed by the invention and system, because adopting infrared depth image sensor and depth image treatment technology, so the hand motion of user can be separated efficiently from complex background, and real-time hand exercise tracking can be carried out, and then pass through hand exercise track and the gesture identification template based on direction in space segmentation, complete gesture motion identification process, it is made without prior gesture modeling and study, thus being greatly improved dynamic hand gesture recognition efficiency, improve Consumer's Experience.
Accompanying drawing explanation
Fig. 1 is the electrical structure block diagram of an embodiment of the gesture motion identification system of the present invention.
Fig. 2 is the X/Y plane segmentation schematic diagram used in the gesture motion recognition methods of the present invention.
Fig. 3 is the flow chart of an embodiment of the dynamic gesture action identification method of the present invention.
Fig. 4 is the gesture template matching flow chart of an embodiment of the dynamic gesture action identification method of the present invention.
Detailed description of the invention
Below in conjunction with the drawings and specific embodiments, the present invention is described in further detail.
The gesture motion recognition methods of the present invention is based on infrared depth image sensor and depth image processing unit, dynamic for user hand motion can be separated efficiently from complex background, and real-time hand exercise tracking can be carried out, and then pass through hand exercise track and the gesture identification template based on direction in space segmentation, complete gesture motion identification process, it is made without prior gesture modeling and study, such that it is able to avoid Problems existing in tradition gesture motion identification preferably, it is greatly improved dynamic hand gesture recognition efficiency.
Being illustrated in figure 1 the electrical structure block diagram of an embodiment of the gesture motion identification equipment of the present invention, its main composition includes:
Infrared structure Optical Transmit Unit: for launching the structured light plane having coding to direction, user place.
Infrared structure optical sensor unit: also referred to as infrared structure photoinduction unit, for receiving and sensing the infrared structure light returned via user place Ambient.
Depth image processing unit: according to structure light coding know-why, by contrasting the structure light coding of three-dimensional environment object and original planar structure pumped FIR laser, obtain the depth information of scene in structured light sensor visual range.
Hand target tracking unit: for following the tracks of user's hand target trajectory from environmental background.Without loss of generality, we assume that user in practical service environment relative to other scene parts be motion, in the present invention, user's hand motion scope is all given tacit consent to and is limited to user's body front by we, and the Moving Objects location of pixels that therefore distance depth image sensor is closest is the hand object of user.So, we only need to follow the tracks of in continuous depth image data frame, and the minimum motion pixel target of the degree of depth can reach the effect of user's hand target following.Here, depth image data frame is the deep image information of depth transducer identification and preservation, and each pixel is the distance value between corresponding target and sensor.
Gesture motion recognition unit: for the user's hand exercise track following result according to hand target tracking unit, compare with the predetermined gesture motion template based on direction in space segmentation, and comparison result is sent to control command converting unit as gesture motion recognition result.
Control command converting unit: for specific gesture motion instruction is converted to the control command that host system is capable of identify that, the gesture motion for controlling the present invention controls the duty of equipment.
It is illustrated in figure 2 the X/Y plane segmentation schematic diagram of use in the gesture motion recognition methods of the present invention.Without loss of generality, we for be perpendicular to ground level with the subscriber station cube X/Y plane to parallel distance user 30 centimeters, we are this X/Y plane, and the intersection point formed with user's one hand flattened is for coordinate origin, what be perpendicular to ground is Y-axis, and what be parallel to ground is X-axis;It is as the criterion according to through the X-axis of initial point, Y-axis, positive and negative oblique 45 degree of cut-off rules on this plane, this plane is divided into 8 parts, start from surface, according to numbering clockwise corresponding S1, S2, S3, S4, S5, S6, S7, S8 respectively, to amount to 8 regions.
Being illustrated in figure 3 the flow chart of the dynamic gesture action identification method of the present invention, as one embodiment of the present of invention, key step includes:
1, infrared structure Optical Transmit Unit launches encoded planar structure light to direction, user place.
2, infrared structure optical sensor unit receives and senses the infrared structure light via user and Ambient thereof.
3, depth image processing unit is according to structure light coding know-why, is obtained the depth information of sensed image by the structure light coding and original plane structure light coding contrasting three-dimensional environment object, and stores into continuous print depth map picture frame.
4, hand target tracking unit is by the result of continuous depth map picture frame, the motion pixel partial target that tracking depths is minimum, i.e. user's gesture motion track.
5, gesture motion recognition unit is compared with the gesture motion template based on direction in space segmentation preset according to user's continuous print gesture motion track, identifies user's gesture motion.
Here, the gesture motion template based on direction in space segmentation refers to, without loss of generality, defines different gesture motion templates, and such as, the action template of correspondence of waving to the left is { S2-S1-S8-S1}, called after M1 template;The action template of correspondence of waving to the right is { S7-S8-S1-S2}, called after M2 template;The action template that turns clockwise for S7-S8-S1-S2-S3-S4-S5-S6}, called after M3 template, etc..
6, the action command that gesture motion recognition unit identifies sends to control command converting unit, thus specific gesture motion instruction is converted to the control command that host system is capable of identify that.
It is illustrated in figure 4 the gesture template matching flow chart in the dynamic gesture action identification method of the present invention.As one embodiment of the present of invention, gesture motion track is as follows with the process of template contrast:
In the interval set, the trajectory coordinates of the process of hand target tracking unit is placed in a coordinate data buffer memory array in such as every 30 milliseconds, because the resolution of depth image is general very low, whole hand in depth image also with regard to the granularity of several pixels, the motion of hand is exactly the motion of corresponding a small amount of neighbor in range image sequence, and the initial trace coordinate in coordinate data buffer memory array is the meansigma methods of these pixel correspondence positions.
Then according to described X/Y plane, above-mentioned initial trace coordinate is split direction to be grouped, and by the adjacent data cached removal with group, further according to requiring supplementation with, the grouped data caused because of frame-skipping or collecting device error etc. is discontinuous, is generally adopted interpolation algorithm to supplement the discontinuous of smoothed data.
Then the hand target trajectory group character by reality acquisition of order is mated with gesture motion template again, finally draws matching result.Thus learn the gesture motion control command of user according to template.
The present invention is by based on infrared depth image sensor and depth image processing unit, dynamic for user hand motion can be separated efficiently from complex background, and real-time hand exercise tracking can be carried out, and then pass through hand exercise track and the gesture identification template based on direction in space segmentation, complete gesture motion identification process, it is made without prior gesture modeling and learning process, such that it is able to avoid the Problems existing in tradition gesture motion identification preferably, it is greatly improved dynamic hand gesture recognition efficiency, promotes Consumer's Experience.
The foregoing is only presently preferred embodiments of the present invention, not in order to limit the present invention, all any amendment, equivalent replacement and improvement etc. made within the spirit and principles in the present invention, should be included within protection scope of the present invention.
Claims (4)
1. a gesture motion recognition methods, it is characterised in that comprise the steps:
A, to user and place environment emitting structural optical plane thereof;Again through the structured light signal acquisition sensed image that optical sensor reception user and place Ambient thereof return;Then pass through depth image processing unit and obtain the depth information in sensed image;
B, periodically it is chosen over and the information near structured light unit that records in described depth information, obtains hand exercise trace information;
C, by hand exercise trace information described in gesture motion recognition unit identification, it is judged that corresponding user's gesture motion control command;
D, described user's gesture motion control command is converted to the control command that host equipment is capable of identify that;
Farther include at described step B: according to the interval set, periodically initial trace coordinate is saved in coordinate data buffer memory array, again according to X/Y plane, this initial trace coordinate is split direction to be grouped, and by the adjacent data cached removal with group, obtain hand exercise trace information;Described X/Y plane is perpendicular to ground level and with subscriber station cube to parallel;
In described step C, then order mate hand exercise trace information and gesture motion template according to group character.
2. gesture motion recognition methods as claimed in claim 1, it is characterised in that also utilize interpolation algorithm, supplements because the grouped data that causes of frame-skipping or collecting device error is discontinuous supplementing in described hand exercise trace information as required.
3. a gesture motion identification system, including central processor unit, for controlling native system co-ordination by software and peripheral circuit;It is characterized in that, also include:
Structured light unit, is used for user position emitting structural optical plane;
Structured light sensor, the sensed image that the structured light signal reflected for receiving user is constituted;
Depth image processing unit, for obtaining the depth information in sensed image;
Hand target tracking unit, for the information near described structured light unit selecting and recording in described depth information, is hand exercise trace information;According to the interval set, periodically initial trace coordinate is saved in coordinate data buffer memory array, then this initial trace coordinate is grouped according to X/Y plane segmentation direction, and by the adjacent data cached removal with group, obtain hand exercise trace information;Described X/Y plane is perpendicular to ground level and with subscriber station cube to parallel;
Gesture motion recognition unit, is used for identifying hand exercise trace information, it is judged that corresponding user's gesture motion control command, then the hand target trajectory group character by reality acquisition of order is mated with gesture motion template.
4. system as claimed in claim 3, it is characterised in that also include:
Control command converting unit, for being converted to, by described user's gesture motion control command, the control command that host equipment is capable of identify that.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201110160290.2A CN102999152B (en) | 2011-09-09 | 2011-09-09 | A kind of gesture motion recognition methods and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201110160290.2A CN102999152B (en) | 2011-09-09 | 2011-09-09 | A kind of gesture motion recognition methods and system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102999152A CN102999152A (en) | 2013-03-27 |
CN102999152B true CN102999152B (en) | 2016-06-29 |
Family
ID=47927810
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201110160290.2A Expired - Fee Related CN102999152B (en) | 2011-09-09 | 2011-09-09 | A kind of gesture motion recognition methods and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102999152B (en) |
Families Citing this family (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103336967B (en) * | 2013-05-27 | 2016-12-28 | 东软集团股份有限公司 | A kind of hand motion trail detection and device |
CN103268482B (en) * | 2013-05-31 | 2016-02-24 | 清华大学 | A kind of gesture of low complex degree is extracted and gesture degree of depth acquisition methods |
CN103744602A (en) * | 2013-11-14 | 2014-04-23 | 深圳市至高通信技术发展有限公司 | Display screen operating method and terminal device |
CN104616028B (en) * | 2014-10-14 | 2017-12-12 | 北京中科盘古科技发展有限公司 | Human body limb gesture actions recognition methods based on space segmentation study |
CN104700096B (en) * | 2015-03-30 | 2018-07-13 | 北京奇艺世纪科技有限公司 | A kind of user action identified areas based on image determines method and device |
CN105069409B (en) * | 2015-07-24 | 2018-05-15 | 上海科勒电子科技有限公司 | A kind of three-dimensional gesture recognition method and system based on infrared technique |
CN105117000A (en) * | 2015-07-29 | 2015-12-02 | 青岛海信医疗设备股份有限公司 | Method and device for processing medical three-dimensional image |
CN105224214B (en) * | 2015-08-26 | 2018-09-04 | 广东欧珀移动通信有限公司 | A kind of method for controlling dialing and smartwatch |
CN107025830A (en) * | 2016-01-29 | 2017-08-08 | 北京新唐思创教育科技有限公司 | The analogy method and device of a kind of teaching experiment |
CN107309883A (en) * | 2016-04-27 | 2017-11-03 | 王方明 | Intelligent robot |
CN106774850B (en) * | 2016-11-24 | 2020-06-30 | 深圳奥比中光科技有限公司 | Mobile terminal and interaction control method thereof |
WO2018106276A1 (en) * | 2016-12-05 | 2018-06-14 | Youspace, Inc. | Systems and methods for gesture-based interaction |
CN106933347A (en) * | 2017-01-20 | 2017-07-07 | 深圳奥比中光科技有限公司 | The method for building up and equipment in three-dimensional manipulation space |
CN106919261A (en) * | 2017-03-08 | 2017-07-04 | 广州致远电子股份有限公司 | A kind of infrared gesture identification method and device based on zone sequence reconstruct |
CN107358171B (en) * | 2017-06-22 | 2019-08-02 | 华中师范大学 | A kind of gesture identification method based on COS distance and dynamic time warping |
CN107357356A (en) * | 2017-07-04 | 2017-11-17 | 北京有初科技有限公司 | Miniature projection computer and the method using gesture control Miniature projection computer page turning |
CN107589834B (en) * | 2017-08-09 | 2020-08-07 | Oppo广东移动通信有限公司 | Terminal device operation method and device and terminal device |
CN107818585B (en) * | 2017-09-27 | 2020-05-29 | 歌尔科技有限公司 | Method and device for determining finger position information of user, projector and projection system |
CN108229391B (en) | 2018-01-02 | 2021-12-24 | 京东方科技集团股份有限公司 | Gesture recognition device, server thereof, gesture recognition system and gesture recognition method |
CN108648225B (en) | 2018-03-31 | 2022-08-02 | 奥比中光科技集团股份有限公司 | Target image acquisition system and method |
CN108540717A (en) * | 2018-03-31 | 2018-09-14 | 深圳奥比中光科技有限公司 | Target image obtains System and method for |
CN108549878B (en) * | 2018-04-27 | 2020-03-24 | 北京华捷艾米科技有限公司 | Depth information-based hand detection method and system |
CN108549489B (en) * | 2018-04-27 | 2019-12-13 | 哈尔滨拓博科技有限公司 | gesture control method and system based on hand shape, posture, position and motion characteristics |
CN108961314B (en) * | 2018-06-29 | 2021-09-17 | 北京微播视界科技有限公司 | Moving image generation method, moving image generation device, electronic device, and computer-readable storage medium |
CN109397286A (en) * | 2018-09-29 | 2019-03-01 | Oppo广东移动通信有限公司 | Robot control method, device, electronic equipment and computer readable storage medium |
CN109407842A (en) * | 2018-10-22 | 2019-03-01 | Oppo广东移动通信有限公司 | Interface operation method, device, electronic equipment and computer readable storage medium |
CN110633666A (en) * | 2019-09-10 | 2019-12-31 | 江南大学 | Gesture track recognition method based on finger color patches |
CN110889390A (en) * | 2019-12-05 | 2020-03-17 | 北京明略软件系统有限公司 | Gesture recognition method, gesture recognition device, control equipment and machine-readable storage medium |
KR20220010885A (en) | 2020-07-20 | 2022-01-27 | 에스케이하이닉스 주식회사 | Apparatus for recognizing motion by using ToF sensor, and method for operating the same |
CN112306237B (en) * | 2020-10-21 | 2024-10-18 | 广州朗国电子科技股份有限公司 | Three-dimensional touch method based on electromagnetic wave reflection, touch equipment and storage medium |
CN113052043B (en) * | 2021-03-17 | 2024-06-07 | 深圳荆虹科技有限公司 | Hand detection method and device for reducing false detection rate |
CN113111738B (en) * | 2021-03-26 | 2023-12-19 | 常州工学院 | Dynamic gesture recognition method and device based on video image processing |
CN114460881A (en) * | 2022-02-11 | 2022-05-10 | 广东好太太智能家居有限公司 | Clothes airing equipment control device and method and clothes airing equipment |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6252623B1 (en) * | 1998-05-15 | 2001-06-26 | 3Dmetrics, Incorporated | Three dimensional imaging system |
CN1842824A (en) * | 2004-08-03 | 2006-10-04 | 松下电器产业株式会社 | Human identification apparatus and human searching/tracking apparatus |
EP1796043A2 (en) * | 2005-12-07 | 2007-06-13 | Nissan Motor Co., Ltd. | Object detection |
CN101388114A (en) * | 2008-09-03 | 2009-03-18 | 北京中星微电子有限公司 | Method and system for estimating human body attitudes |
CN102012778A (en) * | 2009-09-04 | 2011-04-13 | 索尼公司 | Display control apparatus, display control method, and display control program |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8659658B2 (en) * | 2010-02-09 | 2014-02-25 | Microsoft Corporation | Physical interaction zone for gesture-based user interfaces |
-
2011
- 2011-09-09 CN CN201110160290.2A patent/CN102999152B/en not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6252623B1 (en) * | 1998-05-15 | 2001-06-26 | 3Dmetrics, Incorporated | Three dimensional imaging system |
CN1842824A (en) * | 2004-08-03 | 2006-10-04 | 松下电器产业株式会社 | Human identification apparatus and human searching/tracking apparatus |
EP1796043A2 (en) * | 2005-12-07 | 2007-06-13 | Nissan Motor Co., Ltd. | Object detection |
CN101388114A (en) * | 2008-09-03 | 2009-03-18 | 北京中星微电子有限公司 | Method and system for estimating human body attitudes |
CN102012778A (en) * | 2009-09-04 | 2011-04-13 | 索尼公司 | Display control apparatus, display control method, and display control program |
Also Published As
Publication number | Publication date |
---|---|
CN102999152A (en) | 2013-03-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102999152B (en) | A kind of gesture motion recognition methods and system | |
Memo et al. | Head-mounted gesture controlled interface for human-computer interaction | |
Ma et al. | Kinect Sensor‐Based Long‐Distance Hand Gesture Recognition and Fingertip Detection with Depth Information | |
Betancourt et al. | The evolution of first person vision methods: A survey | |
CN112926423B (en) | Pinch gesture detection and recognition method, device and system | |
CN105739702B (en) | Multi-pose finger tip tracking for natural human-computer interaction | |
Kragic et al. | Vision for robotic object manipulation in domestic settings | |
CN103941866B (en) | Three-dimensional gesture recognizing method based on Kinect depth image | |
TWI684136B (en) | Robot, control system and method for operating the robot | |
US8442307B1 (en) | Appearance augmented 3-D point clouds for trajectory and camera localization | |
CN111328396A (en) | Pose estimation and model retrieval for objects in images | |
Sun et al. | Magichand: Interact with iot devices in augmented reality environment | |
KR102285915B1 (en) | Real-time 3d gesture recognition and tracking system for mobile devices | |
Zhang et al. | A practical robotic grasping method by using 6-D pose estimation with protective correction | |
CN102426480A (en) | Man-machine interactive system and real-time gesture tracking processing method for same | |
US10755422B2 (en) | Tracking system and method thereof | |
CN103353935A (en) | 3D dynamic gesture identification method for intelligent home system | |
CN103105924B (en) | Man-machine interaction method and device | |
CN111444764A (en) | Gesture recognition method based on depth residual error network | |
CN112507918B (en) | Gesture recognition method | |
CN102566827A (en) | Method and system for detecting object in virtual touch screen system | |
CN102799271A (en) | Method and system for identifying interactive commands based on human hand gestures | |
CN103713730A (en) | Mid-air gesture recognition method and device applied to intelligent terminal | |
Afif et al. | Vision-based tracking technology for augmented reality: a survey | |
CN105205786B (en) | A kind of picture depth restoration methods and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right | ||
TR01 | Transfer of patent right |
Effective date of registration: 20170322 Address after: 518053 Nanshan District, Guangdong overseas Chinese town, Shantou street, No. 7, No. Patentee after: Shenzhen KONKA Telecommunications Technology Co.,Ltd. Address before: 518053 Nanshan District overseas Chinese town, Shenzhen, Guangdong Patentee before: KONKA GROUP Co.,Ltd. |
|
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20160629 Termination date: 20210909 |