CN102663364A - Imitated 3D gesture recognition system and method - Google Patents

Imitated 3D gesture recognition system and method Download PDF

Info

Publication number
CN102663364A
CN102663364A CN2012101032104A CN201210103210A CN102663364A CN 102663364 A CN102663364 A CN 102663364A CN 2012101032104 A CN2012101032104 A CN 2012101032104A CN 201210103210 A CN201210103210 A CN 201210103210A CN 102663364 A CN102663364 A CN 102663364A
Authority
CN
China
Prior art keywords
gesture
camera
gesture identification
television terminal
gesture motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2012101032104A
Other languages
Chinese (zh)
Inventor
宋立立
贾汇东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Changhong Electric Co Ltd
Original Assignee
Sichuan Changhong Electric Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Changhong Electric Co Ltd filed Critical Sichuan Changhong Electric Co Ltd
Priority to CN2012101032104A priority Critical patent/CN102663364A/en
Publication of CN102663364A publication Critical patent/CN102663364A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention relates to the gesture recognition technology in the field of smart TV and intelligent STB(Set Top Box), and discloses an imitated 3D gesture identification system, which solves the problems of high cost, limited identification of the action type and low recognition rate of the gesture in the traditional technology. The system comprises a gesture recognition unit, a main camera and an assistant camera, wherein the main camera is used for collecting the gesture action information in parallel with the surface of the TV terminal screen in front of the TV terminal screen and transmitting to the gesture recognition unit; the assistant camera is used for acquiring gesture information vertical to the face of the TV terminal screen in front of the TV terminal screen and transmitting the gesture information to the gesture recognition unit; the gesture recognition unit is used for performing analysis based on the combination of the gesture action information collected by the main camera and the gesture information collected by the assistant camera, recognizing the gesture action in the three-dimensional space, translating to the corresponding command sending the command to the TV terminal. Furthermore, the invention also provides an imitated 3D gesture recognition method which is suitable for digital TV.

Description

Imitative 3D gesture identification system and method
Technical field
The present invention relates to the Gesture Recognition of intelligent television, intelligent field of set-top, particularly a kind of imitative 3D gesture identification system and gesture identification method.
Background technology
Intellectuality is the development trend of television terminal equipment; Each terminal producer is all very positive, has made careful planning, but the development of intelligent television industry is faced with formidable challenges; Except that use and challenge that industrial chain is brought, the challenge that user experience is brought is perhaps more direct and crucial.Traditional use mouse, keyboard, telepilot etc. can not satisfy current needs as the human-computer interaction technology of interactive means in the past.Under this background, obtain fast development as the Gesture Recognition of intelligent human-machine interaction system.Gesture as a kind of intuitively, the nature input mode, from traditional contact input media, free people, can be mutual with a kind of more natural mode and smart machine.The gesture identification that is fit to large-screen interactive more will obtain wide application prospect in field of household appliances such as TV, STBs.
The gesture identification scheme that is applied at present on the television terminal mainly contains two kinds: 1. adopt a 2D camera to carry out the collection of gesture motion, discern then; This mode is because the 2D camera can only be gathered the two dimensional surface gesture motion, and discrimination is low, the recognized action type is limited; 2. adopt a 3D camera to carry out the collection of gesture motion, discern then; This mode can be gathered three-dimensional gesture motion, and discrimination is high, the recognized action type is extensive, yet cost is too high.
But problems such as discrimination is low, the recognized action type is limited that present Gesture Recognition faces again; If adopt the 3D camera to face the too high restriction of cost again; The present invention realizes the gesture identification of imitative 3D through two common 2D cameras, two set of model; Make discrimination, reliability be improved, cost is controlled preferably simultaneously.
Summary of the invention
Technical matters to be solved by this invention is: propose a kind of imitative 3D gesture identification system and recognition methods, solve in the conventional art problem that discrimination is low, the identification maneuver type is limited and cost is high to gesture motion.
The present invention solves the problems of the technologies described above the technical scheme that is adopted:
Imitative 3D gesture identification system comprises gesture identification unit, main camera and auxiliary camera;
Said main camera be used to gather television terminal screen the place ahead with the parallel gesture motion information of television terminal screen place face, and send the gesture identification unit to;
Said auxiliary camera be used to gather television terminal screen the place ahead with the vertical gesture motion information of television terminal screen place face, and send the gesture identification unit to;
Said gesture identification unit is used to combine the gesture motion information of main camera collection and the gesture information of auxiliary camera collection to analyze, and identifies the user in three-dimensional gesture motion, and is converted into corresponding order and sends television terminal to.
Further, said main camera is installed on the television terminal, and camera lens keeps certain angle over against the television terminal dead ahead or with the television terminal dead ahead; Said auxiliary camera is installed on the ground, and camera lens up and be parallel to the face at television terminal screen place or keep certain angle with the face at television terminal screen place.
Further, said main camera and auxiliary camera are the 2D camera.
Further, said gesture identification unit comprises:
The gesture MBM is used for gesture motion is carried out mathematical modeling, and gesture motion is mapped to the model parameter space, establishes relevant functional requirement, treatment principle and incidence relation, and establishes cooperation flow process and the rule between main camera and the auxiliary camera;
Initialization module is used for when use first should be imitated 3D gesture identification system, carrying out relevant parameters initialization setting automatically according to the putting position and the environment for use of main camera and auxiliary camera;
The image pre-processing module, be used for to main camera and auxiliary camera collection to image carry out corresponding pre-service;
Gestures detection and tracking module are used for confirming the hand exercise zone from being partitioned into gesture motion through pretreated image and following the tracks of;
The gesture feature extraction module is used to extract the static nature and the behavioral characteristics of gesture motion;
The gesture identification module, the static nature and the behavioral characteristics that are used to combine to extract identify gesture motion, and are converted into corresponding order and send television terminal to.
Imitative 3D gesture identification method may further comprise the steps:
A. the user sends gesture command to television terminal;
B. main camera and auxiliary camera are gathered the image that comprises gesture motion information from different perspectives, and send the gesture identification unit to and carry out corresponding pre-service;
C. the hand exercise zone is confirmed from through being partitioned into gesture motion the pretreated image and following the tracks of in the gesture identification unit;
D. the gesture identification unit extracts the static nature and the behavioral characteristics of gesture motion;
E. the gesture identification unit combines the static nature and the behavioral characteristics of gesture motion to identify gesture motion, converts corresponding order into and sends television terminal to;
F. television terminal receives the corresponding command and carries out.
Further, before step a, also comprise step:
A0. when imitating the 3D gesture identification first, the gesture identification unit carries out relevant parameters initialization setting automatically according to the putting position and the environment for use of main camera and auxiliary camera.
Further, among the step b, said corresponding pre-service comprises carries out noise reduction, enhancing and color conversion process to image.
Further, among the step c, from being meant: from the pretreated image of process, isolate and lock the hand candidate region according to the hand colour of skin, shape, area parameters through being partitioned into gesture motion the pretreated image and following the tracks of.
Further; Among the step e; The static nature of gesture identification unit combination gesture motion and behavioral characteristics identify gesture motion and are meant: static nature information in the gesture motion and the gesture feature information in the template characteristic storehouse of setting up are in advance compared; And the dynamic feature information of employing HMM algorithm computation gesture motion, thereby identify gesture motion.
Further, in the steps d, said static nature comprises: hand profile, position, area, finger distributed intelligence; Said behavioral characteristics comprises: gesture motion in the space, temporal change information.
The invention has the beneficial effects as follows: adopt two 2D cameras to gather the image that comprises gesture motion from different perspectives; Carry out gesture motion analysis, extraction and identification; Thereby realize imitative 3D gesture identification; With respect to adopting single 2D camera in the conventional art, more to the discrimination gesture motion higher, identification of gesture motion; With respect to directly adopting the 3D camera in the conventional art, can practice thrift cost.
Description of drawings
Fig. 1 is the imitative 3D gesture identification system architecture diagram among the present invention;
Fig. 2 is the fundamental diagram of gesture identification unit;
Fig. 3 is an imitative 3D gesture identification method process flow diagram of the present invention.
Embodiment
To the problem that the discrimination to gesture motion is low in the conventional art, the identification maneuver type is limited and cost is high; The present invention proposes a kind of imitative 3D gesture identification system and recognition methods; It adopts two 2D cameras to gather the image that comprises gesture motion from different perspectives; The input of two group images, two set of model make up, two cover gesture algorithms, and the gesture that not only can catch two dimensional surface changes, and can also follow the tracks of the gesture change in depth; The dynamic gesture identification that realizes imitative 3D is with respect to adopting single 2D camera in the conventional art, and is more to the discrimination gesture motion higher, identification of gesture motion; With respect to directly adopting the 3D camera in the conventional art, can practice thrift cost.
As shown in Figure 1, this system comprises gesture identification unit, main camera and auxiliary camera;
Said main camera be used to gather television terminal screen the place ahead with the parallel gesture motion information of television terminal screen place face, and send the gesture identification unit to;
Said auxiliary camera be used to gather television terminal screen the place ahead with the vertical gesture motion information of television terminal screen place face, and send the gesture identification unit to;
Said gesture identification unit is used to combine the gesture motion information of main camera collection and the gesture information of auxiliary camera collection to analyze, and identifies the user in three-dimensional gesture motion, and is converted into corresponding order and sends television terminal to.
Installation site for main camera and auxiliary camera; Multiple set-up mode can be arranged; In real life; The user can confirm the installation site of two cameras according to environment for use, as: main camera is installed on the television terminal, and camera lens keeps certain angle over against the television terminal dead ahead or with the television terminal dead ahead; Auxiliary camera is installed on the ground, and camera lens up and be parallel to the face at television terminal screen place or keep certain angle with the face at television terminal screen place.
The formation of gesture identification unit is as shown in Figure 2, and it comprises: gesture MBM (not marking among the figure), initialization module, image pre-processing module, gestures detection and tracking module, gesture feature extraction module, gesture identification module;
The gesture MBM: carry out mathematical modeling to gesture, to the model parameter space, establish relevant functional requirement, treatment principle, incidence relation etc. to the gesture action mapping, and the flow process and the rule that cooperate between definite major-minor camera;
Initialization module: when using first, carry out relevant parameters initialization setting automatically according to the putting position and the environment for use of two cameras;
The image pre-processing module: the image to input carries out noise reduction, enhancing, color conversion etc., makes it be applicable to gestures detection and tracking module, gesture feature extraction module;
Gestures detection and tracking module: from image, be partitioned into gesture and tracking; The method that employing combines with movable information based on characteristic; The colour of skin of utilizing the people is classified as the candidate region to zone similar with the colour of skin in the image as characteristic, according to the area parameters filtration fraction noise of hand, face; Shape (the ratio of width to height etc.) parameter of passing through hand, face at last can filter out most of noise region, the zone of separation place hand and face; Use difference frame method to judge movable information at last: present frame and former frame (preceding some frames) are subtracted each other, and the difference that detects between two frames is confirmed the hand exercise zone;
Gesture feature extraction module: extract static nature (profile, position, area, finger distribution etc.) and behavioral characteristics (space, time change);
The gesture identification module: the mode that adopts the sound attitude to combine, template matching method is adopted in static identification, gathers static gesture as sample, extracts characteristic as the template characteristic storehouse, and input gesture and ATL mate; Dynamic Recognition adopts the HMM algorithm.
Imitative 3D gesture identification method among the present invention, can adopt following steps to accomplish:
1, initialization setting: according to two camera putting positions and angle, environment for use etc. initial parameter is set automatically when using first.According to interface prompt, the user makes " about up and down before " several gesture motion can accomplish the initialization setting.
2, image collection: two cameras are images acquired from different perspectives, and carries out the pre-service of image, and pre-service comprises carries out noise reduction, enhancing and color conversion etc. to image;
3, gesture is cut apart: from image, isolate and lock the hand candidate region according to parameters such as the hand colour of skin, shape, areas.The colour of skin of human body has particular range at color space, and the colour of skin of utilizing human body is classified as the candidate region to zone similar with the colour of skin in the image as characteristic; In the isolated candidate region, hand, facial area are to be different from the direct parameter in noise zone, filter out partial noise according to area parameters; The form parameter (pointing characteristics such as distribution) of passing through hand at last can filter out most of noise region, separates the zone of selling.
4, gesture is followed the tracks of: use difference frame method to isolate hand region, and follow the tracks of hand gesture location; Subtract each other through present frame and former frame (or preceding some frames), detect the initiation region that difference between two two field pictures is used for confirming hand exercise, and definite palm position.
5, feature extraction: extract characteristics such as hand-type in the image, three-dimensional position, finger distribution through two cameras, two groups of algorithms.The contour feature of hand can be represented with continuous point, through edge detection algorithm, the rim detection of image-region is come out, and handles through level and smooth and polygon fitting algorithm, obtains polygonal vertex sequence.The position feature of hand is meant the centroid position of palm, draws through the MeanShift algorithm computation.The palm area features draws through calculating the wide area of a polygon of handwheel.Utilize finger apart from palm center this rule farthest; It is the polar coordinate system of initial point that the coordinate conversion of handle profile point becomes with the palm center; Analyze the local maximum in the contour curve then, be finger to the location map of point through mapping ruler at last, confirm the finger distribution characteristics.
6, gesture identification: the variation of static library coupling identification hand-type etc., and be converted into specific instruction; Location, dynamic algorithm implementation space, locking cursor position and corresponding operating.Gather static gesture as sample, extract characteristic as the template characteristic storehouse, the gesture and the ATL of input mate.Dynamic gesture has time and spatial variations, becomes temporal symbol sebolic addressing after the gesture feature quantization encoding, and Dynamic Recognition adopts the HMM algorithm.
The information of coming out through the gesture identification of above-mentioned steps realization is converted into control commands corresponding; Send to television terminal; Just can carry out control corresponding to television terminal; Mainly be that the gesture track is mapped to mouse event to the control of TV in the present invention promptly, the position of the corresponding mouse that moves up and down of gesture for example, the click action that moves forward and backward corresponding mouse of gesture.With the basic controlling to TV programme is example: palm is drawn a circle clockwise when watch programs; Just can access the control interface; The slip palm moves to the control corresponding icon with cursor, promotes palm forward and can realize the control to correlation parameter, like channel switching, volume plus-minus etc.

Claims (10)

1. imitative 3D gesture identification system is characterized in that, comprises gesture identification unit, main camera and auxiliary camera;
Said main camera be used to gather television terminal screen the place ahead with the parallel gesture motion information of television terminal screen place face, and send the gesture identification unit to;
Said auxiliary camera be used to gather television terminal screen the place ahead with the vertical gesture motion information of television terminal screen place face, and send the gesture identification unit to;
Said gesture identification unit is used to combine the gesture motion information of main camera collection and the gesture information of auxiliary camera collection to analyze, and identifies the user in three-dimensional gesture motion, and is converted into corresponding order and sends television terminal to.
2. imitative 3D gesture identification as claimed in claim 1 system is characterized in that said main camera is installed on the television terminal, and camera lens keeps certain angle over against the television terminal dead ahead or with the television terminal dead ahead; Said auxiliary camera is installed on the ground, and camera lens up and be parallel to the face at television terminal screen place or keep certain angle with the face at television terminal screen place.
3. according to claim 1 or claim 2 imitative 3D gesture identification system is characterized in that said main camera and auxiliary camera are the 2D camera.
4. according to claim 1 or claim 2 imitative 3D gesture identification system is characterized in that said gesture identification unit comprises:
The gesture MBM is used for gesture motion is carried out mathematical modeling, and gesture motion is mapped to the model parameter space, establishes relevant functional requirement, treatment principle and incidence relation, and establishes cooperation flow process and the rule between main camera and the auxiliary camera;
Initialization module is used for when use first should be imitated 3D gesture identification system, carrying out relevant parameters initialization setting automatically according to the putting position and the environment for use of main camera and auxiliary camera;
The image pre-processing module, be used for to main camera and auxiliary camera collection to image carry out corresponding pre-service;
Gestures detection and tracking module are used for confirming the hand exercise zone from being partitioned into gesture motion through pretreated image and following the tracks of;
The gesture feature extraction module is used to extract the static nature and the behavioral characteristics of gesture motion;
The gesture identification module, the static nature and the behavioral characteristics that are used to combine to extract identify gesture motion, and are converted into corresponding order and send television terminal to.
5. imitative 3D gesture identification method is characterized in that, may further comprise the steps:
A. the user sends gesture command to television terminal;
B. main camera and auxiliary camera are gathered the image that comprises gesture motion information from different perspectives, and send the gesture identification unit to and carry out corresponding pre-service;
C. the hand exercise zone is confirmed from through being partitioned into gesture motion the pretreated image and following the tracks of in the gesture identification unit;
D. the gesture identification unit extracts the static nature and the behavioral characteristics of gesture motion;
E. the gesture identification unit combines the static nature and the behavioral characteristics of gesture motion to identify gesture motion, converts corresponding order into and sends television terminal to;
F. television terminal receives the corresponding command and carries out.
6. imitative 3D gesture identification method as claimed in claim 5 is characterized in that, before step a, also comprises step:
A0. when imitating the 3D gesture identification first, the gesture identification unit carries out relevant parameters initialization setting automatically according to the putting position and the environment for use of main camera and auxiliary camera.
7. like claim 5 or 6 described imitative 3D gesture identification methods, it is characterized in that among the step b, said corresponding pre-service comprises carries out noise reduction, enhancing and color conversion process to image.
8. like claim 5 or 6 described imitative 3D gesture identification methods; It is characterized in that; Among the step c, from being meant: from the pretreated image of process, isolate and lock the hand candidate region according to the hand colour of skin, shape, area parameters through being partitioned into gesture motion the pretreated image and following the tracks of.
9. like claim 5 or 6 described imitative 3D gesture identification methods; It is characterized in that; Among the step e; The static nature of gesture identification unit combination gesture motion and behavioral characteristics identify gesture motion and are meant: static nature information in the gesture motion and the gesture feature information in the template characteristic storehouse of setting up are in advance compared, and adopt the dynamic feature information of HMM algorithm computation gesture motion, thereby identify gesture motion.
10. like claim 5 or 6 described imitative 3D gesture identification methods, it is characterized in that in the steps d, said static nature comprises: hand profile, position, area, finger distributed intelligence; Said behavioral characteristics comprises: gesture motion in the space, temporal change information.
CN2012101032104A 2012-04-10 2012-04-10 Imitated 3D gesture recognition system and method Pending CN102663364A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2012101032104A CN102663364A (en) 2012-04-10 2012-04-10 Imitated 3D gesture recognition system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2012101032104A CN102663364A (en) 2012-04-10 2012-04-10 Imitated 3D gesture recognition system and method

Publications (1)

Publication Number Publication Date
CN102663364A true CN102663364A (en) 2012-09-12

Family

ID=46772848

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2012101032104A Pending CN102663364A (en) 2012-04-10 2012-04-10 Imitated 3D gesture recognition system and method

Country Status (1)

Country Link
CN (1) CN102663364A (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103019389A (en) * 2013-01-12 2013-04-03 福建华映显示科技有限公司 Gesture recognition system and gesture recognition method
CN103345627A (en) * 2013-07-23 2013-10-09 清华大学 Action recognition method and device
CN103413080A (en) * 2013-08-20 2013-11-27 苏州跨界软件科技有限公司 Password protection realization method based on gesture
CN103778405A (en) * 2012-10-17 2014-05-07 原相科技股份有限公司 Method for gesture recognition through natural images
CN104007918A (en) * 2013-02-27 2014-08-27 联想(北京)有限公司 Data processing method and electronic device
CN104038799A (en) * 2014-05-21 2014-09-10 南京大学 Three-dimensional television-oriented gesture manipulation method
CN104142939A (en) * 2013-05-07 2014-11-12 李东舸 Method and device for matching feature codes based on motion feature information
CN104143074A (en) * 2013-05-07 2014-11-12 李东舸 Method and equipment for generating motion feature codes on the basis of motion feature information
CN104216623A (en) * 2013-05-29 2014-12-17 鸿富锦精密工业(深圳)有限公司 Household man-machine interaction control system
CN104238721A (en) * 2013-06-06 2014-12-24 由田新技股份有限公司 Interface editing method for editable media interaction device and media interaction platform
CN104486654A (en) * 2014-12-15 2015-04-01 四川长虹电器股份有限公司 Method for providing guidance and television
CN104820472A (en) * 2015-05-13 2015-08-05 瑞声声学科技(深圳)有限公司 A mobile terminal
CN105278667A (en) * 2014-12-16 2016-01-27 维沃移动通信有限公司 Data interaction method, data interaction system and mobile terminal
CN105323619A (en) * 2014-08-04 2016-02-10 深圳市同方多媒体科技有限公司 Gesture control method and gesture control television based on analog button board
CN105487646A (en) * 2014-09-16 2016-04-13 洪永川 Intelligent terminal system with gesture remote function
CN105491425A (en) * 2014-09-16 2016-04-13 洪永川 Methods for gesture recognition and television remote control
CN106295531A (en) * 2016-08-01 2017-01-04 乐视控股(北京)有限公司 A kind of gesture identification method and device and virtual reality terminal
CN106371682A (en) * 2016-09-13 2017-02-01 努比亚技术有限公司 Gesture recognition system based on proximity sensor and method thereof
CN106650554A (en) * 2015-10-30 2017-05-10 成都理想境界科技有限公司 Static hand gesture identification method
CN107272893A (en) * 2017-06-05 2017-10-20 上海大学 Man-machine interactive system and method based on gesture control non-touch screen
TWI618027B (en) * 2016-05-04 2018-03-11 國立高雄應用科技大學 3d hand gesture image recognition method and system thereof with ga
CN108229391A (en) * 2018-01-02 2018-06-29 京东方科技集团股份有限公司 Gesture identifying device and its server, gesture recognition system, gesture identification method
CN114627561A (en) * 2022-05-16 2022-06-14 南昌虚拟现实研究院股份有限公司 Dynamic gesture recognition method and device, readable storage medium and electronic equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101729808A (en) * 2008-10-14 2010-06-09 Tcl集团股份有限公司 Remote control method for television and system for remotely controlling television by same
CN101807114A (en) * 2010-04-02 2010-08-18 浙江大学 Natural interactive method based on three-dimensional gestures

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101729808A (en) * 2008-10-14 2010-06-09 Tcl集团股份有限公司 Remote control method for television and system for remotely controlling television by same
CN101807114A (en) * 2010-04-02 2010-08-18 浙江大学 Natural interactive method based on three-dimensional gestures

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107256089B (en) * 2012-10-17 2020-07-03 原相科技股份有限公司 Gesture recognition method by natural image
CN103778405A (en) * 2012-10-17 2014-05-07 原相科技股份有限公司 Method for gesture recognition through natural images
CN107256089A (en) * 2012-10-17 2017-10-17 原相科技股份有限公司 The gesture identification method carried out with natural image
CN103778405B (en) * 2012-10-17 2017-07-04 原相科技股份有限公司 With the gesture identification that natural image is carried out
CN103019389A (en) * 2013-01-12 2013-04-03 福建华映显示科技有限公司 Gesture recognition system and gesture recognition method
CN104007918B (en) * 2013-02-27 2018-03-27 联想(北京)有限公司 A kind of data processing method and a kind of electronic equipment
CN104007918A (en) * 2013-02-27 2014-08-27 联想(北京)有限公司 Data processing method and electronic device
CN104143074A (en) * 2013-05-07 2014-11-12 李东舸 Method and equipment for generating motion feature codes on the basis of motion feature information
CN104142939A (en) * 2013-05-07 2014-11-12 李东舸 Method and device for matching feature codes based on motion feature information
CN104142939B (en) * 2013-05-07 2019-07-02 杭州智棱科技有限公司 A kind of method and apparatus based on body dynamics information matching characteristic code
CN104216623A (en) * 2013-05-29 2014-12-17 鸿富锦精密工业(深圳)有限公司 Household man-machine interaction control system
CN104238721A (en) * 2013-06-06 2014-12-24 由田新技股份有限公司 Interface editing method for editable media interaction device and media interaction platform
CN103345627B (en) * 2013-07-23 2016-03-30 清华大学 Action identification method and device
CN103345627A (en) * 2013-07-23 2013-10-09 清华大学 Action recognition method and device
CN103413080A (en) * 2013-08-20 2013-11-27 苏州跨界软件科技有限公司 Password protection realization method based on gesture
CN104038799A (en) * 2014-05-21 2014-09-10 南京大学 Three-dimensional television-oriented gesture manipulation method
CN105323619A (en) * 2014-08-04 2016-02-10 深圳市同方多媒体科技有限公司 Gesture control method and gesture control television based on analog button board
CN105487646A (en) * 2014-09-16 2016-04-13 洪永川 Intelligent terminal system with gesture remote function
CN105491425A (en) * 2014-09-16 2016-04-13 洪永川 Methods for gesture recognition and television remote control
CN104486654A (en) * 2014-12-15 2015-04-01 四川长虹电器股份有限公司 Method for providing guidance and television
CN105278667A (en) * 2014-12-16 2016-01-27 维沃移动通信有限公司 Data interaction method, data interaction system and mobile terminal
WO2016095641A1 (en) * 2014-12-16 2016-06-23 维沃移动通信有限公司 Data interaction method and system, and mobile terminal
CN104820472A (en) * 2015-05-13 2015-08-05 瑞声声学科技(深圳)有限公司 A mobile terminal
CN106650554A (en) * 2015-10-30 2017-05-10 成都理想境界科技有限公司 Static hand gesture identification method
TWI618027B (en) * 2016-05-04 2018-03-11 國立高雄應用科技大學 3d hand gesture image recognition method and system thereof with ga
CN106295531A (en) * 2016-08-01 2017-01-04 乐视控股(北京)有限公司 A kind of gesture identification method and device and virtual reality terminal
CN106371682A (en) * 2016-09-13 2017-02-01 努比亚技术有限公司 Gesture recognition system based on proximity sensor and method thereof
CN107272893A (en) * 2017-06-05 2017-10-20 上海大学 Man-machine interactive system and method based on gesture control non-touch screen
CN108229391A (en) * 2018-01-02 2018-06-29 京东方科技集团股份有限公司 Gesture identifying device and its server, gesture recognition system, gesture identification method
US10725553B2 (en) 2018-01-02 2020-07-28 Boe Technology Group Co., Ltd. Gesture recognition device, gesture recognition method, and gesture recognition system
CN114627561A (en) * 2022-05-16 2022-06-14 南昌虚拟现实研究院股份有限公司 Dynamic gesture recognition method and device, readable storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
CN102663364A (en) Imitated 3D gesture recognition system and method
CN103353935B (en) A kind of 3D dynamic gesture identification method for intelligent domestic system
CN106598226B (en) A kind of unmanned plane man-machine interaction method based on binocular vision and deep learning
CN102184021B (en) Television man-machine interaction method based on handwriting input and fingertip mouse
CN105425964B (en) A kind of gesture identification method and system
CN100487636C (en) Game control system and method based on stereo vision
CN103605466A (en) Facial recognition control terminal based method
CN102402289B (en) Mouse recognition method for gesture based on machine vision
CN107894836B (en) Human-computer interaction method for processing and displaying remote sensing image based on gesture and voice recognition
CN105739702A (en) Multi-posture fingertip tracking method for natural man-machine interaction
CN103294996A (en) 3D gesture recognition method
CN104428732A (en) Multimodal interaction with near-to-eye display
CN103995595A (en) Game somatosensory control method based on hand gestures
CN102426480A (en) Man-machine interactive system and real-time gesture tracking processing method for same
Song et al. Design of control system based on hand gesture recognition
CN103105924B (en) Man-machine interaction method and device
CN109839827B (en) Gesture recognition intelligent household control system based on full-space position information
CN103000054B (en) Intelligent teaching machine for kitchen cooking and control method thereof
CN102831408A (en) Human face recognition method
CN109395375A (en) A kind of 3d gaming method of interface interacted based on augmented reality and movement
CN106469268A (en) A kind of contactless unlocking system and method based on gesture identification
KR101525011B1 (en) tangible virtual reality display control device based on NUI, and method thereof
Hu et al. Efficient face and gesture recognition techniques for robot control
KR101289883B1 (en) System and method for generating mask image applied in each threshold in region
Puri Gesture recognition based mouse events

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C12 Rejection of a patent application after its publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20120912