CN102426480A - Man-machine interactive system and real-time gesture tracking processing method for same - Google Patents

Man-machine interactive system and real-time gesture tracking processing method for same Download PDF

Info

Publication number
CN102426480A
CN102426480A CN2011103429725A CN201110342972A CN102426480A CN 102426480 A CN102426480 A CN 102426480A CN 2011103429725 A CN2011103429725 A CN 2011103429725A CN 201110342972 A CN201110342972 A CN 201110342972A CN 102426480 A CN102426480 A CN 102426480A
Authority
CN
China
Prior art keywords
gesture
man
move
staff
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2011103429725A
Other languages
Chinese (zh)
Inventor
刘远民
陈大炜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Konka Group Co Ltd
Original Assignee
Konka Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Konka Group Co Ltd filed Critical Konka Group Co Ltd
Priority to CN2011103429725A priority Critical patent/CN102426480A/en
Publication of CN102426480A publication Critical patent/CN102426480A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention discloses a man-machine interactive system and a real-time gesture tracking processing method for the same. The method comprises the following steps of: obtaining image information of user side, executing gesture detection by a gesture detection unit to finish separation of a gesture and a background, and automatically determining a smaller rectangular frame which surrounds the hand as a region of interest in the image information through a vision algorithm; calculating the hand profile state of each frame in a video sequence by a gesture tracking unit; executing validity check on hand actions according to the calculated hand profile state, and determining the gesture action finished by the user; and generating a corresponding gesture action control instruction according to the determined gesture action, and making corresponding feedbacks by a three-dimensional user interface according to the gesture action control instruction. In the system and the method, all or part of gesture actions of the user are sensed so as to finish accurate tracking on the gesture, thus, a real-time robust solution is provided for an effective gesture man-machine interface port based on a common vision sensor.

Description

A kind of man-machine interactive system and real-time gesture tracking processing method thereof
Technical field
The present invention relates to human-computer interaction technique field, in particular a kind of man-machine interactive system and real-time gesture tracking processing method thereof.
Background technology
Human-computer interaction technology is one of field with fastest developing speed in the present user interface research, and to this, each state all attaches great importance to.The U.S. classifies man-machine interface in the infotech as and one of software and computing machine six gordian techniquies arranged side by side in national gordian technique.In the U.S. national defense gordian technique, man-machine interface is not only one of important content in the software engineering, and is and one of computing machine and software engineering 11 gordian techniquies arranged side by side.The European information technology research and development strategic plan (ESPRIT) of the European Community has also been set up the user interface techniques project specially, comprising multimodal human-computer interaction interface (Multi-Modal Interface for Man-Machine Interface).Remaining on taking the lead in this field, is vital to whole intelligent computer systems.
The information 80% that human body obtained is from vision, therefore, is the important means that solves man-machine interaction from knowing psychologic angle research based on the man-machine interaction mode of machine vision.Gesture be in the man-machine interaction process one very naturally, exchange channels intuitively, therefore studying gestures detection Tracking Recognition technology not only helps to realize the man-machine interaction of nature, and helps robot through imitating user's demonstration movement acquistion technical ability.
Because diversity, polysemy and time that gesture itself has and the characteristics such as otherness on the space; Staff is the ill-posedness of complex deformation body and vision itself in addition, and therefore the gesture identification based on vision is a multidisciplinary intersection, challenging research topic.
The present man-machine interaction based on gesture mainly contains three kinds of modes, and the one, Massachusetts Polytechnics is representative, utilizes devices such as data glove, data clothes, the motion of adversary and health is followed the tracks of, the completion man-machine interaction; Second kind is to be the somatic sensation television game of representative with the Microsoft, and it adopts degree of depth camera and RGB camera to realize the Position Tracking of hand and health.The front dual mode all has expensive characteristics, is unsuitable for enterprise, particularly the widespread use of household appliances enterprise with keen competition; The third is exactly well-known in the industry HandVu; It is the object of research with common camera; Have advantages such as cost is low, real-time performance is good, but bigger by the external environment influence in tracing process, can not solve well owing to illumination and the complicated tracking failure problem that brings of background.
The Kinect somatic sensation television game that Microsoft in 2010 releases because of its human-computer interaction function intuitively naturally, and makes it favored by consumers in general.This system has adopted dual camera (degree of depth camera and RGB camera) to help the information fusion of multisensor, therefore have higher gestures detection and tracking accuracy, but its cost is high.On the contrary; Real-time gesture detection tracker based on common single camera has very strong advantage in this respect; But all there are certain deficiency in its adversary's the tracking and the preparation of detection and precision; Trace it to its cause and mainly contain: (1) hand itself is not a rigid body, in motion process, possibly have deformation in various degree; (2) influence of illumination condition and variation; (3) to target following neither one confidence level module, therefore, when system keeps track other targets and the tracking failure problem that causes is difficult to solve.
Therefore, prior art awaits to improve and development.
Summary of the invention
The technical matters that the present invention will solve is; Above-mentioned defective to prior art; A kind of man-machine interactive system and real-time gesture tracking processing method thereof are provided; The present invention can solve the tracking of non-steel body target under common single camera and the inaccurate problem of detection such as hand, and can solve owing to illumination and the complicated tracking failure problem that brings of background, utilizes computer vision and image processing techniques to realize automatic staff detection, tracking and gesture identification; In real time, robust, be easy to realize and operation, can make the computer user through hand attitude and computing machine carry out more natural, more intuitively, more intelligence alternately.
The technical scheme that technical solution problem of the present invention is adopted is following:
A kind of real-time gesture tracking processing method of man-machine interactive system wherein, comprises step:
A, obtain the image information of user side and carry out corresponding image noise reduction and enhancement process;
B, treated image information is carried out staff through the gestures detection unit detect, accomplish separating of gesture and background, and be area-of-interest through vision algorithm confirms to surround staff automatically in image information a less rectangle frame;
C, through the gesture tracking cell, the area-of-interest in said image information is accomplished the sub-pixel tracking of gesture feature point, in video sequence, calculates the staff profile state of every frame;
The validity that the staff profile state that D, basis calculate carries out the staff action detects, and carries out gesture identification and classify with the track of the user being accomplished certain predefine gesture, confirms the gesture motion that the user accomplishes;
The gesture motion that E, basis are confirmed generates corresponding gesture motion steering order, and this gesture motion steering order is sent to three-dimensional user interface;
F, three-dimensional user interface are made respective feedback according to said gesture motion steering order.
The real-time gesture tracking processing method of described man-machine interactive system wherein, comprises also before the said steps A that a, stereopsis display unit show 3 D stereoscopic image and three-dimensional graphical interface of user.
The real-time gesture tracking processing method of described man-machine interactive system, wherein, said steps A specifically comprises:
A1, video image acquiring unit obtain the user and belong to environment depth image information;
A2, the image information of the video image acquiring unit being obtained through graphics processing unit are carried out denoising and target enhancement process.
The real-time gesture tracking processing method of described man-machine interactive system, wherein, the staff profile state among the said step C: the length and the angle that comprise position, rotation angle, scaling amount and each finger.
The real-time gesture tracking processing method of described man-machine interactive system, wherein, said step D also comprises: the basis for estimation whether gesture motion begins is in the staff testing result of continuous 20 frames, has 12 frames of surpassing to detect staff and is in same position.
The real-time gesture tracking processing method of described man-machine interactive system, wherein, the gesture motion among the said step D comprises: move to left, move to right, on move, move down.
The real-time gesture tracking processing method of described man-machine interactive system, wherein, the gesture motion that the basis in the said step e is confirmed generates corresponding gesture motion steering order and comprises:
E1, motionless through hand gesture location identify this action for clicking order, generate corresponding click steering order;
E2, through the moving to left fast of hand gesture location, move to right, on move, in-migration identifies left and right, upper and lower four orders down, generate move to left accordingly, move to right, on move, move down steering order;
E3, through the closing motion of waving to identify of hand gesture location, generate the instruction of corresponding closing control.
A kind of man-machine interactive system wherein, comprising:
The video image acquiring unit is used to obtain the user and belongs to environment depth image information;
Graphics processing unit, the image information that is used for that the video image acquiring unit is obtained is carried out denoising and target enhancement process;
The gestures detection unit is used for that treated image information is carried out staff and detects, and the completion gesture is separated with background, and confirms that through vision algorithm a less rectangle frame of encirclement staff is an area-of-interest in image information automatically;
The gesture tracking cell is used for accomplishing at the area-of-interest of said image information the sub-pixel tracking of gesture feature point, in video sequence, calculates the staff profile state of every frame;
Gesture effect property detection and gesture motion confirmation unit, the validity of carrying out the staff action according to the staff profile state that calculates detects, and carries out gesture identification and classify with the track of the user being accomplished certain predefine gesture, confirms the gesture motion that the user accomplishes;
Gesture instruction control command generation unit generates corresponding gesture motion steering order according to the gesture motion of confirming, and this gesture motion steering order is sent to three-dimensional user interface;
The stereopsis display unit is used to show 3 D stereoscopic image and three-dimensional graphical interface of user, and is used for making respective feedback according to said gesture motion steering order.
Described man-machine interactive system, wherein, said video image acquiring unit is a camera.
Described man-machine interactive system, wherein, said staff profile state: the length and the angle that comprise position, rotation angle, scaling amount and each finger;
Gesture instruction control command generation unit further comprises:
First generation module is used for motionless through hand gesture location, identifies this action for clicking order, generates the corresponding steering order of clicking;
Second generation module, be used for through the moving to left fast of hand gesture location, move to right, on move, in-migration identifies left and right, upper and lower four orders down, generate move to left accordingly, move to right, on move, move down steering order;
The 3rd generates module, is used for the closing motion of waving to identify through hand gesture location, generates corresponding closing control instruction.
A kind of man-machine interactive system provided by the present invention and real-time gesture tracking processing method thereof; The present invention is through image sensing and processing unit on the 3 D stereoscopic image display device; Induction user's all or part of gesture motion; Accomplish the accurate tracking of gesture; Thereby for the solution of real-time gesture tracking effectively is provided based on the gesture user-machine interface of common vision sensor; The present invention utilizes computer vision and image processing techniques to realize automatic staff detection, tracking and gesture identification, in real time, robust, be easy to realize and operation, can make the computer user through hand attitude and computing machine carry out more natural, more intuitively, more intelligence alternately; Can be used for applications such as intelligent appliance, man-machine interaction and VR-Platform.Be applied in the intelligent television and the man-machine interaction of other intelligent appliance products, various somatic sensation television game and the various relevant VR-Platform products, so the present invention also has great economic worth and using value.
Description of drawings
Fig. 1 is the real-time gesture tracking processing method process flow diagram of the man-machine interactive system of the embodiment of the invention.
Fig. 2 is that the staff classifier stage of the embodiment of the invention links composition.
Fig. 3 is the man-machine interactive system functional schematic block diagram of the embodiment of the invention.
Fig. 4 is the man-machine interactive system hardware configuration synoptic diagram of the embodiment of the invention.
Embodiment
A kind of man-machine interactive system provided by the invention and real-time gesture tracking processing method thereof, clearer, clear and definite for making the object of the invention, technical scheme and advantage, below develop simultaneously embodiment to further explain of the present invention with reference to accompanying drawing.Should be appreciated that specific embodiment described herein only in order to explanation the present invention, and be not used in qualification the present invention.
The hardware device that needs in the embodiment of the invention is as shown in Figure 4, is computing machine 300 and video image acquisition equipment 400, and the real-time gesture tracking processing method of a kind of man-machine interactive system that the embodiment of the invention provides is as shown in Figure 1, comprises step:
Step S110, stereopsis display unit show 3 D stereoscopic image and three-dimensional graphical interface of user.
For example, on the display screen of the computing machine 300 of man-machine interactive system, show 3 D stereoscopic image and the three-dimensional graphical interface of user that to realize man-machine interaction.
Step S120, obtain the image information of user side and carry out corresponding image noise reduction and enhancement process.
For example, when need carry out man-machine interaction, can obtain user (as shown in Figure 4 500) place environment depth image information through video image acquiring unit (first-class) as making a video recording; And carry out denoising and target enhancement process through the image information that graphics processing unit obtains the video image acquiring unit, for next step gestures detection and tracking provides effective guarantee.Get into step S130 then.
Step S130, treated image information is carried out staff through the gestures detection unit detect, accomplish separating of gesture and background, and be area-of-interest through vision algorithm confirms to surround staff automatically in image information a less rectangle frame.
Accomplish separating of gesture and background in this step, for the feature point extraction in the tracking of target is provided convenience, and set area-of-interest, for the real-time demand of system is given security.
It is to adopt direction gradient histogram (HOG) characteristic that staff described in the embodiment of the invention detects, through what realize based on the statistical learning method of Adaboost.
The statistical learning method that is used to learn the staff pattern is the Adaboost algorithm.The Adaboost algorithm is in people's face detects, to use a kind of extremely widely ripe algorithm, its sample through finding it difficult to learn and practise in the continuous learning training sample of learner a little less than calling, thus reach higher extensive precision.The main process of Adaboost algorithm is: at first given training sample set; Then this sample set is carried out cycling; Each circulation obtains a Weak Classifier with selected features training; Calculate the error rate of this hypothesis then, the weight that changes each example according to this error rate gets into next circulation, and a strong classifier is formed in several weak typing cascades.Final sorter is formed by a series of similar strong classifier cascades; The classification capacity of sorter is along with the number of strong classifier in the cascade structure increases and increases; Wherein l as shown in Figure 2,2_M are each strong classifier that cascades up, and T representes that the candidate region accepted (promptly thinking staff district a fabulous creature, said to be like a turtle and blow poisonous sand in man's face) by certain strong classifier, and F representes that the candidate region refused by strong classifier; Be the candidate region that has been excluded, promptly think non-staff zone.Have only the candidate region to be accepted just to think that by all strong classifiers it is real staff zone, as long as some strong classifiers refusal thinks that promptly it is non-staff zone.
Step S140, through the gesture tracking cell, the area-of-interest in said image information is accomplished the sub-pixel tracking of gesture feature point, in video sequence, calculates the staff profile state of every frame;
The tracking of gesture is for next step gesture analysis information to be provided, wherein, and said staff profile state: the length and the angle that comprise position, rotation angle, scaling amount and each finger.
Said sub-pix is interpreted as: the imaging surface at area array camera is least unit with the pixel.Certain CMOS camera chip for example, its pel spacing is 5.2 microns.When video camera is taken, continuous images in the physical world has been carried out the discretize processing.Each pixel is only represented near the color it to the imaging surface.As for " near " to what degree? Just very difficulty is explained.Between two pixels 5.2 microns distance is arranged, on macroscopic view, can be regarded as and connect together.But on microcosmic, also have unlimited littler thing to exist between them.We claim that it is " sub-pix " this littler thing.In fact " sub-pix " should exist, and just do not have a trickle sensor to detect it on the hardware.So calculate it approx on the software.
The validity that the staff profile state that step S150, basis calculate carries out the staff action detects, and carries out gesture identification and classify with the track of the user being accomplished certain predefine gesture, confirms the gesture motion that the user accomplishes; In the present embodiment, said gesture motion comprises: move to left, move to right, on move, move down.And wherein the basis for estimation that whether begins of gesture motion is in the staff testing result of continuous 20 frames, has 12 frames of surpassing to detect staff and is in same position.
Gesture identification described in the present invention implements is passed through HMM and is realized,
The step of gesture identification according to the invention comprises:
Step 151: carry out pre-service removal point of density to follow the tracks of the gesture track that obtains from profile, obtain the pre-service track:
Step 152: pretreated track is extracted the direction encoding characteristic, to characteristic normalization;
Step 153: the probability of the corresponding all kinds of gesture models of characteristic that employing forward recursion algorithm computation step 152 obtains, getting probability the maximum is recognition result.
Staff profile according to the invention is followed the tracks of and is adopted the method that conditional probability density is propagated and heuristic scanning technique combines to realize that the step of said profile track algorithm is following:
Step 51: adopt conditional probability density to propagate translation, rotation and the scaling component motion of (Condensation) algorithm keeps track profile, obtain some candidate's profiles, these candidate's profiles are not also confirmed about the state component of finger;
Step 52: to each candidate's profile of having confirmed translation, rotation and scaling component motion, progressively adjust the length and the angle of each finger, obtain the finger motion state component of each profile, thereby produce final candidate's profile that all state components are all confirmed;
Step 53: from all final candidate's profiles, produce a profile as tracking results.
The gesture motion that step S160, basis are confirmed generates corresponding gesture motion steering order, and this gesture motion steering order is sent to three-dimensional user interface;
Wherein, the gesture motion confirmed of the basis in this step generates corresponding gesture motion steering order and comprises:
E1, motionless through hand gesture location identify this action for clicking order, generate corresponding click steering order;
E2, through the moving to left fast of hand gesture location, move to right, on move, in-migration identifies left and right, upper and lower four orders down, generate move to left accordingly, move to right, on move, move down steering order;
E3, through the closing motion of waving to identify of hand gesture location, generate the instruction of corresponding closing control.
Step S170, three-dimensional user interface are made respective feedback according to said gesture motion steering order.For example, according to user's gesture motion control, the three-dimensional user interface that the stereopsis display unit shows carries out corresponding action demonstration etc.
Therefore this bright embodiment through induction user's all or part of gesture motion, accomplishes the accurate tracking of gesture, thereby for the solution of real-time robust effectively is provided based on the gesture user-machine interface of common vision sensor
Based on the foregoing description, the embodiment of the invention also provides a kind of man-machine interactive system, and is as shown in Figure 3, mainly comprises:
Video image acquiring unit 210 is used to obtain the user and belongs to environment depth image information; Specifically of above-mentioned step S120.Wherein, said video image acquiring unit is a camera.
Graphics processing unit 220, the image information that is used for that the video image acquiring unit is obtained is carried out denoising and target enhancement process; Specifically of above-mentioned step S120.
Gestures detection unit 230 is used for that treated image information is carried out staff and detects, and the completion gesture is separated with background, and confirms that through vision algorithm a less rectangle frame of encirclement staff is an area-of-interest in image information automatically; Specifically of above-mentioned step S130.
Gesture tracking cell 240 is used for accomplishing at the area-of-interest of said image information the sub-pixel tracking of gesture feature point, in video sequence, calculates the staff profile state of every frame; Specifically of above-mentioned step S140.Wherein, said staff profile state: the length and the angle that comprise position, rotation angle, scaling amount and each finger.
Gesture effect property detection and gesture motion confirmation unit 250; The validity of carrying out the staff action according to the staff profile state that calculates detects; And carry out gesture identification and classify with the track of the user being accomplished certain predefine gesture, confirm the gesture motion that the user accomplishes; Specifically of above-mentioned step S150.
Gesture instruction control command generation unit 260 generates corresponding gesture motion steering order according to the gesture motion of confirming, and this gesture motion steering order is sent to three-dimensional user interface; Specifically of above-mentioned step S160.
Stereopsis display unit 270 is used to show 3 D stereoscopic image and three-dimensional graphical interface of user, and is used for making respective feedback according to said gesture motion steering order; Specifically of above-mentioned step S170.
Wherein, said gesture instruction control command generation unit further comprises:
First generation module is used for motionless through hand gesture location, identifies this action for clicking order, generates the corresponding steering order of clicking;
Second generation module, be used for through the moving to left fast of hand gesture location, move to right, on move, in-migration identifies left and right, upper and lower four orders down, generate move to left accordingly, move to right, on move, move down steering order;
The 3rd generates module, is used for the closing motion of waving to identify through hand gesture location, generates corresponding closing control instruction.
In sum; A kind of man-machine interactive system provided by the present invention and real-time gesture tracking processing method thereof; The present invention is through image sensing and processing unit on the 3 D stereoscopic image display device; Induction user's all or part of gesture motion; Accomplish the accurate tracking of gesture, thereby for the solution of real-time gesture tracking effectively is provided based on the gesture user-machine interface of common vision sensor, the present invention utilizes computer vision and image processing techniques to realize automatic staff detection, tracking and gesture identification; In real time, robust, be easy to realize and operation, can make the computer user through hand attitude and computing machine carry out more natural, more intuitively, more intelligence alternately; Can be used for applications such as intelligent appliance, man-machine interaction and VR-Platform.Be applied in the intelligent television and the man-machine interaction of other intelligent appliance products, various somatic sensation television game and the various relevant VR-Platform products, so the present invention also has great economic worth and using value.
Should be understood that application of the present invention is not limited to above-mentioned giving an example, concerning those of ordinary skills, can improve or conversion that all these improvement and conversion all should belong to the protection domain of accompanying claims of the present invention according to above-mentioned explanation.

Claims (10)

1. the real-time gesture tracking processing method of a man-machine interactive system is characterized in that, comprises step:
A, obtain the image information of user side and carry out corresponding image noise reduction and enhancement process;
B, treated image information is carried out staff through the gestures detection unit detect, accomplish separating of gesture and background, and be area-of-interest through vision algorithm confirms to surround staff automatically in image information a less rectangle frame;
C, through the gesture tracking cell, the area-of-interest in said image information is accomplished the sub-pixel tracking of gesture feature point, in video sequence, calculates the staff profile state of every frame;
The validity that the staff profile state that D, basis calculate carries out the staff action detects, and carries out gesture identification and classify with the track of the user being accomplished certain predefine gesture, confirms the gesture motion that the user accomplishes;
The gesture motion that E, basis are confirmed generates corresponding gesture motion steering order, and this gesture motion steering order is sent to three-dimensional user interface;
F, three-dimensional user interface are made respective feedback according to said gesture motion steering order.
2. the real-time gesture tracking processing method of man-machine interactive system according to claim 1 is characterized in that, also comprises before the said steps A, and a, stereopsis display unit show 3 D stereoscopic image and three-dimensional graphical interface of user.
3. the real-time gesture tracking processing method of man-machine interactive system according to claim 1 is characterized in that, said steps A specifically comprises:
A1, video image acquiring unit obtain the user and belong to environment depth image information;
A2, the image information of the video image acquiring unit being obtained through graphics processing unit are carried out denoising and target enhancement process.
4. the real-time gesture tracking processing method of man-machine interactive system according to claim 1 is characterized in that, the staff profile state among the said step C: the length and the angle that comprise position, rotation angle, scaling amount and each finger.
5. the real-time gesture tracking processing method of man-machine interactive system according to claim 1; It is characterized in that; Said step D also comprises: the basis for estimation whether gesture motion begins is in the staff testing result of continuous 20 frames, has 12 frames of surpassing to detect staff and is in same position.
6. the real-time gesture tracking processing method of man-machine interactive system according to claim 1 is characterized in that, the gesture motion among the said step D comprises: move to left, move to right, on move, move down.
7. the real-time gesture tracking processing method of man-machine interactive system according to claim 1 is characterized in that, the gesture motion that the basis in the said step e is confirmed generates corresponding gesture motion steering order and comprises:
E1, motionless through hand gesture location identify this action for clicking order, generate corresponding click steering order;
E2, through the moving to left fast of hand gesture location, move to right, on move, in-migration identifies left and right, upper and lower four orders down, generate move to left accordingly, move to right, on move, move down steering order;
E3, through the closing motion of waving to identify of hand gesture location, generate the instruction of corresponding closing control.
8. a man-machine interactive system is characterized in that, comprising:
The video image acquiring unit is used to obtain the user and belongs to environment depth image information;
Graphics processing unit, the image information that is used for that the video image acquiring unit is obtained is carried out denoising and target enhancement process;
The gestures detection unit is used for that treated image information is carried out staff and detects, and the completion gesture is separated with background, and confirms that through vision algorithm a less rectangle frame of encirclement staff is an area-of-interest in image information automatically;
The gesture tracking cell is used for accomplishing at the area-of-interest of said image information the sub-pixel tracking of gesture feature point, in video sequence, calculates the staff profile state of every frame;
Gesture effect property detection and gesture motion confirmation unit, the validity of carrying out the staff action according to the staff profile state that calculates detects, and carries out gesture identification and classify with the track of the user being accomplished certain predefine gesture, confirms the gesture motion that the user accomplishes;
Gesture instruction control command generation unit generates corresponding gesture motion steering order according to the gesture motion of confirming, and this gesture motion steering order is sent to three-dimensional user interface;
The stereopsis display unit is used to show 3 D stereoscopic image and three-dimensional graphical interface of user, and is used for making respective feedback according to said gesture motion steering order.
9. man-machine interactive system according to claim 8 is characterized in that, said video image acquiring unit is a camera.
10. man-machine interactive system according to claim 8 is characterized in that, said staff profile state: the length and the angle that comprise position, rotation angle, scaling amount and each finger;
Gesture instruction control command generation unit further comprises:
First generation module is used for motionless through hand gesture location, identifies this action for clicking order, generates the corresponding steering order of clicking;
Second generation module, be used for through the moving to left fast of hand gesture location, move to right, on move, in-migration identifies left and right, upper and lower four orders down, generate move to left accordingly, move to right, on move, move down steering order;
The 3rd generates module, is used for the closing motion of waving to identify through hand gesture location, generates corresponding closing control instruction.
CN2011103429725A 2011-11-03 2011-11-03 Man-machine interactive system and real-time gesture tracking processing method for same Pending CN102426480A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2011103429725A CN102426480A (en) 2011-11-03 2011-11-03 Man-machine interactive system and real-time gesture tracking processing method for same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2011103429725A CN102426480A (en) 2011-11-03 2011-11-03 Man-machine interactive system and real-time gesture tracking processing method for same

Publications (1)

Publication Number Publication Date
CN102426480A true CN102426480A (en) 2012-04-25

Family

ID=45960476

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2011103429725A Pending CN102426480A (en) 2011-11-03 2011-11-03 Man-machine interactive system and real-time gesture tracking processing method for same

Country Status (1)

Country Link
CN (1) CN102426480A (en)

Cited By (63)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102693084A (en) * 2012-05-08 2012-09-26 上海鼎为软件技术有限公司 Mobile terminal and method for response operation of mobile terminal
CN102722249A (en) * 2012-06-05 2012-10-10 上海鼎为软件技术有限公司 Manipulating method, manipulating device and electronic device
CN102769802A (en) * 2012-06-11 2012-11-07 西安交通大学 Man-machine interactive system and man-machine interactive method of smart television
CN102789568A (en) * 2012-07-13 2012-11-21 浙江捷尚视觉科技有限公司 Gesture identification method based on depth information
CN102799263A (en) * 2012-06-19 2012-11-28 深圳大学 Posture recognition method and posture recognition control system
CN102945078A (en) * 2012-11-13 2013-02-27 深圳先进技术研究院 Human-computer interaction equipment and human-computer interaction method
CN102981742A (en) * 2012-11-28 2013-03-20 无锡市爱福瑞科技发展有限公司 Gesture interaction system based on computer visions
CN102982557A (en) * 2012-11-06 2013-03-20 桂林电子科技大学 Method for processing space hand signal gesture command based on depth camera
CN103139627A (en) * 2013-02-07 2013-06-05 上海集成电路研发中心有限公司 Intelligent television and gesture control method thereof
CN103136541A (en) * 2013-03-20 2013-06-05 上海交通大学 Double-hand three-dimensional non-contact type dynamic gesture identification method based on depth camera
CN103227962A (en) * 2013-03-29 2013-07-31 上海集成电路研发中心有限公司 Method capable of identifying distance of line formed by image sensors
CN103377554A (en) * 2012-04-27 2013-10-30 卢颖 Induction type pedestrian crossing controlling facility optimization design based on kinect
CN103389793A (en) * 2012-05-07 2013-11-13 深圳泰山在线科技有限公司 Human-computer interaction method and human-computer interaction system
CN103399629A (en) * 2013-06-29 2013-11-20 华为技术有限公司 Method and device for capturing gesture displaying coordinates
CN103679154A (en) * 2013-12-26 2014-03-26 中国科学院自动化研究所 Three-dimensional gesture action recognition method based on depth images
CN103713738A (en) * 2013-12-17 2014-04-09 武汉拓宝电子系统有限公司 Man-machine interaction method based on visual tracking and gesture recognition
CN103777744A (en) * 2012-10-23 2014-05-07 中国移动通信集团公司 Method and device for achieving input control and mobile terminal
CN103869986A (en) * 2014-04-02 2014-06-18 中国电影器材有限责任公司 Dynamic data generating method based on KINECT
CN104143075A (en) * 2013-05-08 2014-11-12 光宝科技股份有限公司 Gesture judging method applied to electronic device
WO2014183262A1 (en) * 2013-05-14 2014-11-20 Empire Technology Development Llc Detection of user gestures
CN104182169A (en) * 2013-05-23 2014-12-03 三星电子株式会社 Method and apparatus for user interface based on gesture
CN104252231A (en) * 2014-09-23 2014-12-31 河南省辉耀网络技术有限公司 Camera based motion sensing recognition system and method
CN104281265A (en) * 2014-10-14 2015-01-14 京东方科技集团股份有限公司 Application program control method, application program control device and electronic equipment
CN104331149A (en) * 2014-09-29 2015-02-04 联想(北京)有限公司 Control method, control device and electronic equipment
CN104662561A (en) * 2012-06-27 2015-05-27 若威尔士有限公司 Skin-based user recognition
CN104754311A (en) * 2015-04-28 2015-07-01 刘凌霞 Device for identifying object with computer vision and system thereof
CN104777900A (en) * 2015-03-12 2015-07-15 广东威法科技发展有限公司 Gesture trend-based graphical interface response method
CN104915011A (en) * 2015-06-28 2015-09-16 合肥金诺数码科技股份有限公司 Open environment gesture interaction game system
CN104978014A (en) * 2014-04-11 2015-10-14 维沃移动通信有限公司 Method for quickly calling application program or system function, and mobile terminal thereof
CN105045399A (en) * 2015-09-07 2015-11-11 哈尔滨市一舍科技有限公司 Electronic device with 3D camera assembly
CN105046249A (en) * 2015-09-07 2015-11-11 哈尔滨市一舍科技有限公司 Human-computer interaction method
CN105068662A (en) * 2015-09-07 2015-11-18 哈尔滨市一舍科技有限公司 Electronic device used for man-machine interaction
CN105069444A (en) * 2015-09-07 2015-11-18 哈尔滨市一舍科技有限公司 Gesture recognition device
CN105160323A (en) * 2015-09-07 2015-12-16 哈尔滨市一舍科技有限公司 Gesture identification method
CN105701442A (en) * 2014-12-15 2016-06-22 黄安琪 Passenger car stopping prompting method and device
CN105718052A (en) * 2016-01-18 2016-06-29 京东方科技集团股份有限公司 Instruction method and apparatus for correcting somatosensory interaction tracking failure
CN105745606A (en) * 2013-09-24 2016-07-06 惠普发展公司,有限责任合伙企业 Identifying a target touch region of a touch-sensitive surface based on an image
CN105917356A (en) * 2014-01-14 2016-08-31 微软技术许可有限责任公司 Contour-based classification of objects
CN105988566A (en) * 2015-02-11 2016-10-05 联想(北京)有限公司 Information processing method and electronic device
CN106020433A (en) * 2015-12-09 2016-10-12 展视网(北京)科技有限公司 3D vehicle terminal man-machine interactive system and interaction method
WO2017075932A1 (en) * 2015-11-02 2017-05-11 深圳奥比中光科技有限公司 Gesture-based control method and system based on three-dimensional displaying
CN106774842A (en) * 2016-11-24 2017-05-31 中国科学技术大学 Driving-situation assistant's gesture intersection control routine
CN106846564A (en) * 2016-12-29 2017-06-13 湖南拓视觉信息技术有限公司 A kind of intelligent access control system and control method
WO2017161496A1 (en) * 2016-03-22 2017-09-28 广东虚拟现实科技有限公司 Fringe set searching method, device and system
CN107430431A (en) * 2015-01-09 2017-12-01 雷蛇(亚太)私人有限公司 Gesture identifying device and gesture identification method
CN107895161A (en) * 2017-12-22 2018-04-10 北京奇虎科技有限公司 Real-time attitude recognition methods and device, computing device based on video data
CN108363482A (en) * 2018-01-11 2018-08-03 江苏四点灵机器人有限公司 A method of the three-dimension gesture based on binocular structure light controls smart television
CN108375913A (en) * 2018-03-28 2018-08-07 山东大学 It is a kind of based on the smart home things system and its operation method of NAO robots and application
CN108416420A (en) * 2018-02-11 2018-08-17 北京光年无限科技有限公司 Limbs exchange method based on visual human and system
CN108459702A (en) * 2017-02-22 2018-08-28 天津锋时互动科技有限公司深圳分公司 Man-machine interaction method based on gesture identification and visual feedback and system
CN108549489A (en) * 2018-04-27 2018-09-18 哈尔滨拓博科技有限公司 A kind of gestural control method and system based on hand form, posture, position and motion feature
CN109194899A (en) * 2018-11-22 2019-01-11 维沃移动通信有限公司 A kind of method and terminal of audio-visual synchronization
CN109343701A (en) * 2018-09-03 2019-02-15 电子科技大学 A kind of intelligent human-machine interaction method based on dynamic hand gesture recognition
CN109416570A (en) * 2015-12-31 2019-03-01 微软技术许可有限责任公司 Use the hand gestures API of finite state machine and posture language discrete value
WO2019085060A1 (en) * 2017-10-30 2019-05-09 南京阿凡达机器人科技有限公司 Method and system for detecting waving of robot, and robot
CN109782850A (en) * 2019-01-04 2019-05-21 北京灵优智学科技有限公司 Support the full interactive intelligence intelligent education machine of multiple network access
WO2019120290A1 (en) * 2017-12-22 2019-06-27 北京市商汤科技开发有限公司 Dynamic gesture recognition method and device, and gesture interaction control method and device
CN109960980A (en) * 2017-12-22 2019-07-02 北京市商汤科技开发有限公司 Dynamic gesture identification method and device
CN110109535A (en) * 2019-03-18 2019-08-09 国网浙江省电力有限公司信息通信分公司 Augmented reality generation method and device
CN111199169A (en) * 2018-11-16 2020-05-26 北京微播视界科技有限公司 Image processing method and device
CN111079588B (en) * 2019-12-03 2021-09-10 北京字节跳动网络技术有限公司 Image processing method, device and storage medium
CN114167980A (en) * 2021-11-18 2022-03-11 深圳市鸿合创新信息技术有限责任公司 Gesture processing method and device, electronic equipment and readable storage medium
CN114167978A (en) * 2021-11-11 2022-03-11 广州大学 Human-computer interaction system carried on construction robot

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999034327A2 (en) * 1997-12-23 1999-07-08 Koninklijke Philips Electronics N.V. System and method for permitting three-dimensional navigation through a virtual reality environment using camera-based gesture inputs
CN101689244A (en) * 2007-05-04 2010-03-31 格斯图尔泰克股份有限公司 Camera-based user input for compact devices
CN101763515A (en) * 2009-09-23 2010-06-30 中国科学院自动化研究所 Real-time gesture interaction method based on computer vision
US20100166258A1 (en) * 2008-12-30 2010-07-01 Xiujuan Chai Method, apparatus and computer program product for providing hand segmentation for gesture analysis
CN102081918A (en) * 2010-09-28 2011-06-01 北京大学深圳研究生院 Video image display control method and video image display device
CN102117117A (en) * 2010-01-06 2011-07-06 致伸科技股份有限公司 System and method for control through identifying user posture by image extraction device
CN102789568A (en) * 2012-07-13 2012-11-21 浙江捷尚视觉科技有限公司 Gesture identification method based on depth information

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999034327A2 (en) * 1997-12-23 1999-07-08 Koninklijke Philips Electronics N.V. System and method for permitting three-dimensional navigation through a virtual reality environment using camera-based gesture inputs
CN101689244A (en) * 2007-05-04 2010-03-31 格斯图尔泰克股份有限公司 Camera-based user input for compact devices
US20100166258A1 (en) * 2008-12-30 2010-07-01 Xiujuan Chai Method, apparatus and computer program product for providing hand segmentation for gesture analysis
CN101763515A (en) * 2009-09-23 2010-06-30 中国科学院自动化研究所 Real-time gesture interaction method based on computer vision
CN102117117A (en) * 2010-01-06 2011-07-06 致伸科技股份有限公司 System and method for control through identifying user posture by image extraction device
CN102081918A (en) * 2010-09-28 2011-06-01 北京大学深圳研究生院 Video image display control method and video image display device
CN102789568A (en) * 2012-07-13 2012-11-21 浙江捷尚视觉科技有限公司 Gesture identification method based on depth information

Cited By (98)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103377554A (en) * 2012-04-27 2013-10-30 卢颖 Induction type pedestrian crossing controlling facility optimization design based on kinect
CN103389793A (en) * 2012-05-07 2013-11-13 深圳泰山在线科技有限公司 Human-computer interaction method and human-computer interaction system
CN102693084A (en) * 2012-05-08 2012-09-26 上海鼎为软件技术有限公司 Mobile terminal and method for response operation of mobile terminal
CN102693084B (en) * 2012-05-08 2016-08-03 上海鼎为电子科技(集团)有限公司 Mobile terminal and the method for response operation thereof
CN102722249A (en) * 2012-06-05 2012-10-10 上海鼎为软件技术有限公司 Manipulating method, manipulating device and electronic device
CN102722249B (en) * 2012-06-05 2016-03-30 上海鼎为电子科技(集团)有限公司 Control method, actuation means and electronic installation
CN102769802A (en) * 2012-06-11 2012-11-07 西安交通大学 Man-machine interactive system and man-machine interactive method of smart television
CN102799263A (en) * 2012-06-19 2012-11-28 深圳大学 Posture recognition method and posture recognition control system
CN102799263B (en) * 2012-06-19 2016-05-25 深圳大学 A kind of gesture recognition method and gesture recognition control system
CN104662561A (en) * 2012-06-27 2015-05-27 若威尔士有限公司 Skin-based user recognition
CN102789568B (en) * 2012-07-13 2015-03-25 浙江捷尚视觉科技股份有限公司 Gesture identification method based on depth information
CN102789568A (en) * 2012-07-13 2012-11-21 浙江捷尚视觉科技有限公司 Gesture identification method based on depth information
CN103777744A (en) * 2012-10-23 2014-05-07 中国移动通信集团公司 Method and device for achieving input control and mobile terminal
CN102982557B (en) * 2012-11-06 2015-03-25 桂林电子科技大学 Method for processing space hand signal gesture command based on depth camera
CN102982557A (en) * 2012-11-06 2013-03-20 桂林电子科技大学 Method for processing space hand signal gesture command based on depth camera
CN102945078A (en) * 2012-11-13 2013-02-27 深圳先进技术研究院 Human-computer interaction equipment and human-computer interaction method
CN102981742A (en) * 2012-11-28 2013-03-20 无锡市爱福瑞科技发展有限公司 Gesture interaction system based on computer visions
CN103139627A (en) * 2013-02-07 2013-06-05 上海集成电路研发中心有限公司 Intelligent television and gesture control method thereof
CN103136541B (en) * 2013-03-20 2015-10-14 上海交通大学 Based on the both hands 3 D non-contacting type dynamic gesture identification method of depth camera
CN103136541A (en) * 2013-03-20 2013-06-05 上海交通大学 Double-hand three-dimensional non-contact type dynamic gesture identification method based on depth camera
CN103227962B (en) * 2013-03-29 2018-11-09 上海集成电路研发中心有限公司 Identify the method at a distance from imaging sensor line formed
CN103227962A (en) * 2013-03-29 2013-07-31 上海集成电路研发中心有限公司 Method capable of identifying distance of line formed by image sensors
CN104143075A (en) * 2013-05-08 2014-11-12 光宝科技股份有限公司 Gesture judging method applied to electronic device
US10268279B2 (en) 2013-05-14 2019-04-23 Empire Technology Development Llc Detection of user gestures
WO2014183262A1 (en) * 2013-05-14 2014-11-20 Empire Technology Development Llc Detection of user gestures
US9740295B2 (en) 2013-05-14 2017-08-22 Empire Technology Development Llc Detection of user gestures
CN104182169A (en) * 2013-05-23 2014-12-03 三星电子株式会社 Method and apparatus for user interface based on gesture
CN103399629A (en) * 2013-06-29 2013-11-20 华为技术有限公司 Method and device for capturing gesture displaying coordinates
CN105745606A (en) * 2013-09-24 2016-07-06 惠普发展公司,有限责任合伙企业 Identifying a target touch region of a touch-sensitive surface based on an image
CN105745606B (en) * 2013-09-24 2019-07-26 惠普发展公司,有限责任合伙企业 Target touch area based on image recognition touch sensitive surface
US10324563B2 (en) 2013-09-24 2019-06-18 Hewlett-Packard Development Company, L.P. Identifying a target touch region of a touch-sensitive surface based on an image
CN103713738B (en) * 2013-12-17 2016-06-29 武汉拓宝科技股份有限公司 A kind of view-based access control model follows the tracks of the man-machine interaction method with gesture identification
CN103713738A (en) * 2013-12-17 2014-04-09 武汉拓宝电子系统有限公司 Man-machine interaction method based on visual tracking and gesture recognition
CN103679154A (en) * 2013-12-26 2014-03-26 中国科学院自动化研究所 Three-dimensional gesture action recognition method based on depth images
CN105917356A (en) * 2014-01-14 2016-08-31 微软技术许可有限责任公司 Contour-based classification of objects
CN103869986A (en) * 2014-04-02 2014-06-18 中国电影器材有限责任公司 Dynamic data generating method based on KINECT
CN104978014A (en) * 2014-04-11 2015-10-14 维沃移动通信有限公司 Method for quickly calling application program or system function, and mobile terminal thereof
CN104978014B (en) * 2014-04-11 2018-05-11 维沃移动通信有限公司 A kind of method and its mobile terminal of quick calling application program or systemic-function
CN104252231A (en) * 2014-09-23 2014-12-31 河南省辉耀网络技术有限公司 Camera based motion sensing recognition system and method
CN104331149A (en) * 2014-09-29 2015-02-04 联想(北京)有限公司 Control method, control device and electronic equipment
CN104331149B (en) * 2014-09-29 2018-08-10 联想(北京)有限公司 A kind of control method, device and electronic equipment
CN104281265A (en) * 2014-10-14 2015-01-14 京东方科技集团股份有限公司 Application program control method, application program control device and electronic equipment
KR20160060003A (en) * 2014-10-14 2016-05-27 보에 테크놀로지 그룹 컴퍼니 리미티드 A method, a device, and an electronic equipment for controlling an Application Program
WO2016058303A1 (en) * 2014-10-14 2016-04-21 京东方科技集团股份有限公司 Application control method and apparatus and electronic device
KR101718837B1 (en) 2014-10-14 2017-03-22 보에 테크놀로지 그룹 컴퍼니 리미티드 A method, a device, and an electronic equipment for controlling an Application Program
CN104281265B (en) * 2014-10-14 2017-06-16 京东方科技集团股份有限公司 A kind of control method of application program, device and electronic equipment
CN105701442A (en) * 2014-12-15 2016-06-22 黄安琪 Passenger car stopping prompting method and device
CN105701442B (en) * 2014-12-15 2019-12-03 孟圣慧 Passenger car stopping prompting method and device
CN107430431A (en) * 2015-01-09 2017-12-01 雷蛇(亚太)私人有限公司 Gesture identifying device and gesture identification method
CN105988566A (en) * 2015-02-11 2016-10-05 联想(北京)有限公司 Information processing method and electronic device
CN105988566B (en) * 2015-02-11 2019-05-31 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN104777900A (en) * 2015-03-12 2015-07-15 广东威法科技发展有限公司 Gesture trend-based graphical interface response method
CN104754311A (en) * 2015-04-28 2015-07-01 刘凌霞 Device for identifying object with computer vision and system thereof
CN104915011A (en) * 2015-06-28 2015-09-16 合肥金诺数码科技股份有限公司 Open environment gesture interaction game system
CN105045399A (en) * 2015-09-07 2015-11-11 哈尔滨市一舍科技有限公司 Electronic device with 3D camera assembly
CN105045399B (en) * 2015-09-07 2018-08-14 哈尔滨市一舍科技有限公司 A kind of electronic equipment with 3D camera assemblies
CN105160323A (en) * 2015-09-07 2015-12-16 哈尔滨市一舍科技有限公司 Gesture identification method
CN105068662B (en) * 2015-09-07 2018-03-06 哈尔滨市一舍科技有限公司 A kind of electronic equipment for man-machine interaction
CN105069444B (en) * 2015-09-07 2018-09-11 哈尔滨市一舍科技有限公司 A kind of gesture identifying device
CN105069444A (en) * 2015-09-07 2015-11-18 哈尔滨市一舍科技有限公司 Gesture recognition device
CN105046249B (en) * 2015-09-07 2018-09-11 哈尔滨市一舍科技有限公司 A kind of man-machine interaction method
CN105046249A (en) * 2015-09-07 2015-11-11 哈尔滨市一舍科技有限公司 Human-computer interaction method
CN105160323B (en) * 2015-09-07 2018-11-27 哈尔滨市一舍科技有限公司 A kind of gesture identification method
CN105068662A (en) * 2015-09-07 2015-11-18 哈尔滨市一舍科技有限公司 Electronic device used for man-machine interaction
WO2017075932A1 (en) * 2015-11-02 2017-05-11 深圳奥比中光科技有限公司 Gesture-based control method and system based on three-dimensional displaying
CN106020433A (en) * 2015-12-09 2016-10-12 展视网(北京)科技有限公司 3D vehicle terminal man-machine interactive system and interaction method
CN109416570B (en) * 2015-12-31 2022-04-05 微软技术许可有限责任公司 Hand gesture API using finite state machines and gesture language discrete values
CN109416570A (en) * 2015-12-31 2019-03-01 微软技术许可有限责任公司 Use the hand gestures API of finite state machine and posture language discrete value
CN105718052B (en) * 2016-01-18 2018-09-11 京东方科技集团股份有限公司 A kind of indicating means and device for correcting body feeling interaction tracking failure
US9990031B2 (en) 2016-01-18 2018-06-05 Boe Technology Group Co., Ltd. Indicating method and device for correcting failure of motion-sensing interaction tracking
CN105718052A (en) * 2016-01-18 2016-06-29 京东方科技集团股份有限公司 Instruction method and apparatus for correcting somatosensory interaction tracking failure
WO2017161496A1 (en) * 2016-03-22 2017-09-28 广东虚拟现实科技有限公司 Fringe set searching method, device and system
US10795456B2 (en) 2016-03-22 2020-10-06 Guangdong Virtual Reality Technology Co., Ltd. Method, device and terminal for determining effectiveness of stripe set
CN106774842A (en) * 2016-11-24 2017-05-31 中国科学技术大学 Driving-situation assistant's gesture intersection control routine
CN106846564A (en) * 2016-12-29 2017-06-13 湖南拓视觉信息技术有限公司 A kind of intelligent access control system and control method
CN108459702B (en) * 2017-02-22 2024-01-26 深圳巧牛科技有限公司 Man-machine interaction method and system based on gesture recognition and visual feedback
CN108459702A (en) * 2017-02-22 2018-08-28 天津锋时互动科技有限公司深圳分公司 Man-machine interaction method based on gesture identification and visual feedback and system
WO2019085060A1 (en) * 2017-10-30 2019-05-09 南京阿凡达机器人科技有限公司 Method and system for detecting waving of robot, and robot
WO2019120290A1 (en) * 2017-12-22 2019-06-27 北京市商汤科技开发有限公司 Dynamic gesture recognition method and device, and gesture interaction control method and device
US11221681B2 (en) 2017-12-22 2022-01-11 Beijing Sensetime Technology Development Co., Ltd Methods and apparatuses for recognizing dynamic gesture, and control methods and apparatuses using gesture interaction
CN107895161A (en) * 2017-12-22 2018-04-10 北京奇虎科技有限公司 Real-time attitude recognition methods and device, computing device based on video data
CN109960980B (en) * 2017-12-22 2022-03-15 北京市商汤科技开发有限公司 Dynamic gesture recognition method and device
CN109960980A (en) * 2017-12-22 2019-07-02 北京市商汤科技开发有限公司 Dynamic gesture identification method and device
CN107895161B (en) * 2017-12-22 2020-12-11 北京奇虎科技有限公司 Real-time attitude identification method and device based on video data and computing equipment
JP2020508511A (en) * 2017-12-22 2020-03-19 ベイジン センスタイム テクノロジー デベロップメント カンパニー, リミテッド Dynamic gesture recognition method and apparatus, gesture dialogue control method and apparatus
CN108363482A (en) * 2018-01-11 2018-08-03 江苏四点灵机器人有限公司 A method of the three-dimension gesture based on binocular structure light controls smart television
CN108416420A (en) * 2018-02-11 2018-08-17 北京光年无限科技有限公司 Limbs exchange method based on visual human and system
CN108375913A (en) * 2018-03-28 2018-08-07 山东大学 It is a kind of based on the smart home things system and its operation method of NAO robots and application
CN108549489A (en) * 2018-04-27 2018-09-18 哈尔滨拓博科技有限公司 A kind of gestural control method and system based on hand form, posture, position and motion feature
CN109343701A (en) * 2018-09-03 2019-02-15 电子科技大学 A kind of intelligent human-machine interaction method based on dynamic hand gesture recognition
CN111199169A (en) * 2018-11-16 2020-05-26 北京微播视界科技有限公司 Image processing method and device
CN109194899A (en) * 2018-11-22 2019-01-11 维沃移动通信有限公司 A kind of method and terminal of audio-visual synchronization
CN109782850A (en) * 2019-01-04 2019-05-21 北京灵优智学科技有限公司 Support the full interactive intelligence intelligent education machine of multiple network access
CN110109535A (en) * 2019-03-18 2019-08-09 国网浙江省电力有限公司信息通信分公司 Augmented reality generation method and device
CN111079588B (en) * 2019-12-03 2021-09-10 北京字节跳动网络技术有限公司 Image processing method, device and storage medium
CN114167978A (en) * 2021-11-11 2022-03-11 广州大学 Human-computer interaction system carried on construction robot
CN114167980A (en) * 2021-11-18 2022-03-11 深圳市鸿合创新信息技术有限责任公司 Gesture processing method and device, electronic equipment and readable storage medium
CN114167980B (en) * 2021-11-18 2024-05-07 深圳市鸿合创新信息技术有限责任公司 Gesture processing method, gesture processing device, electronic equipment and readable storage medium

Similar Documents

Publication Publication Date Title
CN102426480A (en) Man-machine interactive system and real-time gesture tracking processing method for same
Chen et al. Pose guided structured region ensemble network for cascaded hand pose estimation
WO2021129064A9 (en) Posture acquisition method and device, and key point coordinate positioning model training method and device
Kim et al. Simultaneous gesture segmentation and recognition based on forward spotting accumulative HMMs
Ma et al. Kinect sensor-based long-distance hand gesture recognition and fingertip detection with depth information
Zhang et al. Real-time multiple human perception with color-depth cameras on a mobile robot
Choi et al. A general framework for tracking multiple people from a moving camera
Gu et al. Human gesture recognition through a kinect sensor
US11107242B2 (en) Detecting pose using floating keypoint(s)
CN103295016B (en) Behavior recognition method based on depth and RGB information and multi-scale and multidirectional rank and level characteristics
Huang et al. Deepfinger: A cascade convolutional neuron network approach to finger key point detection in egocentric vision with mobile camera
CN111444764A (en) Gesture recognition method based on depth residual error network
Chen et al. Using FTOC to track shuttlecock for the badminton robot
US10401947B2 (en) Method for simulating and controlling virtual sphere in a mobile device
Zhang et al. Handsense: smart multimodal hand gesture recognition based on deep neural networks
Wang et al. Immersive human–computer interactive virtual environment using large-scale display system
Nan et al. Learning to infer human attention in daily activities
Ding et al. Simultaneous body part and motion identification for human-following robots
Liu et al. Fingertip in the eye: A cascaded cnn pipeline for the real-time fingertip detection in egocentric videos
CN112861808B (en) Dynamic gesture recognition method, device, computer equipment and readable storage medium
CN107918507A (en) A kind of virtual touchpad method based on stereoscopic vision
CN110490165B (en) Dynamic gesture tracking method based on convolutional neural network
Singh et al. Some contemporary approaches for human activity recognition: A survey
Shaker et al. Real-time finger tracking for interaction
CN106599901B (en) Collaboration Target Segmentation and Activity recognition method based on depth Boltzmann machine

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20120425