CN104407694A - Man-machine interaction method and device combining human face and gesture control - Google Patents

Man-machine interaction method and device combining human face and gesture control Download PDF

Info

Publication number
CN104407694A
CN104407694A CN201410597018.4A CN201410597018A CN104407694A CN 104407694 A CN104407694 A CN 104407694A CN 201410597018 A CN201410597018 A CN 201410597018A CN 104407694 A CN104407694 A CN 104407694A
Authority
CN
China
Prior art keywords
action
staff
gesture
face
human
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410597018.4A
Other languages
Chinese (zh)
Other versions
CN104407694B (en
Inventor
张海霞
尚蕾
刘治
孙彬
金蕾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University
Original Assignee
Shandong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University filed Critical Shandong University
Priority to CN201410597018.4A priority Critical patent/CN104407694B/en
Publication of CN104407694A publication Critical patent/CN104407694A/en
Application granted granted Critical
Publication of CN104407694B publication Critical patent/CN104407694B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention relates to a man-machine interaction method and device combining human face and gesture control. The method is used for carrying out interaction with a man-machine interaction device, the man-machine interaction device obtains human hand region information and human face region information, respectively calculates and sets a human hand center point and a human face center point starting the man-machine interaction according to a human hand region and a human face region, through the moving tracks of the human hand center point and the human face center point, whether the gesture is the static change gesture or the dynamic change gesture or not and whether the action is the static human face action or the dynamic human face action or not are respectively judged, the man-machine interaction device identifies and obtains control instructions corresponding to the action, and the control instructions are used for controlling terminal equipment to complete the corresponding operation. The man-machine interaction method has the advantages that the improvement is carried out on the basis of the traditional man-machine interaction method, and the problem in the traditional man-machine interaction method is solved. Compared with the traditional man-machine interaction method, the method has the advantages that the step design is more reasonable and more effective, the effects are obvious, and higher practicability values are realized.

Description

A kind of man-machine interaction method of controlling in conjunction with face and gesture and device
Technical field
The present invention relates to a kind of man-machine interaction method of controlling in conjunction with face and gesture and device, belong to artificial intelligence and technical field of image processing.
Background technology
Along with the development of science and technology, the application of various smart machine is more and more extensive.Man-machine interaction between people and smart machine also gets more and more, and man-machine interaction mode has contact and contactless two kinds, and the man-machine interaction mode of contact is comparatively ripe and perfect, and contactless man-machine interactive operation mode is then in the research starting stage.In recent years, contactless man-machine interaction mode is the focus and difficulties studied in man-machine interaction mode always, study well contactless man-machine interaction mode and there is very high use value, be especially applied under the occasion of some particular/special requirements that then meaning is more great.
Face is equally inherent with the other biological feature of human body, and it has the superperformance of uniqueness and not transreplication.Face as a kind of three-dimensional body, by the joint effect of illumination variation, attitude, expression and other factors.Recognition of face is one of hot subject of the area researches such as pattern-recognition in recent years, image procossing, machine vision, neural network and cognitive science.
Gesture be one intuitively, interactive mode naturally, expression way is quick, and expression and significance enriches, and is the important tool of the mutual exchange of information of the mankind.Gesture identification identifies the content of gesture expression, has very strong consistance and extendability.In application aspect, gesture recognition system can realize deaf-mute and exchange with normal person, also can in application software direct control software or virtual objects.From the structure of hand, gesture comprises finger and palm; Divide from hand in three-dimensional movement, can in level, vertically, the degree of depth three directions are moved.The change that the finger of varying number adds palm in conjunction with the movement of different directions can be combined into different gestures, can realize controlling, the different utilization such as mobile.In recent years, how face and gesture being combined and reach better person to person or man-machine interaction, is focus and the difficulties of research.
Summary of the invention
For solving the problem, the invention provides a kind of man-machine interaction method controlled in conjunction with face and gesture, the method more precisely can must obtain gesture control information, and the method has enriched the diversity of man-machine interaction, realizes more efficiently man-machine interactive operation.
The present invention also provides a kind of device for realizing the above-mentioned man-machine interaction method controlled in conjunction with face and gesture.
Term illustrates:
1, depth information, refer to the object dimensional characteristic information that depth image has, the two dimensional image that depth image is different and usually handled, pixel in depth image represents a relative depth information, therefore each pixel contains the information of each coordinate figure under three-dimensional system of coordinate, and depth image data reflect the three-dimensional information on scenery surface.
2, morphological image process, refer to and mathematical mor-phology is extracted for the picture content of expressing and description region shape is useful as instrument from image, such as border, skeleton and convex hull, also comprise for the morphologic filter of pre-service or aftertreatment, refinement and pruning etc.
Technical scheme of the present invention is as follows:
In conjunction with the man-machine interaction method that face and gesture control, the method comprises: realize the operation with human-computer interaction device by the behavior aggregate of human body, and described behavior aggregate comprises the action that simple staff action, simple human face action and staff are combined with face;
Human-computer interaction device obtains human face region information and staff area information respectively, and wherein, described staff region is between human body and described human-computer interaction device; Described staff action is the staff action in staff region, and described human face action is the human face action in human face region;
Described human-computer interaction device calculates according to staff area information and arranges the staff central point opening interactive action, described staff central point is as the reference mark of staff region at three-dimensional space motion, by judging the movement locus of staff central point in three dimensions, determine that staff action is static change gesture or dynamic change gesture; Described human-computer interaction device calculates according to human face region information and arranges the face center opening interactive action, described face center is as the reference mark of human face region at three-dimensional space motion, by judging the movement locus of face center in three dimensions, determine that human face action is static facial action or dynamic facial action;
Described human-computer interaction device identification also obtains the steering order corresponding with the action in behavior aggregate, and described steering order is used for control terminal and completes corresponding operating.
Preferably, the process of described human-computer interaction device acquisition staff area information comprises:
First, described human-computer interaction device calculates according to the human face region information obtained and obtains the features of skin colors of human body, obtain staff area information according to staff depth information and features of skin colors again, finally according to optimized algorithm and combining image Morphological scale-space, process is optimized to staff area information.
Preferred further, described optimized algorithm is Blob labeling algorithm.
Preferably, described staff central point is set to the maximum inscribed circle center of circle of palm or the minimum circumscribed circle center of circle of whole hand; Described face center is set to the face minimum circumscribed circle center of circle or the maximum inscribed circle center of circle.
Preferably, the movement locus of described staff central point in three dimensions, by judging the displacement of staff central point in three-dimensional system of coordinate and the magnitude relationship of threshold value, determine that staff action is static change gesture or dynamic change gesture: staff central point is a corresponding three-dimensional coordinate in three dimensions, by judging the combination motion vector of this staff central point movement locus X-axis Y-axis Z axis in three-dimensional system of coordinate, if the value of this combination motion vector is less than the threshold value preset, then staff action is static change gesture, if the value of this combination motion vector is greater than the threshold value preset, then staff action is dynamic change gesture.
The gesture that further preferred, described static change gesture comprises the gesture identifying the combination of the flexible number of finger and determine, the gesture determined with the profile of hand and circumscribed circle area ratio parameter, the number that adds effective convex defect characteristic angle with the effective convex defect of hand are determined, with the vectorial gesture determined of the one-dimensional characteristic of intrinsic image; The dynamic change gesture that the displacement that described dynamic change gesture comprises simple staff central point is determined, the displacement of staff central point change the dynamic change gesture determined in conjunction with gesture.The benefit of this design is, the gesture that dynamic change gesture is brought except the change in displacement comprising simple staff central point top to bottom, left and right, front and rear controls, also comprise the gesture brought in conjunction with the change in location of staff central point and the attitudes vibration of finger or palm to control, decrease the gesture identification time, add the diversity of human-computer interaction device operation in interactive process.
Preferably, the movement locus of described face center in three dimensions, by judging the displacement of face center in three-dimensional system of coordinate and the magnitude relationship of threshold value, determine that human face action is static facial action or dynamic facial action: face center is a corresponding three-dimensional coordinate in three dimensions, by judging the combination motion vector of this face center movement locus X-axis Y-axis Z axis in three-dimensional system of coordinate, if the value of this combination motion vector is less than the threshold value preset, then human face action is static facial action, if the value of this combination motion vector is greater than the threshold value preset, then human face action is dynamic facial action.
Preferred further, described static facial action comprises the action of human face five-sense-organ change, and described dynamic facial action comprises the action that people's head change in location causes.
Preferably, described man-machine interaction method also comprises human-computer interaction device timing acquisition and upgrades the process of features of skin colors.The object of this design is, in the process of carrying out man-machine interaction, the illumination variation of human-computer interaction device has an impact to the colour of skin unavoidably, by arranging timing extraction image device, timing upgrades features of skin colors, so that human-computer interaction device accurately can detect the gesture motion in staff region.
For realizing a device for the man-machine interaction method controlled in conjunction with face and gesture, comprise the image collection module, action processing module, functional realiey module and the terminal device that connect successively; After described image collection module obtains human face region information and staff area information respectively, to be calculated according to human face region information and staff area information by action processing module and the face center and staff central point of opening interactive action are set, action processing module is transferred to functional realiey module after judging to identify the behavior aggregate of human body by the movement locus of face center and staff central point in conjunction with gesture and facial action, functional realiey module loading has the steering order corresponding with this action message, and this steering order carries out the operation of terminal device.
Preferably, this device also comprises timing extraction module, and described timing extraction module is in order to the human face region information timing acquisition that obtains image collection module and upgrade features of skin colors.
Beneficial effect of the present invention is:
1. this man-machine interaction method changes the mode that tradition simple acquisition images of gestures carries out terminal device operation, by face information is combined with staff information, correspondingly enrich the steering order of human-computer interaction device, achieve the diversity of control terminal operation.
2. this man-machine interaction method changes the mode that tradition simple acquisition images of gestures carries out terminal device operation, by adding this factor of face information in interactive process, human-computer interaction device is enable more accurately to locate staff region by obtaining features of skin colors, the gesture motion of carrying out in accurate staff region can make human-computer interaction device carry out more efficiently terminal device operation, solves the not high and problem such as maloperation, repeatable operation brought of gesture identification degree in conventional human's reciprocal process.
3. this man-machine interaction method is different from conventional human's interactive device does not carry out Division identification single way to gesture motion, before human-computer interaction device carries out action recognition, first division is carried out to gesture motion and human face action and be identified as static change gesture or dynamic change gesture, static facial action or dynamic facial action, improve the validity of gesture motion and human face action identification; Meanwhile, static change gesture or dynamic change gesture, static facial action or dynamic facial action can the different gesture of easier class definition, operational motions under face, improve practicality and the controlling of human-computer interaction device.
4. this man-machine interaction method passes through to increase timing acquisition and the design upgrading features of skin colors, its objective is to reduce the impact of image acquisition equipment illumination on the colour of skin, and then obtains staff region more accurately, can be more accurate to the identification of gesture motion.
5. this man-machine interaction method improves on the basis of conventional human's exchange method, solves Problems existing in conventional human's exchange method; Compared to traditional man-machine interaction method, the design of the method step is more reasonable effectively, and its effect is more obvious, and effect is more remarkable, has higher practical value.
Accompanying drawing explanation
Fig. 1 is the structural representation of human-computer interaction device in interactive process;
Fig. 2 is the structured flowchart of human-computer interaction device each several part in interactive process;
Fig. 3 is the schematic flow sheet of exchange method in interactive process.
Embodiment
Below by embodiment, also the present invention will be further described by reference to the accompanying drawings, but be not limited thereto.
Embodiment 1:
In conjunction with the man-machine interaction method that face and gesture control, described method comprises:
First according to the image lock human face region that human-computer interaction device obtains, this process can according to physiological knowledge, the aspect ratio range (0.9 of such as face, 2.0) in conjunction with Adaboost algorithm, obtain human face region comparatively reliably, then obtain features of skin colors according to the human face region obtained, this process obtains features of skin colors by image acquiring device post-processed
Secondly, in actual mechanical process, staff region is between human body and human-computer interaction device; Obtain staff area information according to staff depth information in conjunction with features of skin colors, then according to Blob labeling optimized algorithm and the operation of combining image Morphological scale-space, carry out denoising, obtain more accurate staff area information further.
Human-computer interaction device calculates according to staff area information and arranges the staff central point opening interactive action, and this staff central point is the maximum inscribed circle center of circle of palm, the reference mark using this staff central point as staff region at three-dimensional space motion; Human-computer interaction device calculates according to human face region information and arranges the face center opening interactive action, face center is elected as the center of circle of face minimum circumscribed circle;
Judge the movement locus of staff central point in three dimensions, by judging the magnitude relationship of the threshold value M that staff central point presets at three-dimensional coordinate intrinsic displacement and human-computer interaction device, detailed process is: this staff central point is a corresponding three-dimensional coordinate in three dimensions, by judging the combination motion vector of this staff central point movement locus X-axis Y-axis Z axis in three-dimensional system of coordinate, if the value of this combination motion vector is less than the threshold value M preset, then staff action is static change gesture, if the value of this combination motion vector is greater than the threshold value M preset, then staff action is dynamic change gesture.
The gesture that the combination that static change gesture comprises pointing flexible number is determined, as stretched out three fingers; Or with the gesture that the profile of hand and circumscribed circle area ratio parameter are determined, as clenched fist; Or with the gesture that the number that the effective convex defect of hand adds effective convex defect characteristic angle is determined; Or with the gesture that the one-dimensional characteristic of intrinsic image vector is determined; The dynamic change gesture that the displacement that dynamic change gesture comprises staff central point is determined, moves left and right as staff or moves forward and backward or move up and down; The displacement of staff central point changes the dynamic change gesture determined in conjunction with gesture, as staff move left and right while stretch out again three fingers or staff movable while clench fist.
The gesture that dynamic change gesture is brought except the change in displacement comprising simple staff central point top to bottom, left and right, front and rear controls, also comprise the gesture brought in conjunction with the change in location of staff central point and the attitudes vibration of finger or palm to control, add the diversity of human-computer interaction device operation in interactive process.
The movement locus of face center in three dimensions, being the magnitude relationship by judging the threshold value N that the displacement of face center in three-dimensional system of coordinate and human-computer interaction device preset, determining that human face action is static facial action or dynamic facial action.Detailed process is: face center is a corresponding three-dimensional coordinate in three dimensions, by judging the combination motion vector of this face center movement locus X-axis Y-axis Z axis in three-dimensional system of coordinate, if the value of this combination motion vector is less than the threshold value N preset, then human face action is static facial action, as the change of simple human face five-sense-organ, such as, blink or open one's mouth or frown; If the value of this combination motion vector is greater than the threshold value N preset, then human face action is dynamic facial action, the action that people's head change in location causes, as come back, bowing, peak with heading, in no time etc.
Finally, human-computer interaction device identification also obtains the steering order corresponding with the action in behavior aggregate, and the terminal device that steering order is used for control terminal completes corresponding operation, as actions such as crawl, taking and placing, switch, startup, time-outs.Command adapted thereto is pre-stored within database by human-computer interaction device, and when people represents a certain action, human-computer interaction device identification also transfers the steering order corresponding with this action from database, and this steering order control terminal realizes specific operational motion.
Embodiment 2:
In conjunction with the man-machine interaction method that face and gesture control, as described in Example 1, its difference is its method step, according to the minimum circumscribed circle center of circle of whole staff, take this center of circle as the staff central point of man-machine interaction gesture; Face center is elected as the center of circle of face maximum inscribed circle.
Embodiment 3:
In conjunction with the man-machine interaction method that face and gesture control, as described in Example 1, its difference is its method step, and this man-machine interaction method also comprises human-computer interaction device timing acquisition and upgrades the process of features of skin colors.In the process of carrying out man-machine interaction, the illumination variation of human-computer interaction device has an impact to the colour of skin unavoidably, and by arranging timing extraction image device, timing upgrades features of skin colors, so that human-computer interaction device accurately can detect the gesture motion in staff region.
Embodiment 4:
For a device for the man-machine interaction method in conjunction with face and gesture control, comprise the image collection module, action processing module, functional realiey module and the terminal device that connect successively; After described image collection module obtains human face region information and staff area information respectively, to be calculated according to human face region information and staff area information by action processing module and the face center and staff central point of opening interactive action are set, action processing module is transferred to functional realiey module after judging to identify the behavior aggregate of human body by the movement locus of face center and staff central point in conjunction with gesture and facial action, functional realiey module loading has the steering order corresponding with this action message, and this steering order carries out the operation of terminal device.
This device also comprises timing extraction module, and described timing extraction module is in order to the human face region information timing acquisition that obtains image collection module and upgrade features of skin colors.Features of skin colors is upgraded, so that human-computer interaction device accurately can detect the gesture motion in staff region by timing.

Claims (10)

1. in conjunction with the man-machine interaction method that face and gesture control, the method comprises: realize the operation with human-computer interaction device by the behavior aggregate of human body, and described behavior aggregate comprises the action that simple staff action, simple human face action and staff are combined with face;
Human-computer interaction device obtains human face region information and staff area information respectively, and wherein, described staff region is between human body and described human-computer interaction device; Described staff action is the staff action in staff region, and described human face action is the human face action in human face region;
Described human-computer interaction device calculates according to staff area information and arranges the staff central point opening interactive action, described staff central point is as the reference mark of staff region at three-dimensional space motion, by judging the movement locus of staff central point in three dimensions, determine that staff action is static change gesture or dynamic change gesture; Described human-computer interaction device calculates according to human face region information and arranges the face center opening interactive action, described face center is as the reference mark of human face region at three-dimensional space motion, by judging the movement locus of face center in three dimensions, determine that human face action is static facial action or dynamic facial action;
Described human-computer interaction device identification also obtains the steering order corresponding with the action in behavior aggregate, and described steering order is used for control terminal and completes corresponding operating.
2., as claimed in claim 1 in conjunction with the man-machine interaction method that face and gesture control, it is characterized in that, the process that described human-computer interaction device obtains staff area information comprises:
First, described human-computer interaction device calculates according to the human face region information obtained and obtains the features of skin colors of human body, obtain staff area information according to staff depth information and features of skin colors again, finally according to optimized algorithm and combining image Morphological scale-space, process is optimized to staff area information.
3., as claimed in claim 2 in conjunction with the man-machine interaction method that face and gesture control, it is characterized in that, described optimized algorithm is Blob labeling algorithm.
4., as claimed in claim 1 in conjunction with the man-machine interaction method that face and gesture control, it is characterized in that, described staff central point is set to the maximum inscribed circle center of circle of palm or the minimum circumscribed circle center of circle of whole hand; Described face center is set to the face minimum circumscribed circle center of circle or the maximum inscribed circle center of circle.
5. as claimed in claim 1 in conjunction with the man-machine interaction method that face and gesture control, it is characterized in that, the movement locus of described staff central point in three dimensions, by judging the displacement of staff central point in three-dimensional system of coordinate and the magnitude relationship of threshold value, determine that staff action is static change gesture or dynamic change gesture: staff central point is a corresponding three-dimensional coordinate in three dimensions, by judging the combination motion vector of this staff central point movement locus X-axis Y-axis Z axis in three-dimensional system of coordinate, if the value of this combination motion vector is less than the threshold value preset, then staff action is static change gesture, if the value of this combination motion vector is greater than the threshold value preset, then staff action is dynamic change gesture.
6. as claimed in claim 5 in conjunction with the man-machine interaction method that face and gesture control, it is characterized in that, described static change gesture comprises the gesture identifying the combination of the flexible number of finger and determine, gesture that the gesture determined with the profile of hand and circumscribed circle area ratio parameter, the number that adds effective convex defect characteristic angle with the effective convex defect of hand are determined, with the vectorial gesture determined of the one-dimensional characteristic of intrinsic image; The dynamic change gesture that the displacement that described dynamic change gesture comprises simple staff central point is determined, the displacement of staff central point change the dynamic change gesture determined in conjunction with gesture.
7. as claimed in claim 1 in conjunction with the man-machine interaction method that face and gesture control, it is characterized in that, the movement locus of described face center in three dimensions, by judging the displacement of face center in three-dimensional system of coordinate and the magnitude relationship of threshold value, determine that human face action is static facial action or dynamic facial action: face center is a corresponding three-dimensional coordinate in three dimensions, by judging the combination motion vector of this face center movement locus X-axis Y-axis Z axis in three-dimensional system of coordinate, if the value of this combination motion vector is less than the threshold value preset, then human face action is static facial action, if the value of this combination motion vector is greater than the threshold value preset, then human face action is dynamic facial action.
8., as claimed in claim 7 in conjunction with the man-machine interaction method that face and gesture control, it is characterized in that, described static facial action comprises the action of human face five-sense-organ change, and described dynamic facial action comprises the action that people's head change in location causes; Described man-machine interaction method also comprises human-computer interaction device timing acquisition and upgrades the process of features of skin colors.
9., for realizing a device for the man-machine interaction method controlled in conjunction with face and gesture described in any one of claim 1 to 8, it is characterized in that, comprise the image collection module, action processing module, functional realiey module and the terminal device that connect successively; After described image collection module obtains human face region information and staff area information respectively, to be calculated according to human face region information and staff area information by action processing module and the face center and staff central point of opening interactive action are set, action processing module is transferred to functional realiey module after judging to identify the behavior aggregate of human body by the movement locus of face center and staff central point in conjunction with gesture and facial action, functional realiey module loading has the steering order corresponding with this action message, and this steering order carries out the operation of terminal device.
10. as claimed in claim 9 for realizing the device of the man-machine interaction method controlled in conjunction with face and gesture described in any one of claim 1 to 8, it is characterized in that, this device also comprises timing extraction module, and described timing extraction module is in order to the human face region information timing acquisition that obtains image collection module and upgrade features of skin colors.
CN201410597018.4A 2014-10-29 2014-10-29 The man-machine interaction method and device of a kind of combination face and gesture control Active CN104407694B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410597018.4A CN104407694B (en) 2014-10-29 2014-10-29 The man-machine interaction method and device of a kind of combination face and gesture control

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410597018.4A CN104407694B (en) 2014-10-29 2014-10-29 The man-machine interaction method and device of a kind of combination face and gesture control

Publications (2)

Publication Number Publication Date
CN104407694A true CN104407694A (en) 2015-03-11
CN104407694B CN104407694B (en) 2018-02-23

Family

ID=52645331

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410597018.4A Active CN104407694B (en) 2014-10-29 2014-10-29 The man-machine interaction method and device of a kind of combination face and gesture control

Country Status (1)

Country Link
CN (1) CN104407694B (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105653037A (en) * 2015-12-31 2016-06-08 张小花 Interactive system and method based on behavior analysis
CN106020227A (en) * 2016-08-12 2016-10-12 北京奇虎科技有限公司 Control method and device for unmanned aerial vehicle
CN106686231A (en) * 2016-12-27 2017-05-17 广东小天才科技有限公司 Message playing method of wearable device and wearable device
CN107066081A (en) * 2016-12-23 2017-08-18 歌尔科技有限公司 The interaction control method and device and virtual reality device of a kind of virtual reality system
CN107255942A (en) * 2017-06-02 2017-10-17 昆山锐芯微电子有限公司 The control method of smart machine, apparatus and system, storage medium
CN107450717A (en) * 2016-05-31 2017-12-08 联想(北京)有限公司 A kind of information processing method and Wearable
CN107678551A (en) * 2017-10-19 2018-02-09 京东方科技集团股份有限公司 Gesture identification method and device, electronic equipment
CN107837031A (en) * 2017-07-12 2018-03-27 李海英 A kind of method that puerpera goes to toilet after convenient childbirth
CN109032345A (en) * 2018-07-04 2018-12-18 百度在线网络技术(北京)有限公司 Apparatus control method, device, equipment, server-side and storage medium
CN109961454A (en) * 2017-12-22 2019-07-02 北京中科华正电气有限公司 Human-computer interaction device and processing method in a kind of embedded intelligence machine
CN109977906A (en) * 2019-04-04 2019-07-05 睿魔智能科技(深圳)有限公司 Gesture identification method and system, computer equipment and storage medium
WO2019223056A1 (en) * 2018-05-22 2019-11-28 深圳市鹰硕技术有限公司 Gesture recognition-based teaching and learning method and apparatus
CN110827414A (en) * 2019-11-05 2020-02-21 江西服装学院 Virtual digital library experience device based on VR technique
CN110996052A (en) * 2019-11-26 2020-04-10 绍兴天宏激光科技有限公司 Emergency alarm method and system based on image recognition
CN111062312A (en) * 2019-12-13 2020-04-24 RealMe重庆移动通信有限公司 Gesture recognition method, gesture control method, device, medium and terminal device
CN111429519A (en) * 2020-03-27 2020-07-17 贝壳技术有限公司 Three-dimensional scene display method and device, readable storage medium and electronic equipment
WO2020156469A1 (en) * 2019-01-31 2020-08-06 Huawei Technologies Co., Ltd. Hand-over-face input sensing for interaction with a device having a built-in camera
CN112506342A (en) * 2020-12-04 2021-03-16 郑州中业科技股份有限公司 Man-machine interaction method and system based on dynamic gesture recognition
CN113282164A (en) * 2021-03-01 2021-08-20 联想(北京)有限公司 Processing method and device
CN115185381A (en) * 2022-09-15 2022-10-14 北京航天奥祥通风科技股份有限公司 Method and device for controlling terminal based on motion trail of head

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102073436A (en) * 2009-11-23 2011-05-25 英业达股份有限公司 Picture operating method and electronic device using same
CN102117117A (en) * 2010-01-06 2011-07-06 致伸科技股份有限公司 System and method for control through identifying user posture by image extraction device
CN103562822A (en) * 2011-04-28 2014-02-05 Nec软件系统科技有限公司 Information processing device, information processing method, and recording medium
US20140210704A1 (en) * 2013-01-29 2014-07-31 Wistron Corporation Gesture recognizing and controlling method and device thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102073436A (en) * 2009-11-23 2011-05-25 英业达股份有限公司 Picture operating method and electronic device using same
CN102117117A (en) * 2010-01-06 2011-07-06 致伸科技股份有限公司 System and method for control through identifying user posture by image extraction device
CN103562822A (en) * 2011-04-28 2014-02-05 Nec软件系统科技有限公司 Information processing device, information processing method, and recording medium
US20140210704A1 (en) * 2013-01-29 2014-07-31 Wistron Corporation Gesture recognizing and controlling method and device thereof

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105653037A (en) * 2015-12-31 2016-06-08 张小花 Interactive system and method based on behavior analysis
CN107450717A (en) * 2016-05-31 2017-12-08 联想(北京)有限公司 A kind of information processing method and Wearable
CN106020227B (en) * 2016-08-12 2019-02-26 北京奇虎科技有限公司 The control method of unmanned plane, device
CN106020227A (en) * 2016-08-12 2016-10-12 北京奇虎科技有限公司 Control method and device for unmanned aerial vehicle
CN107066081A (en) * 2016-12-23 2017-08-18 歌尔科技有限公司 The interaction control method and device and virtual reality device of a kind of virtual reality system
CN107066081B (en) * 2016-12-23 2023-09-15 歌尔科技有限公司 Interactive control method and device of virtual reality system and virtual reality equipment
CN106686231A (en) * 2016-12-27 2017-05-17 广东小天才科技有限公司 Message playing method of wearable device and wearable device
CN107255942A (en) * 2017-06-02 2017-10-17 昆山锐芯微电子有限公司 The control method of smart machine, apparatus and system, storage medium
CN107837031B (en) * 2017-07-12 2020-03-06 聊城市东昌府区妇幼保健院 Closestool for puerpera after delivery
CN107837031A (en) * 2017-07-12 2018-03-27 李海英 A kind of method that puerpera goes to toilet after convenient childbirth
CN107678551B (en) * 2017-10-19 2021-12-28 京东方科技集团股份有限公司 Gesture recognition method and device and electronic equipment
US11402918B2 (en) 2017-10-19 2022-08-02 Boe Technology Group Co., Ltd. Method for controlling terminal apparatus, apparatus for controlling terminal apparatus, and computer-program product
CN107678551A (en) * 2017-10-19 2018-02-09 京东方科技集团股份有限公司 Gesture identification method and device, electronic equipment
CN109961454A (en) * 2017-12-22 2019-07-02 北京中科华正电气有限公司 Human-computer interaction device and processing method in a kind of embedded intelligence machine
WO2019223056A1 (en) * 2018-05-22 2019-11-28 深圳市鹰硕技术有限公司 Gesture recognition-based teaching and learning method and apparatus
CN109032345A (en) * 2018-07-04 2018-12-18 百度在线网络技术(北京)有限公司 Apparatus control method, device, equipment, server-side and storage medium
WO2020156469A1 (en) * 2019-01-31 2020-08-06 Huawei Technologies Co., Ltd. Hand-over-face input sensing for interaction with a device having a built-in camera
US10885322B2 (en) 2019-01-31 2021-01-05 Huawei Technologies Co., Ltd. Hand-over-face input sensing for interaction with a device having a built-in camera
US11393254B2 (en) 2019-01-31 2022-07-19 Huawei Technologies Co., Ltd. Hand-over-face input sensing for interaction with a device having a built-in camera
CN109977906A (en) * 2019-04-04 2019-07-05 睿魔智能科技(深圳)有限公司 Gesture identification method and system, computer equipment and storage medium
CN109977906B (en) * 2019-04-04 2021-06-01 睿魔智能科技(深圳)有限公司 Gesture recognition method and system, computer device and storage medium
CN110827414A (en) * 2019-11-05 2020-02-21 江西服装学院 Virtual digital library experience device based on VR technique
CN110996052A (en) * 2019-11-26 2020-04-10 绍兴天宏激光科技有限公司 Emergency alarm method and system based on image recognition
CN111062312B (en) * 2019-12-13 2023-10-27 RealMe重庆移动通信有限公司 Gesture recognition method, gesture control device, medium and terminal equipment
CN111062312A (en) * 2019-12-13 2020-04-24 RealMe重庆移动通信有限公司 Gesture recognition method, gesture control method, device, medium and terminal device
CN111429519A (en) * 2020-03-27 2020-07-17 贝壳技术有限公司 Three-dimensional scene display method and device, readable storage medium and electronic equipment
CN111429519B (en) * 2020-03-27 2021-07-16 贝壳找房(北京)科技有限公司 Three-dimensional scene display method and device, readable storage medium and electronic equipment
CN112506342A (en) * 2020-12-04 2021-03-16 郑州中业科技股份有限公司 Man-machine interaction method and system based on dynamic gesture recognition
CN113282164A (en) * 2021-03-01 2021-08-20 联想(北京)有限公司 Processing method and device
CN115185381A (en) * 2022-09-15 2022-10-14 北京航天奥祥通风科技股份有限公司 Method and device for controlling terminal based on motion trail of head

Also Published As

Publication number Publication date
CN104407694B (en) 2018-02-23

Similar Documents

Publication Publication Date Title
CN104407694A (en) Man-machine interaction method and device combining human face and gesture control
CN104331158A (en) Gesture-controlled human-computer interaction method and device
US20210365492A1 (en) Method and apparatus for identifying input features for later recognition
Sun et al. Magichand: Interact with iot devices in augmented reality environment
Zhu et al. Vision based hand gesture recognition
CN102402289B (en) Mouse recognition method for gesture based on machine vision
CN105739702A (en) Multi-posture fingertip tracking method for natural man-machine interaction
CN106502570A (en) A kind of method of gesture identification, device and onboard system
CN103984928A (en) Finger gesture recognition method based on field depth image
CN107357428A (en) Man-machine interaction method and device based on gesture identification, system
CN103995595A (en) Game somatosensory control method based on hand gestures
Vivek Veeriah et al. Robust hand gesture recognition algorithm for simple mouse control
Li et al. Hand gesture tracking and recognition based human-computer interaction system and its applications
CN107450717A (en) A kind of information processing method and Wearable
CN108108648A (en) A kind of new gesture recognition system device and method
Itkarkar et al. A study of vision based hand gesture recognition for human machine interaction
Thomas et al. A comprehensive review on vision based hand gesture recognition technology
Ardizzone et al. Pose classification using support vector machines
Gong et al. A multi-objective optimization model and its evolution-based solutions for the fingertip localization problem
CN114296543A (en) Fingertip force detection and gesture recognition intelligent interaction system and intelligent ring
Kakade et al. Dynamic hand gesture recognition: a literature review
Jiang et al. Gesture recognition based on depth information and convolutional neural network
Jiang et al. A robust method of fingertip detection in complex background
Prabhakar et al. AI And Hand Gesture Recognition Based Virtual Mouse
Ye et al. 3D Dynamic Hand Gesture Recognition with Fused RGB and Depth Images.

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant