CN103324274A - Method and device for man-machine interaction - Google Patents

Method and device for man-machine interaction Download PDF

Info

Publication number
CN103324274A
CN103324274A CN2012100784857A CN201210078485A CN103324274A CN 103324274 A CN103324274 A CN 103324274A CN 2012100784857 A CN2012100784857 A CN 2012100784857A CN 201210078485 A CN201210078485 A CN 201210078485A CN 103324274 A CN103324274 A CN 103324274A
Authority
CN
China
Prior art keywords
user
gesture feature
directive command
sets
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2012100784857A
Other languages
Chinese (zh)
Inventor
陈柯
杨锦平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN2012100784857A priority Critical patent/CN103324274A/en
Publication of CN103324274A publication Critical patent/CN103324274A/en
Pending legal-status Critical Current

Links

Images

Abstract

The embodiment of the invention provides a method and device for man-machine interaction, which allows the man-machine interaction through hands to be more accurate. The method comprises the following steps: obtaining the gesture features of a current user by collecting the hand vein image information of the current user; determining the instruction corresponding to the gesture features of the current user by comparing the gesture features of the current user to preset gesture features, and actuating the instruction. The embodiment of the invention is suitable for the technical field of man-machine interaction.

Description

A kind of man-machine interaction method and device
Technical field
The present invention relates to human-computer interaction technique field, relate in particular to a kind of man-machine interaction method and device based on vein identification.
Background technology
The mutual form of user and machinery and equipment mainly or by conventional input equipment (such as keyboard, mouse and touch-screen etc.) is inputted in the prior art, and this becomes the bottleneck of present man-machine interaction.
For can better realize with staff directly as with the human-computer interaction device of computing machine, the method for gesture interaction has also been proposed in the prior art, as utilize specifically defined gesture realize user and equipment alternately.But adopt this method, because machine can't well be distinguished the color of staff and the color of some background technology, cause the identification gesture that machine can't be accurately and timely.
Summary of the invention
Embodiments of the invention provide a kind of man-machine interaction exchange method and device, can be so that more accurate by the man-machine interaction of hand.
For achieving the above object, embodiments of the invention adopt following technical scheme:
A kind of man-machine interaction method comprises:
By gathering user's hand vein image information acquisition active user gesture feature;
By active user's gesture feature and the user's gesture feature that pre-sets are compared, determine the corresponding directive command of active user's gesture feature, carry out described directive command.
A kind of device of man-machine interaction comprises: acquiring unit, determining unit and performance element;
Described acquiring unit is used for by gathering user's hand vein image information acquisition user gesture feature;
Described determining unit is used for comparing by described active user's gesture feature that described acquiring unit is obtained and the user's gesture feature that pre-sets, and determines the corresponding directive command of active user's gesture feature;
Described performance element is used for carrying out described directive command according to the described directive command that described determining unit is determined.
The embodiment of the invention provides a kind of method and apparatus of man-machine interaction, by gathering user's hand vein image information acquisition active user gesture feature; By active user's gesture feature and the user's gesture feature that pre-sets are compared, determine the corresponding directive command of active user's gesture feature, carry out described directive command, this man-machine interaction based on hand vein identification is more accurate.
Description of drawings
In order to be illustrated more clearly in the embodiment of the invention or technical scheme of the prior art, the below will do to introduce simply to the accompanying drawing of required use in embodiment or the description of the Prior Art, apparently, accompanying drawing in the following describes only is some embodiments of the present invention, for those of ordinary skills, under the prerequisite of not paying creative work, can also obtain according to these accompanying drawings other accompanying drawing.
Fig. 1 provides a kind of man-machine interaction method schematic flow sheet based on vein identification for the embodiment of the invention one;
Fig. 2 provides a kind of apparatus structure block diagram of the man-machine interaction based on vein identification for the embodiment of the invention one;
Fig. 3 provides another kind of man-machine interaction method schematic flow sheet based on vein identification for the embodiment of the invention two;
Fig. 4 provides the apparatus structure block diagram of another kind based on the man-machine interaction of vein identification for the embodiment of the invention three.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the invention, the technical scheme in the embodiment of the invention is clearly and completely described, obviously, described embodiment only is the present invention's part embodiment, rather than whole embodiment.Based on the embodiment among the present invention, those of ordinary skills belong to the scope of protection of the invention not making the every other embodiment that obtains under the creative work prerequisite.
Embodiment one,
The embodiment of the invention provides a kind of man-machine interaction method, as shown in Figure 1, comprising:
S 101, by gathering user's hand vein image information acquisition active user gesture feature.
Vein identification is according to the blood red feature that absorbs infrared ray that have in the blood, the camera that will have the near infrared ray sensitivity is made a video recording to hand, the shaded side that can will shine vein blood vessel takes out image, then the vein blood vessel image is carried out digital processing, extract vein blood vessel Characteristic of Image value.
According to this principle, after the user made a certain gesture, the identification equipment with near infrared ray sensitivity camera was made a video recording to the user and is gathered hand vein image information acquisition active user gesture feature.
Wherein the camera of near infrared ray sensitivity can be infrared C CD camera.
S102, by active user's gesture feature and user's gesture feature of pre-seting are compared, determine the corresponding directive command of active user's gesture feature, carry out described directive command.
After having the identification equipment active user gesture feature of infrared induction degree camera, this gesture feature and the user's gesture feature that pre-sets are compared, determine the corresponding directive command of described active user's gesture feature.Wherein pre-seted the corresponding relation between user's gesture feature and the directive command in the identification equipment.Then identification equipment is indicated described directive command according to the directive command of the correspondence of determining.
For example, the shutdown command of the corresponding computing machine of gesture of the right hand back of the body that pre-sets, then after the user stretches out right hand back, identification equipment determines that active user's gesture feature is the gesture of the right hand back of the body after collecting the vein image information at this right hand back, determine that according to the gesture feature of this right hand back of the body corresponding directive command is " shutdown " order, identification equipment is carried out the order of shutdown, with this identification equipment shutdown.
Further, the device that a kind of method that can adopt the embodiment of the invention to provide is carried out man-machine interaction is provided described identification equipment, can be the computing machine with infrared induction degree camera, also can be other equipment with infrared induction degree camera.
The embodiment of the invention also provides a kind of device 20 of man-machine interaction, and the described device of the embodiment of the invention describes as an example of computing machine example, as shown in Figure 2, comprising: acquiring unit 21, determining unit 22 and performance element 23.
Described acquiring unit 21 is used for by gathering user's hand vein image information acquisition user gesture feature.
Vein identification is according to the blood red feature that absorbs infrared ray that have in the blood, the camera that will have the near infrared ray sensitivity is made a video recording to hand, the shaded side that can will shine vein blood vessel takes out image, then the vein blood vessel image is carried out digital processing, extract vein blood vessel Characteristic of Image value.
According to this principle, after the user made a certain gesture, 21 couples of users of described acquiring unit made a video recording and gather hand vein image information acquisition active user gesture feature.Certain described acquiring unit 21 can collect the vein image of user's hand, specifically can have the camera of near infrared ray sensitivity.
Wherein the camera of near infrared ray sensitivity can be infrared C CD camera.
Described determining unit 22 is used for comparing by described active user's gesture feature that described acquiring unit is obtained and the user's gesture feature that pre-sets, and determines the corresponding directive command of active user's gesture feature.
After described acquiring unit 21 got access to active user's gesture feature, described determining unit 22 compared this gesture feature and the user's gesture feature that pre-sets, and determines the corresponding directive command of described active user's gesture feature.Pre-seted the corresponding relation between user's gesture feature and the directive command in the wherein said device 20.
For example, described device 20 has pre-seted the shutdown command of the corresponding computing machine of gesture of the right hand back of the body, then after the user stretches out right hand back, acquiring unit 21 determines that active user's gesture feature is the gesture of the right hand back of the body after collecting the vein image information at this right hand back, and determining unit determines that according to the gesture feature of this right hand back of the body corresponding directive command be that " shutdown " orders.
Described performance element 23 is used for carrying out described directive command according to the described directive command that described determining unit is determined.
Further, described device 20 can be the computing machine with infrared induction degree camera, also can be other equipment with infrared induction degree camera.
The embodiment of the invention provides a kind of method and apparatus of man-machine interaction, by gathering user's hand vein image information acquisition active user gesture feature; By active user's gesture feature and the user's gesture feature that pre-sets are compared, determine the corresponding directive command of active user's gesture feature, carry out described directive command, this man-machine interaction based on hand vein identification is more accurate.
Embodiment two,
The embodiment of the invention provides a kind of man-machine interaction method, as shown in Figure 3, comprising:
S301, pre-set user's directive command corresponding to gesture feature, obtain the described user's gesture feature that pre-sets and the corresponding relation of directive command.
Vein identification is according to the blood red feature that absorbs infrared ray that have in the blood, the camera that will have the near infrared ray sensitivity is made a video recording to hand, the shaded side that can will shine vein blood vessel takes out image, then the vein blood vessel image is carried out digital processing, extract vein blood vessel Characteristic of Image value.
According to this principle, the user can make various gestures and come to carry out man-machine interaction with identification equipment.For example, identification equipment can pre-set " start " order that directive command corresponding to gesture feature that stretches out the right hand palm is identification equipment, directive command corresponding to gesture feature that stretches out the right hand back of the body is " shutdown " order of identification equipment, directive command corresponding to gesture feature that stretches out the left hand palm is the order of " cancellation " of identification equipment, directive command corresponding to gesture feature that stretches out the left hand back of the body is " restarting " order of identification equipment, directive command corresponding to gesture feature that stretches out the left hand palm and the right hand palm is the order of " locking " of identification equipment, stretches out left hand is carried on the back and the right hand is carried on the back directive command corresponding to gesture feature and be the order etc. of " the release knowledge " of identification equipment.
Certainly because the vein image the characteristics of information value at the same position of different user is different, the same gesture feature of making for the user like this, the vein image information that identification equipment collects is also different, like this for different users, then can Extraordinary equipment gesture feature and directive command between corresponding relation.
As, the directive command corresponding to gesture feature of the right hand palm that identification equipment can stretch out user A preset and is set to identification equipment " start " order; And identification equipment can user B stretches out the directive command corresponding to gesture feature of the right hand palm and is set to " shutdown " and orders.
Further, identification equipment can also be for the corresponding relation that pre-sets user's gesture feature and directive command of concrete a certain application program.As for certain a game, can pre-set directive command corresponding to gesture feature that stretches out right hand forefinger and be " advancing " order, other orders etc. can also be set, only as example, do not repeat them here.
S302, by gathering user's hand vein image information acquisition active user gesture feature.
S303, determine described user's identity information according to the user's hand vein image information that gathers, confirm whether described user is legal.
If the user is validated user, then carry out S304, carry out S302 otherwise then return.
User's venous information has uniqueness, compares according to the venous information of the user's hand vein image information that gathers with the user who pre-sets like this, is confirmed whether the information that pre-sets of coupling.If the information that pre-sets of coupling is arranged, then identification equipment confirms that this user is validated user.
Wherein, described identification equipment can pre-set the venous information of hand a part, is used for confirming user's identity; Can certainly be according to the user's who pre-sets gesture feature corresponding venous information confirm user's identity.
S304, by active user's gesture feature and user's gesture feature of pre-seting are compared, determine the corresponding directive command of active user's gesture feature, carry out described directive command.
Behind the corresponding relation that pre-sets user's gesture feature and directive command, after the user made a certain gesture, identification equipment compared the current gesture feature of user and the user's gesture feature that pre-sets, and determines the directive command that active user's gesture feature is corresponding.Concrete, identification equipment is that the vein image information of vein image information and the user's gesture feature that pre-sets of the current gesture feature of user that will collect compares.
Then, identification equipment is carried out described directive command according to this directive command after having determined directive command corresponding to a certain position of the hand that this user inputs.
For example, the shutdown command of the corresponding identification equipment of gesture feature that stretches out the right hand back of the body that pre-sets is then after the user stretches out the gesture feature of the right hand back of the body, after identification equipment gets access to the gesture feature of this right hand back of the body, determine this gesture feature corresponding " shutdown " order, identification equipment is carried out the order of shutdown.
Further, the device that a kind of method that can adopt the embodiment of the invention to provide is carried out man-machine interaction is provided described identification equipment, can be the computing machine with infrared induction degree camera, also can be other equipment with infrared induction degree camera.
The embodiment of the invention provides a kind of method of man-machine interaction, by gathering user's hand vein image information acquisition active user gesture feature; By active user's gesture feature and the user's gesture feature that pre-sets are compared, determine the corresponding directive command of active user's gesture feature, carry out described directive command, this man-machine interaction based on hand vein identification is more accurate.
Embodiment three,
The embodiment of the invention provides a kind of device 40 of man-machine interaction, as shown in Figure 4, comprising: setting unit 41, acquiring unit 42, determining unit 43 and performance element 44.
Described setting unit 41, directive command corresponding to gesture feature for pre-seting the user obtains the described user's gesture feature that pre-sets and the corresponding relation of directive command.
Vein identification is according to the blood red feature that absorbs infrared ray that have in the blood, the camera that will have the near infrared ray sensitivity is made a video recording to hand, the shaded side that can will shine vein blood vessel takes out image, then the vein blood vessel image is carried out digital processing, extract vein blood vessel Characteristic of Image value.
According to this principle, the user can make various gestures and come to carry out man-machine interaction with device 40.For example, setting unit 41 can pre-set " start " order that directive command corresponding to gesture feature that stretches out the right hand palm is device 40, directive command corresponding to gesture feature that stretches out the right hand back of the body is " shutdown " order of device 40, directive command corresponding to gesture feature that stretches out the left hand palm is the order of " cancellation " of device 40, directive command corresponding to gesture feature that stretches out the left hand back of the body is " restarting " order of device 40, directive command corresponding to gesture feature that stretches out the left hand palm and the right hand palm is the order of " locking " of device 40, and directive command corresponding to gesture feature that stretches out the left hand back of the body and the right hand back of the body is " release " order of device 40 etc.
Certainly because the vein image the characteristics of information value at the same position of different user is different, the same gesture feature of making for the user like this, the device 40 vein image information that collect are also different, like this for different users, then setting unit 41 can Extraordinary equipment gesture feature and directive command between corresponding relation.
As, the directive command corresponding to gesture feature of the right hand palm that setting unit 41 can be stretched out user A preset and is set to device 40 " start " order; And the directive command corresponding to gesture feature of the right hand that setting unit 41 can user B the be stretched out palm is set to " shutdown " and orders.
Further, setting unit 41 can also be for the corresponding relation that pre-sets user's gesture feature and directive command of concrete a certain application program.As for certain a game, can pre-set directive command corresponding to gesture feature that stretches out right hand forefinger and be " advancing " order, other orders etc. can also be set, only as example, do not repeat them here.
Described acquiring unit 42 is used for by gathering user's hand vein image information acquisition active user gesture feature.
Described determining unit 43 is used for comparing by described active user's gesture feature that described acquiring unit is obtained and the user's gesture feature that pre-sets, and determines the corresponding directive command of active user's gesture feature.
Pre-set the corresponding relation of user's gesture feature and directive command in setting unit 41 after, after the user makes a certain gesture, described determining unit 43 compares the current gesture feature of user and the user's gesture feature that pre-sets, and determines the directive command that active user's gesture feature is corresponding.Concrete, described determining unit 43 is that the vein image information of vein image information and the default user's gesture feature that refers to of the current gesture feature of user that will collect compares.
Then, described determining unit 43 is carried out described directive command according to this directive command after having determined directive command corresponding to a certain position of the hand that this user inputs.
Described performance element 44 is used for carrying out described directive command according to the described directive command that described determining unit is determined.
For example, the shutdown command of the gesture feature corresponding intrument 40 of the right hand back of the body that setting unit 41 pre-sets, then after the user stretches out the gesture feature of the right hand back of the body, after described acquiring unit 42 gets access to this gesture feature that stretches out the right hand back of the body, described determining unit 43 is determined this gesture feature corresponding " shutdown " order, and described performance element 44 is carried out the order of shutdown.
Further, determining unit 43 also is used for determining according to the user's hand vein image information that gathers described user's identity information, confirms whether described user is legal; If described user is validated user, by active user's gesture feature and the user's gesture feature that pre-sets are compared, determine the corresponding directive command of active user's gesture feature.
User's venous information has uniqueness, and the venous information according to acquiring unit 42 the user's hand vein image information that gathers and the user who pre-sets compares like this, and determining unit 43 is confirmed whether the information that pre-sets of coupling.If the information that pre-sets of coupling is arranged, then determining unit 43 confirms that this user is validated user.
Wherein, described setting unit 41 can pre-set the venous information of hand a part, is used for confirming user's identity; The user's that can certainly pre-set according to setting unit 41 venous information corresponding to gesture feature confirmed user's identity.
Further, described device 40 can be the computing machine with infrared induction degree camera, also can be other equipment with infrared induction degree camera.
The embodiment of the invention provides a kind of device of gesture interaction, setting unit pre-sets user's directive command corresponding to gesture feature, obtain the described user's gesture feature that pre-sets and the corresponding relation of directive command, acquiring unit is by gathering user's hand vein image information acquisition active user gesture feature, then determining unit compares by described active user's gesture feature that described acquiring unit is obtained and the user's gesture feature that pre-sets, determine the corresponding directive command of active user's gesture feature, performance element is carried out described directive command according to described directive command, and this man-machine interaction based on hand vein identification is more accurate.
One of ordinary skill in the art will appreciate that: all or part of step that realizes said method embodiment can be finished by the relevant hardware of programmed instruction, aforesaid program can be stored in the computer read/write memory medium, this program is carried out the step that comprises said method embodiment when carrying out; And aforesaid storage medium comprises: the various media that can be program code stored such as ROM, RAM, magnetic disc or CD.
The above; be the specific embodiment of the present invention only, but protection scope of the present invention is not limited to this, anyly is familiar with those skilled in the art in the technical scope that the present invention discloses; can expect easily changing or replacing, all should be encompassed within protection scope of the present invention.Therefore, protection scope of the present invention should be as the criterion with the protection domain of described claim.

Claims (8)

1. a man-machine interaction method is characterized in that, comprising:
By gathering user's hand vein image information acquisition active user gesture feature;
By active user's gesture feature and the user's gesture feature that pre-sets are compared, determine the corresponding directive command of active user's gesture feature, carry out described directive command.
2. method according to claim 1 is characterized in that,, also comprises by after gathering user's hand vein image information acquisition active user gesture feature described:
Determine described user's identity information according to the user's hand vein image information that gathers, confirm whether described user is legal;
It is described by active user's gesture feature and the user's gesture feature that pre-sets are compared, determine the corresponding directive command of active user's gesture feature, carrying out described directive command is specially: if described user is validated user, by active user's gesture feature and the user's gesture feature that pre-sets are compared, determine the corresponding directive command of active user's gesture feature, carry out described directive command.
3. method according to claim 2 is characterized in that,, also comprises by before gathering user's hand vein image information acquisition active user gesture feature described:
Pre-set user's directive command corresponding to gesture feature, obtain the described user's gesture feature that pre-sets and the corresponding relation of directive command.
4. method according to claim 3, it is characterized in that, described directive command corresponding to gesture feature that pre-sets the user, obtain the described user's gesture feature that pre-sets and the corresponding relation of directive command and be specially directive command corresponding to gesture feature that pre-sets different user, obtain the described different user gesture feature that pre-sets and the corresponding relation of directive command.
5. the device of a man-machine interaction is characterized in that, comprising: acquiring unit, determining unit and performance element;
Described acquiring unit is used for by gathering user's hand vein image information acquisition user gesture feature;
Described determining unit is used for comparing by described active user's gesture feature that described acquiring unit is obtained and the user's gesture feature that pre-sets, and determines the corresponding directive command of active user's gesture feature;
Described performance element is used for carrying out described directive command according to the described directive command that described determining unit is determined.
6. device according to claim 5 is characterized in that, described determining unit also for the identity information of determining described user according to the user's hand vein image information that gathers, confirms whether described user is legal; If described user is validated user, by active user's gesture feature and the user's gesture feature that pre-sets are compared, determine the corresponding directive command of active user's gesture feature.
7. device according to claim 6 is characterized in that, also comprises: setting unit, directive command corresponding to gesture feature for pre-seting the user obtains the described user's gesture feature that pre-sets and the corresponding relation of directive command.
8. device according to claim 7 is characterized in that, directive command corresponding to gesture feature that described setting unit also is used for pre-seting different user obtains the described different user gesture feature that pre-sets and the corresponding relation of directive command.
CN2012100784857A 2012-03-22 2012-03-22 Method and device for man-machine interaction Pending CN103324274A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2012100784857A CN103324274A (en) 2012-03-22 2012-03-22 Method and device for man-machine interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2012100784857A CN103324274A (en) 2012-03-22 2012-03-22 Method and device for man-machine interaction

Publications (1)

Publication Number Publication Date
CN103324274A true CN103324274A (en) 2013-09-25

Family

ID=49193075

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2012100784857A Pending CN103324274A (en) 2012-03-22 2012-03-22 Method and device for man-machine interaction

Country Status (1)

Country Link
CN (1) CN103324274A (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103995998A (en) * 2014-05-19 2014-08-20 华为技术有限公司 Non-contact gesture command authentication method and user device
CN105320276A (en) * 2014-07-31 2016-02-10 三星电子株式会社 Wearable device and method of operating the same
CN105807912A (en) * 2015-01-21 2016-07-27 现代自动车株式会社 Vehicle, method for controlling the same and gesture recognition apparatus therein
CN106648102A (en) * 2016-12-26 2017-05-10 珠海市魅族科技有限公司 Method and system of controlling terminal equipment through non-touch gesture
CN107450717A (en) * 2016-05-31 2017-12-08 联想(北京)有限公司 A kind of information processing method and Wearable
CN107493428A (en) * 2017-08-09 2017-12-19 广东欧珀移动通信有限公司 Filming control method and device
CN107615270A (en) * 2016-04-28 2018-01-19 华为技术有限公司 A kind of man-machine interaction method and its device
CN108594995A (en) * 2018-04-13 2018-09-28 广东小天才科技有限公司 A kind of electronic device method and electronic equipment based on gesture identification
CN110244844A (en) * 2019-06-10 2019-09-17 Oppo广东移动通信有限公司 Control method and relevant apparatus
CN110334561A (en) * 2018-03-31 2019-10-15 广州卓腾科技有限公司 A kind of gestural control method of control object rotation
CN111052047A (en) * 2017-09-29 2020-04-21 苹果公司 Vein scanning device for automatic gesture and finger recognition
CN111273777A (en) * 2020-02-11 2020-06-12 Oppo广东移动通信有限公司 Virtual content control method and device, electronic equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101630193A (en) * 2008-07-15 2010-01-20 张雪峰 Hand induction equipment
CN201927095U (en) * 2010-12-27 2011-08-10 北京天公瑞丰科技有限公司 Entrance guard system based on palm vein authentication
CN102385693A (en) * 2010-09-03 2012-03-21 洪西进 System and method for identifying finger vein

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101630193A (en) * 2008-07-15 2010-01-20 张雪峰 Hand induction equipment
CN102385693A (en) * 2010-09-03 2012-03-21 洪西进 System and method for identifying finger vein
CN201927095U (en) * 2010-12-27 2011-08-10 北京天公瑞丰科技有限公司 Entrance guard system based on palm vein authentication

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015176494A1 (en) * 2014-05-19 2015-11-26 华为技术有限公司 Non-contact gesture command authentication method and user equipment
CN103995998A (en) * 2014-05-19 2014-08-20 华为技术有限公司 Non-contact gesture command authentication method and user device
CN105320276A (en) * 2014-07-31 2016-02-10 三星电子株式会社 Wearable device and method of operating the same
US10867023B2 (en) 2014-07-31 2020-12-15 Samsung Electronics Co., Ltd. Wearable device and method of operating the same
CN105320276B (en) * 2014-07-31 2020-10-13 三星电子株式会社 Wearable device and method of operating wearable device
US10552598B2 (en) 2014-07-31 2020-02-04 Samsung Electronics Co., Ltd. Wearable device and method of operating the same
CN105807912A (en) * 2015-01-21 2016-07-27 现代自动车株式会社 Vehicle, method for controlling the same and gesture recognition apparatus therein
CN105807912B (en) * 2015-01-21 2020-10-20 现代自动车株式会社 Vehicle, method for controlling the same, and gesture recognition apparatus therein
CN107615270B (en) * 2016-04-28 2020-04-14 华为技术有限公司 Man-machine interaction method and device
US11868710B2 (en) 2016-04-28 2024-01-09 Honor Device Co., Ltd. Method and apparatus for displaying a text string copied from a first application in a second application
CN107615270A (en) * 2016-04-28 2018-01-19 华为技术有限公司 A kind of man-machine interaction method and its device
US10853564B2 (en) 2016-04-28 2020-12-01 Huawei Technologies Co., Ltd. Operation for copied content
CN107450717A (en) * 2016-05-31 2017-12-08 联想(北京)有限公司 A kind of information processing method and Wearable
CN106648102A (en) * 2016-12-26 2017-05-10 珠海市魅族科技有限公司 Method and system of controlling terminal equipment through non-touch gesture
CN107493428A (en) * 2017-08-09 2017-12-19 广东欧珀移动通信有限公司 Filming control method and device
CN111052047A (en) * 2017-09-29 2020-04-21 苹果公司 Vein scanning device for automatic gesture and finger recognition
CN111052047B (en) * 2017-09-29 2022-04-19 苹果公司 Vein scanning device for automatic gesture and finger recognition
CN110334561A (en) * 2018-03-31 2019-10-15 广州卓腾科技有限公司 A kind of gestural control method of control object rotation
CN108594995A (en) * 2018-04-13 2018-09-28 广东小天才科技有限公司 A kind of electronic device method and electronic equipment based on gesture identification
CN110244844A (en) * 2019-06-10 2019-09-17 Oppo广东移动通信有限公司 Control method and relevant apparatus
CN111273777A (en) * 2020-02-11 2020-06-12 Oppo广东移动通信有限公司 Virtual content control method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN103324274A (en) Method and device for man-machine interaction
CN105117056B (en) A kind of method and apparatus of operation touch-screen
CN103870156B (en) A kind of method and device of process object
US20130050076A1 (en) Method of recognizing a control command based on finger motion and mobile device using the same
CN104063128B (en) A kind of information processing method and electronic equipment
TW201145184A (en) Vision-based hand movement recognition system and method thereof
CN102768595B (en) A kind of method and device identifying touch control operation instruction on touch-screen
CN103365402A (en) Control method and device for display equipment
CN105760084B (en) The control method and device of voice input
DE112012006448T5 (en) Translate a touch input into a local input based on a translation profile for an application
KR101891306B1 (en) Method and Apparatus for Realizaing Human-Machine Interaction
KR20170033757A (en) Touch screen device for moving or copying of an object based on the touch input and operating method thereof
CN105579945A (en) Digital device and control method thereof
CN104239844A (en) Image recognition system and image recognition method
US20160098087A1 (en) Systems and methods for gesture recognition
CN110688190A (en) Control method and device of intelligent interactive panel
CN103744609B (en) A kind of data extraction method and device
CN107450717A (en) A kind of information processing method and Wearable
DE102014118225A1 (en) Desktop gestures to mimic a mouse control
CN104615984A (en) User task-based gesture identification method
CN106446643B (en) Terminal control method and device
CN104216517A (en) Information processing method and electronic equipment
CN104216563B (en) A kind of terminal
CN204203932U (en) A kind of human-computer interaction device based on radio-frequency (RF) identification
CN105867685A (en) Control method and device for terminal equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20130925