CN107390881A - A kind of gestural control method - Google Patents

A kind of gestural control method Download PDF

Info

Publication number
CN107390881A
CN107390881A CN201710845960.1A CN201710845960A CN107390881A CN 107390881 A CN107390881 A CN 107390881A CN 201710845960 A CN201710845960 A CN 201710845960A CN 107390881 A CN107390881 A CN 107390881A
Authority
CN
China
Prior art keywords
hand
gesture
user
recognition system
gesture recognition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710845960.1A
Other languages
Chinese (zh)
Inventor
闫立俊
Original Assignee
Xi'an Lingxun Excellent Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xi'an Lingxun Excellent Information Technology Co Ltd filed Critical Xi'an Lingxun Excellent Information Technology Co Ltd
Priority to CN201710845960.1A priority Critical patent/CN107390881A/en
Publication of CN107390881A publication Critical patent/CN107390881A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Abstract

The invention discloses a kind of gestural control method, it is characterised in that including bimanual input operation scenario and singlehanded gesture operation scene;Bimanual input operation scenario includes gesture-capture camera, mainframe computer system and near-end gesture recognition system;Singlehanded gesture operation scene includes voice or brain wave control device and its mainframe computer system, gesture-capture camera, near-end gesture recognition system;Technical scheme, avoided by way of helmet and gesture combine or bimanual input combines, oneself or other people unconscious gesture are judged as system command by system by mistake, so as to improve the security of system.

Description

A kind of gestural control method
Technical field
The invention belongs to human-computer interaction technique field, more particularly to a kind of gestural control method.
Background technology
With the development of computer technology, human-computer interaction technology is also more and more intelligent, personalized.From traditional mouse, The interactive meanses such as keyboard, touch-screen, the Non-contact control methods such as gesture control are gradually developed.
In computer realm, including but not limited to virtual reality, augmented reality, mixed reality scene, current gesture is handed over Mutual scope includes the gesture operation based on two dimension display object and based on three-dimensional display object.Its principle is typically all to pass through gesture The image or video of the camera shooting user gesture of identifying system, then identify the quiet of user gesture by computerized algorithm State implication or motion implication, re-map and performed for internal system order, final system according to the order to the various of display object Operation.
Prior art:Show that the gesture operation of object, such as the two dimensional surface of traditional computer are shown towards two dimensional surface Object, the physical equipments such as mouse, keyboard, touch-screen are replaced by gesture operation, more natural man-machine interaction mode can be obtained. As Microsoft Research is located at an entitled Project Prague of the advanced technology laboratory research of Israel Prague item Mesh, it is exactly to capture user gesture by camera device to operate, is mapped as windows internal commands, performs a certain concrete operations.
Show that the gesture operation of object, such as the intelligent glasses hololens hololenses of Microsoft are supported towards 3 D stereo For the gesture interaction of virtual three-dimensional model.The gesture operation that Microsoft intelligent glasses hololens is supported includes:Bloom bursts forth hand Gesture, Air tap air tap gesture, pin do not put, roll, pulling, zoom, peg, release or unload application etc., specifically Operation may be referred to the site information of Microsoft.
In summary object and the gesture operation of three-dimensional display object are shown towards two dimension, prior art is present asks as follows Topic:
1) gesture operation is not hidden enough, it is necessary to which user lifts hand operation, so that main camera " can see " gesture of user; When user operates virtual display object, the people that virtual image is can't see on side can be wondering, especially in public;
2) much operation is all singlehanded gesture identification to existing system, there is certain security risk, when people around is more, system Other people unconscious gesture is easily judged as system command, causes system mistake;
The content of the invention
It is an object of the invention to provide a kind of gestural control method, to solve the above problems.
To realize above mentioned problem, the present invention uses following technical scheme:
A kind of gestural control method, including bimanual input operation scenario and singlehanded gesture operation scene;
Bimanual input operation scenario includes gesture-capture camera, mainframe computer system and near-end gesture recognition system;Hand Gesture catches camera and is used to catch user gesture, and gesture-capture camera is connected with near-end gesture recognition system;Near-end gesture is known Other system is used for after the near-end gesture operation of user is identified, and mainframe computer system interaction, drives mainframe computer system Original control logic, mainframe computer system are the original computer system of equipment for supporting gesture operation;Specifically include following step Suddenly:
Step 1, user stretches out a hand A, in body near-end, hand A take posture to represent user to start operation virtual aobvious Show object;
Step 2, user stretch out another hand B, in body near-end, concrete operations of the expansion for virtual display object;
Step 3, near-end gesture recognition system judges hand A gestures when being " starting to operate " gesture, and hand B gesture operation is It can come into force;
Step 4, after near-end gesture recognition system obtains effective hand B operations, and mainframe computer system interaction, driving master Original control logic of computer system;
Including following operation:
1) user stretches out a hand A, and in body near-end, palm stretches, palm or the back of the hand upward, with hand A palm or The back of the hand represents user and starts to operate two dimensional surface object;
2) user stretches out the concrete operations that another hand B be directed to two dimensional surface object, including situations below:
Hand B stretches out a finger, in hand A palm or the back of the hand, by singly refer to click, the double click operation two-dimensional object, closely End gesture recognition system is mapped as to the clicking of two dimensional surface object, double-clicks order;
Hand B stretches out two fingers, in hand A palm or the back of the hand, refers to opening, closed procedure, near-end gesture identification by double System is mapped as amplification, the order of diminution to two dimensional surface object;
Hand B stretches out some finger B1, and slide, the mapping of near-end gesture recognition system are carried out in hand A palm or the back of the hand For the scroll command to two dimensional surface object;
Hand B stretches out some finger B2, and in hand A palm or the back of the hand slide, near-end gesture recognition system is mapped as pair The drag command of two dimensional surface object;
Hand B stretches out some finger, point hitter A some finger, and near-end gesture recognition system is mapped as to two dimensional surface pair As or 3 D stereo object prompt operation order;
Hand B stretches out some finger, by the prompt operation order of some finger of hitter A, starts handwriting recognition pattern, uses Family is operated using hand B in the hand A centre of the palm or the back of the hand, the writing for carrying out English alphabet or Chinese character, the mapping of near-end gesture recognition system For the input of English alphabet or Chinese character;
Singlehanded gesture operation scene include voice or brain wave control device and its comprising mainframe computer system, gesture catches Catch camera, near-end gesture recognition system;Voice or brain wave control device gather the voice or brain wave information of user in real time, The voice or eeg signal of user is identified by the equipment;Gesture-capture camera is used to catch user gesture, and gesture-capture is taken the photograph As head is connected with near-end gesture recognition system;Voice or brain wave control device mainframe computer system and near-end gesture recognition system Connection;Near-end gesture recognition system is used for after the near-end gesture operation of user is identified, and voice or brain wave control device Mainframe computer system interaction, drive original control logic of mainframe computer system, specifically include following steps:
First, user stretches out a hand A, in body near-end;Hand A takes posture to represent user to start to operate the object;
2nd, the gesture is passed to near-end gesture recognition system by gesture-capture camera, and near-end gesture recognition system judges When hand A gesture is starts operation, near-end gesture recognition system and the mainframe computer system of voice or brain wave control device are handed over Mutually, voice or brain wave control device just start to analyze the voice or eeg signal of user, voice or brain wave control device According to the voice or eeg signal analyzed, send corresponding control signal and be output to miscellaneous equipment;
Further, hand B uses Microsoft Hololens holographic glasses standard operation postures in bimanual input operation scenario, closely End gesture recognition system is mapped as internal command and performed.
Further, the posture that hand A takes is not limited to clench fist, palm stretches.
Further, the computer system of near-end gesture recognition system can be unified with the computer system of main equipment.
Compared with prior art, the present invention has following technique effect:
The present invention completes for most of gesture operation in user's body near-end can, reduces user and lifts hand in the air Operation when interference to other people, improve the disguise of gesture operation.
Technical scheme, when avoiding surrounding environment by way of bimanual input combination and have other people, system Other people unconscious gesture is judged as system command by mistake, so as to improve the security of system.
Technical scheme, combined by voice or brain wave control device and gesture, avoid user in non-controlling During state, interactive voice or brainwave activity are mistaken as system command, improve the compatibility of system;Voice control device and Gesture, which coordinates, can also remove the worry that user frequently says wake-up word from;
The present invention completes gesture operation by the both hands of user oneself, can replace the miscellaneous equipments such as mouse, keyboard, so With cost it is low, be easily achieved the characteristics of.
Embodiment
With reference to embodiment, the present invention is described in detail:
A kind of gestural control method, including bimanual input operation scenario and singlehanded gesture operation scene;
Bimanual input operation scenario includes gesture-capture camera, mainframe computer system and near-end gesture recognition system;Hand Gesture catches camera and is used to catch user gesture, and gesture-capture camera is connected with near-end gesture recognition system;Near-end gesture is known Other system is used for after the near-end gesture operation of user is identified, and mainframe computer system interaction, drives mainframe computer system Original control logic, mainframe computer system are the original computer system of equipment for supporting gesture operation;Specifically include following step Suddenly:
Step 1, user stretches out a hand A, in body near-end, hand A take posture to represent user to start operation virtual aobvious Show object;
Step 2, user stretch out another hand B, in body near-end, concrete operations of the expansion for virtual display object;
Step 3, near-end gesture recognition system judges hand A gestures when being " starting to operate " gesture, and hand B gesture operation is It can come into force;
Step 4, after near-end gesture recognition system obtains effective hand B operations, and mainframe computer system interaction, driving master Original control logic of computer system;
Including following operation:
1) user stretches out a hand A, and in body near-end, palm stretches, palm or the back of the hand upward, with hand A palm or The back of the hand represents user and starts to operate two dimensional surface object;
2) user stretches out the concrete operations that another hand B be directed to two dimensional surface object, including situations below:
Hand B stretches out a finger, in hand A palm or the back of the hand, by singly refer to click, the double click operation two-dimensional object, closely End gesture recognition system is mapped as to the clicking of two dimensional surface object, double-clicks order;
Hand B stretches out two fingers, in hand A palm or the back of the hand, refers to opening, closed procedure, near-end gesture identification by double System is mapped as amplification, the order of diminution to two dimensional surface object;
Hand B stretches out some finger B1, and slide, the mapping of near-end gesture recognition system are carried out in hand A palm or the back of the hand For the scroll command to two dimensional surface object;
Hand B stretches out some finger B2, and in hand A palm or the back of the hand slide, near-end gesture recognition system is mapped as pair The drag command of two dimensional surface object;
Hand B stretches out some finger, point hitter A some finger, and near-end gesture recognition system is mapped as to two dimensional surface pair As or 3 D stereo object prompt operation order;
Hand B stretches out some finger, by the prompt operation order of some finger of hitter A, starts handwriting recognition pattern, uses Family is operated using hand B in the hand A centre of the palm or the back of the hand, the writing for carrying out English alphabet or Chinese character, the mapping of near-end gesture recognition system For the input of the English alphabet or Chinese character of browser;
Singlehanded gesture operation scene include voice or brain wave control device and its comprising mainframe computer system, gesture catches Catch camera, near-end gesture recognition system;Voice or brain wave control device gather the voice or brain wave information of user in real time, The voice or eeg signal of user is identified by the equipment;Gesture-capture camera is used to catch user gesture, and gesture-capture is taken the photograph As head is connected with near-end gesture recognition system;Voice or brain wave control device mainframe computer system and near-end gesture recognition system Connection;Near-end gesture recognition system is used for after the near-end gesture operation of user is identified, and voice or brain wave control device Mainframe computer system interaction, drive original control logic of mainframe computer system, specifically include following steps:
Step 1, user stretches out a hand A, in body near-end;Hand A take posture represent user start to operate this it is right As;
The gesture is passed to near-end gesture recognition system, near-end gesture recognition system by step 2, gesture-capture camera When judging hand A gesture to start operation, near-end gesture recognition system and the master computer system of voice or brain wave control device System interaction, voice or brain wave control device just start to analyze the voice or eeg signal of user, voice or brain wave control Equipment sends corresponding control signal and is output to miscellaneous equipment according to the voice or eeg signal analyzed;
Hand B uses Microsoft Hololens holographic glasses standard operation postures in bimanual input operation scenario, and near-end gesture is known Other system is mapped as internal command and performed.
The posture that hand A takes is not limited to clench fist, palm stretches.
The computer system of near-end gesture recognition system can be unified with the computer system of main equipment.
Embodiment 1, intelligent glasses user complete the operation that two dimension shows object.
Step 1, intelligent glasses start, user's browsing content;
Step 2, the display units of intelligent glasses are presented some two dimensional image, for example, user open windows edge it is clear Look at device, display unit shows edge browser interfaces;
Step 3, user stretch out a hand A, before body near-end, such as belly or chest, palm or the back of the hand upward, The edge browsers to be operated are represented with hand A palm or the back of the hand;
Step 4, user stretch out the concrete operations that another hand B be directed to edge browsers, including following three kinds of feelings Condition;
1), hand B stretches out a finger, and in hand A palm or the back of the hand, by clicking, double click operation, system is mapped as pair Edge browser browsing contents are clicked, double-click order;
Gesture recognition system mapping process is:The gesture figure shot according to the downward one or more cameras of intelligent glasses Picture, image sequence or video, by computer system processor and analysis, draw specific gesture motion, may be referred to industry herein Ripe images of gestures and gesture motion recognizer.Then, according to the gesture motion and system control command pre-set Gesture is converted to internal control command by mapping relations, gesture recognition system.
2), hand B stretches out two fingers, in hand A palm or the back of the hand, refers to opening, closed procedure by double, system is mapped as Amplification, the order of diminution to edge browser browsing contents;
Further, some finger B1 slide is stretched out by hand B, in system is mapped as browsing edge browsers The scroll command of appearance;Some finger B2 slide is stretched out by hand B, system is mapped as the dragging to edge browser windows Order;
Further, hand B stretches out some finger, point hitter A some finger, and system is mapped as to edge browser interfaces Some prompt operation order.For example hand B point hitters A thumb represents to close window;
Further, hand B stretches out some finger, in hand A palm or the back of the hand, passes through the quick of a hitter A some finger Operational order, for example, it is nameless, start handwriting recognition pattern, user is using hand B finger in the hand A centre of the palm or the back of the hand, progress The writing operation of English alphabet or Chinese character, system are mapped as the input of the English alphabet or Chinese character of edge browsers.
Embodiment 2, intelligent glasses user complete the operation of three-dimensional display object.
Step 1, intelligent glasses start, user's browsing content;
Actions menu is presented in step 2, the display unit of intelligent glasses, and menu includes a series of three-dimensional object models;With Lower step is the example that user operates " tellurion " model;
Step 3, user stretches out a hand A, before body near-end, such as belly or chest;The posture that hand A takes is not It is limited to posture of clenching fist, stretch, hand A given pose represents user and starts to operate;For example user starts to grasp using expression of clenching fist Make;
Step 4, user stretch out another hand B, concrete operations of the expansion for tellurion model;Hand B operations use Microsoft Intelligent glasses hololens standard operation postures, but different from standard hololens be operating position in body near-end, near-end Gesture operation is mapped as internal command and performed by gestural control system;For example coordinated by the sight focus of user, point of use Hitter's gesture chooses " tellurion " model, some particular location being then placed into virtual environment.
Embodiment 3, brain wave helmet and gesture coordinate
Brain wave helmet, illustrate by taking the kether of Hangzhou carriage return Electronic Science and Technology Co., Ltd. as an example.
The racing car Kether of a brain wave control of Shenzhen carriage return science and technology typically used as scene is that user passes through brain electricity Wave device drives motor sport.After user puts on eeg signal collecting device, focus on, when notice reaches setting value Afterwards, racing car can run on runway, and the attention force value of experiencer is higher, and racing car gait of march is faster.When the note of experiencer Meaning force value drops to below setting value, and racing car will halt.
In the technical program, the step of brain wave equipment and gesture operation coordinate:
Step 1, the downward one or more of brain wave integration of equipments are used for the camera of gesture identification;
Step 2, user stretch out a hand A, in body near-end, for example are disposed close to the position of belly or chest;Hand A Take certain given pose to represent user to start to operate racing car, the posture that hand A takes is not limited to clench fist, palm stretches;Meanwhile use Family is focused on, and starts to control racing car speed with brain wave;
Step 3, near-end gestural control system identification hand A gesture, when judging that gesture is to start the gesture of operation racing car When, the master computer interaction of near-end gesture recognition system and brain wave equipment, brain wave equipment starts to analyze the brain wave of user Control signal, according to eeg signal intensity, control signal is sent to racing car;If it is determined that gesture does not start to operate racing car Gesture, then control signal is not sent to racing car;
Step 4, the control signal that racing car is sent according to brain wave equipment, adjust the gait of march of oneself.
Embodiment 4, voice control device and gesture coordinate
The equipment for supporting Voice command, such as Google glass;By being configured with gesture operation, Google glass is in interactive voice When, it can remove from and frequently say wake-up word;In the technical program, the step of Google glass and gesture operation coordinate:
Step 1, the downward one or more of Google glass integration of equipments are used for the camera of gesture identification;
Step 2, user stretch out a hand A, in body near-end, for example are disposed close to the position of belly or chest;Hand A Take certain given pose to represent user and start interactive voice, the posture that hand A takes is not limited to clench fist, palm stretches;
Step 3, user send voice command;For example user says " photograph " using voice and ordered;
Step 4, near-end gesture recognition system identification hand A gesture, when it is to start the gesture of operation to judge gesture, closely Gesture recognition system and the interaction of the mainframe computer system of Google glass are held, Google glass just proceeds by speech recognition, and controls Main camera completes photographic actions.

Claims (4)

1. a kind of gestural control method, it is characterised in that including bimanual input operation scenario and singlehanded gesture operation scene;
Bimanual input operation scenario includes gesture-capture camera, mainframe computer system and near-end gesture recognition system;Gesture is caught Catch camera to be used to catch user gesture, gesture-capture camera is connected with near-end gesture recognition system;Near-end gesture identification system Unite for after the near-end gesture operation of user is identified, and mainframe computer system interaction, drive the original of mainframe computer system Control logic, mainframe computer system are the original computer system of equipment for supporting gesture operation;Specifically include following steps:
Step 1, user stretches out a hand A, and in body near-end, hand A takes posture to represent user to start to operate virtual display pair As;
Step 2, user stretch out another hand B, in body near-end, concrete operations of the expansion for virtual display object;
Step 3, near-end gesture recognition system judges hand A gestures when being " starting to operate " gesture, and hand B gesture operation can just give birth to Effect;
Step 4, after near-end gesture recognition system obtains effective hand B operations, and mainframe computer system interaction, drive host computer Original control logic of machine system;
Including following operation:
1) user stretch out a hand A, in body near-end, palm stretches, palm or the back of the hand upward, with hand A palm or the back of the hand User is represented to start to operate two dimensional surface object;
2) user stretches out the concrete operations that another hand B be directed to two dimensional surface object, including situations below:
Hand B stretches out a finger, in hand A palm or the back of the hand, by singly refer to click, the double click operation two-dimensional object, near-end hand Gesture identifying system is mapped as to the clicking of two dimensional surface object, double-clicks order;
Hand B stretches out two fingers, in hand A palm or the back of the hand, refers to opening, closed procedure, near-end gesture recognition system by double The order for be mapped as the amplification to two dimensional surface object, reducing;
Hand B stretches out some finger B1, carries out slide in hand A palm or the back of the hand, near-end gesture recognition system is mapped as pair The scroll command of two dimensional surface object;
Hand B stretches out some finger B2, and in hand A palm or the back of the hand slide, near-end gesture recognition system is mapped as to two dimension The drag command of planar object;
Hand B stretches out some finger, point hitter A some finger, near-end gesture recognition system be mapped as to two dimensional surface object or The prompt operation order of 3 D stereo object;
Hand B stretches out some finger, by the prompt operation order of some finger of hitter A, starts handwriting recognition pattern, user makes Operated with hand B in the hand A centre of the palm or the back of the hand, the writing for carrying out English alphabet or Chinese character, near-end gesture recognition system is mapped as English Word mother or the input of Chinese character;
Singlehanded gesture operation scene include voice or brain wave control device and its comprising mainframe computer system, gesture-capture takes the photograph As head, near-end gesture recognition system;Voice or brain wave control device gather the voice or brain wave information of user, user in real time Voice or eeg signal identified by the equipment;Gesture-capture camera is used to catch user gesture, gesture-capture camera It is connected with near-end gesture recognition system;Voice or brain wave control device mainframe computer system connect with near-end gesture recognition system Connect;Near-end gesture recognition system is used for after the near-end gesture operation of user is identified, and voice or brain wave control device Mainframe computer system interacts, and drives original control logic of mainframe computer system, specifically includes following steps:
First, user stretches out a hand A, in body near-end;Hand A takes posture to represent user to start to operate the object;
2nd, the gesture is passed to near-end gesture recognition system by gesture-capture camera, and near-end gesture recognition system judges hand A's When gesture is starts operation, near-end gesture recognition system and the interaction of the mainframe computer system of voice or brain wave control device, language Sound or brain wave control device just start to analyze the voice or eeg signal of user, and voice or brain wave control device are according to dividing The voice or eeg signal of precipitation, send corresponding control signal and be output to miscellaneous equipment.
2. a kind of gestural control method according to claim 1, it is characterised in that hand B is adopted in bimanual input operation scenario With Microsoft Hololens holographic glasses standard operation postures, near-end gesture recognition system is mapped as internal command and performed.
A kind of 3. gestural control method according to claim 1, it is characterised in that the posture that hand A takes is not limited to clench fist, Palm stretches.
A kind of 4. gestural control method according to claim 1, it is characterised in that the computer of near-end gesture recognition system System can be unified with the computer system of main equipment.
CN201710845960.1A 2017-09-14 2017-09-14 A kind of gestural control method Pending CN107390881A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710845960.1A CN107390881A (en) 2017-09-14 2017-09-14 A kind of gestural control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710845960.1A CN107390881A (en) 2017-09-14 2017-09-14 A kind of gestural control method

Publications (1)

Publication Number Publication Date
CN107390881A true CN107390881A (en) 2017-11-24

Family

ID=60350971

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710845960.1A Pending CN107390881A (en) 2017-09-14 2017-09-14 A kind of gestural control method

Country Status (1)

Country Link
CN (1) CN107390881A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109032358A (en) * 2018-08-27 2018-12-18 百度在线网络技术(北京)有限公司 The control method and device of AR interaction dummy model based on gesture identification
CN109725722A (en) * 2018-12-29 2019-05-07 百度在线网络技术(北京)有限公司 There are the gestural control method and device of screen equipment
CN109933194A (en) * 2019-03-05 2019-06-25 郑州万特电气股份有限公司 To the exchange method of virtual target object in a kind of mixed reality environment
CN112750437A (en) * 2021-01-04 2021-05-04 欧普照明股份有限公司 Control method, control device and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102221891A (en) * 2011-07-13 2011-10-19 广州视源电子科技有限公司 Method and system for realizing optical image gesture recognition
CN103226443A (en) * 2013-04-02 2013-07-31 百度在线网络技术(北京)有限公司 Method and device for controlling intelligent glasses and intelligent glasses
CN103257713A (en) * 2013-05-31 2013-08-21 华南理工大学 Gesture control method
US20140201684A1 (en) * 2013-01-15 2014-07-17 Leap Motion, Inc. Dynamic user interactions for display control and manipulation of display objects
CN106959746A (en) * 2016-01-12 2017-07-18 百度在线网络技术(北京)有限公司 The processing method and processing device of speech data

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102221891A (en) * 2011-07-13 2011-10-19 广州视源电子科技有限公司 Method and system for realizing optical image gesture recognition
US20140201684A1 (en) * 2013-01-15 2014-07-17 Leap Motion, Inc. Dynamic user interactions for display control and manipulation of display objects
CN103226443A (en) * 2013-04-02 2013-07-31 百度在线网络技术(北京)有限公司 Method and device for controlling intelligent glasses and intelligent glasses
CN103257713A (en) * 2013-05-31 2013-08-21 华南理工大学 Gesture control method
CN106959746A (en) * 2016-01-12 2017-07-18 百度在线网络技术(北京)有限公司 The processing method and processing device of speech data

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109032358A (en) * 2018-08-27 2018-12-18 百度在线网络技术(北京)有限公司 The control method and device of AR interaction dummy model based on gesture identification
CN109725722A (en) * 2018-12-29 2019-05-07 百度在线网络技术(北京)有限公司 There are the gestural control method and device of screen equipment
CN109933194A (en) * 2019-03-05 2019-06-25 郑州万特电气股份有限公司 To the exchange method of virtual target object in a kind of mixed reality environment
CN112750437A (en) * 2021-01-04 2021-05-04 欧普照明股份有限公司 Control method, control device and electronic equipment

Similar Documents

Publication Publication Date Title
CN111680594B (en) Gesture recognition-based augmented reality interaction method
Wexelblat An approach to natural gesture in virtual environments
Murthy et al. A review of vision based hand gestures recognition
CN102854983B (en) A kind of man-machine interaction method based on gesture identification
TWI411935B (en) System and method for generating control instruction by identifying user posture captured by image pickup device
CN108845668B (en) Man-machine interaction system and method
CN107390881A (en) A kind of gestural control method
CN106569613A (en) Multi-modal man-machine interaction system and control method thereof
US20130335318A1 (en) Method and apparatus for doing hand and face gesture recognition using 3d sensors and hardware non-linear classifiers
CN104063039A (en) Human-computer interaction method of wearable computer intelligent terminal
CN101510121A (en) Interface roaming operation method and apparatus based on gesture identification
Geer Will gesture recognition technology point the way?
CN114821753B (en) Eye movement interaction system based on visual image information
Nooruddin et al. HGR: Hand-gesture-recognition based text input method for AR/VR wearable devices
Baig et al. Qualitative analysis of a multimodal interface system using speech/gesture
CN111860086A (en) Gesture recognition method, device and system based on deep neural network
Soroni et al. Hand Gesture Based Virtual Blackboard Using Webcam
CN109753154A (en) There are the gestural control method and device of screen equipment
Chaudhary Finger-stylus for non touch-enable systems
CN113807280A (en) Kinect-based virtual ship cabin system and method
Rustagi et al. Virtual Control Using Hand-Tracking
JPH09237151A (en) Graphical user interface
CN113961067A (en) Non-contact graffiti drawing method and recognition interaction system based on deep learning
Morajkar et al. Hand gesture and voice-controlled mouse for physically challenged using computer vision
Annachhatre et al. Virtual Mouse Using Hand Gesture Recognition-A Systematic Literature Review

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200924

Address after: Building 1, No.8, Jinye 2nd Road, Yanta District, Xi'an City, Shaanxi Province

Applicant after: Qiang Chunjuan

Address before: High tech Zone 710065 Shaanxi city of Xi'an Province Tang Yan Road No. eleven Plaza Garden - Xi'an (two) 1 Building 1 unit 12332 room

Applicant before: XI'AN LINGXUN ZHUOYUE INFORMATION TECHNOLOGY Co.,Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20171124