CN110442233A - A kind of augmented reality key mouse system based on gesture interaction - Google Patents

A kind of augmented reality key mouse system based on gesture interaction Download PDF

Info

Publication number
CN110442233A
CN110442233A CN201910524900.9A CN201910524900A CN110442233A CN 110442233 A CN110442233 A CN 110442233A CN 201910524900 A CN201910524900 A CN 201910524900A CN 110442233 A CN110442233 A CN 110442233A
Authority
CN
China
Prior art keywords
module
augmented reality
mems
emg
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910524900.9A
Other languages
Chinese (zh)
Other versions
CN110442233B (en
Inventor
印二威
谢良
秦伟
鹿迎
邓宝松
闫野
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin (binhai) Intelligence Military-Civil Integration Innovation Center
National Defense Technology Innovation Institute PLA Academy of Military Science
Original Assignee
Tianjin (binhai) Intelligence Military-Civil Integration Innovation Center
National Defense Technology Innovation Institute PLA Academy of Military Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin (binhai) Intelligence Military-Civil Integration Innovation Center, National Defense Technology Innovation Institute PLA Academy of Military Science filed Critical Tianjin (binhai) Intelligence Military-Civil Integration Innovation Center
Priority to CN201910524900.9A priority Critical patent/CN110442233B/en
Publication of CN110442233A publication Critical patent/CN110442233A/en
Application granted granted Critical
Publication of CN110442233B publication Critical patent/CN110442233B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Dermatology (AREA)
  • General Health & Medical Sciences (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The augmented reality key mouse system based on gesture interaction that the present invention provides a kind of, comprising: MEMS module, EMG module, infrared optical module, electrotactile stimulation module, audio feedback module, depth image acquisition module, vision enhancement module and integrated treatment module;The integrated treatment module is separately connected MEMS module, EMG module, infrared optical module, electrotactile stimulation module, audio feedback module, depth image acquisition module, vision enhancement module;The audio feedback module, depth image acquisition module, vision enhancement module are mounted on the augmented reality glasses.The present invention can realize more natural human-computer interaction compared to conventional keyboard and mouse, have the stronger sense of reality, and low manufacture cost, period are short;Wear it is simple, light, versatile, while be suitable for plurality of application scenes;Based on the exchange method that MEMS is combined with the multi-modal gesture identification of EMG with touching audio feedback, more stable compared to single mode interactive performance, user experience is more preferable.

Description

A kind of augmented reality key mouse system based on gesture interaction
Technical field
Present invention relates generally to field of human-computer interaction, especially a kind of augmented reality keyboard and three based on gesture interaction Tie up mouse man-machine interface system.
Background technique
Augmented reality (Augmented Reality, abbreviation AR) is a kind of and real world interaction body It tests, the perception informations such as vision, the sense of hearing, tactile that the object resided in real world is generated by computer " enhancing ".Superposition Sensory information can be the supplement to natural environment, be also possible to the masking to natural environment, and seamlessly hand over physical world It is woven in together, makes one to obtain interaction impression on the spot in person.Augmented reality is and to provide sense for enhancing natural environment or situation Know experience abundant.By augmented reality, the information of real world becomes with interactivity and digitlization behaviour around user Make ability.
In recent years, augmented reality development is very rapid, and just gradually causes more and more researchers and high-tech The concern of company, and huge application prospect has been shown in fields such as medical treatment, education, game and military affairs.Enhancing is existing Real also will be considered as the human-computer interaction device for replacing the next-generation mainstream of PC computer and smart phone in the near future.
Since user is usually in moving condition, and it is generally not easy to carry excessive interactive controlling hardware system, institute To realize that keyboard input and 3D mouse control very challenging, traditional keyboard, mouse has been shown in augmented reality It is not able to satisfy the needs of actual situation interaction so, needs the novel wearable intersection control routine that development has keyboard and mouse function, into And realize man-machine more natural interactive cooperation.
Currently, the main interactive means that augmented reality system uses are gesture interaction and voice based on deep image information Interaction.However, the method based on image is easy the interference by natural light, it is caused to be easy to failure in an outdoor environment, and And since the motion range of manpower in space is larger, it can mutually block, interfere between finger, it is stronger that watch face lacks resolution ratio The problems such as feature, causes the dynamic hand tracking system accuracy based on image lower, and is easy to produce ambiguity.Based on voice Exchange method it is harsh to the background noise of local environment and system for cloud computing environmental requirement, for the public place of densely populated place Or the environment that network environment is unstable, recognition accuracy will have a greatly reduced quality.In recent years, MEMS (MEMS (MEMS, Micro-Electro-Mechanical System)) acceleration transducer and EMG (electromyography, abbreviation myoelectricity Figure) myoelectric sensor rely on small in size, light-weight, low in energy consumption, high reliablity, high sensitivity, high integration the features such as, gradually account for According to transducer market, more and more concerns are caused in field of human-computer interaction.
Summary of the invention
The purpose of the present invention is what is be achieved through the following technical solutions.
The present invention proposes a kind of augmented reality key mouse system based on gesture interaction in response to this problem.The system is effectively melted MEMS and EMG gesture interaction information is closed, and is assisted by infrared light mark point, is provided using tactile and audio feedback information Interactive intrusion sense, and then promote rate of interaction and user experience under augmented reality environment.The present invention is directed to realize that enhancing is existing The naturality and operability of practical operation control are a kind of completely new key mouse control modes.
Specifically, the present invention provides a kind of augmented reality key mouse system based on gesture interaction, it is used for and augmented reality Glasses are used cooperatively, comprising: MEMS module, EMG module, infrared optical module, electrotactile stimulation module, audio feedback module, depth Spend image capture module, vision enhancement module and integrated treatment module;The integrated treatment module be separately connected MEMS module, EMG module, infrared optical module, electrotactile stimulation module, audio feedback module, depth image acquisition module, vision enhancement module; The audio feedback module, depth image acquisition module, vision enhancement module are mounted on the augmented reality glasses.
Preferably, the MEMS module is made of each 69 axle acceleration sensing gauges of right-hand man, is individually positioned in 5 fingers 1st or 2 finger joints and the back of the hand center, the MEMS module are used to acquire the hand exercise information of user, and by 9 axle accelerations Sensing gauge signal is sent to integrated treatment module.
Preferably, the EMG module is made of multipair difference electromyographic electrode, is placed in user's forearm middle position, the EMG Module is used to acquire the muscle activity information of user, and electromyography signal is sent to integrated treatment module.
Preferably, the infrared optical module is made of the LED light being located on the back of the hand center and index finger, by integrated treatment Module for power supply, for generating virtual laser beam.
Preferably, the electrotactile stimulation module is made of the electrotactile stimulation electrode slice for being located at both hands fingertip location, by Integrated treatment module control, for generating the tactile feedback information of keyboard and mousebutton.
Preferably, the bilateral loudspeaker group in temple of the audio feedback module by being located at the augmented reality glasses At being controlled by integrated treatment module, for providing the audio feedback information of keyboard and mousebutton.
Preferably, the depth image acquisition module is by the binocular camera group positioned at the augmented reality nosepiece At for acquiring the depth map for two luminous points that infrared optical module issues, and then solving virtual laser beam is directed toward.
Preferably, vision enhancement module keyboard input content and 3D mouse manipulation result for rendering is user Visual feedback is provided.
Preferably, the integrated treatment module is used to receive the multi channel signals of MEMS module and EMG module, and carries out reality When signal processing identify, and recognition result is given by electrotactile stimulation module, audio feedback module, vision enhancement module feedback User.
Preferably, the integrated treatment module using have malleable conductive material respectively with MEMS module, EMG mould Block, infrared optical module, the connection of electrotactile stimulation module, using wireless mode and audio feedback module, depth image acquisition module, The connection of vision enhancement module.
Preferably, in the integrated treatment module multi channel signals of MEMS module and EMG module treatment process are as follows:
(1) synchronous acquisition MEMS signal and EMG signal;
(2) the MEMS signal is filtered with EMG signal;
(3) the filtered MEMS and EMG signal input are obtained based on key mouse manipulation gesture sample database training Timing convolutional neural networks, output gesture classification result and identification score;
(4) judge identify score whether be more than threshold value: if it is judged that be it is yes, then enter step (5);If it is determined that knot Fruit be it is no, then return to step (1);
(5) output gesture identification instruction, and MEMS and EMG data are saved;
(6) according to keyboard input and mousebutton user's control behavioural information to MEMS the and EMG data saved into Line flag generates specific user's sample set;
(7) specific user's sample set is imported into key mouse and manipulates gesture sample database, re -training timing convolution mind Through network.
Preferably, the course of work of the integrated treatment module is as follows:
(1) synchronous acquisition and gesture of acceleration signal and electromyography signal are carried out based on MEMS module and EMG module synchronization Identification;
(2) whether detection gesture has both hands to synchronize lift or pressing action, judges whether to enter system control state: if Judging result be it is yes, then (3) are entered step, if it is judged that being no, then return step (1);
(3) primary touch feedback is applied to both hands finger tip by the electrotactile stimulation module, whether is identification bimanual input Have singlehanded gesture motion of partly clenching fist, judge whether to enter Double hand operating keyboard input state, if it is judged that be it is yes, then enter step (4), if it is judged that be it is no, then enter step (9);
(4) identification movement finger and its direction of motion, output identification classification information and identification score;
(5) discriminated whether actuation of keys: if it is judged that be it is yes, then (6) are entered step, if it is judged that being It is no, then return step (4);
(6) keyboard is applied by the electrotactile stimulation module and the auditory stimulation module and taps touching audio feedback, and Output content of text is shown in the vision enhancement module;
(7) according to gesture identification result judge thumb whether key: if it is judged that be it is yes, then enter step (8), such as Fruit judging result be it is no, then content to be entered, and return step (4) are recommended by probability positive sequence based on input method code table;
(8) confirm and show output content of text in the vision enhancement module, and judge whether input terminates: if Judging result be it is yes, then terminate current operation, if it is judged that being no, then return step (4);
(9) keyboard-hand and mouse hand are identified by gestures detection result, and switch to key mouse state, at the same adapter tube keyboard and Mouse function;
(10) judge when whether remote holder be keyboard-hand: if it is judged that be it is yes, further determined whether actuation of keys, If there is actuation of keys, then control instruction is sent, and keyboard is applied based on electrotactile stimulation module and audio feedback module synchronization Touching audio feedback is tapped, and enters step (13);If it is judged that be it is no, then enter step (11);
(11) judge whether to select dummy light: if it is judged that be it is yes, then be based on the depth image acquisition module It calculates luminous point connecting line to be directed toward, and shows that real-time virtual laser beam is directed toward in the vision enhancement module;If it is judged that Be it is no, then three-dimensional space change in displacement is calculated based on MEMS signal, and show in the vision enhancement module real-time three-dimensional seat Cursor position;
(12) judge whether there is mousebutton, if it is judged that be it is yes, then send control instruction, apply mousebutton touching Audio feedback, and enter step (13);
(13) order of key mouse Collaborative Control exports.
The present invention has the advantages that the present invention is based on the augmented reality key mouse system of gesture interaction, compared to conventional keyboard and Mouse can realize more natural human-computer interaction, have the stronger sense of reality, and low manufacture cost, period are short;Based on gesture Interactive augmented reality key mouse system, wearing is simple, light, versatile, while being suitable for plurality of application scenes;Based on MEMS It is more steady compared to single mode interactive performance with the exchange method that the multi-modal gesture identification of EMG is combined with touching audio feedback Fixed, user experience is more preferable.
Detailed description of the invention
By reading the following detailed description of the preferred embodiment, various other advantages and benefits are common for this field Technical staff will become clear.The drawings are only for the purpose of illustrating a preferred embodiment, and is not considered as to the present invention Limitation.And throughout the drawings, the same reference numbers will be used to refer to the same parts.In the accompanying drawings:
Fig. 1 is that system hardware of the invention forms figure;
Fig. 2 is that system module of the invention forms figure;
Fig. 3 is the signal processing and identification process figure of gesture identification process of the present invention;
Fig. 4 is system flow chart of the present invention in concrete application process;
Specific embodiment
The illustrative embodiments of the disclosure are more fully described below with reference to accompanying drawings.Although showing this public affairs in attached drawing The illustrative embodiments opened, it being understood, however, that may be realized in various forms the disclosure without the reality that should be illustrated here The mode of applying is limited.It is to be able to thoroughly understand the disclosure on the contrary, providing these embodiments, and can be by this public affairs The range opened is fully disclosed to those skilled in the art.
For technical problem of the existing technology, the present invention provide a kind of structure it is simple, it is easy to operate, manipulation can be improved Speed simultaneously can be improved manipulation reliability, the augmented reality key mouse system based on gesture interaction of accuracy.
Fig. 1 is that system hardware of the invention forms figure, wherein 1 is electrotactile stimulation electrode, and 2 be MEMS motion sensor, 3 be integrated treatment module, and 4 be electromyographic electrode piece, and 5 be LED infrared lamp, and 6 be binocular depth camera, and 7 show for augmented reality Eyeglass, 8 be loudspeaker.
Fig. 2 is that system module of the invention forms figure.A kind of augmented reality key mouse system based on gesture interaction, including MEMS module (i.e. MEMS motion sensor 2 in Fig. 1), EMG module (i.e. electromyographic electrode piece 4 in Fig. 1), infrared optical module (i.e. LED infrared lamp 5 in Fig. 1), electrotactile stimulation module (i.e. electrotactile stimulation electrode 1 in Fig. 1), audio feedback module (i.e. loudspeaker 8 in Fig. 1), depth image acquisition module (i.e. binocular depth camera 6 in Fig. 1), vision enhancement module (are schemed Augmented reality in 1 shows eyeglass 7) and integrated treatment module (i.e. integrated treatment module 3 in Fig. 1), integrated treatment module point It Lian Jie not MEMS module, EMG module, infrared optical module, electrotactile stimulation module, audio feedback module, depth image acquisition mould Block, vision enhancement module.
MEMS module is made of each 69 axle acceleration sensing gauges of both hands, be individually positioned in 5 fingers the 1st or 2 finger joints and The back of the hand center, both hands amount to 12 acceleration transducers, and the MEMS module is used to acquire the hand exercise information of user, And 9 axis accelerometer signals are sent to integrated treatment module;
EMG module is made of difference electromyographic electrode 6-10, is placed in user's forearm middle position, and the EMG module is used for The muscle activity information of user is acquired, and electromyography signal is sent to integrated treatment module;
Infrared optical module is made of the LED light being located on the back of the hand center and index finger, is powered by integrated treatment module, is used In generation virtual laser beam;
Electrotactile stimulation module is made of the electrotactile stimulation electrode slice for being located at both hands fingertip location, by integrated treatment module Control, for generating the tactile feedback information of keyboard and mousebutton;
Audio feedback module is made of the bilateral loudspeaker being located in augmented reality eyeglasses temple arm, by integrated treatment module control System, for providing the audio feedback information of keyboard and mousebutton;
Depth image acquisition module is formed by being located at the binocular camera at augmented reality nose bridge, for acquiring infrared light The depth map for two luminous points that module issues, and then solving virtual laser beam is directed toward;
Vision enhancement module keyboard input content and 3D mouse manipulation result for rendering, it is anti-to provide vision for user Feedback;
Integrated treatment module is used to receive the multi channel signals of MEMS module and EMG module, and carries out real time signal processing Identification, and by recognition result by electrotactile stimulation module, audio feedback module, vision enhancement module feedback to user, it is described Integrated treatment module using liquid metals etc. have malleable conductive material respectively with MEMS module, EMG module, infrared optical mode Block, the connection of electrotactile stimulation module, are acquired using the wireless means such as WiFi/Bluetooth and audio feedback module, depth image Module, the connection of vision enhancement module;
As shown in figure 3, in the integrated treatment module multi channel signals of MEMS module and EMG module processing step Are as follows:
(1) system brings into operation, MEMS signal and EMG signal synchronous acquisition;
(2) MEMS the and EMG multi channel signals carry out the ChebyshevI type IIR notch filter processing of 50Hz, later MEMS signal carries out the ChebyshevI type IIR bandpass filtering treatment of 0.1-30Hz, and EMG signal carries out 0.1-70Hz's ChebyshevI type IIR bandpass filtering treatment;
(3) timing that the filtered MEMS and EMG signal input are obtained based on key mouse manipulation gesture sample library training Convolutional neural networks, output gesture classification result and identification score;
(4) judge identify score whether be more than threshold value: if it is judged that be it is yes, then enter step (5);If it is determined that knot Fruit be it is no, then return to step (1);
(5) output gesture identification instruction, and data are saved, it is used for exptended sample database;
(6) according to keyboard input and mousebutton user's control behavioural information to MEMS the and EMG data saved into Line flag generates the new sample set of specific user;
(7) sample set newly obtained is imported into key mouse and manipulates gesture sample database, re -training timing convolutional Neural net Network.
As shown in figure 4, steps are as follows for the control logic of the integrated treatment module:
(1) system brings into operation, and carries out the same of acceleration signal and electromyography signal based on MEMS module and EMG module synchronization Step acquisition and gesture identification;
(2) whether detection gesture has both hands to synchronize lift pressing action, judges whether to enter system control state: if sentenced Disconnected result be it is yes, then (3) are entered step, if it is judged that being no, then return step (1);
(3) primary touch feedback is applied to both hands finger tip by the electrotactile stimulation module, whether is identification bimanual input Have singlehanded gesture motion of partly clenching fist, judge whether to enter Double hand operating keyboard input state, if it is judged that be it is yes, then enter step (4), if it is judged that be it is no, then enter step (9);
(4) timing convolutional neural networks method, identification movement finger and its direction of motion, output identification classification information are used With identification score;
(5) discriminated whether actuation of keys: if it is judged that be it is yes, then (6) are entered step, if it is judged that being It is no, then return step (4);
(6) system is anti-by the electrotactile stimulation module and the auditory stimulation module application keyboard percussion touching sense of hearing Feedback, and output content of text is shown in the vision enhancement module;
(7) according to gesture identification result judge thumb whether key: if it is judged that be it is yes, then enter step (8), such as Fruit judging result be it is no, then content to be entered, and return step (4) are recommended by probability positive sequence based on input method code table;
(8) confirm and show output content of text in the vision enhancement module, and judge whether input terminates: if Judging result be it is yes, then terminate current operation, if it is judged that being no, then return step (4);
(9) keyboard-hand and mouse hand judges by gestures detection result, and switch to key mouse state, at the same adapter tube keyboard with Mouse function;
(10) judge when whether remote holder be keyboard-hand: if it is judged that be it is yes, further determined whether actuation of keys, If there is actuation of keys, system then sends control instruction, and is applied based on electrotactile stimulation module and audio feedback module synchronization Keyboard taps touching audio feedback;If it is judged that be it is no, then enter step (11);
(11) judge whether to select dummy light: if it is judged that be it is yes, then based on binocular camera calculate luminous point connection Line is directed toward, and show in the vision enhancement module real-time virtual laser beam direction, if it is judged that be it is no, then be based on MEMS signal calculates three-dimensional space change in displacement, and shows real-time three-dimensional coordinate position in the vision enhancement module;
(12) judge whether there is mousebutton, if it is judged that be it is yes, then send control instruction, apply mousebutton touching Audio feedback;
(13) order of key mouse Collaborative Control exports;
(14) terminate.
The present invention is based on the augmented reality key mouse systems of gesture interaction, can realize compared to conventional keyboard and mouse more natural Human-computer interaction, there is the stronger sense of reality, and low manufacture cost, period are short;Augmented reality key mouse based on gesture interaction System, wearing is simple, light, versatile, while being suitable for plurality of application scenes;Multi-modal gesture based on MEMS and EMG Identify that more stable compared to single mode interactive performance, user experience is more preferable with the exchange method that touching audio feedback combines.
The foregoing is only a preferred embodiment of the present invention, but scope of protection of the present invention is not limited thereto, In the technical scope disclosed by the present invention, any changes or substitutions that can be easily thought of by anyone skilled in the art, It should be covered by the protection scope of the present invention.Therefore, protection scope of the present invention should be with the protection model of the claim Subject to enclosing.

Claims (12)

1. a kind of augmented reality key mouse system based on gesture interaction, for being used cooperatively with augmented reality glasses, feature exists In, comprising:
MEMS module, EMG module, infrared optical module, electrotactile stimulation module, audio feedback module, depth image acquisition module, Vision enhancement module and integrated treatment module;
It is anti-that the integrated treatment module is separately connected MEMS module, EMG module, infrared optical module, electrotactile stimulation module, the sense of hearing Present module, depth image acquisition module, vision enhancement module;
The audio feedback module, depth image acquisition module, vision enhancement module are mounted on the augmented reality glasses.
2. a kind of augmented reality key mouse system based on gesture interaction according to claim 1, which is characterized in that
The MEMS module is made of each 69 axle acceleration sensing gauges of right-hand man, is individually positioned in 5 fingers the 1st or 2 finger joints With the back of the hand center, the MEMS module is used to acquire the hand exercise information of user, and by 9 axle acceleration sensing gauge signals It is sent to integrated treatment module.
3. a kind of augmented reality key mouse system based on gesture interaction according to claim 1, which is characterized in that
The EMG module is made of multipair difference electromyographic electrode, is placed in user's forearm middle position, the EMG module is for adopting Collect the muscle activity information of user, and electromyography signal is sent to integrated treatment module.
4. a kind of augmented reality key mouse system based on gesture interaction according to claim 1, which is characterized in that
The infrared optical module is made of the LED light being located on the back of the hand center and index finger, is powered by integrated treatment module, is used In generation virtual laser beam.
5. a kind of augmented reality key mouse system based on gesture interaction according to claim 1, which is characterized in that
The electrotactile stimulation module is made of the electrotactile stimulation electrode slice for being located at both hands fingertip location, by integrated treatment module Control, for generating the tactile feedback information of keyboard and mousebutton.
6. a kind of augmented reality key mouse system based on gesture interaction according to claim 1, which is characterized in that
Bilateral loudspeaker in temple of the audio feedback module by being located at the augmented reality glasses forms, by integrated treatment Module control, for providing the audio feedback information of keyboard and mousebutton.
7. a kind of augmented reality key mouse system based on gesture interaction according to claim 1, which is characterized in that
The depth image acquisition module is formed by being located at the binocular camera at the augmented reality nosepiece, for acquiring The depth map for two luminous points that infrared optical module issues, and then solving virtual laser beam is directed toward.
8. a kind of augmented reality key mouse system based on gesture interaction according to claim 1, which is characterized in that
The vision enhancement module keyboard input content and 3D mouse manipulation result for rendering, it is anti-to provide vision for user Feedback.
9. a kind of augmented reality key mouse system based on gesture interaction according to any one of claims 1 to 8, feature It is,
The integrated treatment module is used to receive the multi channel signals of MEMS module and EMG module, and carries out real time signal processing Identification, and recognition result is passed through into electrotactile stimulation module, audio feedback module, vision enhancement module feedback to user.
10. a kind of augmented reality key mouse system based on gesture interaction according to claim 9, which is characterized in that
The integrated treatment module using have malleable conductive material respectively with MEMS module, EMG module, infrared optical mode Block, the connection of electrotactile stimulation module, using wireless mode and audio feedback module, depth image acquisition module, vision enhancement mould Block connection.
11. a kind of augmented reality key mouse system based on gesture interaction according to claim 9, which is characterized in that
The treatment process of the multi channel signals of MEMS module and EMG module in the integrated treatment module are as follows:
(1) synchronous acquisition MEMS signal and EMG signal;
(2) the MEMS signal is filtered with EMG signal;
(3) timing that the filtered MEMS and EMG signal input are obtained based on key mouse manipulation gesture sample database training Convolutional neural networks, output gesture classification result and identification score;
(4) judge identify score whether be more than threshold value: if it is judged that be it is yes, then enter step (5);If it is judged that being It is no, then return to step (1);
(5) output gesture identification instruction, and MEMS and EMG data are saved;
(6) MEMS the and EMG data saved are marked according to the user's control behavioural information of keyboard input and mousebutton Note generates specific user's sample set;
(7) specific user's sample set is imported into key mouse and manipulates gesture sample database, re -training timing convolutional Neural net Network.
12. a kind of augmented reality key mouse system based on gesture interaction according to claim 9, which is characterized in that
The course of work of the integrated treatment module is as follows:
(1) synchronous acquisition and gesture identification of acceleration signal and electromyography signal are carried out based on MEMS module and EMG module synchronization;
(2) whether detection gesture has both hands to synchronize lift or pressing action, judges whether to enter system control state: if it is determined that As a result be it is yes, then (3) are entered step, if it is judged that being no, then return step (1);
(3) primary touch feedback is applied to both hands finger tip by the electrotactile stimulation module, whether identification bimanual input has list Hand is partly clenched fist gesture motion, judges whether to enter Double hand operating keyboard input state, if it is judged that be it is yes, then enter step (4), If it is judged that be it is no, then enter step (9);
(4) identification movement finger and its direction of motion, output identification classification information and identification score;
(5) discriminated whether actuation of keys: if it is judged that be it is yes, then enter step (6), if it is judged that be it is no, then Return step (4);
(6) keyboard is applied by the electrotactile stimulation module and the auditory stimulation module and taps touching audio feedback, and in institute It states and shows output content of text in vision enhancement module;
(7) according to gesture identification result judge thumb whether key: if it is judged that be it is yes, then (8) are entered step, if sentenced Disconnected result be it is no, then based on input method code table by probability positive sequence recommendation content to be entered, and return step (4);
(8) confirm and show output content of text in the vision enhancement module, and judge whether input terminates: if it is determined that As a result be it is yes, then terminate current operation, if it is judged that being no, then return step (4);
(9) keyboard-hand and mouse hand are identified by gestures detection result, and switches to key mouse state, while adapter tube keyboard and mouse Function;
(10) judge when whether remote holder be keyboard-hand: if it is judged that be it is yes, further determined whether actuation of keys, if There is actuation of keys, then sends control instruction, and keyboard is applied based on electrotactile stimulation module and audio feedback module synchronization and is tapped Audio feedback is touched, and enters step (13);If it is judged that be it is no, then enter step (11);
(11) judge whether to select dummy light: if it is judged that be it is yes, then based on the depth image acquisition module calculate Luminous point connecting line is directed toward, and shows that real-time virtual laser beam is directed toward in the vision enhancement module;If it is judged that be it is no, Three-dimensional space change in displacement is then calculated based on MEMS signal, and shows real-time three-dimensional coordinate bit in the vision enhancement module It sets;
(12) judge whether there is mousebutton, if it is judged that be it is yes, then send control instruction, apply mousebutton and touch the sense of hearing Feedback, and enter step (13);
(13) order of key mouse Collaborative Control exports.
CN201910524900.9A 2019-06-18 2019-06-18 Augmented reality keyboard and mouse system based on gesture interaction Active CN110442233B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910524900.9A CN110442233B (en) 2019-06-18 2019-06-18 Augmented reality keyboard and mouse system based on gesture interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910524900.9A CN110442233B (en) 2019-06-18 2019-06-18 Augmented reality keyboard and mouse system based on gesture interaction

Publications (2)

Publication Number Publication Date
CN110442233A true CN110442233A (en) 2019-11-12
CN110442233B CN110442233B (en) 2020-12-04

Family

ID=68429126

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910524900.9A Active CN110442233B (en) 2019-06-18 2019-06-18 Augmented reality keyboard and mouse system based on gesture interaction

Country Status (1)

Country Link
CN (1) CN110442233B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111158476A (en) * 2019-12-25 2020-05-15 中国人民解放军军事科学院国防科技创新研究院 Key identification method, system, equipment and storage medium of virtual keyboard
CN113220117A (en) * 2021-04-16 2021-08-06 邬宗秀 Device for human-computer interaction
CN113419622A (en) * 2021-05-25 2021-09-21 西北工业大学 Submarine operation instruction control system interaction method and device based on gesture operation
CN113918013A (en) * 2021-09-28 2022-01-11 天津大学 Gesture directional interaction system and method based on AR glasses
CN114265498A (en) * 2021-12-16 2022-04-01 中国电子科技集团公司第二十八研究所 Method for combining multi-modal gesture recognition and visual feedback mechanism

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103294226A (en) * 2013-05-31 2013-09-11 东南大学 Virtual input device and virtual input method
CN103823551A (en) * 2013-03-17 2014-05-28 浙江大学 System and method for realizing multidimensional perception of virtual interaction
US20140198130A1 (en) * 2013-01-15 2014-07-17 Immersion Corporation Augmented reality user interface with haptic feedback
CN107203272A (en) * 2017-06-23 2017-09-26 山东万腾电子科技有限公司 Wearable augmented reality task instruction system and method based on myoelectricity cognition technology
CN107943282A (en) * 2017-11-06 2018-04-20 上海念通智能科技有限公司 A kind of man-machine interactive system and method based on augmented reality and wearable device
CN108460313A (en) * 2017-02-17 2018-08-28 鸿富锦精密工业(深圳)有限公司 A kind of gesture identifying device and human-computer interaction system
CN108829245A (en) * 2018-05-30 2018-11-16 中国人民解放军军事科学院国防科技创新研究院 A kind of virtual sand table intersection control routine based on multi-modal brain-machine interaction technology
CN109453509A (en) * 2018-11-07 2019-03-12 龚映清 It is a kind of based on myoelectricity and motion-captured virtual upper limb control system and its method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140198130A1 (en) * 2013-01-15 2014-07-17 Immersion Corporation Augmented reality user interface with haptic feedback
CN103823551A (en) * 2013-03-17 2014-05-28 浙江大学 System and method for realizing multidimensional perception of virtual interaction
CN103294226A (en) * 2013-05-31 2013-09-11 东南大学 Virtual input device and virtual input method
CN108460313A (en) * 2017-02-17 2018-08-28 鸿富锦精密工业(深圳)有限公司 A kind of gesture identifying device and human-computer interaction system
CN107203272A (en) * 2017-06-23 2017-09-26 山东万腾电子科技有限公司 Wearable augmented reality task instruction system and method based on myoelectricity cognition technology
CN107943282A (en) * 2017-11-06 2018-04-20 上海念通智能科技有限公司 A kind of man-machine interactive system and method based on augmented reality and wearable device
CN108829245A (en) * 2018-05-30 2018-11-16 中国人民解放军军事科学院国防科技创新研究院 A kind of virtual sand table intersection control routine based on multi-modal brain-machine interaction technology
CN109453509A (en) * 2018-11-07 2019-03-12 龚映清 It is a kind of based on myoelectricity and motion-captured virtual upper limb control system and its method

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111158476A (en) * 2019-12-25 2020-05-15 中国人民解放军军事科学院国防科技创新研究院 Key identification method, system, equipment and storage medium of virtual keyboard
CN113220117A (en) * 2021-04-16 2021-08-06 邬宗秀 Device for human-computer interaction
CN113220117B (en) * 2021-04-16 2023-12-29 邬宗秀 Device for human-computer interaction
CN113419622A (en) * 2021-05-25 2021-09-21 西北工业大学 Submarine operation instruction control system interaction method and device based on gesture operation
CN113918013A (en) * 2021-09-28 2022-01-11 天津大学 Gesture directional interaction system and method based on AR glasses
CN113918013B (en) * 2021-09-28 2024-04-16 天津大学 Gesture directional interaction system and method based on AR glasses
CN114265498A (en) * 2021-12-16 2022-04-01 中国电子科技集团公司第二十八研究所 Method for combining multi-modal gesture recognition and visual feedback mechanism
CN114265498B (en) * 2021-12-16 2023-10-27 中国电子科技集团公司第二十八研究所 Method for combining multi-mode gesture recognition and visual feedback mechanism

Also Published As

Publication number Publication date
CN110442233B (en) 2020-12-04

Similar Documents

Publication Publication Date Title
Yang et al. Gesture interaction in virtual reality
CN110442233A (en) A kind of augmented reality key mouse system based on gesture interaction
CN107656613B (en) Human-computer interaction system based on eye movement tracking and working method thereof
CN102789313B (en) User interaction system and method
CN114341779A (en) System, method, and interface for performing input based on neuromuscular control
CN104090659B (en) Operating pointer based on eye image and Eye-controlling focus indicates control device
CN108983636B (en) Man-machine intelligent symbiotic platform system
CN103336575B (en) The intelligent glasses system of a kind of man-machine interaction and exchange method
CN110442232A (en) The wearable augmented reality robot control system of joint eye movement and brain-computer interface
WO2022005860A1 (en) Integration of artificial reality interaction modes
CN105487673A (en) Man-machine interactive system, method and device
CN109453509A (en) It is a kind of based on myoelectricity and motion-captured virtual upper limb control system and its method
CN102932212A (en) Intelligent household control system based on multichannel interaction manner
CN103713737A (en) Virtual keyboard system used for Google glasses
Wu et al. A Visual-Based Gesture Prediction Framework Applied in Social Robots.
CN107357434A (en) Information input equipment, system and method under a kind of reality environment
WO2021073743A1 (en) Determining user input based on hand gestures and eye tracking
CN103699219A (en) Intelligent glasses interaction system and intelligent interaction method
CN102830798A (en) Mark-free hand tracking method of single-arm robot based on Kinect
CN108027663A (en) Mobile equipment is combined with personnel tracking and is interacted for giant display
CN110443113A (en) A kind of virtual reality Writing method, system and storage medium
Li et al. Hand gesture recognition and real-time game control based on a wearable band with 6-axis sensors
Rishan et al. Translation of sri lankan sign language to sinhala text: A leap motion technology-based approach
CN110069101A (en) A kind of wearable calculating equipment and a kind of man-machine interaction method
CN110413106B (en) Augmented reality input method and system based on voice and gestures

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant