CN107203272A - Wearable augmented reality task instruction system and method based on myoelectricity cognition technology - Google Patents

Wearable augmented reality task instruction system and method based on myoelectricity cognition technology Download PDF

Info

Publication number
CN107203272A
CN107203272A CN201710488276.2A CN201710488276A CN107203272A CN 107203272 A CN107203272 A CN 107203272A CN 201710488276 A CN201710488276 A CN 201710488276A CN 107203272 A CN107203272 A CN 107203272A
Authority
CN
China
Prior art keywords
augmented reality
myoelectricity
user
action
wearable
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710488276.2A
Other languages
Chinese (zh)
Inventor
张镇
梁波
史云飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Wanteng Electronic Technology Co Ltd
Original Assignee
Shandong Wanteng Electronic Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Wanteng Electronic Technology Co Ltd filed Critical Shandong Wanteng Electronic Technology Co Ltd
Priority to CN201710488276.2A priority Critical patent/CN107203272A/en
Publication of CN107203272A publication Critical patent/CN107203272A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Abstract

The invention discloses a kind of wearable augmented reality task instruction system and method based on myoelectricity cognition technology, including wearable augmented reality EM equipment module and wearable myoelectricity awareness apparatus module;Wearable myoelectricity awareness apparatus module includes:Motion capture unit;Action recognition unit;Dispensing unit;First network transmission unit;Processing and control element (PCE).The present invention can catch more action messages, and not by Environmental Noise Influence, and this is more extensive in increasingly sophisticated industrial application application prospect to realize that more convenient accurate interaction is laid a good foundation.

Description

Wearable augmented reality task instruction system and method based on myoelectricity cognition technology
Technical field
The present invention relates to augmented reality field, more particularly to a kind of wearable enhancing based on myoelectricity cognition technology are existing Real task instruction system and method.
Background technology
In field of industrial production, with the development of industry 4.0, various emerging task instruction systems emerge in an endless stream, are based on The guidance for industry application of augmented reality (Augmented Reality, abbreviation AR) technology is one of them.Augmented reality The data message that script is difficult to observe by the certain time spatial dimension of real world, by being added to after analog simulation Real world, is perceived by the mankind, so as to realize the sensory experience of exceeding reality.Wearable augmented reality equipment can follow work People is in the random changing position of plant site, and can be added to teachings virtual information real world, so as to meet real-time finger The demand led.
The interactive mode of existing industrial augmented reality application mainly has gesture identification based on image and voice-based Two kinds of instruction identification.Gesture identification based on image gathers gesture information by camera and is identified, and it had the disadvantage to week Collarette environmental light is more sensitive, and is limited to the angle of camera;Voice-based instruction is identified by collection manipulator's Special sound instruction carries out corresponding operation, and major defect is larger by Environmental Noise Influence, particularly in industrial application, Many scenes are run all under strong ambient noise.
The content of the invention
The purpose of the present invention is exactly that there is provided a kind of wearable increasing based on myoelectricity cognition technology in order to solve above-mentioned problem Strong reality task instruction system and method, by myoelectricity acquisition electrode, gathers the muscle electric signal during user movement, passes through The model of limb action recognition training, recognizes the pattern of active user;The action recognition skill that the present invention is perceived based on myoelectricity Art, can recognize different gestures, the amplitude of fluctuation of arm, not by the light in environment, noise effect.
To achieve the above object, concrete scheme of the invention is as follows:
A kind of wearable augmented reality task instruction system based on myoelectricity cognition technology, including:Wearable augmented reality EM equipment module and wearable myoelectricity awareness apparatus module;
The wearable myoelectricity awareness apparatus module includes:
Motion capture unit:The electromyographic signal of crawl user's arm in real time, and send after electromyographic signal is encoded to acting Recognition unit;
Action recognition unit:Electromyographic signal, association analysis current user action are received, and the action event of identification is transmitted To processing and control element (PCE);
Dispensing unit:For configuring call parameter during system operation, comprising:Communications address information, user personality training Model;
First network transmission unit:For realizing the communication with wearable augmented reality EM equipment module;
Processing and control element (PCE):Dispensing unit data initialization is read when system starts, and the action event of identification is sent To wearable augmented reality EM equipment module.
Further, the wearable augmented reality EM equipment module includes:
Control unit:The action event recognition result of wearable myoelectricity awareness apparatus module is received, system scenarios are updated;
Recognition unit:The user visual field is caught, by image processing and analyzing user environment, anchor point is found;
Augmented reality unit:The corresponding model of reception control unit instruction displaying or information;
Second network transmitting unit:For realizing wearable augmented reality EM equipment module and wearable myoelectricity awareness apparatus mould The communication of block.
A kind of wearable augmented reality task instruction method based on myoelectricity cognition technology, including:
(1) limb action training is carried out by limb action Training scene, training result saves as user personality training mould Type;
(2) set anchor point, the anchor point for true environment location point to virtual scene space mapping point;
(3) the 3D models comprising all devices and parts in virtual scene are set up, deployment 3D models are in virtual scene The detailed guidance explanation of technological process and each step in position, typing virtual scene;
(4) user visual field environment is analyzed in real time, and according to the position of true environment anchor point, virtual scene content is overlapped onto In true environment;
(5) record in real time and analyze user's electromyographic signal data, judge whether user is complete by user personality training pattern Into a defined action;
(6) if completing defined action, action response is made with reference to current scene state;Otherwise, step is returned Suddenly (5).
Further, the order according to the step of technological process, points out user's current procedures contents, specific in virtual scene Mode of operation and need operate true environment in location point.
Further, in the step (1), the process of setting up of user personality training pattern is specially:
1) setting sample frequency and sampling way, according to user's current action gather user's electromyographic signal data, as with The initial data of family individual character training pattern;
2) characteristic vector pickup is carried out to the initial data;
3) using the characteristic vector of extraction as the input layer of depth fully-connected network, the recognition training acted.
Further, the step 1) in, user's electromyographic signal data of collection include:Hand signals, forearm signal, on Arm signal.
Further, the step 2) in, the feature per signal extraction all the way includes:Root mean square, peak factor, kurtosis with And wavelet-packet energy.
Further, in the step (6), action response logic is specifically included:
When instructing user to tighten up a screw, if system receives user and tightened up a screw action event, and motionless action is twisted During state, system assert that current operation has been completed, and the operation of user's next step is instructed automatically.
Further, in the step (6), action response logic is specifically included:
When showing the picture of job instruction, if system receives the gesture that the five fingers open, picture can amplify;If System receives the gesture that the five fingers close up, and picture can reduce;If system receives gesture of clenching fist, picture can be followed before and after fist Left and right is moved up and down.
Further, in the step (6), action response logic is specifically included:As upward other finger grippings boxer of forefinger During gesture, the warming-up exercise before clicking on is represented, now the cursor in system can follow gesture to move;When forefinger is bent, represent a little Event is hit, now system handles click event according to the position of cursor.
Beneficial effect of the present invention:
The present invention is gathered the muscle electric signal during user movement, known by limb action by myoelectricity acquisition electrode The model do not trained, recognizes the pattern of active user.The action recognition technology perceived based on myoelectricity, can be recognized different Gesture, the amplitude of fluctuation of arm, not by the light in environment, noise effect.Wearable myoelectricity awareness apparatus, is worn on workman Arm, daily operation is not influenceed.
Present system and method, should in increasingly sophisticated industry to realize that more convenient accurate interaction is laid a good foundation It is more extensive with field application prospect.
Brief description of the drawings
Fig. 1 is each module connection diagram in the system of the present invention;
Fig. 2 is user personality training pattern training flow chart of the present invention;
Fig. 3 is the structural representation of depth fully-connected network of the present invention.
Embodiment:
The present invention is described in detail below in conjunction with the accompanying drawings:
The invention discloses a kind of wearable augmented reality task instruction system based on myoelectricity cognition technology, such as Fig. 1 institutes Show, including:Wearable augmented reality EM equipment module and wearable myoelectricity awareness apparatus module;
Wearable myoelectricity awareness apparatus module includes:
Motion capture unit:It is worn on the electromyographic signal of crawl user's arm, and myoelectricity is believed in real time on the body of user Number coding after send to action recognition unit;
Action recognition unit:Electromyographic signal, association analysis current user action are received, and the action event of identification is transmitted To processing and control element (PCE);
Dispensing unit:For configuring call parameter during system operation, comprising:Communications address information, user personality training Model;
First network transmission unit:With the second communication unit radio communication, for realize with wearable augmented reality equipment The communication of module;
Processing and control element (PCE):Dispensing unit data initialization is read when system starts, will by first network transmission unit The action event of identification is sent to wearable augmented reality EM equipment module.
Wearable augmented reality EM equipment module includes:
Control unit:Network service is carried out by the second network transmitting unit and wearable myoelectricity awareness apparatus module, connect The recognition result of recognition unit is received, system scenarios are updated;
Recognition unit:The user visual field is caught by camera, user environment is analyzed using image processing techniques, positioning is found Point;
Augmented reality unit:The corresponding model of reception control unit instruction displaying or information;
Second network transmitting unit:With the first communication unit radio communication, for realizing wearable augmented reality equipment mould The communication of block and wearable myoelectricity awareness apparatus module.
On the other hand, the invention discloses a kind of wearable augmented reality task instruction side based on myoelectricity cognition technology Method, specifically includes following steps:
(1) limb action is trained:It is different that different user produces electromyographic signal for same action.In order to improve the course of work Middle user action recognition accuracy, system provides limb action Training scene, and training result is saved as into user personality instruction Practice model.
The training of limb action is the basis for realizing finer action recognition, user personality training pattern of the present invention Training process is as shown in Fig. 2 idiographic flow is as follows:
Data prediction part mainly includes collection, framing and the feature extraction of signal, is carried out by motion capture unit The collection of signal, the feature extraction of signal is carried out by action recognition unit.
According to actual needs, setting signal sample rate f s scope is 50Hz~1000Hz, the wearable device gathered Signal mainly includes the signal of following limb part:
1) hand signals, each finger exports 2 road signals, both hands 20 tunnel altogether;
2) forearm signal, the road signal of forearm 8, both arms 16 tunnel altogether are gathered using annular wearable device;
3) upper arm signal, the road signal of upper arm 8, both arms 16 tunnel altogether are gathered using annular wearable device;
Processing is identified in collection 1s data every time, and according to the sample rate and signal way, each gathered data amount is Fs × (20+16+16), this signal constitutes the matrix of fs × 52, and next step feature extraction is carried out as initial data.
Can be x=[x with a vector representation per road signal (fs point)1,x2,…,xn], n=fs, the feature of extraction is such as It is lower described:
1) root mean square
2) peak factor
3) kurtosis
Wherein μ is x average, and σ is x standard deviation, and E () represents to average.
4) wavelet-packet energy
3 layers of WAVELET PACKET DECOMPOSITION are carried out using haar small echos, by signal decomposition to 8 subbands, each subband letter are then calculated Number energy accounts for the ratio of gross energy, exports 8 dimensional features.
More than, 11 dimensional features are extracted altogether per road signal, the characteristic sequence of 52 road signals is arranged, and obtain one 572 dimension Characteristic vector, this characteristic vector is as the input layer of depth fully-connected network, the recognition training acted.
Depth fully-connected network structure used in the present invention comprising multiple hidden layers is as shown in figure 3, to recognize 8 actions Exemplified by, specifically include:
1st layer be input layer, neuron number 572, activation primitive uses ReLU;
2nd layer to the 6th layer neuron number is followed successively by 600,500,300,150,50, and activation primitive is ReLU, each Adding proportion prevents over-fitting for 0.25 Dropout after full articulamentum.
Last layer is output layer, and dimension is that the action number of tags to be recognized plus 1, such as to 8 actions of identification, then defeated Go out dimension for 9, the action of additional 1 tag representation it is unidentified go out, be used as loss function using cross entropy.
(2) anchor point is set:Anchor point refers to the location point of true environment to the mapping point in virtual scene space.
Use image analysis technology, it is desirable to have an identifier is used as anchor point so that in wearable augmented reality equipment Can completely it be photographed in the camera image of system.Identifier can be a pictures (position for being flattened on anchor point), can Can be that (object can not be too big, in anchor point for a three-dimensional body to be a Background (background pictures clapped at anchor point) Place's camera can be photographed completely).The setting of anchor point is realized by control unit.
(3) scene construction:Use the 3D models comprising all devices and parts, portion in 3 d modeling software design scenario Affix one's name to the detailed guidance explanation of technological process and each step of the 3D models in the position of scene, typing scene.
(4) task instruction:The recognition unit of wearable augmented reality EM equipment module analyzes user visual field environment in real time, according to The position of true environment anchor point, the virtual scene content in system is overlapped onto in true environment.System is according to technological process The step of order, that points out that user's current procedures contents, specific mode of operation and needing operate in virtual scene is true Location point in environment.
(5) limb action is recognized:The motion capture unit of wearable myoelectricity awareness apparatus module records user's myoelectricity in real time Signal, user personality training pattern analysis user electromyographic signal data of the action recognition unit in configuration module, finally sentences Whether disconnected user completed a defined action just now.If completing defined action, processing and control element (PCE) passes through One network transmitting unit, the action event is sent to wearable augmented reality EM equipment module in real time.
(6) action response:Wearable augmented reality device systems receive action event, with reference to current scene condition responsive Action.
The algorithm logic of response action is as follows:
1) when wearable augmented reality device systems are instructing user to tighten up a screw, if system receives user and tightened up a screw When action event is with motionless operating state is twisted, system assert that current operation has been completed, and user's next step will be instructed automatically Operation;
2) when wearable augmented reality device systems are showing the picture of job instruction, if system receives the five fingers The gesture opened, picture can amplify;If system receives the gesture that the five fingers close up, picture can reduce;If system is received Clench fist gesture, picture can follow fist all around to move up and down;
3) when forefinger other finger grippings boxer's gesture upwards, the warming-up exercise before clicking on is represented, now the cursor in system Gesture can be followed to move;When forefinger is bent, click event is represented, now system handles click event according to the position of cursor.
Above-mentioned scene construction, task instruction and action response are realized by augmented reality unit.
Although above-mentioned the embodiment of the present invention is described with reference to accompanying drawing, not to present invention protection model The limitation enclosed, one of ordinary skill in the art should be understood that on the basis of technical scheme those skilled in the art are not Need to pay various modifications or deform still within protection scope of the present invention that creative work can make.

Claims (10)

1. a kind of wearable augmented reality task instruction system based on myoelectricity cognition technology, it is characterised in that including:It is wearable Augmented reality EM equipment module and wearable myoelectricity awareness apparatus module;
The wearable myoelectricity awareness apparatus module includes:
Motion capture unit:The electromyographic signal of crawl user's arm, and sending to action recognition after electromyographic signal is encoded in real time Unit;
Action recognition unit:Electromyographic signal, association analysis current user action are received, and the action event of identification is transmitted to place Manage control unit;
Dispensing unit:For configuring call parameter during system operation, comprising:Communications address information, user personality training pattern;
First network transmission unit:For realizing the communication with wearable augmented reality EM equipment module;
Processing and control element (PCE):Dispensing unit data initialization is read when system starts, and the action event of identification is sent to can Dress augmented reality EM equipment module.
2. a kind of wearable augmented reality task instruction system based on myoelectricity cognition technology as claimed in claim 1, it is special Levy and be, the wearable augmented reality EM equipment module includes:
Control unit:The action event recognition result of wearable myoelectricity awareness apparatus module is received, system scenarios are updated;
Recognition unit:The user visual field is caught, by image processing and analyzing user environment, anchor point is found;
Augmented reality unit:The corresponding model of reception control unit instruction displaying or information;
Second network transmitting unit:For realizing wearable augmented reality EM equipment module and wearable myoelectricity awareness apparatus module Communication.
3. a kind of wearable augmented reality task instruction method based on myoelectricity cognition technology, it is characterised in that including:
(1) limb action training is carried out by limb action Training scene, training result saves as user personality training pattern;
(2) set anchor point, the anchor point for true environment location point to virtual scene space mapping point;
(3) the 3D models comprising all devices and parts in virtual scene are set up, 3D models are disposed in the position of virtual scene, The detailed guidance explanation of technological process and each step in typing virtual scene;
(4) user visual field environment is analyzed in real time, and according to the position of true environment anchor point, virtual scene content is overlapped onto truly In environment;
(5) record in real time and analyze user's electromyographic signal data, judge whether user completes by user personality training pattern One defined action;
(6) if completing defined action, action response is made with reference to current scene state;Otherwise, return to step (5)。
4. a kind of wearable augmented reality task instruction method based on myoelectricity cognition technology as claimed in claim 3, it is special Levy and be, the order according to the step of technological process points out user's current procedures content, specific mode of operation in virtual scene And need the location point in the true environment of operation.
5. a kind of wearable augmented reality task instruction method based on myoelectricity cognition technology as claimed in claim 3, it is special Levy and be, in the step (1), the process of setting up of user personality training pattern is specially:
1) setting sample frequency and sampling way, gather user's electromyographic signal data according to user's current action, are used as user The initial data of property training pattern;
2) characteristic vector pickup is carried out to the initial data;
3) using the characteristic vector of extraction as the input layer of depth fully-connected network, the recognition training acted.
6. a kind of wearable augmented reality task instruction method based on myoelectricity cognition technology as claimed in claim 5, it is special Levy and be, the step 1) in, user's electromyographic signal data of collection include:Hand signals, forearm signal, upper arm signal.
7. a kind of wearable augmented reality task instruction method based on myoelectricity cognition technology as claimed in claim 5, it is special Levy and be, the step 2) in, the feature per signal extraction all the way includes:Root mean square, peak factor, kurtosis and wavelet packet energy Amount.
8. a kind of wearable augmented reality task instruction method based on myoelectricity cognition technology as claimed in claim 3, it is special Levy and be, in the step (6), action response logic is specifically included:
When instructing user to tighten up a screw, if system receives user and tightened up a screw action event, and motionless operating state is twisted When, system assert that current operation has been completed, and the operation of user's next step is instructed automatically.
9. a kind of wearable augmented reality task instruction method based on myoelectricity cognition technology as claimed in claim 3, it is special Levy and be, in the step (6), action response logic is specifically included:
When showing the picture of job instruction, if system receives the gesture that the five fingers open, picture can amplify;If system The gesture that the five fingers close up is received, picture can reduce;If system receives gesture of clenching fist, picture can follow fist all around Move up and down.
10. a kind of wearable augmented reality task instruction method based on myoelectricity cognition technology as claimed in claim 3, it is special Levy and be, in the step (6), action response logic is specifically included:When forefinger other finger grippings boxer's gesture upwards, represent a little Warming-up exercise before hitting, now the cursor in system gesture can be followed to move;When forefinger is bent, click event is represented, now System handles click event according to the position of cursor.
CN201710488276.2A 2017-06-23 2017-06-23 Wearable augmented reality task instruction system and method based on myoelectricity cognition technology Pending CN107203272A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710488276.2A CN107203272A (en) 2017-06-23 2017-06-23 Wearable augmented reality task instruction system and method based on myoelectricity cognition technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710488276.2A CN107203272A (en) 2017-06-23 2017-06-23 Wearable augmented reality task instruction system and method based on myoelectricity cognition technology

Publications (1)

Publication Number Publication Date
CN107203272A true CN107203272A (en) 2017-09-26

Family

ID=59906725

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710488276.2A Pending CN107203272A (en) 2017-06-23 2017-06-23 Wearable augmented reality task instruction system and method based on myoelectricity cognition technology

Country Status (1)

Country Link
CN (1) CN107203272A (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107608524A (en) * 2017-10-13 2018-01-19 中国科学院软件研究所 A kind of multi objective control command generating device based on myoelectricity
CN107943282A (en) * 2017-11-06 2018-04-20 上海念通智能科技有限公司 A kind of man-machine interactive system and method based on augmented reality and wearable device
CN109192007A (en) * 2018-09-21 2019-01-11 杭州电子科技大学 A kind of AR sign Language Recognition Method and teaching method based on myoelectricity motion perception
CN110061755A (en) * 2019-04-30 2019-07-26 徐州重型机械有限公司 Operation householder method and system, wearable device and engineering truck
CN110074760A (en) * 2019-04-28 2019-08-02 太平洋未来科技(深圳)有限公司 A kind of instrument for the sensor values that can read measurement human muscle's exercise intensity
CN110442233A (en) * 2019-06-18 2019-11-12 中国人民解放军军事科学院国防科技创新研究院 A kind of augmented reality key mouse system based on gesture interaction
CN110568929A (en) * 2019-09-06 2019-12-13 诺百爱(杭州)科技有限责任公司 Virtual scene interaction method and device and electronic equipment
CN112789577A (en) * 2018-09-20 2021-05-11 脸谱科技有限责任公司 Neuromuscular text input, writing and drawing in augmented reality systems
US11907423B2 (en) 2019-11-25 2024-02-20 Meta Platforms Technologies, Llc Systems and methods for contextualized interactions with an environment
US11941176B1 (en) 2018-11-27 2024-03-26 Meta Platforms Technologies, Llc Methods and apparatus for autocalibration of a wearable electrode sensor system
US11961494B1 (en) 2019-03-29 2024-04-16 Meta Platforms Technologies, Llc Electromagnetic interference reduction in extended reality environments

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101433491A (en) * 2008-12-05 2009-05-20 华中科技大学 Multiple-freedom degree wearing type rehabilitation training robot for function of hand and control system thereof
CN102622605A (en) * 2012-02-17 2012-08-01 国电科学技术研究院 Surface electromyogram signal feature extraction and action pattern recognition method
CN104107134A (en) * 2013-12-10 2014-10-22 中山大学 Myoelectricity feedback based upper limb training method and system
CN105139038A (en) * 2015-09-14 2015-12-09 李玮琛 Electromyographic signal mode identification method
CN105748068A (en) * 2016-02-26 2016-07-13 宁波原子智能技术有限公司 Bioelectricity management system and method
CN106236336A (en) * 2016-08-15 2016-12-21 中国科学院重庆绿色智能技术研究院 A kind of myoelectric limb gesture and dynamics control method
CN106846237A (en) * 2017-02-28 2017-06-13 山西辰涵影视文化传媒有限公司 A kind of enhancing implementation method based on Unity3D

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101433491A (en) * 2008-12-05 2009-05-20 华中科技大学 Multiple-freedom degree wearing type rehabilitation training robot for function of hand and control system thereof
CN102622605A (en) * 2012-02-17 2012-08-01 国电科学技术研究院 Surface electromyogram signal feature extraction and action pattern recognition method
CN104107134A (en) * 2013-12-10 2014-10-22 中山大学 Myoelectricity feedback based upper limb training method and system
CN105139038A (en) * 2015-09-14 2015-12-09 李玮琛 Electromyographic signal mode identification method
CN105748068A (en) * 2016-02-26 2016-07-13 宁波原子智能技术有限公司 Bioelectricity management system and method
CN106236336A (en) * 2016-08-15 2016-12-21 中国科学院重庆绿色智能技术研究院 A kind of myoelectric limb gesture and dynamics control method
CN106846237A (en) * 2017-02-28 2017-06-13 山西辰涵影视文化传媒有限公司 A kind of enhancing implementation method based on Unity3D

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107608524A (en) * 2017-10-13 2018-01-19 中国科学院软件研究所 A kind of multi objective control command generating device based on myoelectricity
CN107943282A (en) * 2017-11-06 2018-04-20 上海念通智能科技有限公司 A kind of man-machine interactive system and method based on augmented reality and wearable device
CN112789577A (en) * 2018-09-20 2021-05-11 脸谱科技有限责任公司 Neuromuscular text input, writing and drawing in augmented reality systems
CN112789577B (en) * 2018-09-20 2024-04-05 元平台技术有限公司 Neuromuscular text input, writing and drawing in augmented reality systems
CN109192007A (en) * 2018-09-21 2019-01-11 杭州电子科技大学 A kind of AR sign Language Recognition Method and teaching method based on myoelectricity motion perception
US11941176B1 (en) 2018-11-27 2024-03-26 Meta Platforms Technologies, Llc Methods and apparatus for autocalibration of a wearable electrode sensor system
US11961494B1 (en) 2019-03-29 2024-04-16 Meta Platforms Technologies, Llc Electromagnetic interference reduction in extended reality environments
CN110074760A (en) * 2019-04-28 2019-08-02 太平洋未来科技(深圳)有限公司 A kind of instrument for the sensor values that can read measurement human muscle's exercise intensity
CN110061755A (en) * 2019-04-30 2019-07-26 徐州重型机械有限公司 Operation householder method and system, wearable device and engineering truck
CN110061755B (en) * 2019-04-30 2021-04-16 徐州重型机械有限公司 Operation assisting method and system, wearable device and engineering vehicle
CN110442233A (en) * 2019-06-18 2019-11-12 中国人民解放军军事科学院国防科技创新研究院 A kind of augmented reality key mouse system based on gesture interaction
CN110568929A (en) * 2019-09-06 2019-12-13 诺百爱(杭州)科技有限责任公司 Virtual scene interaction method and device and electronic equipment
CN110568929B (en) * 2019-09-06 2023-04-25 诺百爱(杭州)科技有限责任公司 Virtual scene interaction method and device based on virtual keyboard and electronic equipment
US11907423B2 (en) 2019-11-25 2024-02-20 Meta Platforms Technologies, Llc Systems and methods for contextualized interactions with an environment

Similar Documents

Publication Publication Date Title
CN107203272A (en) Wearable augmented reality task instruction system and method based on myoelectricity cognition technology
CN108983636B (en) Man-machine intelligent symbiotic platform system
CN102789313B (en) User interaction system and method
CN104410883B (en) The mobile wearable contactless interactive system of one kind and method
CN111679740A (en) Method for carrying out remote intelligent diagnosis on power station equipment by using Augmented Reality (AR) technology
CN107943282A (en) A kind of man-machine interactive system and method based on augmented reality and wearable device
CN108776773A (en) A kind of three-dimensional gesture recognition method and interactive system based on depth image
CN112198959A (en) Virtual reality interaction method, device and system
CN114372356B (en) Artificial enhancement method, device and medium based on digital twins
CN107357434A (en) Information input equipment, system and method under a kind of reality environment
CN108616712A (en) A kind of interface operation method, device, equipment and storage medium based on camera
CN106272446A (en) The method and apparatus of robot motion simulation
CN107066081A (en) The interaction control method and device and virtual reality device of a kind of virtual reality system
CN110688910A (en) Method for realizing wearable human body basic posture recognition
CN110442233A (en) A kind of augmented reality key mouse system based on gesture interaction
WO2021035071A1 (en) Systems and methods for simulating sense data and creating perceptions
CN114998983A (en) Limb rehabilitation method based on augmented reality technology and posture recognition technology
CN206411612U (en) The interaction control device and virtual reality device of a kind of virtual reality system
CN107643820B (en) VR passive robot and implementation method thereof
CN107390881A (en) A kind of gestural control method
CN107894834A (en) Gesture identification method and system are controlled under augmented reality environment
CN114063784A (en) Simulated virtual XR BOX somatosensory interaction system and method
CN113377193A (en) Vending machine interaction method and system based on reliable gesture recognition
CN115798042A (en) Escalator passenger abnormal behavior data construction method based on digital twins
CN116563499A (en) Intelligent interaction system of transformer substation based on meta-universe technology

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information
CB03 Change of inventor or designer information

Inventor after: Shao Peng

Inventor after: Zhang Zhen

Inventor after: Liang Bo

Inventor after: Shi Yunfei

Inventor before: Zhang Zhen

Inventor before: Liang Bo

Inventor before: Shi Yunfei

CB03 Change of inventor or designer information
CB03 Change of inventor or designer information

Inventor after: Zhang Zhen

Inventor after: Shao Peng

Inventor after: Liang Bo

Inventor after: Shi Yunfei

Inventor before: Shao Peng

Inventor before: Zhang Zhen

Inventor before: Liang Bo

Inventor before: Shi Yunfei

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20170926