CN104199543A - Leading limb identification method and system - Google Patents
Leading limb identification method and system Download PDFInfo
- Publication number
- CN104199543A CN104199543A CN201410426333.0A CN201410426333A CN104199543A CN 104199543 A CN104199543 A CN 104199543A CN 201410426333 A CN201410426333 A CN 201410426333A CN 104199543 A CN104199543 A CN 104199543A
- Authority
- CN
- China
- Prior art keywords
- limbs
- information
- leading
- user
- myoelectric information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/011—Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
Abstract
The invention provides a leading limb identification method and device, and relates to the field of wearable devices. The method includes the steps of obtaining first myoelectricity information of a first limb of a user, and conducting identification to judge whether the first limb is a leading limb or not according to the first myoelectricity information and reference information. The method and the device are beneficial for automatically setting a device worn by the user according to the identification result, and user experience is improved.
Description
Technical field
The application relates to wearable apparatus field, relates in particular to a kind of leading limbs recognition methods and device.
Background technology
In recent years, along with the development of wearable equipment, Intelligent spire lamella, intelligent bracelet, intelligent glasses etc. progress into people's life, enrich and facilitate greatly people's life.Wearable equipment is due to small volume, and general interaction capabilities is poor.Therefore, user generally wishes that it has good self-recognizability, thereby reduces user's setting operation.
In crowd, approximately people's left hand of 10-13% is leading hand, and others's right hand is leading hand.If wearable equipment can identify user's leading hand, can be used as the input of self or other equipment, reduce user's setting operation, improve user's experience.
Summary of the invention
The application's object is: a kind of leading limbs recognition methods and device are provided.
According at least one embodiment of the application aspect, a kind of leading limbs recognition methods is provided, described method comprises:
Obtain the first myoelectric information of the first limbs of user;
Whether according to described the first myoelectric information and a reference information, identifying described the first limbs is leading limbs.
According at least one embodiment of the application aspect, a kind of leading limbs recognition device is provided, described device comprises:
One first acquisition module, for obtaining the first myoelectric information of the first limbs of user;
One identification module, whether for according to described the first myoelectric information and a reference information, identifying described the first limbs is leading limbs.
Leading limbs recognition methods and device described in the embodiment of the present application, obtain the myoelectric information of the first limbs of user, whether and then to identify described the first limbs according to described myoelectric information and a reference information are leading limbs, thereby provide a kind of recognition methods of leading limbs, be conducive to the equipment that user wears and carry out Lookup protocol according to recognition result, promote user and experience.
Brief description of the drawings
Fig. 1 is the process flow diagram of leading limbs recognition methods described in embodiment of the application;
Fig. 2 is the process flow diagram of leading limbs recognition methods described in embodiment of the application;
Fig. 3 is the process flow diagram of leading limbs recognition methods described in another embodiment of the application;
Fig. 4 is the process flow diagram of leading limbs recognition methods described in another embodiment of the application;
Fig. 5 is the process flow diagram of leading limbs recognition methods described in another embodiment of the application;
Fig. 6 is the modular structure schematic diagram of leading limbs recognition device described in embodiment of the application;
Fig. 7 is the modular structure schematic diagram of leading limbs recognition device described in embodiment of the application;
Fig. 8 is the modular structure schematic diagram of leading limbs recognition device described in another embodiment of the application;
Fig. 9 is the modular structure schematic diagram of leading limbs recognition device described in another embodiment of the application;
Figure 10 is the modular structure schematic diagram of leading limbs recognition device described in another embodiment of the application;
Figure 11 is the modular structure schematic diagram of leading limbs recognition device described in another embodiment of the application;
Figure 12 is the modular structure schematic diagram of leading limbs recognition device described in another embodiment of the application;
Figure 13 is the hardware configuration schematic diagram of leading limbs recognition device described in embodiment of the application.
Embodiment
Below in conjunction with drawings and Examples, the application's embodiment is described in further detail.Following examples are used for illustrating the application, but are not used for limiting the application's scope.
Those skilled in the art understand, in the application's embodiment, the size of the sequence number of following each step does not also mean that the priority of execution sequence, and the execution sequence of each step should determine with its function and internal logic, and should not form any restriction to the implementation process of the embodiment of the present application.
Fig. 1 is the process flow diagram of leading limbs recognition methods described in embodiment of the application, and described method can for example realize on a leading limbs recognition device.As shown in Figure 1, described method comprises:
S120: the first myoelectric information that obtains the first limbs of user;
S140: whether according to described the first myoelectric information and a reference information, identifying described the first limbs is leading limbs.
Method described in the embodiment of the present application, obtain the first myoelectric information of the first limbs of user, whether and then to identify described the first limbs according to described the first myoelectric information and a reference information are leading limbs, thereby provide a kind of recognition methods of leading limbs, be conducive to the equipment that user wears and carry out Lookup protocol according to recognition result, promote user and experience.
Below with reference to embodiment, describe the function of described step S120 and S140 in detail.
S120: the first myoelectric information that obtains the first limbs of user.
Wherein, described limbs can be hands in two hands of described user, can be also arms in two arms of user.For the purpose of simple, below be that a hand in two hands of described user is that example describes mainly with described the first limbs.Described myoelectric information can be the electromyographic signal of a period of time, and it can obtain by the one group of myoelectric sensor contacting with user's skin, and described one group of myoelectric sensor comprises one or more myoelectric sensor.
S140: whether according to described the first myoelectric information and a reference information, identifying described the first limbs is leading limbs.
Inventor finds in research process, and in the time that muscle shrinks with different loads, range value and the muscular strength of myoelectric information are proportional, and the tension force that muscle produces is larger, and the range value of myoelectric information is larger.Further, when the intensity below 40%MVC for muscle (maximum static muscular strength) is shunk, the range value of muscular strength and myoelectricity is linear; When muscle is during by the above intensity of 60%MVC, the range value of muscular strength and myoelectricity is also linear, but straight slope is now larger.And muscular strength is between 40%-60%MVC time, the linear relationship between muscular strength and the range value of myoelectric information does not exist, but proportional relation still.
In addition, inventor is also discovery in research process, and the frequency that people's leading limbs are used is apparently higher than non-dominant limbs, and in other words, the frequency of the contraction of muscle of people's leading limbs is apparently higher than the frequency of non-dominant limbs contraction.Therefore, when inventor is in research process, within a period of time (such as 1 hour), by identical sample frequency, leading limbs to a user and the identical muscle of non-dominant limbs or muscle group gather after myoelectric information, can find that the averaged amplitude value of the myoelectric information of dominating limbs is apparently higher than the averaged amplitude value of the myoelectric information of non-dominant limbs.The application, just based on above-mentioned principle, realizes the identification to leading limbs.
In one embodiment, described reference information is a threshold value of determining according to the averaged amplitude value of the myoelectric information of the averaged amplitude value of the myoelectric information of described user's left limb and right limb.
Described step S140 can comprise:
S141: be greater than described threshold value in response to the averaged amplitude value of described the first myoelectric information, determine that described the first limbs are leading limbs;
S142: be less than described threshold value in response to the averaged amplitude value of described the first myoelectric information, determine that described the first limbs are not leading limbs.
Such as, described reference information can be a threshold value of determining according to the averaged amplitude value of the myoelectric information of the averaged amplitude value of the myoelectric information of described user's left hand and the right hand, supposes that the averaged amplitude value of the myoelectric information of described user's left hand falls into the first interval (L
min, L
max), suppose that the averaged amplitude value of the myoelectric information of described user's the right hand falls into (R between Second Region
min, R
max), and suppose that described user's left hand is leading hand, have L
min> R
max, can determine that described threshold value is M, and L
min> M > R
max.That is to say, described threshold value M is a numerical value between between described the first interval and described Second Region.
Therefore, if the mean value of described myoelectric information is greater than described threshold value M, think that it falls into described the first interval, described limbs are leading hands; If the mean value of described myoelectric information is less than described threshold value M, think that it falls between described Second Region, described limbs are not leading hands.
It should be noted that, in present embodiment, described reference information need to be determined according to the myoelectric information of the myoelectric information of described user's left limb and right limb, therefore, need to obtain in advance described user's the myoelectric information of left limb and the myoelectric information of right limb, such as user is formal use described leading limbs recognition device before, first wear a period of time on leftward, on the right hand, wear a period of time again, to complete training process.
In another embodiment, described reference information is the second myoelectric information of the second limbs of described user.
Described step S140 can comprise:
S141 ': be greater than the averaged amplitude value of described the second myoelectric information in response to the averaged amplitude value of described the first myoelectric information, determine that described the first limbs are leading limbs;
S142 ': be less than the averaged amplitude value of described the second myoelectric information in response to the averaged amplitude value of described the first myoelectric information, determine that described the first limbs are not leading limbs.
In present embodiment, user without carrying out training in advance, such as obtaining respectively described user's the first myoelectric information of the first limbs and the second myoelectric information of the second limbs, then compare the size of the averaged amplitude value of the first myoelectric information and the averaged amplitude value of the second myoelectric information, and then identify leading limbs wherein.
The advantage of present embodiment is, do not need user deliberately to complete training process, user completes the collection of reference information in natural use procedure, but, described in present embodiment, method may need to obtain the myoelectric information such as two hands simultaneously, the scene that its both hands that are mainly applicable to user participate in simultaneously, such as user's both hands have on the Intelligent glove electronic game of playing.
Referring to Fig. 2, in one embodiment, described method also comprises:
S150: according to recognition result executable operations.
Such as, know that according to recognition result user's leading hand is being worn Intelligent spire lamella, often can participate in work because taking hand channel as the leading factor, in the case of user's the frequent motion of leading hand arm, can point out user to note wrist strap safety, prevent from breaking.
For another example, user's both hands have on Intelligent glove and participate in a shooting electron-like game, and the left hand that gloves identify user is leading hand, can automatically switch to the armed pattern of the right hand, thereby more meet user's use habit.
In addition, according to described recognition result and user's input message, whether right-hand man or the user that can further identify user are left-handed persons etc.
Referring to Fig. 3, in one embodiment, described method also comprises:
S160: the input message that receives described user.
Referring to Fig. 4, in one embodiment, described input message is leading limbs information, and described user is left-handed person or right-handed person.In present embodiment, described method also comprises:
S170: according to described input message and recognition result, determine that described limbs are left limb or right limb.
Such as, if described leading limbs information shows that described user is left-handed person, and described recognition result shows that described the first limbs are leading hands, can determine that described the first limbs are left hands.
After having determined that described the first limbs are left limb or right limb, can be adaptive the setting such as display interface of the wearable equipment worn of described the first limbs of adjustment user, or can also adjust the setting such as display interface of the equipment such as smart mobile phone that described the first limbs grip.
Such as, suppose to determine that described the first limbs are left hands of user, the deciphering gesture of the intelligent watch that described the first limbs wear can be set is to slide from left to right to described method, can arrange that on smart mobile phone, consonant is in the left-half of screen, and vowel is at the right half part of screen.
Referring to Fig. 5, in one embodiment, described input message comprises that described the first limbs are left hand or the right hand, and described method can also comprise:
S180: according to described input message and recognition result, determine that described user is left-handed person or right-handed person.
Such as, described input message shows that described the first limbs are left hands, and described recognition result shows that described the first limbs are leading hands, can determine that described user is left-handed person.Wherein, described input message can phonetic entry, word input etc., can also be behavior input, and such as one left hand gloves being worn over to one on hand time as user, being equivalent to input a described hand is left hand.
In addition, the embodiment of the present application also provides a kind of computer-readable medium, is included in the computer-readable instruction that carries out following operation while being performed: carry out the step S120 of method and the operation of S140 in above-mentioned Fig. 1 illustrated embodiment.
To sum up, method described in the embodiment of the present application, whether available to identify described the first limbs according to the myoelectric information of user's the first limbs and a reference information are leading limbs, and can determine that described limbs are left limb or right limb in conjunction with user's input message, or whether definite user is left-handed person, and carry out corresponding operating according to recognition result or definite result, and reduce user's setting operation, promote user's experience.
Fig. 6 is the modular structure schematic diagram of leading limbs recognition device described in one embodiment of the invention, described leading limbs recognition device can be used as a functional module and is arranged in the wearable equipment such as Intelligent spire lamella, intelligent watch, Intelligent glove, certainly also can be used as one independently wearable device for user.As shown in Figure 6, described device 600 can comprise:
One first acquisition module 610, for obtaining the first myoelectric information of the first limbs of user;
One identification module 620, whether for according to described the first myoelectric information and a reference information, identifying described the first limbs is leading limbs.
Described in the embodiment of the present application, install, obtain the first myoelectric information of the first limbs of user, whether and then to identify described the first limbs according to described the first myoelectric information and a reference information are leading limbs, thereby provide a kind of recognition device of leading limbs, be conducive to the equipment that user wears and carry out Lookup protocol according to recognition result, promote user and experience.
Below with reference to embodiment, describe the function of described the first acquisition module 610 and identification module 620 in detail.
Described the first acquisition module 610, for obtaining the first myoelectric information of the first limbs of user.
Wherein, described limbs can be hands in two hands of described user, can be also arms in two arms of user.For the purpose of simple, below be that a hand in two hands of described user is that example describes mainly with described the first limbs.Described myoelectric information can be the electromyographic signal of a period of time, and it can obtain by the one group of myoelectric sensor contacting with user's skin, and described one group of myoelectric sensor comprises one or more myoelectric sensor.Described myoelectric sensor can comprise electrode, amplifier circuit etc.
Described identification module 620, whether for according to described the first myoelectric information and a reference information, identifying described the first limbs is leading limbs.
Referring to Fig. 7, in one embodiment, described device 600 also comprises:
One first determination module 630, for according to the averaged amplitude value of the myoelectric information of the averaged amplitude value of the myoelectric information of described user's left limb and right limb determine a threshold value as described reference information.
Described identification module 620, for being greater than described threshold value in response to the averaged amplitude value of described the first myoelectric information, determines that described the first limbs are leading limbs; And
Be less than described threshold value in response to the averaged amplitude value of described the first myoelectric information, determine that described the first limbs are not leading limbs.
Such as, described the first determination module 630 can determine that described threshold value is as described reference information according to the averaged amplitude value of the myoelectric information of the averaged amplitude value of the myoelectric information of described user's left hand and the right hand, supposes that the averaged amplitude value of the myoelectric information of described user's left hand falls into the first interval (L
min, L
max), suppose that the averaged amplitude value of the myoelectric information of described user's the right hand falls into (R between Second Region
min, R
max), and suppose that described user's left hand is leading hand, have L
min> R
max, can determine that described threshold value is M, and L
min> M > R
max.That is to say, described threshold value M is a numerical value between between described the first interval and described Second Region.
Therefore, for described identification module 620, if the mean value of described myoelectric information is greater than described threshold value M, think that it falls into described the first interval, identifying described limbs is leading hands; If the mean value of described myoelectric information is less than described threshold value M, think that it falls between described Second Region, identifying described limbs is not leading hand.
It should be noted that, in present embodiment, described reference information need to be determined according to the myoelectric information of the myoelectric information of described user's left limb and right limb, therefore, described the first determination module 30 need to obtain described user's the myoelectric information of left limb and the myoelectric information of right limb in advance, such as user is formal use described leading limbs recognition device before, first wear a period of time on leftward, on the right hand, wear a period of time again, to complete training process.
Referring to Fig. 8, in another embodiment, described device 600 also comprises:
One second acquisition module 640, for the second myoelectric information of the second limbs of obtaining described user as described reference information.
Described identification module 620, for be greater than the averaged amplitude value of described the second myoelectric information in response to the averaged amplitude value of described the first myoelectric information, determines that described the first limbs are leading limbs; And
Be less than the averaged amplitude value of described the second myoelectric information in response to the averaged amplitude value of described the first myoelectric information, determine that described the first limbs are not leading limbs.
In present embodiment, user without carrying out preliminary election training, such as obtaining respectively described user's the first myoelectric information of the first limbs and the second myoelectric information of the second limbs, then compare the size of the averaged amplitude value of the first myoelectric information and the averaged amplitude value of the second myoelectric information, and then identify leading limbs wherein.
The advantage of present embodiment is, do not need user deliberately to complete training process, user completes the collection of reference information in natural use procedure, but, described in present embodiment, method may need to obtain the myoelectric information such as two hands simultaneously, the scene that its both hands that are mainly applicable to user participate in simultaneously, such as user's both hands have on the Intelligent glove electronic game of playing.
Referring to Fig. 9, in one embodiment, described device 600 also comprises:
One execution module 650, for according to recognition result executable operations.
Such as, know that according to recognition result user's leading hand is being worn Intelligent spire lamella, often can participate in work because taking hand channel as the leading factor, in the case of user's the frequent motion of leading hand arm, described execution module 650 can point out user to note wrist strap safety, prevents from breaking.
For another example, user's both hands have on Intelligent glove and participate in a shooting electron-like game, the left hand that gloves identify user is leading hand, and 650 of described execution modules can be controlled game machine and automatically switch to the armed pattern of the right hand, thereby more meet user's use habit.
Referring to Figure 10, in one embodiment, described device 600 also comprises:
One load module 660, for receiving described user's input message.
Referring to Figure 11, in one embodiment, described input message is leading limbs information, and described device 600 also comprises:
One second determination module 670, for according to described input message and recognition result, determines that described the first limbs are left limb or right limb.
Such as, if described leading limbs information shows that described user is left-handed person, and described recognition result shows that described the first limbs are leading hands, can determine that described the first limbs are left hands.
After having determined that described the first limbs are left limb or right limb, can be adaptive the setting such as display interface of the wearable equipment worn of described the first limbs of adjustment user, or can also adjust the setting such as display interface of the equipment such as smart mobile phone that described the first limbs grip.
Such as, suppose to determine that described the first limbs are left hands of user, the deciphering gesture of the intelligent watch that described the first limbs wear can be set is to slide from left to right to described method, can arrange that on smart mobile phone, consonant is in the left-half of screen, and vowel is at the right half part of screen.
Referring to Figure 12, in another embodiment, described input message comprises that described the first limbs are left hand or the right hand, and described device 600 also comprises:
One the 3rd determination module 680, for according to described input message and recognition result, determines that described user is left-handed person or right-handed person.
Such as, described input message shows that described the first limbs are left hands, and described recognition result shows that described the first limbs are leading hands, can determine that described user is left-handed person.Wherein, described input message can phonetic entry, word input etc., can also be behavior input, and such as one left hand gloves being worn over to one on hand time as user, being equivalent to input a described hand is left hand.
Described in the embodiment of the present application, an application scenarios of leading limbs recognition methods and device can be as follows: left-handed person user both hands are respectively worn an Intelligent glove and played an electronic game, in incipient a period of time, such as 10 minutes, the Intelligent glove of left hand gathers the myoelectric information of five fingers of left hand with a preset frequency, the Intelligent glove of the right hand is with the myoelectric information of identical five fingers of the frequency collection right hand, then calculate respectively the average amplitude of myoelectric information of left hand and the average amplitude of the myoelectric information of the right hand, found that the average amplitude of the myoelectric information of left hand is obviously greater than the average amplitude of the myoelectric information of the right hand, leading hand thereby identify left hand, and then the control knob on display screen is arranged on to left hand one side, to be user-friendly to, promoting user experiences.
Described in another embodiment of the application, the hardware configuration of leading hand recognition device as shown in figure 13.The application's specific embodiment does not limit the specific implementation of described leading hand recognition device, and referring to Figure 13, described device 1300 can comprise:
Processor (processor) 1310, communication interface (Communications Interface) 1320, storer (memory) 1330, and communication bus 1340.Wherein:
Processor 1310, communication interface 1320, and storer 1330 completes mutual communication by communication bus 1340.
Communication interface 1320, for other net element communications.
Processor 1310, for executive routine 1332, specifically can carry out the correlation step in the embodiment of the method shown in above-mentioned Fig. 1.
Particularly, program 1332 can comprise program code, and described program code comprises computer-managed instruction.
Processor 1310 may be a central processor CPU, or specific integrated circuit ASIC (Application Specific Integrated Circuit), or is configured to implement one or more integrated circuit of the embodiment of the present application.
Storer 1330, for depositing program 1332.Storer 1330 may comprise high-speed RAM storer, also may also comprise nonvolatile memory (non-volatile memory), for example at least one magnetic disk memory.Program 1332 specifically can be carried out following steps:
Obtain the first myoelectric information of the first limbs of user;
Whether according to described the first myoelectric information and a reference information, identifying described the first limbs is leading limbs.
In program 1332, the specific implementation of each step can, referring to the corresponding steps in above-described embodiment or module, be not repeated herein.Those skilled in the art can be well understood to, and for convenience and simplicity of description, the specific works process of the equipment of foregoing description and module, can describe with reference to the corresponding process in preceding method embodiment, does not repeat them here.
Those of ordinary skill in the art can recognize, unit and the method step of each example of describing in conjunction with embodiment disclosed herein, can realize with the combination of electronic hardware or computer software and electronic hardware.These functions are carried out with hardware or software mode actually, depend on application-specific and the design constraint of technical scheme.Professional and technical personnel can realize described function with distinct methods to each specifically should being used for, but this realization should not thought and exceeds the application's scope.
If described function realizes and during as production marketing independently or use, can be stored in a computer read/write memory medium using the form of SFU software functional unit.Based on such understanding, the part that the application's technical scheme contributes to prior art in essence in other words or the part of this technical scheme can embody with the form of software product, this computer software product is stored in a storage medium, comprise that some instructions (can be personal computers in order to make a computer equipment, controller, or the network equipment etc.) carry out all or part of step of method described in each embodiment of the application.And aforesaid storage medium comprises: USB flash disk, portable hard drive, ROM (read-only memory) (ROM, Read-Only Memory), the various media that can be program code stored such as random access memory (RAM, Random Access Memory), magnetic disc or CD.
Above embodiment is only for illustrating the application; and the not restriction to the application; the those of ordinary skill in relevant technologies field; in the case of not departing from the application's spirit and scope; can also make a variety of changes and modification; therefore all technical schemes that are equal to also belong to the application's category, and the application's scope of patent protection should be defined by the claims.
Claims (19)
1. a leading limbs recognition methods, is characterized in that, described method comprises:
Obtain the first myoelectric information of the first limbs of user;
Whether according to described the first myoelectric information and a reference information, identifying described the first limbs is leading limbs.
2. the method for claim 1, is characterized in that, described reference information is a threshold value of determining according to the averaged amplitude value of the myoelectric information of the averaged amplitude value of the myoelectric information of described user's left limb and right limb.
3. method as claimed in claim 2, is characterized in that, described according to described the first myoelectric information and a reference information, and whether identify described the first limbs is that leading limbs comprise:
Be greater than described threshold value in response to the averaged amplitude value of described the first myoelectric information, determine that described the first limbs are leading limbs;
Be less than described threshold value in response to the averaged amplitude value of described the first myoelectric information, determine that described the first limbs are not leading limbs.
4. the method for claim 1, is characterized in that, described reference information is the second myoelectric information of the second limbs of described user.
5. method as claimed in claim 4, is characterized in that, described according to described the first myoelectric information and a reference information, and whether identify described the first limbs is that leading limbs comprise:
Be greater than the averaged amplitude value of described the second myoelectric information in response to the averaged amplitude value of described the first myoelectric information, determine that described the first limbs are leading limbs;
Be less than the averaged amplitude value of described the second myoelectric information in response to the averaged amplitude value of described the first myoelectric information, determine that described the first limbs are not leading limbs.
6. the method as described in claim 1 to 5 any one, is characterized in that, described method also comprises:
According to recognition result executable operations.
7. the method as described in claim 1 to 6 any one, is characterized in that, described method also comprises:
Receive described user's input message.
8. method as claimed in claim 7, is characterized in that, described input message is leading limbs information, and described method also comprises:
According to described input message and recognition result, determine that described the first limbs are left limb or right limb.
9. method as claimed in claim 7, is characterized in that, described input message comprises that described the first limbs are left hand or the right hand, and described method also comprises:
According to described input message and recognition result, determine that described user is left-handed person or right-handed person.
10. a leading limbs recognition device, is characterized in that, described device comprises:
One first acquisition module, for obtaining the first myoelectric information of the first limbs of user;
One identification module, whether for according to described the first myoelectric information and a reference information, identifying described the first limbs is leading limbs.
11. devices as claimed in claim 10, is characterized in that, described device also comprises:
One first determination module, for according to the averaged amplitude value of the myoelectric information of the averaged amplitude value of the myoelectric information of described user's left limb and right limb determine a threshold value as described reference information.
12. devices as claimed in claim 11, is characterized in that, described identification module, for being greater than described threshold value in response to the averaged amplitude value of described the first myoelectric information, determines that described the first limbs are leading limbs; And
Be less than described threshold value in response to the averaged amplitude value of described the first myoelectric information, determine that described the first limbs are not leading limbs.
13. devices as claimed in claim 10, is characterized in that, described device also comprises:
One second acquisition module, for the second myoelectric information of the second limbs of obtaining described user as described reference information.
14. devices as claimed in claim 13, is characterized in that, described identification module, for be greater than the averaged amplitude value of described the second myoelectric information in response to the averaged amplitude value of described the first myoelectric information, determines that described the first limbs are leading limbs; And
Be less than the averaged amplitude value of described the second myoelectric information in response to the averaged amplitude value of described the first myoelectric information, determine that described the first limbs are not leading limbs.
15. devices as described in claim 10 to 14 any one, is characterized in that, described device also comprises:
One execution module, for according to recognition result executable operations.
16. devices as described in claim 10 to 15 any one, is characterized in that, described device also comprises:
One load module, for receiving described user's input message.
17. devices as claimed in claim 16, is characterized in that, described input message is leading limbs information, and described device also comprises:
One second determination module, for according to described input message and recognition result, determines that described the first limbs are left limb or right limb.
18. devices as claimed in claim 16, is characterized in that, described input message comprises that described the first limbs are left hand or the right hand, and described device also comprises:
One the 3rd determination module, for according to described input message and recognition result, determines that described user is left-handed person or right-handed person.
19. devices as described in claim 10 to 18 any one, is characterized in that, described device is a wearable device.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410426333.0A CN104199543A (en) | 2014-08-26 | 2014-08-26 | Leading limb identification method and system |
CN201410705606.5A CN104407704A (en) | 2014-08-26 | 2014-11-27 | Dominant limb determination method and apparatus |
PCT/CN2015/086310 WO2016019894A1 (en) | 2014-08-07 | 2015-08-07 | Dominant limb identification method and device |
US15/501,766 US20170235366A1 (en) | 2014-08-07 | 2015-08-07 | Dominant limb identification method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410426333.0A CN104199543A (en) | 2014-08-26 | 2014-08-26 | Leading limb identification method and system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN104199543A true CN104199543A (en) | 2014-12-10 |
Family
ID=52084844
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410426333.0A Withdrawn CN104199543A (en) | 2014-08-07 | 2014-08-26 | Leading limb identification method and system |
CN201410705606.5A Pending CN104407704A (en) | 2014-08-26 | 2014-11-27 | Dominant limb determination method and apparatus |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410705606.5A Pending CN104407704A (en) | 2014-08-26 | 2014-11-27 | Dominant limb determination method and apparatus |
Country Status (1)
Country | Link |
---|---|
CN (2) | CN104199543A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104463131A (en) * | 2014-12-17 | 2015-03-25 | 北京智谷技术服务有限公司 | Method and device for determining inner side and outer side of limb |
WO2016019894A1 (en) * | 2014-08-07 | 2016-02-11 | Beijing Zhigu Tech Co., Ltd. | Dominant limb identification method and device |
CN106371561A (en) * | 2015-08-19 | 2017-02-01 | 北京智谷睿拓技术服务有限公司 | Input information determination method and device |
CN106371560A (en) * | 2015-08-19 | 2017-02-01 | 北京智谷睿拓技术服务有限公司 | Blowing and sucking determination method and device |
US11517261B2 (en) | 2014-09-15 | 2022-12-06 | Beijing Zhigu Tech Co., Ltd. | Method and device for determining inner and outer sides of limbs |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN100451924C (en) * | 2005-12-30 | 2009-01-14 | 财团法人工业技术研究院 | Emotion perception interdynamic recreational apparatus |
CN102890558B (en) * | 2012-10-26 | 2015-08-19 | 北京金和软件股份有限公司 | The method of mobile hand-held device handheld motion state is detected based on sensor |
CN103941874B (en) * | 2014-04-30 | 2017-03-01 | 北京智谷睿拓技术服务有限公司 | Recognition methodss and equipment |
-
2014
- 2014-08-26 CN CN201410426333.0A patent/CN104199543A/en not_active Withdrawn
- 2014-11-27 CN CN201410705606.5A patent/CN104407704A/en active Pending
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016019894A1 (en) * | 2014-08-07 | 2016-02-11 | Beijing Zhigu Tech Co., Ltd. | Dominant limb identification method and device |
US11517261B2 (en) | 2014-09-15 | 2022-12-06 | Beijing Zhigu Tech Co., Ltd. | Method and device for determining inner and outer sides of limbs |
CN104463131A (en) * | 2014-12-17 | 2015-03-25 | 北京智谷技术服务有限公司 | Method and device for determining inner side and outer side of limb |
CN104463131B (en) * | 2014-12-17 | 2018-03-02 | 北京智谷技术服务有限公司 | Outside determines method and apparatus in limbs |
CN106371561A (en) * | 2015-08-19 | 2017-02-01 | 北京智谷睿拓技术服务有限公司 | Input information determination method and device |
CN106371560A (en) * | 2015-08-19 | 2017-02-01 | 北京智谷睿拓技术服务有限公司 | Blowing and sucking determination method and device |
CN106371560B (en) * | 2015-08-19 | 2020-06-02 | 北京智谷睿拓技术服务有限公司 | Method and apparatus for determining blowing and suction air |
Also Published As
Publication number | Publication date |
---|---|
CN104407704A (en) | 2015-03-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104133554A (en) | Method and device for identifying leading limb | |
CN104199543A (en) | Leading limb identification method and system | |
US20150109202A1 (en) | Systems, articles, and methods for gesture identification in wearable electromyography devices | |
US10642820B2 (en) | Method for data processing and related products | |
CN104049752A (en) | Interaction method based on human body and interaction device based on human body | |
CN105597315A (en) | Virtual object throwing control method and device | |
CN104966011A (en) | Method for non-collaborative judgment and operating authorization restriction for mobile terminal child user | |
CN103760970A (en) | Wearable input system and method | |
US11201679B2 (en) | Communications methods and user equipment | |
CN110221684A (en) | Apparatus control method, system, electronic device and computer readable storage medium | |
CN104834580A (en) | Calculation and enthrallment control method for enthrallment value of terminal user | |
CN104407703A (en) | Dominant limb determination method and apparatus | |
CN110123313A (en) | A kind of self-training brain machine interface system and related training method | |
CN114707562A (en) | Electromyographic signal sampling frequency control method and device and storage medium | |
CN105068658A (en) | Man-machine interaction device capable of performing virtual environment control | |
CN105141772B (en) | A kind of the alarm clock method for closing and device of mobile terminal | |
CN105100875B (en) | A kind of control method and device of recording of multimedia information | |
CN108225369A (en) | A kind of information acquisition method and device and related media production | |
KR101724115B1 (en) | Method, device, system and non-transitory computer-readable recording medium for providing feedback | |
CN109002244A (en) | Watchband control method, wearable device and the readable storage medium storing program for executing of wearable device | |
CN104216519A (en) | Dominant limb cognition method and equipment | |
CN104367320A (en) | Method and device for determining dominant eye | |
CN104375649A (en) | Leading limb determination method and device | |
CN103376918A (en) | Gravity sensing method and electronic equipment | |
CN104360750A (en) | Dominant limb determination method and dominant limb determination device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C04 | Withdrawal of patent application after publication (patent law 2001) | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20141210 |