US20010014441A1 - Wireless myoelectric control apparatus and methods - Google Patents
Wireless myoelectric control apparatus and methods Download PDFInfo
- Publication number
- US20010014441A1 US20010014441A1 US09/840,255 US84025501A US2001014441A1 US 20010014441 A1 US20010014441 A1 US 20010014441A1 US 84025501 A US84025501 A US 84025501A US 2001014441 A1 US2001014441 A1 US 2001014441A1
- Authority
- US
- United States
- Prior art keywords
- myoelectric
- gesture
- wireless
- electronic device
- portable electronic
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Definitions
- the present invention relates generally to the field of user interfaces for portable electronic devices, as well as to the field of wireless myoelectric control.
- Portable electronic devices have become increasingly popular. These devices can operate with the use of a small battery pack or battery cells. Examples of these devices include wireless or cellular telephones, personal digital assistants (PDAs), and audio or music delivery devices. Some devices have become increasingly small such that they are now deemed “pocketable” and/or “wearable.”
- a portable electronic device typically has a user interface for operative control. Most if not all conventional user interfaces for such portable electronic devices employ physical buttons, stylus, or voice control. In some devices, a large number of operations or functions are possible with the user interface.
- the apparatus may employ a method of receiving and digitizing signals from the myoelectric sensors, determining whether a correlation exists between the digitized signal data and one of a plurality of prestored gesture patterns, selecting and generating a control signal or message associated with the prestored gesture pattern when a correlation exists, and modulating and transmitting a radio frequency (RF) signal with the control signal for control of the portable electronic device.
- RF radio frequency
- FIG. 5 is a flowchart describing a method of learning gestures which may be carried out by the wireless myoelectric apparatus of FIG. 1.
- FIG. 1 is a diagram of a wireless communication system 100 including a wireless myoelectric apparatus 102 and a portable electronic device 104 .
- Apparatus 102 is operative to communicate to portable electronic device 104 via radio frequency (RF) signals 106 or other suitable over-the-air signals.
- RF radio frequency
- the band may be alternatively sized and configured to cover a part of the forearm that is smaller than that shown as being covered by band 108 .
- FIG. 1 shows a band 150 , which may be referred to as a “wristband”.
- Band 150 has a dimension along the length of the forearm that is smaller than that shown by band 108 . This dimension may be about three inches (7.62 cm), for example, but may vary; for example, the dimension may vary from around one inch (5.08 cm) to five inches (12.7 cm).
- band 150 is not suited to cover most of the forearm, but rather to cover only an upper part of the forearm (e.g., where the muscle is the thickest).
- Left and right wristbands may be used in combination as well.
- FIG. 2 is a schematic block diagram of electrical circuitry 112 of wireless myoelectric apparatus 102 of FIG. 1.
- Electronic circuitry 112 includes myoelectric sensors 110 , a signal amplifier 202 , a gesture detector 204 , a training module 206 , memory 208 , an encryption module 216 , a transmitter 218 , and antenna 114 , coupled as indicated in FIG. 2.
- Memory 208 may include a read-only memory (ROM) 210 , a random access memory (RAM) 212 , and an electrically-erasable/programmable ROM (EEPROM) 214 .
- ROM read-only memory
- RAM random access memory
- EEPROM electrically-erasable/programmable ROM
- Myoelectric sensors 110 of FIG. 2 refers to the sensors from the one or more bands in use.
- a portable energy source 222 is included.
- Portable energy source 222 may include a battery pack or one or more battery cells.
- portable electronic device 104 examples include a portable audio player (having an AM/FM radio, tape cassette, and/or compact disc (CD) player), a wireless or cellular telephone, a personal digital assistant (PDA), a computer, a calculator, to name a few.
- portable audio player having an AM/FM radio, tape cassette, and/or compact disc (CD) player
- CD compact disc
- PDA personal digital assistant
- the electronic circuitry of portable electronic device 104 may include circuitry and devices in addition to electrical circuitry 138 shown in FIG. 1, and will depend on the particular chosen application.
- apparatus 102 is a wireless myoelectric interface for portable electronic device 104 .
- portable electronic device 104 has a wireless myoelectric user interface.
- the conventional user interface 132 of portable electronic device 104 does not exist (except for perhaps earpiece 104 ) and the wireless myoelectric interface is the primary user interface for portable electronic device 104 . That is, buttons or stylus that would otherwise be present are missing from portable electronic device 104 .
- conventional user interface 132 is present but may overridden by apparatus 102 .
- FIG. 3 is a flowchart describing a method of processing myoelectric sensor signals. This method may be carried out by wireless myoelectric apparatus 102 . As apparent, at least some steps of the methods described herein are embodied and implemented in connection with software and a digital processor. In the following description, FIGS. 2 and 3 will be referred to in combination with FIG. 1.
- portable electronic device 104 Assuming portable electronic device 104 is within close proximity to apparatus 102 , portable electronic device 104 will receive RF signals 106 . Other electronic devices within close proximity will receive RF signals 106 as well.
- the flowchart ends at the end block 316 , but the method may be repeated for continuous operation.
- antenna 128 and receiver 126 receive and demodulate RF signals 106 broadcasted by apparatus 102 (step 402 ).
- Decryption module 136 attempts to decrypt the received signals (step 404 ). If the signals cannot be decrypted at step 404 , then the flowchart ends at an end block 410 . If the signals are decrypted at step 404 , processor 124 obtains the control signal or message corresponding to the particular gesture (step 406 ). Advantageously, the signals can only be decrypted by portable electronic device 104 and not other nearby devices. Processor 124 uniquely associates the control signal or message with a particular function. Once the association is made, processor 124 performs that function (step 408 ). The flowchart ends at an end block 410 , but may be repeated for continuous operation.
- the user receives confirmation for whether a particular gesture has been detected and/or acted upon by portable electronic device 104 .
- the confirmation may be provided at and by apparatus 102 (given an appropriate user interface), or at and by portable electronic device 104 (e.g., appropriate audio provided at earpiece 134 ). This confirmation may take place after step 308 of FIG. 3, or around steps 406 and 408 of FIG. 4.
- a right hand “snap” between the right thumb and the right middle finger provides a “SEND” command or an “END” command, depending on what state portable electronic device 104 is in.
- the right hand snap provides a “SEND” command or function when portable electronic device 104 is in an idle or standby mode, and an “END” command or function when portable electronic device 104 is in a talk or communication mode.
- the right hand snap provides an “OFF-HOOK” command or function when portable electronic device 104 is on-hook and an “ON-HOOK” command or function when portable electronic device 104 is off-hook.
- the right hand snap provides a “POWER-UP” command or function when portable electronic device 104 is powered down, and a “POWER-DOWN” command or function when portable electronic device 104 is powered up.
- apparatus 102 employs what are referred to as “gesture-activated transmission” and “gesture-activated interface enabling.” “Gesture-activated transmission” allows for the enabling and disabling of the RF broadcast based on whether any gestures are detected. That is, when a gesture is detected, the transmitter is enabled and RF signals are broadcasted; when no gesture is detected, the transmitter is disabled and no RF signals are broadcasted. More precisely, when a gesture is detected, the transmitter is enabled for a first predetermined time period before the command is sent. After the command is sent, the transmitter remains enabled for a second predetermined time period. If no other gesture is detected during the second predetermined time period, the transmitter is disabled and the RF signal broadcast ceases.
- this feature allows for sufficient signal hysteresis and receiver “pre-time” in order to adequately receive signals and commands.
- a flowchart describing a method of learning gestures is shown. This method may be carried out by wireless myoelectric apparatus 102 of FIG. 1, which may be referred to as a wireless adaptive myoelectric interface (“WAMI”) in this context.
- WAMI wireless adaptive myoelectric interface
- a “gesture learning mode” is enabled for apparatus 102 . This mode may be activated at initial powerup (prior to any operation), in response to a particular gesture or activation, or in response to a detection of some error between incoming gestures and prestored patterns.
- an indication is provided as to what gesture is to be learned (step 504 ). This indication may be provided at and by apparatus 104 , but preferably it is provided at and by portable electronic device 104 (e.g., an audio indication at earpiece 134 ).
- the flowchart ends at an end block 516 . If there are more gestures to learn at step 514 , the next gesture is selected (step 518 ) and the method repeats at step 504 .
- the number of gestures to learn for portable electronic device 104 may be relatively large, and in some implementations on the order of 10-20 gestures.
- the number and the positioning of myoelectric sensors 110 and the discernability of digital processor 220 are collectively configured so that detection is possible with a relative large number of hand gestures, such as those indicated in FIGS. 6 and 7. As one example, one to three myoelectric sensors 110 are positioned on band 108 so that twenty or more finger or hand gestures can be recognized.
- the inventive wireless myoelectric control apparatus is suitable for use in operating a portable electronic device.
- the wireless myoelectric control apparatus may include a material which forms a forearm or wrist band, one or more myoelectric sensors carried on the band; a digital processor coupled to the myoelectric sensors; and a wireless transmitter coupled to the digital processor.
- a battery-operable portable electronic device has a wireless myoelectric user interface.
- This device may have a housing and electronic circuitry carried in the housing.
- the electronic circuitry includes at least part of the wireless myoelectric user interface (e.g., the wireless receiver).
- an apparatus includes one or more myoelectric sensors; a gesture detector coupled to said myoelectric sensors; an output of said gesture detector providing a portable electronic device command corresponding to a detected gesture; a wireless transmitter coupled to the output of said gesture detector; and an antenna coupled to said wireless transmitter.
- an inventive method of processing myoelectric sensor signals comprises the steps of receiving myoelectric sensor signals indicative of one of a plurality of hand or finger gestures, where the plurality of gestures includes a first gesture which reflects contact between a thumb and an index finger of a human hand, a second gesture which reflects contact between the thumb and a middle finger of the human hand, and a third gesture which reflects contact between the thumb and a ring finger of the human hand; detecting one of the plurality of gestures based on the myoelectric sensor signals; and selectively issuing one of a plurality of commands based on the detected gesture.
Abstract
Myoelectric and wireless technologies are used for the control of a portable electronic device, such as a cellular telephone or a personal digital assistant (PDA). That is, a portable electronic device has a wireless myoelectric user interface. An apparatus includes a material which forms a forearm or wrist band, myoelectric sensors attached to the band, a digital processor coupled to the myoelectric sensors, and a wireless transmitter coupled to the digital processor. The apparatus is operative to sense and detect particular hand and/or finger gestures, and to broadcast control signals corresponding to the gestures for operative control of the portable electronic device.
Description
- This application claims benefit of the priority of U.S. Provisional Application Ser. No. 60/104534, filed Oct. 16, 1998.
- 1. Field of the Invention
- The present invention relates generally to the field of user interfaces for portable electronic devices, as well as to the field of wireless myoelectric control.
- 2. Description of the Related Art
- Portable electronic devices have become increasingly popular. These devices can operate with the use of a small battery pack or battery cells. Examples of these devices include wireless or cellular telephones, personal digital assistants (PDAs), and audio or music delivery devices. Some devices have become increasingly small such that they are now deemed “pocketable” and/or “wearable.”
- A portable electronic device typically has a user interface for operative control. Most if not all conventional user interfaces for such portable electronic devices employ physical buttons, stylus, or voice control. In some devices, a large number of operations or functions are possible with the user interface.
- One major shortcoming of a button or stylus-based user interface is that the user must physically retrieve and position the portable electronic device appropriately for physical contact therewith. In addition, as the size of a device becomes smaller, the interface becomes increasingly inappropriate from an ergonomic standpoint. The major shortcoming of a voice-controlled interface is that the user must speak openly in such a way that other nearby people may hear.
- Myoelectric technologies are known and have been used for control in some applications. In U.S. Pat. No. 4,149,716, a head band with electrodes is used to generate signals for wired control of visual displays in a television display game. In U.S. Pat. No. 5,482,051, electromyographic sensors are placed on the back of a user's hand to provide for the realistic “squeezing of objects” in a virtual reality application. U.S. Pat. No. 2,252,102 discloses orthotic and prosthetic devices which are controlled with myoswitches in connection with wireless communication. As described in U.S. Pat. No. 5,679,004, a body suit has myoelectric sensors for contact with an arm, where the signals therefrom are broadcasted to a remote station for comparison of teacher and student motions. U.S. Pat. No. 5,692,417 describes the combined use of EEG and EMG signals to produce control signals for electronic musical instruments and video games. None of these patents describes a wireless myoelectric apparatus for operative control of a portable electronic device with a relatively large number of clearly discernible human commands.
- Accordingly, there is an existing need to provide a convenient and easy-to-use user interface for small portable electronic devices.
- As described herein, the inventive methods and apparatus involve the use of myoelectric and wireless technology for the control of a portable electronic device. That is, the inventive portable electronic device has a wireless myoelectric user interface.
- In one aspect of the invention, the apparatus comprises a material which forms a band, myoelectric sensors attached to the band, a digital processor coupled to the myoelectric sensors, and a wireless transmitter coupled to the digital processor. The band is sized and configured to fit around a human forearm or wrist. In general, the apparatus is operative to sense and detect particular hand and/or finger gestures, and to broadcast control signals corresponding to the gestures for operative control of the portable electronic device.
- The apparatus may employ a method of receiving and digitizing signals from the myoelectric sensors, determining whether a correlation exists between the digitized signal data and one of a plurality of prestored gesture patterns, selecting and generating a control signal or message associated with the prestored gesture pattern when a correlation exists, and modulating and transmitting a radio frequency (RF) signal with the control signal for control of the portable electronic device.
- FIG. 1 is a diagram of a wireless communication system having a wireless myoelectric apparatus and a portable electronic device.
- FIG. 2 is a schematic block diagram of electronic circuitry of the wireless myoelectric apparatus of FIG. 1.
- FIG. 3 is a flowchart describing a method of processing myoelectric sensor signals which may be carried out by the wireless myoelectric apparatus of FIG. 1.
- FIG. 4 is a method of processing control signals which may be carried out by the portable electronic device of FIG. 1.
- FIG. 5 is a flowchart describing a method of learning gestures which may be carried out by the wireless myoelectric apparatus of FIG. 1.
- FIG. 6 is a table showing exemplary detectable gestures of the wireless myoelectric apparatus of FIG. 1, exemplary controlled functions of the portable electronic device of FIG. 1, and exemplary relationships between such gestures and functions.
- FIG. 7 is another table showing exemplary detectable gestures of the wireless myoelectric apparatus of FIG. 1, exemplary controlled functions of the portable electronic device of FIG. 1, and exemplary relationships between such gestures and functions.
- FIG. 1 is a diagram of a
wireless communication system 100 including a wirelessmyoelectric apparatus 102 and a portableelectronic device 104.Apparatus 102 is operative to communicate to portableelectronic device 104 via radio frequency (RF)signals 106 or other suitable over-the-air signals. -
Apparatus 102 includes a material which forms aband 108,myoelectric sensors 110,electronic circuitry 112, and anantenna 114.Band 108 has ahole 116, and is sized and configured to fit around a human forearm in a “snug-fit” fashion.Band 108 may be referred to as a “sleeve.” In the preferred embodiment, the material ofband 108 is elastic, and may be made of well-known spandex or spandex-like material. Thus,band 108 may be stretched for attachment to and detachment from the forearm. In an alternate embodiment, the material ofband 108 is a sheet of cloth having snaps or Velcro™ for attachment and detachment. In another embodiment, two bands with myoelectric sensors are used; one for the left forearm and a similar one for the right forearm. - The band may be alternatively sized and configured to cover a part of the forearm that is smaller than that shown as being covered by
band 108. As an example, FIG. 1 shows aband 150, which may be referred to as a “wristband”.Band 150 has a dimension along the length of the forearm that is smaller than that shown byband 108. This dimension may be about three inches (7.62 cm), for example, but may vary; for example, the dimension may vary from around one inch (5.08 cm) to five inches (12.7 cm). Preferably,band 150 is not suited to cover most of the forearm, but rather to cover only an upper part of the forearm (e.g., where the muscle is the thickest). Left and right wristbands may be used in combination as well. -
Myoelectric sensors 110, such as amyoelectric sensor 118, are coupled toelectronic circuitry 112 viaconductors 120, such as aconductor 122.Myoelectric sensors 110 may be adhesively attached toband 108, andconductors 120 may be embedded within and/or carried onband 108. In any case, each ofmyoelectric sensors 110 is attached to and carried onband 108 such that, when the forearm carries theband 108, the myoelectric sensor makes contact with the outer surface of the forearm. -
Electronic circuitry 112, which is described below in relation to FIG. 2, may be carried onband 108 as indicated in FIG. 1. In one embodiment,electronic circuitry 112 is carried in a pocket (not shown) ofband 108. The pocket may be formed to completely encloseelectronic circuitry 112, or remain partially open. In an alternate embodiment,electronic circuitry 112 is not carried onband 108 but rather contained outside ofband 108, carried in a small housing (not shown), and coupled tomyoelectric sensors 110 and/orconductors 120 via wires (not shown).Electronic circuitry 112 is coupled toantenna 106 which is configured to send commands via RF signals 106 to portableelectronic device 104. - FIG. 2 is a schematic block diagram of
electrical circuitry 112 of wirelessmyoelectric apparatus 102 of FIG. 1.Electronic circuitry 112 includesmyoelectric sensors 110, asignal amplifier 202, agesture detector 204, atraining module 206,memory 208, anencryption module 216, atransmitter 218, andantenna 114, coupled as indicated in FIG. 2.Memory 208 may include a read-only memory (ROM) 210, a random access memory (RAM) 212, and an electrically-erasable/programmable ROM (EEPROM) 214. Preferably,gesture detector 204,training module 206, and at least portions ofmemory 208 are included in adigital processor 220.Myoelectric sensors 110 of FIG. 2 refers to the sensors from the one or more bands in use. To provide energy forelectronic circuitry 112, aportable energy source 222 is included.Portable energy source 222 may include a battery pack or one or more battery cells. - Referring back to FIG. 1, portable
electronic device 104 generally includeselectronic circuitry 138 carried in a housing (see outline of 104).Electronic circuitry 138 shown in FIG. 1 allows for communication and control of portableelectronic device 104 byapparatus 102.Electronic circuitry 138 includes aprocessor 124, areceiver 126, anantenna 128, a user interface 132 (which may include an earpiece or headset 134), and adecryption module 136, coupled as indicated. - Examples of portable
electronic device 104 include a portable audio player (having an AM/FM radio, tape cassette, and/or compact disc (CD) player), a wireless or cellular telephone, a personal digital assistant (PDA), a computer, a calculator, to name a few. As apparent, the electronic circuitry of portableelectronic device 104 may include circuitry and devices in addition toelectrical circuitry 138 shown in FIG. 1, and will depend on the particular chosen application. - Portable
electronic device 104 may be freely carried by a user since it can operate without externally fixed wired connections. To make this possible and convenient,electronic circuitry 138 is powered by aportable energy source 130 which is carried by the housing.Portable energy source 130 may include a battery pack or one or more battery cells. Portableelectronic device 104 also has a small size and a light weight. In one embodiment, portableelectronic device 104 is sized to fit within a hand of a user; it may be referred to as a “hand-held” device. Portableelectronic device 104 may be small enough to be considered “wearable” or “pocketable”. For example, portableelectronic device 104 may have dimensions no greater than 3″×4″×½″, and a weight no greater than eight ounces. Thus, a user may carry or wear portableelectronic device 104 outside of his/her view and accessibility, while listening to audio from portableelectronic device 104 viaearpiece 134. - Generally,
apparatus 102 is a wireless myoelectric interface for portableelectronic device 104. Put another way, portableelectronic device 104 has a wireless myoelectric user interface. In one embodiment, theconventional user interface 132 of portableelectronic device 104 does not exist (except for perhaps earpiece 104) and the wireless myoelectric interface is the primary user interface for portableelectronic device 104. That is, buttons or stylus that would otherwise be present are missing from portableelectronic device 104. In another embodiment,conventional user interface 132 is present but may overridden byapparatus 102. - FIG. 3 is a flowchart describing a method of processing myoelectric sensor signals. This method may be carried out by wireless
myoelectric apparatus 102. As apparent, at least some steps of the methods described herein are embodied and implemented in connection with software and a digital processor. In the following description, FIGS. 2 and 3 will be referred to in combination with FIG. 1. - A user has
band 108 attached to his/her forearm or wrist. As the user moves hands and fingers,myoelectric sensors 110 pick up electrical signals from muscles that control the hands and fingers. Beginning at astart block 300 of FIG. 3, signals frommyoelectric sensors 110 are received (step 302), amplified by signal amplifier 202 (step 304), and digitized by digital processor 220 (step 306).Gesture detector 204 detects whether a correlation exists between the digitized data and one of several prestored data patterns in memory 208 (step 308). Each one of the prestored data patterns corresponds to a particular gesture that is uniquely detectable bygesture detector 204. Preferably, these patterns are stored inEEPROM 214, and may be transferred toRAM 212 during operation. - If no correlation exists at
step 308, the flowchart ends at anend block 316. If a correlation exists atstep 308,gesture detector 204 selects and generates a control signal or message that corresponds to the prestored data pattern (step 310). The control signal or message may be referred to or regarded as a “command” for portableelectronic device 104. Next,gesture detector 204 enablesencryption module 216 to encrypt the control signal or message (step 312).Gesture detector 204 also enablestransmitter 218 so that an RF signal is modulated by the control signal or message and transmitted from antenna 114 (step 314) (see RF signals 106). Assuming portableelectronic device 104 is within close proximity toapparatus 102, portableelectronic device 104 will receive RF signals 106. Other electronic devices within close proximity will receiveRF signals 106 as well. The flowchart ends at theend block 316, but the method may be repeated for continuous operation. - FIG. 4 is a flowchart describing a method of processing control signals or commands. This method may be carried out by portable
electronic device 104 of FIG. 1. This method may follow or occur at substantially the same time as the method of FIG. 3. - Beginning at a
start block 400 of FIG. 4,antenna 128 andreceiver 126 receive and demodulateRF signals 106 broadcasted by apparatus 102 (step 402).Decryption module 136 attempts to decrypt the received signals (step 404). If the signals cannot be decrypted atstep 404, then the flowchart ends at anend block 410. If the signals are decrypted atstep 404,processor 124 obtains the control signal or message corresponding to the particular gesture (step 406). Advantageously, the signals can only be decrypted by portableelectronic device 104 and not other nearby devices.Processor 124 uniquely associates the control signal or message with a particular function. Once the association is made,processor 124 performs that function (step 408). The flowchart ends at anend block 410, but may be repeated for continuous operation. - Preferably, the user receives confirmation for whether a particular gesture has been detected and/or acted upon by portable
electronic device 104. The confirmation may be provided at and by apparatus 102 (given an appropriate user interface), or at and by portable electronic device 104 (e.g., appropriate audio provided at earpiece 134). This confirmation may take place afterstep 308 of FIG. 3, or aroundsteps - FIG. 6 is a table showing exemplary detectable gestures of wireless
myoelectric apparatus 102 of FIG. 1, exemplary controlled functions of the portableelectronic device 104 of FIG. 2, and exemplary relationships between such gestures and functions. The table in FIG. 6 shows four finger gestures detectable bygesture detector 204, where each gesture is uniquely associated with a function of portableelectronic device 104. In the table, symbols are used to denote the finger gestures. Here, “T” signifies the thumb, “i” signifies the index finger, “r” signifies the ring finger, and “p” signifies the pinky finger. A dash “-” is used between these signifiers to indicate physical contact therebetween. For example, a gesture “T-r” indicates the thumb in contact with the ring finger of the same hand. - As indicated in FIG. 6, the thumb in contact with the index finger corresponds to a “forward” command or function; the thumb in contact with the middle finger corresponds to a “play” command or function; the thumb in contact with the ring finger corresponds to a “stop” command or function; and the thumb in contact with the pinky finger corresponds to a “rewind” command or function. As apparent, these or similar functional associations may be used in, for example, audio/video player applications, voice mail retrieval applications, etc.
- As alternatively indicated in FIG. 6, the thumb in contact with the index finger corresponds to a “scroll up” command or function; the thumb in contact with the middle finger corresponds to a “s elect” command or function; and the thumb in contact with the ring finger corresponds to a “scroll down” command or function. As apparent, these or similar functional associations may be used in, for example, scrolling through long data lists, selecting prestored telephone numbers for dialing, voice mail applications, etc.
- FIG. 7 is another table showing other exemplary detectable gestures of
apparatus 102 of FIG. 1, exemplary controlled functions of portableelectronic device 104 of FIG. 1, and exemplary relationships between such gestures and functions. The table in FIG. 7 shows thirteen finger gestures detectable bygesture detector 204, where each gesture is uniquely associated with a function of portableelectronic device 104. In FIG. 7, the same symbols are used as in the table of FIG. 6. Here, however, the control is provided by gestures from both hands, the left hand (“LH”) and the right hand (“RH”). - As indicated in FIG. 7, combinations of simultaneous gestures from both the left and right hands may be associated with a particular function. For example, the number “1” is selected in response to the left thumb being in contact with the left index finger while the right thumb is in contact with the right index finger; the number “2” is selected in response to the left thumb being in contact with the left index finger while the right thumb is in contact with the right middle finger. Similarly, the number “4” is selected in response to the left thumb being in contact with the left middle finger while the right thumb is in contact with the right index finger; the number “5” is selected in response to the left thumb being in contact with the left middle finger while the right thumb is in contact with the right middle finger, etc. As apparent, these or similar functional associations may be used in, for example, selecting from one of many retrievable items, manually selecting telephone numbers for dialing, etc.
- Also as indicated in FIG. 7, a right hand “snap” between the right thumb and the right middle finger provides a “SEND” command or an “END” command, depending on what state portable
electronic device 104 is in. For example, if portableelectronic device 104 provides cellular telephone functionality, the right hand snap provides a “SEND” command or function when portableelectronic device 104 is in an idle or standby mode, and an “END” command or function when portableelectronic device 104 is in a talk or communication mode. As another example, if portableelectronic device 104 provides telephone functionality, the right hand snap provides an “OFF-HOOK” command or function when portableelectronic device 104 is on-hook and an “ON-HOOK” command or function when portableelectronic device 104 is off-hook. As another example, the right hand snap provides a “POWER-UP” command or function when portableelectronic device 104 is powered down, and a “POWER-DOWN” command or function when portableelectronic device 104 is powered up. - Preferably,
apparatus 102 employs what are referred to as “gesture-activated transmission” and “gesture-activated interface enabling.” “Gesture-activated transmission” allows for the enabling and disabling of the RF broadcast based on whether any gestures are detected. That is, when a gesture is detected, the transmitter is enabled and RF signals are broadcasted; when no gesture is detected, the transmitter is disabled and no RF signals are broadcasted. More precisely, when a gesture is detected, the transmitter is enabled for a first predetermined time period before the command is sent. After the command is sent, the transmitter remains enabled for a second predetermined time period. If no other gesture is detected during the second predetermined time period, the transmitter is disabled and the RF signal broadcast ceases. If another gesture is detected during the second predetermined time period, however, the next command is immediately sent and the second predetermined time period is reset so that the RF broadcast is maintained for that time. In addition to prolonging battery life, this feature allows for sufficient signal hysteresis and receiver “pre-time” in order to adequately receive signals and commands. - On the other hand, “gesture-activated interface enabling” allows for the enabling and disabling of gesture commands based on a predetermined gesture. As an example, a right hand snap may activate the gesture interface and a left hand snap may deactivate the gesture interface. Given a right hand snap, commands and functions are issued and performed “as normal” for some set of predetermined gestures (e.g., those in FIG. 6). That is, the right hand snap “enables” the commands. However, no commands or functions are issued or performed for any predetermined gestures following a left hand snap, until the right hand snap is again given. This feature allows for the convenient activation and deactivation of the interface when necessary. Any suitable gestures may be applied, and the same gesture may provide for both activation and deactivation using concepts similar to that described in relation to the “SEND” and “END” functions in FIG. 7.
- Referring now to FIG. 5, a flowchart describing a method of learning gestures is shown. This method may be carried out by wireless
myoelectric apparatus 102 of FIG. 1, which may be referred to as a wireless adaptive myoelectric interface (“WAMI”) in this context. Beginning at astart block 500, a “gesture learning mode” is enabled forapparatus 102. This mode may be activated at initial powerup (prior to any operation), in response to a particular gesture or activation, or in response to a detection of some error between incoming gestures and prestored patterns. Once the gesture learning mode is enabled atstep 502, an indication is provided as to what gesture is to be learned (step 504). This indication may be provided at and byapparatus 104, but preferably it is provided at and by portable electronic device 104 (e.g., an audio indication at earpiece 134). - In response, the user performs the indicated gesture and signals are generated at
myoelectric sensors 110.Digital processor 220 andtraining module 206 receive and process the signals (step 506), and store the results inmemory 208. As indicated atstep 508, the same gesture and processing may be repeated for a predetermined number of repetitions or untiltraining module 206 is “satisfied” with the data already received. After any repetition of the gesture,training module 206 determines a gesture pattern associated with the gesture (step 510) and stores it in memory 208 (step 512). Preferably, the gesture pattern is stored inEEPROM 214. - If there are no more gestures to learn at
step 514, the flowchart ends at anend block 516. If there are more gestures to learn atstep 514, the next gesture is selected (step 518) and the method repeats atstep 504. The number of gestures to learn for portableelectronic device 104 may be relatively large, and in some implementations on the order of 10-20 gestures. The number and the positioning ofmyoelectric sensors 110 and the discernability ofdigital processor 220 are collectively configured so that detection is possible with a relative large number of hand gestures, such as those indicated in FIGS. 6 and 7. As one example, one to threemyoelectric sensors 110 are positioned onband 108 so that twenty or more finger or hand gestures can be recognized. - Thus,
training module 206 as described in relation to FIG. 5 allows for accurate learning for a given individual. Preferably, the determining of the gesture patterns is performed using a discriminative algorithm. The learning algorithm first maps the high dimensional myoelectric recordings into an internal compact representation. Next, a machine learning technique called “boosting” is used to find a set of discriminative features. Finally, the features are combined into a single highly accurate, yet compact, gesture classifier. For best performance, special attention is given to potential crosstalk from synchronous muscle activations as described in “Surface Myoelectric Crosstalk Among Muscles Of The Leg,” C. DeLuca, R. Merletti, EEG & Clin. Neurophysiol., vol. 69, pp. 568-575, 1988, and “Detection Of Motor Unit Action Potentials With Surface Electrodes: Influence Of Electrode Size And Spacing,” A. Fuglevand et al., Biol. Cybern., vol. 67, pp. 143-153, 1992. - As described, the inventive wireless myoelectric control apparatus is suitable for use in operating a portable electronic device. The wireless myoelectric control apparatus may include a material which forms a forearm or wrist band, one or more myoelectric sensors carried on the band; a digital processor coupled to the myoelectric sensors; and a wireless transmitter coupled to the digital processor.
- In other inventive aspects, a battery-operable portable electronic device has a wireless myoelectric user interface. This device may have a housing and electronic circuitry carried in the housing. The electronic circuitry includes at least part of the wireless myoelectric user interface (e.g., the wireless receiver).
- In yet other aspects, an apparatus includes one or more myoelectric sensors; a gesture detector coupled to said myoelectric sensors; an output of said gesture detector providing a portable electronic device command corresponding to a detected gesture; a wireless transmitter coupled to the output of said gesture detector; and an antenna coupled to said wireless transmitter.
- The apparatus may employ a method of receiving and digitizing signals from the myoelectric sensors, determining whether a correlation exists between the digitized signal data and one of a plurality of prestored gesture patterns, selecting and generating a control signal or message associated with the prestored gesture pattern when a correlation exists, and modulating and transmitting a radio frequency (RF) signal with the control signal for control of the portable electronic device.
- Also, an inventive method of processing myoelectric sensor signals comprises the steps of receiving myoelectric sensor signals indicative of one of a plurality of hand or finger gestures, where the plurality of gestures includes a first gesture which reflects contact between a thumb and an index finger of a human hand, a second gesture which reflects contact between the thumb and a middle finger of the human hand, and a third gesture which reflects contact between the thumb and a ring finger of the human hand; detecting one of the plurality of gestures based on the myoelectric sensor signals; and selectively issuing one of a plurality of commands based on the detected gesture.
- The methods described have been used in connection with
apparatus 102 and portableelectronic device 104. However, the methods may be employed betweenapparatus 102 and a number of wireless-controlled devices that are owned and operated by the user (including portable electronic device 104). - As readily apparent, the inventive aspects described herein provide several advantages in the field of myoelectric control. The present invention is a more particular breakthrough in connection with user interfaces for portable electronic devices. Thus, the scope of the invention should be understood to be quite broad and warrant a broad range of equivalent structures and functionalities.
Claims (20)
1. A wireless myoelectric control apparatus suitable for use in operating a portable electronic device, comprising:
a material which forms a forearm or wrist band;
one or more myoelectric sensors carried on said band;
a digital processor;
said digital processor coupled to said myoelectric sensors;
a wireless transmitter; and
said wireless transmitter coupled to said digital processor.
2. The apparatus according to , wherein said digital processor further comprises a gesture detector.
claim 1
3. The apparatus according to , further comprising:
claim 1
said digital processor comprising a gesture detector; and
memory coupled to said digital processor.
4. The apparatus according to , further comprising:
claim 1
said digital processor comprising a gesture detector;
memory coupled to said digital processor; and
said memory for storing a plurality of gesture patterns.
5. The apparatus according to , wherein said digital processor further comprises a training module.
claim 1
6. The apparatus according to , wherein said material is elastic.
claim 1
7. The apparatus according to , further comprising:
claim 1
an antenna; and
said antenna coupled to said wireless transmitter.
8. An apparatus, comprising:
one or more myoelectric sensors;
a gesture detector;
said gesture detector coupled to said myoelectric sensors;
an output of said gesture detector providing a portable electronic device command corresponding to a detected gesture;
a wireless transmitter coupled to the output of said gesture detector; and
an antenna coupled to said wireless transmitter.
9. The apparatus according to , further comprising:
claim 8
a training module; and
said training module coupled to said myoelectric sensors.
10. The apparatus according to , further comprising:
claim 8
memory;
said memory coupled to said gesture detector; and
said memory for stored a plurality of gesture patterns.
11. A battery-operable portable electronic device, comprising:
a wireless myoelectric user interface.
12. The device according to , wherein said wireless myoelectric user interface comprises:
claim 11
a wireless receiver.
13. The device according to , wherein said wireless myoelectric user interface comprises:
claim 11
electronic circuitry, including:
a wireless receiver;
said wireless receiver operative to receive user interface commands;
a processor; and
said processor operative to receive and process the received user interface commands.
14. The device according to , further comprising:
claim 12
a housing; and
said electronic circuitry carried in said housing.
15. The device according to , wherein the device comprises one of a portable computer, a portable wireless or cellular telephone, a portable audio player, a portable calculator, and a personal digital assistant (PDA).
claim 11
16. A method of processing myoelectric sensor signals, comprising:
receiving myoelectric sensor signals indicative of one of a plurality of hand or finger gestures;
detecting one of the plurality of gestures based on the myoelectric sensor signals; and
selectively issuing one of a plurality of commands associated with the detected gesture.
17. The method according to , further comprising:
claim 16
broadcasting the command with a wireless transmitter.
18. The method according to , further comprising:
claim 17
amplifying the myoelectric sensor signals; and
digitizing the amplified myoelectric sensor signals.
19. The method according to , further comprising:
claim 16
in response to detecting, activating a wireless transmitter to broadcast the command.
20. The method according to , wherein the plurality of hand or finger gestures includes a first gesture which reflects contact between a thumb and an index finger of a human hand, a second gesture which reflects contact between the thumb and a middle finger of the human hand, and a third gesture which reflects contact between the thumb and a ring finger of the human hand.
claim 16
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/840,255 US20010014441A1 (en) | 1998-10-16 | 2001-04-23 | Wireless myoelectric control apparatus and methods |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10453498P | 1998-10-16 | 1998-10-16 | |
US09/419,376 US6244873B1 (en) | 1998-10-16 | 1999-10-15 | Wireless myoelectric control apparatus and methods |
US09/840,255 US20010014441A1 (en) | 1998-10-16 | 2001-04-23 | Wireless myoelectric control apparatus and methods |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/419,376 Continuation US6244873B1 (en) | 1998-10-16 | 1999-10-15 | Wireless myoelectric control apparatus and methods |
Publications (1)
Publication Number | Publication Date |
---|---|
US20010014441A1 true US20010014441A1 (en) | 2001-08-16 |
Family
ID=26801658
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/419,376 Expired - Lifetime US6244873B1 (en) | 1998-10-16 | 1999-10-15 | Wireless myoelectric control apparatus and methods |
US09/840,255 Abandoned US20010014441A1 (en) | 1998-10-16 | 2001-04-23 | Wireless myoelectric control apparatus and methods |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/419,376 Expired - Lifetime US6244873B1 (en) | 1998-10-16 | 1999-10-15 | Wireless myoelectric control apparatus and methods |
Country Status (1)
Country | Link |
---|---|
US (2) | US6244873B1 (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6564144B1 (en) * | 2002-01-10 | 2003-05-13 | Navigation Technologies Corporation | Method and system using a hand-gesture responsive device for collecting data for a geographic database |
US20030184498A1 (en) * | 2002-03-29 | 2003-10-02 | Massachusetts Institute Of Technology | Socializing remote communication |
US20070178950A1 (en) * | 2006-01-19 | 2007-08-02 | International Business Machines Corporation | Wearable multimodal computing device with hands-free push to talk |
US20070225034A1 (en) * | 2002-05-24 | 2007-09-27 | Schmidt Dominik J | Dynamically configured antenna for multiple frequencies and bandwidths |
US20080058668A1 (en) * | 2006-08-21 | 2008-03-06 | Kaveh Seyed Momen | Method, system and apparatus for real-time classification of muscle signals from self-selected intentional movements |
US20090237266A1 (en) * | 2008-03-20 | 2009-09-24 | The Ohio Willow Wood Company | System and method for prosthetic/orthotic device communication |
WO2010030822A1 (en) * | 2008-09-10 | 2010-03-18 | Oblong Industries, Inc. | Gestural control of autonomous and semi-autonomous systems |
US20110071417A1 (en) * | 2009-08-21 | 2011-03-24 | The Chinese University Of Hong Kong | Systems and methods for reproducing body motions via networks |
WO2016026100A1 (en) * | 2014-08-20 | 2016-02-25 | 华为技术有限公司 | Myoelectric signal acquisition device |
CN106371560A (en) * | 2015-08-19 | 2017-02-01 | 北京智谷睿拓技术服务有限公司 | Blowing and sucking determination method and device |
CN106371561A (en) * | 2015-08-19 | 2017-02-01 | 北京智谷睿拓技术服务有限公司 | Input information determination method and device |
US10796599B2 (en) * | 2017-04-14 | 2020-10-06 | Rehabilitation Institute Of Chicago | Prosthetic virtual reality training interface and related methods |
US10959863B2 (en) * | 2017-06-20 | 2021-03-30 | Southeast University | Multi-dimensional surface electromyogram signal prosthetic hand control method based on principal component analysis |
US11032137B2 (en) | 2014-06-09 | 2021-06-08 | Samsung Electronics Co., Ltd. | Wearable electronic device, main electronic device, system and control method thereof |
CN116486683A (en) * | 2023-06-20 | 2023-07-25 | 浙江强脑科技有限公司 | Intelligent bionic hand teaching aid |
Families Citing this family (154)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8352400B2 (en) | 1991-12-23 | 2013-01-08 | Hoffberg Steven M | Adaptive pattern recognition based controller apparatus and method and human-factored interface therefore |
US7904187B2 (en) | 1999-02-01 | 2011-03-08 | Hoffberg Steven M | Internet appliance system and method |
US7760905B2 (en) * | 1999-06-29 | 2010-07-20 | Digimarc Corporation | Wireless mobile phone with content processing |
US20020032734A1 (en) | 2000-07-26 | 2002-03-14 | Rhoads Geoffrey B. | Collateral data combined with user characteristics to select web site |
US7565294B2 (en) * | 1999-05-19 | 2009-07-21 | Digimarc Corporation | Methods and systems employing digital content |
US7406214B2 (en) | 1999-05-19 | 2008-07-29 | Digimarc Corporation | Methods and devices employing optical sensors and/or steganography |
US8391851B2 (en) | 1999-11-03 | 2013-03-05 | Digimarc Corporation | Gestural techniques with wireless mobile phone devices |
US6440067B1 (en) * | 2000-02-28 | 2002-08-27 | Altec, Inc. | System and method for remotely monitoring functional activities |
US6720984B1 (en) * | 2000-06-13 | 2004-04-13 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Characterization of bioelectric potentials |
US7148879B2 (en) * | 2000-07-06 | 2006-12-12 | At&T Corp. | Bioacoustic control system, method and apparatus |
US20020071277A1 (en) * | 2000-08-12 | 2002-06-13 | Starner Thad E. | System and method for capturing an image |
US7044911B2 (en) * | 2001-06-29 | 2006-05-16 | Philometron, Inc. | Gateway platform for biological monitoring and delivery of therapeutic compounds |
US6990639B2 (en) | 2002-02-07 | 2006-01-24 | Microsoft Corporation | System and process for controlling electronic components in a ubiquitous computing environment using multimodal integration |
SE0201434L (en) * | 2002-05-10 | 2003-10-14 | Henrik Dryselius | Device for input control signals to an electronic device |
AU2003251983A1 (en) * | 2002-07-08 | 2004-01-23 | Ossur Engineering, Inc. | Socket liner incorporating sensors to monitor amputee progress |
DE60215504T2 (en) * | 2002-10-07 | 2007-09-06 | Sony France S.A. | Method and apparatus for analyzing gestures of a human, e.g. for controlling a machine by gestures |
US7204814B2 (en) | 2003-05-29 | 2007-04-17 | Muscle Tech Ltd. | Orthodynamic rehabilitator |
CN100374986C (en) * | 2003-06-12 | 2008-03-12 | 控制仿生学公司 | Method, system, and software for interactive communication and analysis |
US20050211068A1 (en) * | 2003-11-18 | 2005-09-29 | Zar Jonathan D | Method and apparatus for making music and article of manufacture thereof |
US7707039B2 (en) | 2004-02-15 | 2010-04-27 | Exbiblio B.V. | Automatic modification of web pages |
US8442331B2 (en) | 2004-02-15 | 2013-05-14 | Google Inc. | Capturing text from rendered documents using supplemental information |
US7812860B2 (en) | 2004-04-01 | 2010-10-12 | Exbiblio B.V. | Handheld device for capturing text from both a document printed on paper and a document displayed on a dynamic display device |
US10635723B2 (en) | 2004-02-15 | 2020-04-28 | Google Llc | Search engines and systems with handheld document data capture devices |
US9143638B2 (en) | 2004-04-01 | 2015-09-22 | Google Inc. | Data capture from rendered documents using handheld device |
US8146156B2 (en) | 2004-04-01 | 2012-03-27 | Google Inc. | Archive of text captures from rendered documents |
US20060098900A1 (en) | 2004-09-27 | 2006-05-11 | King Martin T | Secure data gathering from rendered documents |
US7894670B2 (en) | 2004-04-01 | 2011-02-22 | Exbiblio B.V. | Triggering actions in response to optically or acoustically capturing keywords from a rendered document |
US8081849B2 (en) | 2004-12-03 | 2011-12-20 | Google Inc. | Portable scanning and memory device |
US20060081714A1 (en) | 2004-08-23 | 2006-04-20 | King Martin T | Portable scanning device |
US7990556B2 (en) | 2004-12-03 | 2011-08-02 | Google Inc. | Association of a portable scanner with input/output and storage devices |
US9008447B2 (en) | 2004-04-01 | 2015-04-14 | Google Inc. | Method and system for character recognition |
US9116890B2 (en) | 2004-04-01 | 2015-08-25 | Google Inc. | Triggering actions in response to optically or acoustically capturing keywords from a rendered document |
US8713418B2 (en) | 2004-04-12 | 2014-04-29 | Google Inc. | Adding value to a rendered document |
US8620083B2 (en) | 2004-12-03 | 2013-12-31 | Google Inc. | Method and system for character recognition |
US8489624B2 (en) | 2004-05-17 | 2013-07-16 | Google, Inc. | Processing techniques for text capture from a rendered document |
US8874504B2 (en) | 2004-12-03 | 2014-10-28 | Google Inc. | Processing techniques for visual capture data from a rendered document |
US8346620B2 (en) | 2004-07-19 | 2013-01-01 | Google Inc. | Automatic modification of web pages |
JP2006060584A (en) * | 2004-08-20 | 2006-03-02 | Fuji Photo Film Co Ltd | Digital camera |
US8702629B2 (en) | 2005-03-17 | 2014-04-22 | Great Lakes Neuro Technologies Inc. | Movement disorder recovery system and method for continuous monitoring |
US8187209B1 (en) | 2005-03-17 | 2012-05-29 | Great Lakes Neurotechnologies Inc | Movement disorder monitoring system and method |
DE102005021412A1 (en) * | 2005-05-04 | 2006-11-09 | Otto Bock Healthcare Products Gmbh | System of a liner with a myoelectric electrode unit |
US7697827B2 (en) | 2005-10-17 | 2010-04-13 | Konicek Jeffrey C | User-friendlier interfaces for a camera |
US20070158911A1 (en) * | 2005-11-07 | 2007-07-12 | Torre Gabriel D L | Interactive role-play toy apparatus |
US10022545B1 (en) * | 2006-05-11 | 2018-07-17 | Great Lakes Neurotechnologies Inc | Movement disorder recovery system and method |
EP2067119A2 (en) | 2006-09-08 | 2009-06-10 | Exbiblio B.V. | Optical scanners, such as hand-held optical scanners |
JP2008305198A (en) * | 2007-06-07 | 2008-12-18 | Fujitsu Component Ltd | Input system and input device |
JP2008305199A (en) * | 2007-06-07 | 2008-12-18 | Fujitsu Component Ltd | Input system and program |
US20090125824A1 (en) * | 2007-11-12 | 2009-05-14 | Microsoft Corporation | User interface with physics engine for natural gestural control |
US20090216339A1 (en) * | 2008-01-02 | 2009-08-27 | Hanson William J | Through-Liner Electrode System for Prosthetics and the Like |
US8344998B2 (en) * | 2008-02-01 | 2013-01-01 | Wimm Labs, Inc. | Gesture-based power management of a wearable portable electronic device with display |
US20090251407A1 (en) * | 2008-04-03 | 2009-10-08 | Microsoft Corporation | Device interaction with combination of rings |
US8250454B2 (en) * | 2008-04-03 | 2012-08-21 | Microsoft Corporation | Client-side composing/weighting of ads |
WO2009131664A2 (en) * | 2008-04-21 | 2009-10-29 | Carl Frederick Edman | Metabolic energy monitoring system |
WO2009130573A1 (en) * | 2008-04-22 | 2009-10-29 | Sergei Startchik | Method and device for determining gestures made by a user with his hand |
US20090289937A1 (en) * | 2008-05-22 | 2009-11-26 | Microsoft Corporation | Multi-scale navigational visualtization |
US8682736B2 (en) | 2008-06-24 | 2014-03-25 | Microsoft Corporation | Collection represents combined intent |
US9037530B2 (en) | 2008-06-26 | 2015-05-19 | Microsoft Technology Licensing, Llc | Wearable electromyography-based human-computer interface |
US8447704B2 (en) | 2008-06-26 | 2013-05-21 | Microsoft Corporation | Recognizing gestures from forearm EMG signals |
US8170656B2 (en) * | 2008-06-26 | 2012-05-01 | Microsoft Corporation | Wearable electromyography-based controllers for human-computer interface |
CN105930311B (en) | 2009-02-18 | 2018-10-09 | 谷歌有限责任公司 | Execute method, mobile device and the readable medium with the associated action of rendered document |
US8447066B2 (en) | 2009-03-12 | 2013-05-21 | Google Inc. | Performing actions based on capturing information from rendered documents, such as documents under copyright |
WO2010105245A2 (en) | 2009-03-12 | 2010-09-16 | Exbiblio B.V. | Automatically providing content associated with captured information, such as information captured in real-time |
US9081799B2 (en) | 2009-12-04 | 2015-07-14 | Google Inc. | Using gestalt information to identify locations in printed information |
US9323784B2 (en) | 2009-12-09 | 2016-04-26 | Google Inc. | Image search using text-based elements within the contents of images |
US8428664B1 (en) * | 2010-02-27 | 2013-04-23 | Philip W. Wyers | Case assembly for handheld electronic devices and method of use therefor |
WO2011112972A2 (en) * | 2010-03-11 | 2011-09-15 | Philometron, Inc. | Physiological monitor system for determining medication delivery and outcome |
EP2494928B1 (en) * | 2011-03-02 | 2018-01-17 | Siemens Aktiengesellschaft | Operating device for a technical device, in particular a medical device |
GB201114264D0 (en) | 2011-08-18 | 2011-10-05 | Touch Emas Ltd | Improvements in or relating to prosthetics and orthotics |
US9170674B2 (en) * | 2012-04-09 | 2015-10-27 | Qualcomm Incorporated | Gesture-based device control using pressure-sensitive sensors |
GB2502785B (en) * | 2012-06-06 | 2016-01-06 | Kuldeep Mahi | Muscle sensor switch |
WO2014089331A1 (en) | 2012-12-06 | 2014-06-12 | Ossur Hf | Electrical stimulation for orthopedic devices |
US20140198034A1 (en) | 2013-01-14 | 2014-07-17 | Thalmic Labs Inc. | Muscle interface device and method for interacting with content displayed on wearable head mounted displays |
US9223459B2 (en) | 2013-01-25 | 2015-12-29 | University Of Washington Through Its Center For Commercialization | Using neural signals to drive touch screen devices |
GB201302025D0 (en) * | 2013-02-05 | 2013-03-20 | Touch Emas Ltd | Improvements in or relating to prosthetics |
EP2959394B1 (en) | 2013-02-22 | 2021-05-12 | Facebook Technologies, LLC. | Methods and devices that combine muscle activity sensor signals and inertial sensor signals for gesture-based control |
US9234742B2 (en) * | 2013-05-01 | 2016-01-12 | Faro Technologies, Inc. | Method and apparatus for using gestures to control a laser tracker |
WO2014186370A1 (en) | 2013-05-13 | 2014-11-20 | Thalmic Labs Inc. | Systems, articles and methods for wearable electronic devices that accommodate different user forms |
EP2813194B8 (en) * | 2013-06-12 | 2018-07-11 | Otto Bock HealthCare GmbH | Control of limb device |
US9408316B2 (en) | 2013-07-22 | 2016-08-02 | Thalmic Labs Inc. | Systems, articles and methods for strain mitigation in wearable electronic devices |
CN105612475B (en) | 2013-08-07 | 2020-02-11 | 耐克创新有限合伙公司 | Wrist-worn sports apparatus with gesture recognition and power management |
US11426123B2 (en) | 2013-08-16 | 2022-08-30 | Meta Platforms Technologies, Llc | Systems, articles and methods for signal routing in wearable electronic devices that detect muscle activity of a user using a set of discrete and separately enclosed pod structures |
US11921471B2 (en) | 2013-08-16 | 2024-03-05 | Meta Platforms Technologies, Llc | Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source |
US20150124566A1 (en) | 2013-10-04 | 2015-05-07 | Thalmic Labs Inc. | Systems, articles and methods for wearable electronic devices employing contact sensors |
US10042422B2 (en) | 2013-11-12 | 2018-08-07 | Thalmic Labs Inc. | Systems, articles, and methods for capacitive electromyography sensors |
US9788789B2 (en) | 2013-08-30 | 2017-10-17 | Thalmic Labs Inc. | Systems, articles, and methods for stretchable printed circuit boards |
US9372535B2 (en) | 2013-09-06 | 2016-06-21 | Thalmic Labs Inc. | Systems, articles, and methods for electromyography-based human-electronics interfaces |
US9483123B2 (en) | 2013-09-23 | 2016-11-01 | Thalmic Labs Inc. | Systems, articles, and methods for gesture identification in wearable electromyography devices |
US9313646B2 (en) | 2013-10-17 | 2016-04-12 | At&T Intellectual Property I, Lp | Method and apparatus for adjusting device persona |
US10311482B2 (en) | 2013-11-11 | 2019-06-04 | At&T Intellectual Property I, Lp | Method and apparatus for adjusting a digital assistant persona |
KR102215442B1 (en) | 2013-11-26 | 2021-02-15 | 삼성전자주식회사 | Wearable mobile devices, and method for using selective biological signals by wearable mobile devices |
WO2015081113A1 (en) | 2013-11-27 | 2015-06-04 | Cezar Morun | Systems, articles, and methods for electromyography sensors |
KR102135586B1 (en) * | 2014-01-24 | 2020-07-20 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
US9579218B2 (en) | 2014-02-04 | 2017-02-28 | Rehabilitation Institute Of Chicago | Modular and lightweight myoelectric prosthesis components and related methods |
CA2939644A1 (en) | 2014-02-14 | 2015-08-20 | Thalmic Labs Inc. | Systems, articles, and methods for elastic electrical cables and wearable electronic devices employing same |
GB201403265D0 (en) | 2014-02-25 | 2014-04-09 | Touch Emas Ltd | Prosthetic digit for use with touchscreen devices |
DE102014204889A1 (en) * | 2014-03-17 | 2015-09-17 | Zumtobel Lighting Gmbh | System for controlling consumers of a household control technology by means of muscle impulses of at least one user and corresponding method |
US20150261306A1 (en) * | 2014-03-17 | 2015-09-17 | Thalmic Labs Inc. | Systems, devices, and methods for selecting between multiple wireless connections |
US10199008B2 (en) | 2014-03-27 | 2019-02-05 | North Inc. | Systems, devices, and methods for wearable electronic devices as state machines |
GB201408253D0 (en) | 2014-05-09 | 2014-06-25 | Touch Emas Ltd | Systems and methods for controlling a prosthetic hand |
CN105205436B (en) * | 2014-06-03 | 2019-09-06 | 北京创思博德科技有限公司 | A kind of gesture recognition system based on forearm bioelectricity multisensor |
US9880632B2 (en) | 2014-06-19 | 2018-01-30 | Thalmic Labs Inc. | Systems, devices, and methods for gesture identification |
JP5991498B2 (en) * | 2014-07-08 | 2016-09-14 | パナソニックIpマネジメント株式会社 | Myoelectric potential measuring device and myoelectric potential measuring method |
US10488936B2 (en) * | 2014-09-30 | 2019-11-26 | Apple Inc. | Motion and gesture input from a wearable device |
GB201417541D0 (en) | 2014-10-03 | 2014-11-19 | Touch Bionics Ltd | Wrist device for a prosthetic limb |
US9807221B2 (en) | 2014-11-28 | 2017-10-31 | Thalmic Labs Inc. | Systems, devices, and methods effected in response to establishing and/or terminating a physical communications link |
US9294802B1 (en) * | 2015-01-30 | 2016-03-22 | Rovi Guides, Inc. | Gesture control based on prosthetic nerve signal detection |
US9946395B2 (en) * | 2015-02-16 | 2018-04-17 | Samsung Electronics Co., Ltd. | User interface method and apparatus |
KR102377001B1 (en) * | 2015-03-16 | 2022-03-22 | 삼성전자주식회사 | Method for providing motion detection service and electronic device thereof |
US10078435B2 (en) | 2015-04-24 | 2018-09-18 | Thalmic Labs Inc. | Systems, methods, and computer program products for interacting with electronically displayed presentation materials |
CN107850935B (en) * | 2015-07-17 | 2021-08-24 | 电子部品研究院 | Wearable device and method for inputting data by using same |
US9939899B2 (en) | 2015-09-25 | 2018-04-10 | Apple Inc. | Motion and gesture input from a wearable device |
KR102570068B1 (en) * | 2015-11-20 | 2023-08-23 | 삼성전자주식회사 | Gesture recognition method, gesture recognition apparatus, wearable device |
US10324494B2 (en) * | 2015-11-25 | 2019-06-18 | Intel Corporation | Apparatus for detecting electromagnetic field change in response to gesture |
JP6334588B2 (en) | 2016-03-10 | 2018-05-30 | H2L株式会社 | Electrical stimulation system |
US11232136B2 (en) * | 2016-06-27 | 2022-01-25 | Google Llc | Contextual voice search suggestions |
US11216069B2 (en) | 2018-05-08 | 2022-01-04 | Facebook Technologies, Llc | Systems and methods for improved speech recognition using neuromuscular information |
EP3487595A4 (en) | 2016-07-25 | 2019-12-25 | CTRL-Labs Corporation | System and method for measuring the movements of articulated rigid bodies |
WO2018022597A1 (en) | 2016-07-25 | 2018-02-01 | Ctrl-Labs Corporation | Methods and apparatus for inferring user intent based on neuromuscular signals |
US10990174B2 (en) | 2016-07-25 | 2021-04-27 | Facebook Technologies, Llc | Methods and apparatus for predicting musculo-skeletal position information using wearable autonomous sensors |
US11331045B1 (en) | 2018-01-25 | 2022-05-17 | Facebook Technologies, Llc | Systems and methods for mitigating neuromuscular signal artifacts |
US20190121306A1 (en) | 2017-10-19 | 2019-04-25 | Ctrl-Labs Corporation | Systems and methods for identifying biological structures associated with neuromuscular source signals |
US10772519B2 (en) | 2018-05-25 | 2020-09-15 | Facebook Technologies, Llc | Methods and apparatus for providing sub-muscular control |
WO2018022658A1 (en) | 2016-07-25 | 2018-02-01 | Ctrl-Labs Corporation | Adaptive system for deriving control signals from measurements of neuromuscular activity |
US11185426B2 (en) | 2016-09-02 | 2021-11-30 | Touch Bionics Limited | Systems and methods for prosthetic wrist rotation |
US10478099B2 (en) | 2016-09-22 | 2019-11-19 | Apple Inc. | Systems and methods for determining axial orientation and location of a user's wrist |
KR102038120B1 (en) * | 2016-12-02 | 2019-10-30 | 피손 테크놀로지, 인크. | Detection and Use of Body Tissue Electrical Signals |
US10481699B2 (en) | 2017-07-27 | 2019-11-19 | Facebook Technologies, Llc | Armband for tracking hand motion using electrical impedance measurement |
US10973660B2 (en) | 2017-12-15 | 2021-04-13 | Touch Bionics Limited | Powered prosthetic thumb |
US11481030B2 (en) | 2019-03-29 | 2022-10-25 | Meta Platforms Technologies, Llc | Methods and apparatus for gesture detection and classification |
US10937414B2 (en) | 2018-05-08 | 2021-03-02 | Facebook Technologies, Llc | Systems and methods for text input using neuromuscular information |
EP3743901A4 (en) | 2018-01-25 | 2021-03-31 | Facebook Technologies, Inc. | Real-time processing of handstate representation model estimates |
US11907423B2 (en) | 2019-11-25 | 2024-02-20 | Meta Platforms Technologies, Llc | Systems and methods for contextualized interactions with an environment |
WO2019147958A1 (en) | 2018-01-25 | 2019-08-01 | Ctrl-Labs Corporation | User-controlled tuning of handstate representation model parameters |
US11493993B2 (en) | 2019-09-04 | 2022-11-08 | Meta Platforms Technologies, Llc | Systems, methods, and interfaces for performing inputs based on neuromuscular control |
US10504286B2 (en) | 2018-01-25 | 2019-12-10 | Ctrl-Labs Corporation | Techniques for anonymizing neuromuscular signal data |
EP3742961A4 (en) | 2018-01-25 | 2021-03-31 | Facebook Technologies, Inc. | Calibration techniques for handstate representation modeling using neuromuscular signals |
CN112005198A (en) | 2018-01-25 | 2020-11-27 | 脸谱科技有限责任公司 | Hand state reconstruction based on multiple inputs |
US11150730B1 (en) | 2019-04-30 | 2021-10-19 | Facebook Technologies, Llc | Devices, systems, and methods for controlling computing devices via neuromuscular signals of users |
US11069148B2 (en) | 2018-01-25 | 2021-07-20 | Facebook Technologies, Llc | Visualization of reconstructed handstate information |
US20220365556A1 (en) * | 2018-03-26 | 2022-11-17 | Sports Retronic | Sports retronic wearable terminal |
US10592001B2 (en) | 2018-05-08 | 2020-03-17 | Facebook Technologies, Llc | Systems and methods for improved speech recognition using neuromuscular information |
CN112261907A (en) | 2018-05-29 | 2021-01-22 | 脸谱科技有限责任公司 | Noise reduction shielding technology in surface electromyogram signal measurement and related system and method |
EP3807795A4 (en) | 2018-06-14 | 2021-08-11 | Facebook Technologies, LLC. | User identification and authentication with neuromuscular signatures |
US11045137B2 (en) | 2018-07-19 | 2021-06-29 | Facebook Technologies, Llc | Methods and apparatus for improved signal robustness for a wearable neuromuscular recording device |
CN112566553A (en) | 2018-08-13 | 2021-03-26 | 脸谱科技有限责任公司 | Real-time spike detection and identification |
WO2020047429A1 (en) | 2018-08-31 | 2020-03-05 | Ctrl-Labs Corporation | Camera-guided interpretation of neuromuscular signals |
EP3853698A4 (en) | 2018-09-20 | 2021-11-17 | Facebook Technologies, LLC | Neuromuscular text entry, writing and drawing in augmented reality systems |
EP3857342A4 (en) | 2018-09-26 | 2021-12-01 | Facebook Technologies, LLC. | Neuromuscular control of physical objects in an environment |
WO2020072915A1 (en) | 2018-10-05 | 2020-04-09 | Ctrl-Labs Corporation | Use of neuromuscular signals to provide enhanced interactions with physical objects in an augmented reality environment |
US11797087B2 (en) | 2018-11-27 | 2023-10-24 | Meta Platforms Technologies, Llc | Methods and apparatus for autocalibration of a wearable electrode sensor system |
US11547581B2 (en) | 2018-12-20 | 2023-01-10 | Touch Bionics Limited | Energy conservation of a motor-driven digit |
US10905383B2 (en) | 2019-02-28 | 2021-02-02 | Facebook Technologies, Llc | Methods and apparatus for unsupervised one-shot machine learning for classification of human gestures and estimation of applied forces |
US11931270B2 (en) | 2019-11-15 | 2024-03-19 | Touch Bionics Limited | Prosthetic digit actuator |
US11868531B1 (en) | 2021-04-08 | 2024-01-09 | Meta Platforms Technologies, Llc | Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4149716A (en) | 1977-06-24 | 1979-04-17 | Scudder James D | Bionic apparatus for controlling television games |
US4516939A (en) * | 1979-05-24 | 1985-05-14 | Quill Licensing | Finger control system |
GB2163570B (en) * | 1984-08-24 | 1988-05-25 | Hanger & Co Ltd J E | Artificial hand |
US4771344A (en) * | 1986-11-13 | 1988-09-13 | James Fallacaro | System for enhancing audio and/or visual presentation |
JPH01256934A (en) * | 1988-03-31 | 1989-10-13 | Physical Health Devices Inc | Raising operation observation and athletic training system |
US4961423A (en) * | 1988-07-15 | 1990-10-09 | Medtronic, Inc. | Rate adaptive myoelectric pacer |
US5252102A (en) | 1989-01-24 | 1993-10-12 | Electrobionics Corporation | Electronic range of motion apparatus, for orthosis, prosthesis, and CPM machine |
US5221088A (en) * | 1991-01-22 | 1993-06-22 | Mcteigue Michael H | Sports training system and method |
US5692517A (en) | 1993-01-06 | 1997-12-02 | Junker; Andrew | Brain-body actuated system |
US5482051A (en) | 1994-03-10 | 1996-01-09 | The University Of Akron | Electromyographic virtual reality system |
US5679004A (en) | 1995-12-07 | 1997-10-21 | Movit, Inc. | Myoelectric feedback system |
US5888213A (en) * | 1997-06-06 | 1999-03-30 | Motion Control, Inc. | Method and apparatus for controlling an externally powered prosthesis |
JPH11113866A (en) * | 1997-10-13 | 1999-04-27 | Nabco Ltd | Myoelectric sensor |
-
1999
- 1999-10-15 US US09/419,376 patent/US6244873B1/en not_active Expired - Lifetime
-
2001
- 2001-04-23 US US09/840,255 patent/US20010014441A1/en not_active Abandoned
Cited By (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6687612B2 (en) | 2002-01-10 | 2004-02-03 | Navigation Technologies Corp. | Method and system using a hand-gesture responsive device for collecting data for a geographic database |
US6564144B1 (en) * | 2002-01-10 | 2003-05-13 | Navigation Technologies Corporation | Method and system using a hand-gesture responsive device for collecting data for a geographic database |
US20030184498A1 (en) * | 2002-03-29 | 2003-10-02 | Massachusetts Institute Of Technology | Socializing remote communication |
US6940493B2 (en) * | 2002-03-29 | 2005-09-06 | Massachusetts Institute Of Technology | Socializing remote communication |
US20070225034A1 (en) * | 2002-05-24 | 2007-09-27 | Schmidt Dominik J | Dynamically configured antenna for multiple frequencies and bandwidths |
US20070178950A1 (en) * | 2006-01-19 | 2007-08-02 | International Business Machines Corporation | Wearable multimodal computing device with hands-free push to talk |
US20100293115A1 (en) * | 2006-08-21 | 2010-11-18 | Kaveh Seyed Momen | Method, system and apparatus for real-time classification of muscle signals from self -selected intentional movements |
US20080058668A1 (en) * | 2006-08-21 | 2008-03-06 | Kaveh Seyed Momen | Method, system and apparatus for real-time classification of muscle signals from self-selected intentional movements |
US8437844B2 (en) * | 2006-08-21 | 2013-05-07 | Holland Bloorview Kids Rehabilitation Hospital | Method, system and apparatus for real-time classification of muscle signals from self-selected intentional movements |
US8242879B2 (en) | 2008-03-20 | 2012-08-14 | The Ohio Willow Wood Company | System and method for prosthetic/orthotic device communication |
WO2009151710A3 (en) * | 2008-03-20 | 2010-02-25 | The Ohio Willow Wood Company | System and method for prosthetic/orthotic device communication |
US20120299696A1 (en) * | 2008-03-20 | 2012-11-29 | The Ohio Willow Wood Company | System and method for prosthetic/orthotic device communication |
US20090237266A1 (en) * | 2008-03-20 | 2009-09-24 | The Ohio Willow Wood Company | System and method for prosthetic/orthotic device communication |
US8653937B2 (en) * | 2008-03-20 | 2014-02-18 | The Ohio Willow Wood Company | System and method for prosthetic/orthotic device communication |
WO2010030822A1 (en) * | 2008-09-10 | 2010-03-18 | Oblong Industries, Inc. | Gestural control of autonomous and semi-autonomous systems |
US20110071417A1 (en) * | 2009-08-21 | 2011-03-24 | The Chinese University Of Hong Kong | Systems and methods for reproducing body motions via networks |
US8738122B2 (en) * | 2009-08-21 | 2014-05-27 | The Chinese University Of Hong Kong | Systems and methods for reproducing body motions via networks |
US11032137B2 (en) | 2014-06-09 | 2021-06-08 | Samsung Electronics Co., Ltd. | Wearable electronic device, main electronic device, system and control method thereof |
US11637747B2 (en) | 2014-06-09 | 2023-04-25 | Samsung Electronics Co., Ltd. | Wearable electronic device, main electronic device, system and control method thereof |
CN106102575A (en) * | 2014-08-20 | 2016-11-09 | 华为技术有限公司 | A kind of myoelectric signal collection apparatus |
WO2016026100A1 (en) * | 2014-08-20 | 2016-02-25 | 华为技术有限公司 | Myoelectric signal acquisition device |
CN106371561A (en) * | 2015-08-19 | 2017-02-01 | 北京智谷睿拓技术服务有限公司 | Input information determination method and device |
CN106371560A (en) * | 2015-08-19 | 2017-02-01 | 北京智谷睿拓技术服务有限公司 | Blowing and sucking determination method and device |
US10796599B2 (en) * | 2017-04-14 | 2020-10-06 | Rehabilitation Institute Of Chicago | Prosthetic virtual reality training interface and related methods |
US10959863B2 (en) * | 2017-06-20 | 2021-03-30 | Southeast University | Multi-dimensional surface electromyogram signal prosthetic hand control method based on principal component analysis |
CN116486683A (en) * | 2023-06-20 | 2023-07-25 | 浙江强脑科技有限公司 | Intelligent bionic hand teaching aid |
Also Published As
Publication number | Publication date |
---|---|
US6244873B1 (en) | 2001-06-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6244873B1 (en) | Wireless myoelectric control apparatus and methods | |
US10126828B2 (en) | Bioacoustic control system, method and apparatus | |
KR20150144668A (en) | Mobile terminal and method for controlling the same | |
CN108549527A (en) | Display control method, terminal and computer readable storage medium | |
CN110012130A (en) | A kind of control method and electronic equipment of the electronic equipment with Folding screen | |
WO2007146681A2 (en) | Wireless media player device and system, and method for operating the same | |
CN102770831A (en) | Apparatus and method for a virtual keypad using phalanges in the finger | |
CN107743178A (en) | A kind of message player method and mobile terminal | |
CN109009141A (en) | Sleep-Monitoring method, wearable device and computer readable storage medium | |
CN108541080A (en) | First electronic equipment and the second electronic equipment carry out the method and Related product of Hui Lian | |
CN109067965A (en) | Interpretation method, translating equipment, wearable device and storage medium | |
US20050233707A1 (en) | Mobile phone with health care functionality | |
JP2001125722A (en) | Remote control device | |
CN114077414A (en) | Audio playing control method and device, electronic equipment and storage medium | |
CN110543231B (en) | Electronic device control method and related equipment | |
JP2007006420A (en) | Data communication system | |
CN110225282A (en) | A kind of video record control method, equipment and computer readable storage medium | |
JPH10214157A (en) | Portable electronic device resettable adaptively to right-handed or left-handed person | |
CN108683997A (en) | A kind of guard method of terminal, terminal and computer readable storage medium | |
CN108632718A (en) | A kind of method and system that audio is shared | |
CN207354555U (en) | A kind of wireless headset | |
US20190025914A1 (en) | Finger control device combinable with bluetooth functions | |
CN108170355A (en) | A kind of key control method, terminal and computer readable storage medium | |
CN110150804B (en) | Watchband fixed knot constructs and wearable equipment | |
CN109980750B (en) | Wearable equipment and charging circuit thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |