WO2011055326A1 - Universal input/output human user interface - Google Patents

Universal input/output human user interface Download PDF

Info

Publication number
WO2011055326A1
WO2011055326A1 PCT/IB2010/055007 IB2010055007W WO2011055326A1 WO 2011055326 A1 WO2011055326 A1 WO 2011055326A1 IB 2010055007 W IB2010055007 W IB 2010055007W WO 2011055326 A1 WO2011055326 A1 WO 2011055326A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
data set
wrist
digital data
user
Prior art date
Application number
PCT/IB2010/055007
Other languages
French (fr)
Inventor
Igal Firsov
Original Assignee
Igal Firsov
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Igal Firsov filed Critical Igal Firsov
Publication of WO2011055326A1 publication Critical patent/WO2011055326A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Definitions

  • the present invention relates to a universal input/output (I/O) interface and, more particularly, to a wrist-worn device for detecting movement of the hand and fingers of a user and translating those movements into input data such as typed characters and mouse movements.
  • I/O input/output
  • the term 'hand' is exchangeably used herein to refer the hand including the fingers and the hand excluding the Fingers.
  • the context in which the term is used will determine which definition is applicable. In both cases, the term 'hand' does not include the wrist.
  • 'wrist' is used to refer to the anatomical region surrounding the carpus.
  • the current innovation provides a wrist device that is donned in a fashion similar to a wristwatch and provides substantially unobtrusive functionality, in a similar fashion to the unobtrusive nature of a wristwatch.
  • the wrist device replaces the need for a keyboard, mouse and numerous other accessories mentioned below.
  • the current invention is an innovative means for replacing at least the keyboard while still using the finger and hand motions to input the data.
  • the device is a bracelet-like contraption, enclosing the wrist at the base of the palm.
  • the wrist- device 'reads' the movements of the fingers by tracking the movement of the tendons and/or muscles as well as the relative position of the wrist.
  • the combination of simultaneous movements of the tendons and/or muscles and wrist supply the relevant data for the algorithms to distinguish distinct movements. Movement of the tendons without movement of the wrist provide special cases and are understood to relate to those keys that do not require wrist movement for execution (eg. the ASDF keys of a standard keyboard).
  • the user of the device is able to 'type'- that is to say, move fingers as if typing on a keyboard - and these movements are translated into the corresponding key on the virtual keyboard.
  • the user can 'type' on any surface, or even in the air, without having any physical interface aside from the wrist-device on the wrist.
  • the typing movements themselves are translated into the corresponding keys for input.
  • the wrist-device offers the further advantage of not necessitating the entire finger movement usually required for key-pushing.
  • the movement toward the virtual key position provides sufficient data to extrapolate the intended key. This innovation affords a saving of up to 50% of the time current expended for typing.
  • the wrist-device can be configured for the standard 104- character keyboard, or for specialized keyboards as will be discussed below. According to the present invention there is provided
  • the present invention successfully addresses the shortcomings of the presently known configurations by providing a
  • the present invention discloses an innovative
  • FIG. 1 is an illustration of an embodiment of the invention
  • FIG. 2 is a partially exploded view of an embodiment of the invention
  • FIG. 3 is a pictorial illustration of a left and right hand, each donning a device of the current invention
  • FIG. 4 is a block diagram of an exemplary embodiment of sensors and a computing device of the current invention
  • FIG. 5 is a flow chart of the process of converting a gesture into a digitally representable data set. DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • Figures 1 and 2 illustrate a bracelet-like device 100 intended, in the current embodiment, to be worn on the wrist of a user.
  • the device has a dorsal element 102 and a ventral element 104.
  • the dorsal element 102 is slightly convex with the inner, concave, surface resting on the posterior or dorsal side of the wrist.
  • the ventral element 104 is a relatively larger, where the concave surface rests on the ventral, anterior or underside of the wrist. It is to be understood that other embodiments are envisioned whereby all the components are included in different arrangements but fol lowing the same principles of motion detection described herein.
  • Figure 3 is a pictorial depiction of a pair of devices of the current invention, worn on the left and right hands of a user.
  • the dorsal element is populated with several sensors 120, which monitor the movements of the tendons on the anterior surface of the wrist at the base of the hand.
  • G-sensors residing exemplarily in the cylindrical connecting elements 122 monitor the relative movement of the wrist.
  • a mechanical sensor 124 senses the movement of hand and wrist relative to the surface upon which the hand is resting, in the case of a hand resting on a surface.
  • the data collected from the sensors identify the movements corresponding to the virtual positions of the keys of a keyboard or keypad as predefined and stored in the data processor of the device.
  • Various sensor technologies can be employed separately or in combination.
  • the exemplary technologies discussed herein are tension sensors, electromyography, ultrasonic sensors, mechanical sensors and G-sensors.
  • the mechanical sensor(s) and G-sensors are auxiliary sensors, each of which provides additional data regarding the location of the hand and/or wrist at the time the finger gesture.
  • the G-sensors give indispensable location information in the case of gesture made in midair.
  • Electromyography is a technique for evaluating and recording the electrical activity produced by skeletal muscles.
  • Skeletal muscle is a form of striated muscle tissue existing under control of the somatic nervous system. As its name suggests, most skeletal muscle is attached to bones by bundles of collagen fibers known as tendons.
  • EMG is performed using an instrument called an electromyograph, to produce a record called an electromyogram.
  • An electromyograph detects the electrical potential generated by muscle cells when these cells are electrically or neurologically activated. The signals can be analyzed to detect medical abnormalities, activation level, recruitment order or to analyze the biomechanics of human or animal movement.
  • electromyographic functions are integrated into the device, and the sensors 120 are EMG surface sensors.
  • the activation levels of the muscles change with movement. Each movement and the corresponding relative electrical change is potentially predefined at time of manufacture and calibrated (redefined) by each individual user. The method for converting the sensed data into a useful form is discussed below.
  • a tension resistor can be employed in addition to the EMG functionality or, alternatively, in an alternative embodiment of the invention.
  • the tension sensor is configured to detect the changing angles of the tendons when moving the fingers in a manner similar to the registration of touch on a touchscreen.
  • a resistive touchscreen panel is composed of several layers, the most important of which are two thin, metallic, electrically conductive layers separated by a narrow gap. When an object, such as a finger, presses down on a point on the outer surface of the panel, the two metallic layers become connected at that point: the panel then behaves as a pair of voltage dividers with connected outputs. This causes a change in the electrical current, which is registered as a touch event and sent to the controller for processing.
  • the tension resistor senses the change in angle of the tendon, and registers the related movement of the finger. The sensed movement is interpreted by the data processor to reflect a predefined output.
  • At least one ultrasonic transceiver potentially using piezoelectric crystals, is employed in the dorsal element 102 as a sensor 120.
  • ultrasonic sensing technology is used in touchscreens.
  • Surface acoustic wave (SAW) technology uses ultrasonic waves that pass over the touchscreen panel . When the panel is touched, a portion of the wave is absorbed. This change in the ultrasonic waves registers the position of the touch event and sends this information to the controller for processing.
  • SAW technology can register the movement of the tendons and change of surface topography of the posterior surface of the wrist and hand. The sensed changes are registered as specific gestures, according pre-configured parameters.
  • FIG. 4 is a block diagram or a exemplary embodiment of the computing device 400 of the current invention, which receives the sensor data, processes the sensed data into digital data and transmits the data to a receiving device.
  • Sensors 120a - 120n (a-n representing an undetermined plurality) transmit sensor data (electrical signals) to the computing unit 400 via the Data Input Port(s) 412.
  • CPU 402 compares the pattern of the electrical signals with a database of reference patterns and returns the related values (digital data), stored on the mass storage unit 408.
  • the code executed by CPU 402 is stored in ROM 406.
  • the digital data related to the received sensor data is transmitted to a destination device (not shown) via a wired/wireless transceiver 414 (using the wireless antenna 416 where relevant). Potentially, a plurality of integrated wireless antennas 416 can be employed for use with various wireless technologies.
  • Power source 410 powers the computing device 400 sensors 120 and all attached peripherals (e.g. view screen, speakers, etc., not shown here).
  • the CPU 402, ROM 406, RAM 404, storage unit 408, transceiver 414 and data input ports 412 all communicate with each other on a common bus 418.
  • the convex surface of the dorsal element 102 can have, among other things, a view screen display 106 to monitor the inputted data and make corrections as necessary.
  • the display can be a touch screen display.
  • Various other touch screen button/features 108 can be added to the surface such as, but not limited to, a 'start' button, 'end' button, 'transfer', 'connect' and the like.
  • the buttons can also toggle between various functions such, as keyboard, numeric pad, virtual mouse, cellular phone, Personal Digital Assistant (PDA), etc.
  • the ventral element 104 encompasses the anterior surface or underside of the wrist at the base of the palm.
  • various haptic actuators 126 are arranged to provide tactile feedback to the user.
  • Haptic technology, or haptics is a tactile feedback technology that takes advantage of the sense of touch of a user by applying forces, vibrations, and/or motions to the user. Actuators that apply forces to the skin for touch feedback enable the tactile feedback.
  • the actuator provides mechanical motion in response to an electrical stimulus.
  • Most early designs of haptic feedback use electromagnetic technologies such as vibratory motors with an offset mass, such as the pager motor, that is in most cell phones or voice coils where a central mass or output is moved by a magnetic field.
  • the electromagnetic motors typically operate at resonance and provide strong feedback, but have limited range of sensations.
  • Next- generation actuator technologies are beginning to emerge, offering a wider range of effects thanks to more rapid response times.
  • Next generation haptic actuator technologies include Eiectroactive Polymers, Piezoelectric, and Electrostatic surface actuation.
  • haptics A non-limiting example of the use of haptics is key depression feedback.
  • the lack of tactile feedback that people have become accustomed to expect from key depression, can disorienting.
  • tactile confirmation feedback from the finger depressing the key.
  • audio confirmation There is sometimes even an audio confirmation that the key was indeed sufficiently depressed and the data inputted (older keyboards 'clicked' when a key was depressed, touchscreen buttons emit an audio beep or vibration).
  • actuators 126 have been included in the design.
  • These devices can be configured to provide a tactile response to virtual key depression, providing the user with confirmation feedback regarding the entering of the data.
  • the nodes can be configured to provide varied responses based on the user's needs. For example, in the case where the wrist-device has cellular
  • an incoming call could be registered to the user as the vibration of one or more specific nodes. These nodes could be further configured to vibrate in specific patterns depending on the caller, in a fashion similar to the current ability to designate different ringtones for different callers.
  • the user may employ a single wrist-device, or a pair of wrist devices.
  • the latter scenario is depicted in Figure 3.
  • the device is worn around the wrist at the base of the hand.
  • the device can be worn on any part of the arm where the movement of the tendons and the wrist can be registered.
  • the wrist-device can be designed to withstand an acceptable level of physical shock, ranging from nominal knockage to the forces that the G-ShockTM Casio wristwatch range by Casio America, Inc. 570 Mt. Pleasant Avenue, Dover, NJ, USA, are designed to withstand.
  • the device is hermitically sealed to provide nominal to extreme water resistance.
  • buttons 108 can be configured to toggle between different types of keyboards and keypads.
  • the device communicates with peripheral accessories through any or all of the standard wireless technologies such as, but not limited to, WiFi, BlueTooth, Near Field Communication (NFC), Radio Frequency identification (RFID), Ultra-wideband etc.
  • NFC Near Field Communication
  • RFID Radio Frequency identification
  • Ultra-wideband any or all of the standard wireless technologies such as, but not limited to, WiFi, BlueTooth, Near Field Communication (NFC), Radio Frequency identification (RFID), Ultra-wideband etc.
  • the wrist-device Before each typing session, the wrist-device has to be calibrated.
  • the calibration process will require the executing of a preset sequence of movements employing all the fingers.
  • One such example is the drumming of each of the fingers a single time, sequentially, from the little finger to the thumb.
  • the contact movement of each digit will calibrate the device relative to the surface upon which the hand is resting.
  • the calibration will also be effective (though to a lesser degree) when used in the air.
  • the wrist-device can replace the standard mouse or other pointing device.
  • the wrist-device By engaging the mouse command (whether configured as one of the preconfigured functional buttons 108 or a button displayed on the touch-screen display or as a particular pre-configured hand movement) the wrist-device will translate the wrist-fmger movements as control directives for the cursor on the screen of the device to which the wrist-device is wirelessly coupled.
  • FIG. 5 is a flow chart of an exemplary method for translating a gesture into digitally represented data.
  • Each sensing technology converts the signals received by the sensor into electrical signals.
  • User 500 performs a gesture.
  • a force is exerted on sensor 504, in step 502.
  • the nature of the force exerted on the sensor is determined by the type of sensor employed (e.g. an electrical current is exerted on an EMG sensor whereas pressure or strain is exerted on a tension resistor).
  • Sensor 504 converts the force into an electrical signal.
  • the converted electrical signal is sent to CPU 508 in step 506, for deciphering.
  • CPU 508 cross-references the pattern of the electrical signal with stored electrical signal patterns in database 512. Each electrical signal pattern has a related value.
  • the wrist-device of the current invention when wirelessly coupled to a cellular phone, via BlueTooth, the wrist device remotely controls the functions of the cellular phone, such as dialing, text message compilation and reading, surfing the Internet, etc.
  • the mobile phone When the mobile phone is in dialing mode, the user virtually 'taps out' the number for dialing - using one hand on any surface, or even in midair - according the keypad configuration of a mobile phone, and then employs a pre-configured hand gesture, which sends the command to dial the typed number.
  • a user accesses the texting screen on the mobile device and then proceeds to virtually type the message - once again using one or two hands on any surface or in midair - instead of using the physical alpha-numeric keypad of the mobile device or the virtual keypad presented on the touchscreen.
  • Both the alpha-numeric keypad and virtual mini- or expanded keyboards are difficult or inconvenient to use with high accuracy and/or have the aforementioned drawbacks of causing damage such as BlackBerry Thumb syndrome.
  • the hands when using two wrist devices synced together, the hands lies prone on a surface (e.g. desktop), and at least the base of palm and fingertips are in contact with the surface. The hands are slightly arched.
  • the tension sensors 504 register the change of angle of the tendons for each gesture of the fingers. Deformation of the sensor from the reference configuration to the deformed configuration gives the pressure reading 502. The sensor converts the measured pressure into an electrical signal 506. This signal is processed by the CPU 508 to extrapolate the motion which caused the pressure. The extrapolated movement is cross-referenced (in step 510) with a database 512 of gestures / movements stored on the storage unit of the computing device. Exemplarily, the pattern of the electrical signal 506 is cress-referenced with stored patterns, each having a related value.
  • the pressure of the small finger with no movement of the wrist is extrapolated by the computing device to represent the character 'A' on a standard 104-key keyboard.
  • Value 'A' is returned from the database in the form of digital data, in step 514.
  • Digital data representative of the character 'A' is outputted (in step 520) via an output port 518 to a receiving / destination device 522.
  • the receiving device in the current example is the mobile phone.
  • the digital data is transmitted (outputted) wirelessly (using BlueTooth technology in the current example) to the receiving device (the mobile phone), where the letter 'A' appears in the text message being compiled.
  • Using more than one sensor technology provides additional data regarding the sensed movement and can, among other things, improve accuracy of gesture identification.
  • the present invention affords remote multi touching. That is to say that more than one cursor can be controlled with an additional pointing device.
  • multi touching refers to the physical contact of more than one appendage with a specialized multi touch enabled screen
  • remote multi touching refers to the use of more than one remote pointing device to control more than one on screen cursor. Only applications that are enabled for multiple remote pointing devices will support remote multi touching or more accurately multi pointing.
  • two wrist devices at least two remote pointing devices are available. Even with one wrist-device, when more than one sensing technology is employed, sensing data from two or more sources allows reafmultitouching. Sensing data relating to a first movement is received from one sensor (e.g. tension sensor) and while data relating to another movement is received from a second sensor (e.g. muscle electricity).
  • one sensor e.g. tension sensor
  • a second sensor e.g. muscle electricity
  • chord refers to entering characters or commands formed by pressing several keys together, like playing a "chord" on a piano.
  • a single sensor With a single sensor, only a singular movement can be deciphered.
  • signals from distinct sources e.g. tendons, muscles etc.
  • chorded gestures can recognize synchronized or chorded gestures.
  • chorded gestures Perhaps the most significant use of chorded gestures is in sign language. Multi- sensing (collating sensed data from distinct sources) affords more accurate
  • the wrist-device can be used as a '3D mouse', acting as a pointing device or remote touching device.
  • a '3D mouse' acting as a pointing device or remote touching device.
  • the entire hand and all fingers can be employed to remotely interact with the various objects in the environment, manipulating them as desired.
  • wrist-device can augment existing functionality with additional buttons to improve the quality of the experience:
  • the wrist-device can mimic al l the functions of the Mobile Air MouseTM technology that was developed by the P A Technology company currently based in Boston MA, USA.
  • Wii Joystick Where applicable and convenient, the wrist -device can replace the functions of the Wii joystick or that of any other play-station / gaming platform.
  • the wrist-device can be configured to interpret sign language (standard or country dependent) rendering the signs into words displayed on a screen or even relaying an audio signal of the words (literally a sign-language translator), allowing a deaf and/or mute to communicate audibly with individuals who are unfamiliar with sign language. To facilitate this function, a speaker would be added to the wrist-device (not shown in current figures).
  • One or more G-sensors provide additional sensing data to better process the hand gestures. Using a plurality of sensing technologies, complex hand gestures can be processed in the manner similar to real multi -touching discussed above.
  • the wrist-device can be an integrated telecommunications device using cellular or satellite technology.
  • the device can be configured to act as a mobile phone as well as including the hardware, firmware and software innovations currently know in the art.
  • An exemplary non-limiting list of technologies include:

Abstract

A human, user interface device and method for interpreting gestures of the hand and wrist, including sensors for sensing the gestures and a data processor for converting the sensed gesture into digital data and outputting the data to a destination device.

Description

APPLICATION FOR PATENT
Inventor(s): Igal Firsov
Title: UNIVERSAL INPUT/OUTPUT HUMAN USER INTERFACE
This is a continuation-in-part of U. S. Provisional Patent Application No. 61/257,859, filed November 04, 2009
FIELD AND BACKGROUND OF THE INVENTION The present invention relates to a universal input/output (I/O) interface and, more particularly, to a wrist-worn device for detecting movement of the hand and fingers of a user and translating those movements into input data such as typed characters and mouse movements.
Almost every standard computer is equipped with a keyboard and some kind of pointing device, usually a mouse. To date all of the innovations in the field have yet to provide an input/output interface that is not a hindrance to the user. The standard keyboard and mouse combination have the well-publicized drawbacks relating to ergonometric features and leading to several medical syndromes such as Carpal Tunnel Syndrome and the commonly named BlackBerry Thumb Syndrome relating to use of the miniature keypads on mobile devices. Numerous other ailments have been tracked to incorrect posture and extended periods of sitting in front of a computer while typing on a standard keyboard and clicking on a mouse. Other technologies for virtually inputting data using hands and fingers necessitate the donning of outfitted gloves or similar accessories. Touch-screen technology necessitates the physical contact of the user with the interface. Various attempts have been made to sensing instruments for measuring the movements of the hand and fingers for use as data input. U.S. Pat. No. 6,128,004, which is incorporated in its entirety as if fully set forth herein, teaches a data input glove for sensing the movements of the hand and fingers for use at least in a virtual- reality system.
It would be highly advantageous to have a device for sensing the movements of the hand and fingers, which at the same time does not necessitate the donnmg of special accessories on the fingers or hand, which interfere with regular day-to-day tasks.
DEFINITIONS
The term 'hand' is exchangeably used herein to refer the hand including the fingers and the hand excluding the Fingers. The context in which the term is used will determine which definition is applicable. In both cases, the term 'hand' does not include the wrist.
The term 'wrist' is used to refer to the anatomical region surrounding the carpus.
While the majority of the discussion following refers to a 'wrist device', it is hereby made clear that while some embodiments of the invention are intended for positioning on or near the wrist area, other embodiments of the invention can be positioned elsewhere along the forearm of the user. SUMMARY OF THE INVENTION
The current innovation provides a wrist device that is donned in a fashion similar to a wristwatch and provides substantially unobtrusive functionality, in a similar fashion to the unobtrusive nature of a wristwatch. The wrist device replaces the need for a keyboard, mouse and numerous other accessories mentioned below.
The current invention is an innovative means for replacing at least the keyboard while still using the finger and hand motions to input the data. The device is a bracelet-like contraption, enclosing the wrist at the base of the palm. The wrist- device 'reads' the movements of the fingers by tracking the movement of the tendons and/or muscles as well as the relative position of the wrist. The combination of simultaneous movements of the tendons and/or muscles and wrist supply the relevant data for the algorithms to distinguish distinct movements. Movement of the tendons without movement of the wrist provide special cases and are understood to relate to those keys that do not require wrist movement for execution (eg. the ASDF keys of a standard keyboard). The user of the device is able to 'type'- that is to say, move fingers as if typing on a keyboard - and these movements are translated into the corresponding key on the virtual keyboard. The user can 'type' on any surface, or even in the air, without having any physical interface aside from the wrist-device on the wrist. The typing movements themselves are translated into the corresponding keys for input. In fact, the wrist-device offers the further advantage of not necessitating the entire finger movement usually required for key-pushing. The movement toward the virtual key position provides sufficient data to extrapolate the intended key. This innovation affords a saving of up to 50% of the time current expended for typing. The wrist-device can be configured for the standard 104- character keyboard, or for specialized keyboards as will be discussed below. According to the present invention there is provided
According to further features in preferred embodiments of the invention described below
According to still further features in the described preferred, embodiments According to another embodiment
The present invention successfully addresses the shortcomings of the presently known configurations by providing a
The present invention discloses an innovative
More specifically, the of the present invention,
[how to use] BRIEF DESCRIPTION OF THE DRAWINGS
Various embodiments are herein described, by way of example only, with reference to the accompanying drawings, wherein:
FIG. 1 is an illustration of an embodiment of the invention;
FIG. 2 is a partially exploded view of an embodiment of the invention;
FIG. 3 is a pictorial illustration of a left and right hand, each donning a device of the current invention;
FIG. 4 is a block diagram of an exemplary embodiment of sensors and a computing device of the current invention;
FIG. 5 is a flow chart of the process of converting a gesture into a digitally representable data set. DESCRIPTION OF THE PREFERRED EMBODIMENTS
The principles and operation of an I/O user interface according to the present invention may be better understood with reference to the drawings and the accompanying description.
Referring now to the drawings, Figures 1 and 2 illustrate a bracelet-like device 100 intended, in the current embodiment, to be worn on the wrist of a user. The device has a dorsal element 102 and a ventral element 104. The dorsal element 102 is slightly convex with the inner, concave, surface resting on the posterior or dorsal side of the wrist. The ventral element 104 is a relatively larger, where the concave surface rests on the ventral, anterior or underside of the wrist. It is to be understood that other embodiments are envisioned whereby all the components are included in different arrangements but fol lowing the same principles of motion detection described herein.
Figure 3 is a pictorial depiction of a pair of devices of the current invention, worn on the left and right hands of a user.
The dorsal element is populated with several sensors 120, which monitor the movements of the tendons on the anterior surface of the wrist at the base of the hand. G-sensors residing exemplarily in the cylindrical connecting elements 122 monitor the relative movement of the wrist. Alternatively and/or additionally a mechanical sensor 124 senses the movement of hand and wrist relative to the surface upon which the hand is resting, in the case of a hand resting on a surface. When the fingers move, the data collected from the sensors identify the movements corresponding to the virtual positions of the keys of a keyboard or keypad as predefined and stored in the data processor of the device. Various sensor technologies can be employed separately or in combination. Currently, the exemplary technologies discussed herein are tension sensors, electromyography, ultrasonic sensors, mechanical sensors and G-sensors. The mechanical sensor(s) and G-sensors are auxiliary sensors, each of which provides additional data regarding the location of the hand and/or wrist at the time the finger gesture. Furthermore, the G-sensors give indispensable location information in the case of gesture made in midair.
Electromyography (EMG) is a technique for evaluating and recording the electrical activity produced by skeletal muscles. Skeletal muscle is a form of striated muscle tissue existing under control of the somatic nervous system. As its name suggests, most skeletal muscle is attached to bones by bundles of collagen fibers known as tendons. EMG is performed using an instrument called an electromyograph, to produce a record called an electromyogram. An electromyograph detects the electrical potential generated by muscle cells when these cells are electrically or neurologically activated. The signals can be analyzed to detect medical abnormalities, activation level, recruitment order or to analyze the biomechanics of human or animal movement. In one embodiment the current invention, electromyographic functions are integrated into the device, and the sensors 120 are EMG surface sensors. The activation levels of the muscles change with movement. Each movement and the corresponding relative electrical change is potentially predefined at time of manufacture and calibrated (redefined) by each individual user. The method for converting the sensed data into a useful form is discussed below.
A tension resistor can be employed in addition to the EMG functionality or, alternatively, in an alternative embodiment of the invention. The tension sensor is configured to detect the changing angles of the tendons when moving the fingers in a manner similar to the registration of touch on a touchscreen. A resistive touchscreen panel is composed of several layers, the most important of which are two thin, metallic, electrically conductive layers separated by a narrow gap. When an object, such as a finger, presses down on a point on the outer surface of the panel, the two metallic layers become connected at that point: the panel then behaves as a pair of voltage dividers with connected outputs. This causes a change in the electrical current, which is registered as a touch event and sent to the controller for processing. When used in the current invention, the tension resistor senses the change in angle of the tendon, and registers the related movement of the finger. The sensed movement is interpreted by the data processor to reflect a predefined output.
A further sensing technology that can be used separately or in tandem with one or both of the aforementioned technologies, is ultrasonic sensing. At least one ultrasonic transceiver, potentially using piezoelectric crystals, is employed in the dorsal element 102 as a sensor 120. Here too, ultrasonic sensing technology is used in touchscreens. Surface acoustic wave (SAW) technology uses ultrasonic waves that pass over the touchscreen panel . When the panel is touched, a portion of the wave is absorbed. This change in the ultrasonic waves registers the position of the touch event and sends this information to the controller for processing. In a similar fashion, SAW technology can register the movement of the tendons and change of surface topography of the posterior surface of the wrist and hand. The sensed changes are registered as specific gestures, according pre-configured parameters.
Although the current embodiment of the invention depicts the sensors in the dorsal element and the haptic actuator in the ventral element, any applicable placement of both the sensors and the actuators is included within the scope of this invention. Furthermore, all applicable sensing technologies, known to those skilled in the art, are included in the scope of the invention. A practical example of the system in use is described below in the Example section. Figure 4 is a block diagram or a exemplary embodiment of the computing device 400 of the current invention, which receives the sensor data, processes the sensed data into digital data and transmits the data to a receiving device. Sensors 120a - 120n (a-n representing an undetermined plurality) transmit sensor data (electrical signals) to the computing unit 400 via the Data Input Port(s) 412. CPU 402 compares the pattern of the electrical signals with a database of reference patterns and returns the related values (digital data), stored on the mass storage unit 408. The code executed by CPU 402 is stored in ROM 406. The digital data related to the received sensor data is transmitted to a destination device (not shown) via a wired/wireless transceiver 414 (using the wireless antenna 416 where relevant). Potentially, a plurality of integrated wireless antennas 416 can be employed for use with various wireless technologies. Power source 410 powers the computing device 400 sensors 120 and all attached peripherals (e.g. view screen, speakers, etc., not shown here). The CPU 402, ROM 406, RAM 404, storage unit 408, transceiver 414 and data input ports 412 all communicate with each other on a common bus 418.
The convex surface of the dorsal element 102 can have, among other things, a view screen display 106 to monitor the inputted data and make corrections as necessary. The display can be a touch screen display. Various other touch screen button/features 108 can be added to the surface such as, but not limited to, a 'start' button, 'end' button, 'transfer', 'connect' and the like. The buttons can also toggle between various functions such, as keyboard, numeric pad, virtual mouse, cellular phone, Personal Digital Assistant (PDA), etc.
The ventral element 104 encompasses the anterior surface or underside of the wrist at the base of the palm. Here various haptic actuators 126 are arranged to provide tactile feedback to the user. Haptic technology, or haptics, is a tactile feedback technology that takes advantage of the sense of touch of a user by applying forces, vibrations, and/or motions to the user. Actuators that apply forces to the skin for touch feedback enable the tactile feedback. The actuator provides mechanical motion in response to an electrical stimulus. Most early designs of haptic feedback use electromagnetic technologies such as vibratory motors with an offset mass, such as the pager motor, that is in most cell phones or voice coils where a central mass or output is moved by a magnetic field. The electromagnetic motors typically operate at resonance and provide strong feedback, but have limited range of sensations. Next- generation actuator technologies are beginning to emerge, offering a wider range of effects thanks to more rapid response times. Next generation haptic actuator technologies include Eiectroactive Polymers, Piezoelectric, and Electrostatic surface actuation.
A non-limiting example of the use of haptics is key depression feedback. For many users, the lack of tactile feedback, that people have become accustomed to expect from key depression, can disorienting. Usually when depressing a key on a physical keyboard there is some tactile confirmation feedback from the finger depressing the key. There is sometimes even an audio confirmation that the key was indeed sufficiently depressed and the data inputted (older keyboards 'clicked' when a key was depressed, touchscreen buttons emit an audio beep or vibration). In order to alleviate this phenomenon, actuators 126 have been included in the design. These devices can be configured to provide a tactile response to virtual key depression, providing the user with confirmation feedback regarding the entering of the data. In addition, the nodes can be configured to provide varied responses based on the user's needs. For example, in the case where the wrist-device has cellular
telecommunication capabilities integrated into its design (see below the discussion on integrated smartphone capabilities), an incoming call could be registered to the user as the vibration of one or more specific nodes. These nodes could be further configured to vibrate in specific patterns depending on the caller, in a fashion similar to the current ability to designate different ringtones for different callers.
The user may employ a single wrist-device, or a pair of wrist devices. The latter scenario is depicted in Figure 3. Preferably the device is worn around the wrist at the base of the hand. Alternatively, the device can be worn on any part of the arm where the movement of the tendons and the wrist can be registered.
The wrist-device can be designed to withstand an acceptable level of physical shock, ranging from nominal knockage to the forces that the G-Shock™ Casio wristwatch range by Casio America, Inc. 570 Mt. Pleasant Avenue, Dover, NJ, USA, are designed to withstand. In addition, the device is hermitically sealed to provide nominal to extreme water resistance.
Due to the numerous distinct movement combinations discernable by the wrist-device, many more keys can be pre-configured than exist on the standard 104- key keyboards. For example, a virtual keyboard containing all the symbols of the Chinese language could be configured. The entire standard keyboard can be accessed by one hand including the numeric pad. Personal macros can be defines as well as personalized function buttons. Separate keypad configurations can be pre-defined for different applications (e.g. when the wrist device interacts with a DVD-player, the virtual keypad interface will take on the form of a DVD remote control consol). As mentioned above and elsewhere herein, functional buttons 108 can be configured to toggle between different types of keyboards and keypads.
The device communicates with peripheral accessories through any or all of the standard wireless technologies such as, but not limited to, WiFi, BlueTooth, Near Field Communication (NFC), Radio Frequency identification (RFID), Ultra-wideband etc. If only one wrist-device is used then, on receiving the command to search for nearby compatible devices, the wrist-device will seek and connect to the closest or targeted device. If a pair of wrist-devices is used, the pair will sync first and then find the closest or targeted device to couple with.
Before each typing session, the wrist-device has to be calibrated. The calibration process will require the executing of a preset sequence of movements employing all the fingers. One such example is the drumming of each of the fingers a single time, sequentially, from the little finger to the thumb. The contact movement of each digit will calibrate the device relative to the surface upon which the hand is resting. The calibration will also be effective (though to a lesser degree) when used in the air. Aside from replacing the standard keyboard, the wrist-device can replace the standard mouse or other pointing device. By engaging the mouse command (whether configured as one of the preconfigured functional buttons 108 or a button displayed on the touch-screen display or as a particular pre-configured hand movement) the wrist-device will translate the wrist-fmger movements as control directives for the cursor on the screen of the device to which the wrist-device is wirelessly coupled.
Figure 5 is a flow chart of an exemplary method for translating a gesture into digitally represented data. Each sensing technology converts the signals received by the sensor into electrical signals. User 500 performs a gesture. A force is exerted on sensor 504, in step 502. The nature of the force exerted on the sensor is determined by the type of sensor employed (e.g. an electrical current is exerted on an EMG sensor whereas pressure or strain is exerted on a tension resistor). Sensor 504 converts the force into an electrical signal. The converted electrical signal is sent to CPU 508 in step 506, for deciphering. In step 510, CPU 508 cross-references the pattern of the electrical signal with stored electrical signal patterns in database 512. Each electrical signal pattern has a related value. Most values are predefined at the time of manufacture although some values can be defined by the user. That is to say, some gestures can be defined by the user to have unique values. The related value, in the for of a data set is returned to the CPU in step 514. The data set is relayed (step 516) to output port 518 and then outputted / transmitted to destination device 522, in step 520.
Example
The following exemplary scenario is also described in relation to the flow chart of Figure 5. Exemplarily, when the wrist-device of the current invention is wirelessly coupled to a cellular phone, via BlueTooth, the wrist device remotely controls the functions of the cellular phone, such as dialing, text message compilation and reading, surfing the Internet, etc. When the mobile phone is in dialing mode, the user virtually 'taps out' the number for dialing - using one hand on any surface, or even in midair - according the keypad configuration of a mobile phone, and then employs a pre-configured hand gesture, which sends the command to dial the typed number. In another exemplary scenario, continuing for the previous scenario, a user accesses the texting screen on the mobile device and then proceeds to virtually type the message - once again using one or two hands on any surface or in midair - instead of using the physical alpha-numeric keypad of the mobile device or the virtual keypad presented on the touchscreen. Both the alpha-numeric keypad and virtual mini- or expanded keyboards are difficult or inconvenient to use with high accuracy and/or have the aforementioned drawbacks of causing damage such as BlackBerry Thumb syndrome. Exemplarily, when using two wrist devices synced together, the hands lies prone on a surface (e.g. desktop), and at least the base of palm and fingertips are in contact with the surface. The hands are slightly arched. When typing out the letters, 'ASDF JKL;' the small through index fingers of the left hand sequentially exert downward pressure on the surface, then the thumb through small finger of the right hand each exert downward pressure in turn. The tension sensors 504 register the change of angle of the tendons for each gesture of the fingers. Deformation of the sensor from the reference configuration to the deformed configuration gives the pressure reading 502. The sensor converts the measured pressure into an electrical signal 506. This signal is processed by the CPU 508 to extrapolate the motion which caused the pressure. The extrapolated movement is cross-referenced (in step 510) with a database 512 of gestures / movements stored on the storage unit of the computing device. Exemplarily, the pattern of the electrical signal 506 is cress-referenced with stored patterns, each having a related value. In the exemplary case, the pressure of the small finger with no movement of the wrist is extrapolated by the computing device to represent the character 'A' on a standard 104-key keyboard. Value 'A' is returned from the database in the form of digital data, in step 514. Digital data representative of the character 'A' is outputted (in step 520) via an output port 518 to a receiving / destination device 522. The receiving device in the current example is the mobile phone. The digital data is transmitted (outputted) wirelessly (using BlueTooth technology in the current example) to the receiving device (the mobile phone), where the letter 'A' appears in the text message being compiled.
Using more than one sensor technology provides additional data regarding the sensed movement and can, among other things, improve accuracy of gesture identification.
Multitouching
The present invention affords remote multi touching. That is to say that more than one cursor can be controlled with an additional pointing device. Whereas multi touching refers to the physical contact of more than one appendage with a specialized multi touch enabled screen, remote multi touching refers to the use of more than one remote pointing device to control more than one on screen cursor. Only applications that are enabled for multiple remote pointing devices will support remote multi touching or more accurately multi pointing. With two wrist devices, at least two remote pointing devices are available. Even with one wrist-device, when more than one sensing technology is employed, sensing data from two or more sources allows reafmultitouching. Sensing data relating to a first movement is received from one sensor (e.g. tension sensor) and while data relating to another movement is received from a second sensor (e.g. muscle electricity).
Using a plurality of sensors also enables the identification of a chord movement. A chord refers to entering characters or commands formed by pressing several keys together, like playing a "chord" on a piano. With a single sensor, only a singular movement can be deciphered. With multiple sensors, signals from distinct sources (e.g. tendons, muscles etc.) can recognize synchronized or chorded gestures.
Perhaps the most significant use of chorded gestures is in sign language. Multi- sensing (collating sensed data from distinct sources) affords more accurate
deciphering of sign language gestures and enables the gestures to be interpreted into predefined audio words, as discussed elsewhere herein.
The wrist-device can be used as a '3D mouse', acting as a pointing device or remote touching device. In a 3D environment the entire hand and all fingers can be employed to remotely interact with the various objects in the environment, manipulating them as desired.
A number of exemplary additional applications are envisioned where the wrist-device can augment existing functionality with additional buttons to improve the quality of the experience:
a. Air Mouse - The wrist-device can mimic al l the functions of the Mobile Air Mouse™ technology that was developed by the P A Technology company currently based in Boston MA, USA. b. Wii Joystick - Where applicable and convenient, the wrist -device can replace the functions of the Wii joystick or that of any other play-station / gaming platform.
c. Sign language translator - The wrist-device can be configured to interpret sign language (standard or country dependent) rendering the signs into words displayed on a screen or even relaying an audio signal of the words (literally a sign-language translator), allowing a deaf and/or mute to communicate audibly with individuals who are unfamiliar with sign language. To facilitate this function, a speaker would be added to the wrist-device (not shown in current figures). One or more G-sensors provide additional sensing data to better process the hand gestures. Using a plurality of sensing technologies, complex hand gestures can be processed in the manner similar to real multi -touching discussed above.
d. Integrated SmartPhone capabilities - The wrist-device can be an integrated telecommunications device using cellular or satellite technology. The device can be configured to act as a mobile phone as well as including the hardware, firmware and software innovations currently know in the art. An exemplary non-limiting list of technologies include:
i. Digital camera
ii. GPS location finder
iii. Universal remote control for household and vehicle use. iv. PalmTM Notepad for freehand note taking.
v. PDA functionality. While the invention has been described with respect to a limited number of embodiments, it will be appreciated that many variations, modifications and other applications of the invention may be made. Therefore, the claimed invention as recited in the claims that follow is not limited to the embodiments described herein.

Claims

WHAT IS CLAIMED IS
1. A human user interface device, for wearing on an upper limb of a user, for interpreting gestures of a wrist and hand of the user, comprising:
(a) a set of at least one sensor; and
(b) a Central Processing Unit (CPU) configured to:
(i) receive an electrical signal from said at least one sensor;
(ii) convert said electrical data into a digital data set; and
(iii) output said digital data set to a destination device.
2. The device of claim 1, further comprising:
(c) a view screen, for viewing at least said digital data set.
3. The device of claim 2, wherein said view screen is a touchscreen.
4. The device of claim 3, wherein said touchscreen is an Input/Output (I/O) interface.
5. The device of claim 1 , further comprising:
(c) a set of at least one haptic actuator, at least for providing tactile feedback indicative of said outputting of said digital data set.
6. The device of claim 5, wherein said at least one haptic actuator further provides tactile feedback indicative of events selected from the group consisting of:
(i) an incoming cellular communication; and (ii) an incoming cellular communication from a predefined source, having a predefined feedback pattern.
7. The device of claim 1 , wherein said set of at least one sensor is selected from the group consisting:
(i) an electromyographic (EMG) sensor;
(ii) a tension sensor;
(iii) an ultrasonic sensor;
(iv) a G-sensor; and
(v) a mechanical sensor, said mechanical sensor operative to be positioned adjacent to a base of a palm of the user.
8. The device of claim 2, wherein said destination device includes said view screen.
9. The device of claim 1, wherein said destination device includes a remote computing device.
10. The device of claim 9, wherein said digital data set controls a cursor on said remote computing device.
11. The device of claim 9, wherein said digital data set controls a plurality of cursors on said remote computing device.
12. The device of claim 1, wherein said destination device includes a remote cellular device.
13. The device of claim 1 , further comprising:
(c) a telecommunications component.
14. The device of claim 13, wherein said destination device includes said telecommunications component.
15. The device of claim 1, wherein said digital data set is operative to be converted into analog sound waves.
16. A method of converting a hand and wrist gesture into a digital data set, the method comprising the steps of:
(a) receiving at least one electrical signal from a set of at least one sensor, by a Central Processing Unit (CPU);
(b) cross-referencing a pattern of said at least one electrical signal with a database of stored electrical signal patterns, each said stored electrical signal pattern having a related data set;
(c) receiving said related data set in said CPU; and
(d) outputting said related data set to a destination device.
17. The method of claim 16, wherein said set of at least one sensor is selected from the group consisting of:
(i) an electromyographic (EMG) sensor; (ii) a tension sensor;
(iii) an ultrasonic sensor;
(iv) a G-sensor; and
(v) a mechanical sensor, said mechanical sensor operative to be positioned adjacent to a base of a palm of the user.
PCT/IB2010/055007 2009-11-04 2010-11-04 Universal input/output human user interface WO2011055326A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US25785909P 2009-11-04 2009-11-04
US61/257,859 2009-11-04

Publications (1)

Publication Number Publication Date
WO2011055326A1 true WO2011055326A1 (en) 2011-05-12

Family

ID=43969639

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2010/055007 WO2011055326A1 (en) 2009-11-04 2010-11-04 Universal input/output human user interface

Country Status (1)

Country Link
WO (1) WO2011055326A1 (en)

Cited By (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013154864A1 (en) * 2012-04-09 2013-10-17 Qualcomm Incorporated Control of remote device based on gestures
CN103558918A (en) * 2013-11-15 2014-02-05 上海威璞电子科技有限公司 Gesture recognition scheme of smart watch based on arm electromyography
CN103581428A (en) * 2012-07-27 2014-02-12 Lg电子株式会社 Terminal and control method thereof
CN103631368A (en) * 2012-08-27 2014-03-12 联想(北京)有限公司 Detection device, detection method and electronic equipment
CN103645804A (en) * 2013-12-18 2014-03-19 三星电子(中国)研发中心 Method and device for identifying human body gestures as well as watch using device
CN103677232A (en) * 2012-09-17 2014-03-26 联想(北京)有限公司 Information processing method, action recognition device and electronic equipment
CN103677265A (en) * 2013-12-09 2014-03-26 中国科学院深圳先进技术研究院 Intelligent sensing glove and intelligent sensing method
CN103677236A (en) * 2012-09-18 2014-03-26 联想(北京)有限公司 Information processing method and electronic equipment
CN103777752A (en) * 2013-11-02 2014-05-07 上海威璞电子科技有限公司 Gesture recognition device based on arm muscle current detection and motion sensor
CN103853333A (en) * 2014-03-21 2014-06-11 上海威璞电子科技有限公司 Gesture control scheme for toy
WO2014093525A1 (en) * 2012-12-12 2014-06-19 Microsoft Corproation Wearable multi-modal input device for augmented reality
CN103941859A (en) * 2014-03-21 2014-07-23 上海威璞电子科技有限公司 Algorithm for differentiating different gestures through signal power
US20140368424A1 (en) * 2013-06-17 2014-12-18 Samsung Electronics Co., Ltd. Presentation device and method for operating the device
CN104238344A (en) * 2014-09-05 2014-12-24 青岛歌尔声学科技有限公司 Intelligent system and intelligent wrist watch
EP2843511A1 (en) * 2013-08-29 2015-03-04 LG Electronics, Inc. Mobile terminal and controlling method thereof
CN104460992A (en) * 2014-11-20 2015-03-25 大连理工大学 Finger movement detection device and method adopting infrared rays for irradiating intercarpal ligament
WO2015060856A1 (en) * 2013-10-24 2015-04-30 Bodhi Technology Ventures Llc Wristband device input using wrist movement
WO2015067315A1 (en) * 2013-11-08 2015-05-14 Marvel Digital Group Ltd. Method and arrangement for transmitting information from a sender to a receiver following sensory ascertainment of at least one muscle or tendon movement of the sender
CN104679229A (en) * 2013-11-27 2015-06-03 中国移动通信集团公司 Gesture recognition method and apparatus
WO2015123771A1 (en) * 2014-02-18 2015-08-27 Sulon Technologies Inc. Gesture tracking and control in augmented and virtual reality
KR20150112240A (en) * 2014-03-27 2015-10-07 엘지전자 주식회사 Mobile terminal and method for controlling the same
WO2015179262A1 (en) * 2014-05-23 2015-11-26 Microsoft Technology Licensing, Llc Finger tracking
US9214043B2 (en) 2013-03-04 2015-12-15 Here Global B.V. Gesture based map annotation
CN105631305A (en) * 2014-11-06 2016-06-01 联想(北京)有限公司 Information processing method and electronic device
WO2016114487A1 (en) * 2015-01-13 2016-07-21 주식회사 씨케이머티리얼즈랩 Haptic information provision device
CN105814517A (en) * 2015-09-23 2016-07-27 深圳还是威健康科技有限公司 Multi-screen display method and smart bracelet
JP2016143139A (en) * 2015-01-30 2016-08-08 アルパイン株式会社 Information processing apparatus, operation control system, and operation control method
EP3073351A1 (en) * 2015-03-26 2016-09-28 Lenovo (Singapore) Pte. Ltd. Controlling a wearable device using gestures
CN106020442A (en) * 2016-05-05 2016-10-12 云神科技投资股份有限公司 Sensing method for intelligent sensing glove
WO2016178445A1 (en) * 2015-05-07 2016-11-10 나시스 주식회사 User input control device using band
CN106155300A (en) * 2015-04-23 2016-11-23 宁波市美灵思医疗科技有限公司 A kind of based on myoelectricity stream and the human-computer interaction device of multi-sensor cooperation effect and using method
CN106293382A (en) * 2015-06-12 2017-01-04 联想(北京)有限公司 A kind of control method and device
US9582076B2 (en) 2014-09-17 2017-02-28 Microsoft Technology Licensing, Llc Smart ring
US9625884B1 (en) 2013-06-10 2017-04-18 Timothy Harris Ousley Apparatus for extending control and methods thereof
KR20170055942A (en) * 2015-01-13 2017-05-22 주식회사 씨케이머티리얼즈랩 Wearable device
EP3139246A4 (en) * 2014-05-16 2018-01-17 ZTE Corporation Control method and apparatus, electronic device, and computer storage medium
GB2552897A (en) * 2016-08-09 2018-02-14 Google Llc Haptic feedback mechanism for an interactive garment
US9946395B2 (en) 2015-02-16 2018-04-17 Samsung Electronics Co., Ltd. User interface method and apparatus
CN107943285A (en) * 2017-11-10 2018-04-20 上海交通大学 A kind of human-computer interaction wrist ring, system and method based on biological myoelectricity
US20180143697A1 (en) * 2015-07-17 2018-05-24 Korea Electronics Technology Institute Wearable device and method of inputting information using the same
US10372216B2 (en) 2016-03-04 2019-08-06 Nxp B.V. Gesture feedback
TWI710925B (en) * 2019-01-24 2020-11-21 宏碁股份有限公司 Multiplex sensing core and input device
CN112286341A (en) * 2020-06-02 2021-01-29 深圳市杰理微电子科技有限公司 Wearable device, audio control method and audio control system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050132290A1 (en) * 2003-10-17 2005-06-16 Peter Buchner Transmitting information to a user's body
US20060125806A1 (en) * 2004-09-27 2006-06-15 The Regents Of The University Of Minnesota Human-activated displacement control appliance for use with computerized device/mechanism
US20060206833A1 (en) * 2003-03-31 2006-09-14 Capper Rebecca A Sensory output devices
US20090153365A1 (en) * 2004-11-18 2009-06-18 Fabio Salsedo Portable haptic interface

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060206833A1 (en) * 2003-03-31 2006-09-14 Capper Rebecca A Sensory output devices
US20050132290A1 (en) * 2003-10-17 2005-06-16 Peter Buchner Transmitting information to a user's body
US20060125806A1 (en) * 2004-09-27 2006-06-15 The Regents Of The University Of Minnesota Human-activated displacement control appliance for use with computerized device/mechanism
US20090153365A1 (en) * 2004-11-18 2009-06-18 Fabio Salsedo Portable haptic interface

Cited By (73)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9170674B2 (en) 2012-04-09 2015-10-27 Qualcomm Incorporated Gesture-based device control using pressure-sensitive sensors
WO2013154864A1 (en) * 2012-04-09 2013-10-17 Qualcomm Incorporated Control of remote device based on gestures
JP2015512550A (en) * 2012-04-09 2015-04-27 クアルコム,インコーポレイテッド Gesture-based remote device control
CN104205015A (en) * 2012-04-09 2014-12-10 高通股份有限公司 Control of remote device based on gestures
CN103581428A (en) * 2012-07-27 2014-02-12 Lg电子株式会社 Terminal and control method thereof
US9753543B2 (en) 2012-07-27 2017-09-05 Lg Electronics Inc. Terminal and control method thereof
CN103581428B (en) * 2012-07-27 2016-03-30 Lg电子株式会社 Terminal and control method thereof
EP2698686A3 (en) * 2012-07-27 2015-03-11 LG Electronics Inc. Wrist-wearable terminal and control method thereof
CN103631368B (en) * 2012-08-27 2017-04-19 联想(北京)有限公司 Detection device, detection method and electronic equipment
CN103631368A (en) * 2012-08-27 2014-03-12 联想(北京)有限公司 Detection device, detection method and electronic equipment
CN103677232A (en) * 2012-09-17 2014-03-26 联想(北京)有限公司 Information processing method, action recognition device and electronic equipment
CN103677236A (en) * 2012-09-18 2014-03-26 联想(北京)有限公司 Information processing method and electronic equipment
WO2014093525A1 (en) * 2012-12-12 2014-06-19 Microsoft Corproation Wearable multi-modal input device for augmented reality
US9214043B2 (en) 2013-03-04 2015-12-15 Here Global B.V. Gesture based map annotation
US9625884B1 (en) 2013-06-10 2017-04-18 Timothy Harris Ousley Apparatus for extending control and methods thereof
KR102131358B1 (en) * 2013-06-17 2020-07-07 삼성전자주식회사 User interface device and method of operation of user interface device
KR20140146352A (en) * 2013-06-17 2014-12-26 삼성전자주식회사 User interface device and method of operation of user interface device
EP2816320A1 (en) * 2013-06-17 2014-12-24 Samsung Electronics Co., Ltd. Presentation device and method for operating the device
US9529434B2 (en) 2013-06-17 2016-12-27 Samsung Electronics Co., Ltd. Presentation device and method for operating the device
US20140368424A1 (en) * 2013-06-17 2014-12-18 Samsung Electronics Co., Ltd. Presentation device and method for operating the device
US9704386B2 (en) 2013-08-29 2017-07-11 Lg Electronics Inc. Mobile terminal and controlling method thereof
EP2843511A1 (en) * 2013-08-29 2015-03-04 LG Electronics, Inc. Mobile terminal and controlling method thereof
KR20150025385A (en) * 2013-08-29 2015-03-10 엘지전자 주식회사 Mobile terminal and controlling method thereof
KR102034587B1 (en) * 2013-08-29 2019-10-21 엘지전자 주식회사 Mobile terminal and controlling method thereof
CN104423581A (en) * 2013-08-29 2015-03-18 Lg电子株式会社 Mobile terminal and controlling method thereof
WO2015060856A1 (en) * 2013-10-24 2015-04-30 Bodhi Technology Ventures Llc Wristband device input using wrist movement
CN103777752A (en) * 2013-11-02 2014-05-07 上海威璞电子科技有限公司 Gesture recognition device based on arm muscle current detection and motion sensor
WO2015067315A1 (en) * 2013-11-08 2015-05-14 Marvel Digital Group Ltd. Method and arrangement for transmitting information from a sender to a receiver following sensory ascertainment of at least one muscle or tendon movement of the sender
CN103558918A (en) * 2013-11-15 2014-02-05 上海威璞电子科技有限公司 Gesture recognition scheme of smart watch based on arm electromyography
CN104679229A (en) * 2013-11-27 2015-06-03 中国移动通信集团公司 Gesture recognition method and apparatus
CN103677265A (en) * 2013-12-09 2014-03-26 中国科学院深圳先进技术研究院 Intelligent sensing glove and intelligent sensing method
CN103677265B (en) * 2013-12-09 2017-02-15 中国科学院深圳先进技术研究院 Intelligent sensing glove and intelligent sensing method
CN103645804A (en) * 2013-12-18 2014-03-19 三星电子(中国)研发中心 Method and device for identifying human body gestures as well as watch using device
WO2015123771A1 (en) * 2014-02-18 2015-08-27 Sulon Technologies Inc. Gesture tracking and control in augmented and virtual reality
CN103853333A (en) * 2014-03-21 2014-06-11 上海威璞电子科技有限公司 Gesture control scheme for toy
CN103941859A (en) * 2014-03-21 2014-07-23 上海威璞电子科技有限公司 Algorithm for differentiating different gestures through signal power
KR20150112240A (en) * 2014-03-27 2015-10-07 엘지전자 주식회사 Mobile terminal and method for controlling the same
KR102208115B1 (en) * 2014-03-27 2021-01-27 엘지전자 주식회사 Mobile terminal and method for controlling the same
EP3139246A4 (en) * 2014-05-16 2018-01-17 ZTE Corporation Control method and apparatus, electronic device, and computer storage medium
US10191543B2 (en) 2014-05-23 2019-01-29 Microsoft Technology Licensing, Llc Wearable device touch detection
US9594427B2 (en) 2014-05-23 2017-03-14 Microsoft Technology Licensing, Llc Finger tracking
WO2015179262A1 (en) * 2014-05-23 2015-11-26 Microsoft Technology Licensing, Llc Finger tracking
CN104238344A (en) * 2014-09-05 2014-12-24 青岛歌尔声学科技有限公司 Intelligent system and intelligent wrist watch
US9880620B2 (en) 2014-09-17 2018-01-30 Microsoft Technology Licensing, Llc Smart ring
US9582076B2 (en) 2014-09-17 2017-02-28 Microsoft Technology Licensing, Llc Smart ring
CN105631305A (en) * 2014-11-06 2016-06-01 联想(北京)有限公司 Information processing method and electronic device
CN104460992B (en) * 2014-11-20 2017-07-21 大连理工大学 The finger motion detection means and method of a kind of use infrared radiation carpal ligaments
CN104460992A (en) * 2014-11-20 2015-03-25 大连理工大学 Finger movement detection device and method adopting infrared rays for irradiating intercarpal ligament
KR102245419B1 (en) * 2015-01-13 2021-04-29 주식회사 씨케이머티리얼즈랩 Tactile Module
KR101735745B1 (en) * 2015-01-13 2017-05-29 주식회사 씨케이머티리얼즈랩 Wearable device
KR20170055942A (en) * 2015-01-13 2017-05-22 주식회사 씨케이머티리얼즈랩 Wearable device
US10204494B2 (en) 2015-01-13 2019-02-12 Ck Materials Lab Co., Ltd. Haptic information provision device
US11760375B2 (en) 2015-01-13 2023-09-19 Ck Materials Lab Co., Ltd. Haptic information provision device
WO2016114487A1 (en) * 2015-01-13 2016-07-21 주식회사 씨케이머티리얼즈랩 Haptic information provision device
JP2016143139A (en) * 2015-01-30 2016-08-08 アルパイン株式会社 Information processing apparatus, operation control system, and operation control method
US9946395B2 (en) 2015-02-16 2018-04-17 Samsung Electronics Co., Ltd. User interface method and apparatus
EP3073351A1 (en) * 2015-03-26 2016-09-28 Lenovo (Singapore) Pte. Ltd. Controlling a wearable device using gestures
CN106155300A (en) * 2015-04-23 2016-11-23 宁波市美灵思医疗科技有限公司 A kind of based on myoelectricity stream and the human-computer interaction device of multi-sensor cooperation effect and using method
WO2016178445A1 (en) * 2015-05-07 2016-11-10 나시스 주식회사 User input control device using band
CN106293382A (en) * 2015-06-12 2017-01-04 联想(北京)有限公司 A kind of control method and device
US20180143697A1 (en) * 2015-07-17 2018-05-24 Korea Electronics Technology Institute Wearable device and method of inputting information using the same
US10884504B2 (en) * 2015-07-17 2021-01-05 Korea Electronics Technology Institute Wearable wrist device and method of detecting a physical change in the epidermis and wirelessly inputting sensor information using the same
CN105814517A (en) * 2015-09-23 2016-07-27 深圳还是威健康科技有限公司 Multi-screen display method and smart bracelet
WO2017049480A1 (en) * 2015-09-23 2017-03-30 深圳还是威健康科技有限公司 Multi-screen display method and smart wristband
US10372216B2 (en) 2016-03-04 2019-08-06 Nxp B.V. Gesture feedback
CN106020442A (en) * 2016-05-05 2016-10-12 云神科技投资股份有限公司 Sensing method for intelligent sensing glove
US10318005B2 (en) 2016-08-09 2019-06-11 Google Llc Haptic feedback mechanism for an interactive garment
GB2552897B (en) * 2016-08-09 2021-01-13 Google Llc Haptic feedback mechanism for an interactive garment
GB2552897A (en) * 2016-08-09 2018-02-14 Google Llc Haptic feedback mechanism for an interactive garment
CN107943285A (en) * 2017-11-10 2018-04-20 上海交通大学 A kind of human-computer interaction wrist ring, system and method based on biological myoelectricity
CN107943285B (en) * 2017-11-10 2021-01-01 上海交通大学 Man-machine interaction wrist ring, system and method based on biological myoelectricity
TWI710925B (en) * 2019-01-24 2020-11-21 宏碁股份有限公司 Multiplex sensing core and input device
CN112286341A (en) * 2020-06-02 2021-01-29 深圳市杰理微电子科技有限公司 Wearable device, audio control method and audio control system

Similar Documents

Publication Publication Date Title
WO2011055326A1 (en) Universal input/output human user interface
JP5243967B2 (en) Information input using sensors attached to fingers
RU2662408C2 (en) Method, apparatus and data processing device
US9612661B2 (en) Closed loop feedback interface for wearable devices
WO2012070682A1 (en) Input device and control method of input device
CN105814511B (en) Offset plane wrist input device, method and computer readable medium
JP2008305199A (en) Input system and program
WO2002027456A1 (en) Wearable data input interface
WO2004114107A1 (en) Human-assistive wearable audio-visual inter-communication apparatus.
JP2008203911A (en) Pointing device and computer
CN102549531A (en) Processor interface
KR100499391B1 (en) Virtual input device sensed finger motion and method thereof
US20090289896A1 (en) Input arrangement for electronic devices
JP6687743B2 (en) Information transmitting / receiving apparatus and information transmitting / receiving method
US20200168121A1 (en) Device for Interpretation of Digital Content for the Visually Impaired
CN103631368A (en) Detection device, detection method and electronic equipment
US20050148870A1 (en) Apparatus for generating command signals to an electronic device
JP2008305198A (en) Input system and input device
KR101491039B1 (en) Input device for smart phone
JP2017037583A (en) Computer input system
JP2001166871A (en) Input device
KR20210051277A (en) Wearable Device for Motion Detecting and Method for Manufacturing Thereof
KR20080082207A (en) Finger tab script input device
JP4497717B2 (en) Data input system and method, and input device
EP2447808B1 (en) Apparatus for operating a computer using thoughts or facial impressions

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10828004

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10828004

Country of ref document: EP

Kind code of ref document: A1