WO2011147418A1 - A tongue sensor - Google Patents

A tongue sensor Download PDF

Info

Publication number
WO2011147418A1
WO2011147418A1 PCT/DK2011/050171 DK2011050171W WO2011147418A1 WO 2011147418 A1 WO2011147418 A1 WO 2011147418A1 DK 2011050171 W DK2011050171 W DK 2011050171W WO 2011147418 A1 WO2011147418 A1 WO 2011147418A1
Authority
WO
WIPO (PCT)
Prior art keywords
tongue
sensor
sensors
function
signal
Prior art date
Application number
PCT/DK2011/050171
Other languages
French (fr)
Inventor
Daniel Johansen
Lotte Najanguaq Søvsø Andreasen STRUIJK
Stig Jensen
Dejan B. Popovic
Original Assignee
Aalborg Universitet
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aalborg Universitet filed Critical Aalborg Universitet
Priority to EP11727110.6A priority Critical patent/EP2575699A1/en
Publication of WO2011147418A1 publication Critical patent/WO2011147418A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F4/00Methods or devices enabling patients or disabled persons to operate an apparatus or a device not forming part of the body 
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Definitions

  • the invention relates to body sensors and in particular to a tongue sensor.
  • Tong ue keyboards or input devices with pressure, optically, electrically or inductance sensitive sensors are known . Such devices may be used by disabled persons to control for example wheelchairs, prostheses, neural prostheses, robots and computers.
  • An inductive input device or tongue sensor is known from WO2009/138089 which discloses an input device comprising an inductor unit and a magnetic or
  • the ind uctor unit comprises at least one carrier, the carrier having an interaction surface and carries at least one coil, the coil being arranged essentially parallel to the interaction surface, wherein during operation the input device provides signals by means of ind uctive interaction between the at least one coil and the activation unit on the interaction surface, wherein the carrier is a printed circuit board incorporating the at least one coil, the coil comprising a number of turns forming a planar spiral .
  • the senor disclosed in WO2009/138089 provides the user with possibly a plurality of sensor coils, where the output signal from each coil can be used to control the function of external devices.
  • tongue sensors and other body sensors which may be used by disabled persons for improving their possibilities in daily life or in other fields.
  • a tongue sensor system for controlling one or more devices is presented, where the tongue sensor system comprises
  • At least a first sensor being responsive to tongue movements, where the first sensor is capable of generating at least one output level in response to the tongue movement, and
  • control system for controlling the one or more devices in dependence of the output from the first sensor.
  • the device may be a hand or a hand-and-arm prosthesis and the tongue sensor system may be a hand or a hand-and-arm prosthesis control system for the hand or hand-and-arm prosthesis.
  • the prosthesis system may allow use of five or more grasps, functions or controllable degrees of freedom.
  • the prosthesis system may comprise the one or more tongue sensors which may be connected to the control system, i.e. a prosthesis controller.
  • the tongue sensors may be placed in the upper palatal area of the user's mouth.
  • a tongue sensor which is capable of generating one or more output levels, e.g. voltage levels, in response to movement of the tongue.
  • a control system is provided for controlling a device or functions of a device, e.g. hand or finger movement of a hand prosthesis in dependence of the at least one output level of the tongue sensor.
  • first and second sensors being responsive to tongue movements, where the first sensor is configured to be located in a first activation zone in the mouth cavity where the tongue has a high tongue selectivity and the second sensor is configured to be located in a second activation zone in the mouth cavity where the tongue has a lower tongue selectivity,
  • each of the first and second sensors are capable of generating at least one output level in response to the tongue movement
  • the control system is capable of controlling the one or more devices in dependence of the output from the first and second sensors.
  • the first and second sensors may be any sensors of a sensor system capable of identifying first and second tongue positions.
  • the sensors are configured to be located in different activation zones in the mouth cavity and since different sensors may control different functions of the device it is possible to use a sensor located at a location where the tongue has a high selectivity for control of a frequently used function, and to use a sensor located at a location where the tongue has lower selectivity for control of a less frequently used function .
  • the tongue sensor system may be used for controlling prostheses such as a hand prosthesis where each grasp function, other functions and controllable degrees of freedom may be connected to one of the at least first and second sensors and the first and second sensor's corresponding independent output signals.
  • the grasp, function or controllable degree of freedom that is most frequently used may be allocated to the tongue sensor that is easiest to reach and activate with the tongue. If a sensor is connected to a grasp, activating the sensor will make the prosthesis pre-shape the position of fingers and then start the closing of the designated grasp. If a sensor is connected to the control of a degree of freedom, activating the sensor will make the specific joint flex/supinate or extend/pronate.
  • first and second sensors have been defined as sensors of a tongue sensor system
  • the sensors could also be sensors of other body sensor systems such as shoe sole sensors, eye gaze sensors or joysticks which a operable by legs, shoulders or other extremities of the body, brain sensors, or muscle sensors.
  • At least one of the first and second sensors are capable of generating at least first and second output levels in response to the tongue movement
  • the control system is capable of controlling a function of the one or more devices in dependence of the at least first and second output levels of one of the first and second sensors.
  • Providing a functionality of the controller to enable control of a device in dependence of the at least first and second output levels may advantageously enable control of e.g . velocity of a wheelchair or velocity or pressure of a finger of a hand-prosthesis in dependence of the output level of a sensor. Thereby a user may control e.g . velocity according to e.g . position of the tongue relative to a sensor or pressure of the tongue on a sensor.
  • control of the function of the one or more devices in dependence of the at least first and second output levels is configurable.
  • the user may program or set the controller to perform control of a given function in dependence of output levels, e.g . so that a given output level equivalent to a given tongue action (e.g . pressure) generates a control signal from the controller for invoking e.g . a particular pressure or velocity of a finger of a hand prosthesis.
  • the control system is capable of allocating control of at least first and second functions of the one or more devices to one of the first and second sensors.
  • the capability of allocating different functions to at least one of the first and second sensors has the advantage that functions or devices which need to be controlled most frequently or with greatest accuracy can be allocated to those sensors with highest selectivity, and conversely, functions which need to be controlled less frequently or with lower accuracy can be allocated to sensors with lower selectivity. Accordingly, the disabled person is able to utilize his or hers selectivity of the tongue or other body extremities optimally since a function, e.g . of the person's own choice, can be allocated to the sensor which best matches the requirements for that function . Accordingly, in relation to this embodiment of a tongue sensor system for controlling one or more devices may be provided where the tongue sensor system comprises
  • first and second sensors being responsive to tongue movements, where the first sensor is configured to be located in a first activation zone in the mouth cavity where the tongue has a high selectivity and the second sensor is configured to be located in a second activation zone in the mouth cavity where the tongue has a lower selectivity, and
  • control system capable of allocating control of at least first and second functions of the one or more devices to one of the first and second sensors.
  • control system may be capable of replacing an allocation of the first or second function to one of the first and second sensors with another function in response to a sensor signal from one of the sensors. Accordingly, the user may be able to change which function is allocated to a given sensor by activating some other sensor. For example, a function of a prosthesis allocated to a given sensor may be replaced with a control function of a wheelchair by activating a different sensor, e.g. a sensor located where the selectivity of the tongue is relatively low.
  • control system may have an input for receiving a secondary sensor signal not originating from tongue sensors, such as a
  • control system may have processing means capable of generating a control signal for controlling at least a third or a fourth function of one of the one or more devices in response to the secondary sensor signal.
  • functions and devices may be controlled by e.g. myoelectric signals (EMG), brain signals (EEG), or signals from eye sensors or other bioelectric sensors.
  • EMG myoelectric signals
  • EEG brain signals
  • EEG brain signals
  • specific functions e.g. specific arm prosthesis functions
  • myoelectric signals may be controlled both by myoelectric signals as well as sensor signals - or specific functions, e.g. specific arm prosthesis functions, may be controllable only by myoelectric signals.
  • the control system may be operable to select which functions are allocated to sensor signals and myoelectric signals. As an example, depending on the level of amputation different biological signals could be included to increase the intuitivism of the tongue control system.
  • EMG signals from biceps and triceps muscles could be used to control closing and opening of grasps of a hand prosthesis or to give a more intuitive means of controlling grasp force or joint movement speed.
  • generation of the control signal (from the secondary sensor signal) for controlling the third or the fourth function may be selectable in response to a sensor signal from one of the first and second sensors.
  • the user may select which function is allocated to a given secondary sensor signal such as a myoelectric sensor signal by activation of a given sensor. This may provide the user with a great flexibility to quickly replace one function to be controlled by the myoelectric signal with a number of other functions.
  • control system may comprise a secondary sensor control function for controlling which of the third or fourth function is allocated to the secondary sensor signal, where a change between allocations between the third and fourth function to the secondary sensor is invoked by a sensor signal from one of the tongue sensors.
  • the secondary sensor control function or control device may comprise a look-up table which links different functions similar to the third and fourth functions to different tongue sensors. The look-up table may be modified by the user or by service people.
  • the tongue sensor system may comprise a plurality of the first sensors configured to be located in the first activation zone and a plurality of the second sensors configured to be located in the second activation zone.
  • a typical maximum number of first and second sensors are within the range from ten to thirty.
  • control system is capable of combining sensor signals from a plurality of the first or second sensors into a single sensor signal. It may be advantageous to combine output signals from sensors into a single signal since this enables generation of sensors with larger activations areas.
  • control system is configurable via a user interface.
  • the user interface may be a computer implemented screen interface which is accessible via a normal computer keyboard, or via tongue sensors or other body sensors such as eye gaze sensors.
  • the one or more devices are one or more hand or hand-and-arm prostheses
  • the functions of the one or more prostheses comprise grasp functions for moving individual or groups of fingers
  • the controller is configured or configurable to control force, position, speed or acceleration of one of the finger functions in dependence of the at least first and second output levels so that a particular force, position, speed or acceleration is selectable by the user.
  • a second aspect of the invention relates to a prosthesis system for controlling movement of one or more parts of a hand-prosthesis, the prosthesis system comprises
  • the hand prosthesis may comprise one or more moveable or bendable fingers.
  • the hand prosthesis may be provided with motors to enable fingers to move or bend.
  • the motors may be controllable by control signals from the controller.
  • first and second motors may be driven according to output signals from first and second tongue sensors.
  • speed, position or finger-force may be controlled according to the output level of a tongue sensor. Accordingly, the user may be able to control the actions of the hand prosthesis with a high level of accuracy and flexibility since actions may be controllable in response to signals from first and second tongue sensors, and/or in response to first and second output levels. Possibly, the user may also be able to control actions of the hand prosthesis by use of secondary sensor signals in combination with signals from tongue sensors.
  • a third aspect of the invention relates to a method for controlling at least a first function of one or more devices, the method comprises - generating a first output signal with a first output level from a first tongue sensor in response to tongue movement of a user of the tongue sensor system,
  • An embodiment of the third aspect of the invention relates to a method for controlling at least the first function and a second function of the one or more devices, and the method further comprises
  • a grasp function of a hand-prosthesis may react in dependence of the first and second output levels to generate first and second grasp velocities or strengths in response to the respective first and second output levels.
  • Generating both first and second output signals with respective first and second, and third and fourth output levels and controlling first and second functions in dependence of the respective signals form the first and second output signals and their different output levels may enable control of e.g. first and second grasp functions so that each of the first and second grasp functions react in dependence of the different output levels, e.g. to generate different grasp velocities in dependence of the output levels.
  • Generating a first output signal with only a first output level and a second output signal with only a third output level enables a user to control first and second functions with first and second tongue sensors which are dedicated to the respective first and second functions, e.g . movement of different fingers or degrees of freedom of a hand-prosthesis.
  • a method for controlling at least first and second functions of one or more devices by use of a tongue sensor system comprising at least first and second sensors which generate corresponding first and second sensor signals in response to tongue movements, where the first sensor is located in a first activation zone in a mouth cavity where the tongue has a high tongue selectivity and the second sensor is located in a second activation zone in the mouth cavity where the tongue has a lower tongue selectivity, where the method comprises
  • a fourth aspect of the invention relates to a computer program enabling a processor to carry out the method according to the second aspect.
  • the invention relates to a tongue sensor system which enables the user to utilise the tongue's selectivity in a more advantageous way.
  • Specific functions or actions to be controlled e.g . functions of a prosthesis can be allocated to specific sensors of the array of tongue sensors. Therefore, it is possible to predefine the function of different sensors so that sensors located where the tongue has a high selectivity is predefined to control or activate functions which are often used or requires high selection accuracy.
  • the tongue sensors may be used directly to control devices, or the tongue sensors may be used indirectly to affect how other control systems should work.
  • a muscle control system which controls prostheses from myoelectric signals may be affected by tongue sensor outputs so that pre-activation of specific tongue sensors causes the muscle control system to control different functions in dependence of the activated tongue sensor.
  • the various aspects of the invention may be combined and coupled in any way possible within the scope of the invention.
  • Fig. 1 shows a tongue sensor system
  • Fig. 2 show a specific example of a tongue sensor
  • Fig. 3 illustrates use of a tongue sensor system in combination with an arm prosthesis
  • Fig. 4 illustrates the principle of a prosthesis or myoelectric controller for controlling a prosthesis using both myoelectric and tongue sensor signals
  • Fig. 5 shows experimental data from experiments where sensor signals from tongue sensors and muscle sensors are combined for operating a prosthesis (here a modeled prosthesis)
  • a prosthesis here a modeled prosthesis
  • Fig. 6 shows output signal from tongue sensors where the signals comprise at least first and second output levels or signal amplitudes.
  • Fig. 1 shows a tongue sensor system 100 for controlling one or more devices 151,152.
  • the tongue sensor system comprises at least a first sensor 111 and a second sensor 112.
  • the first and second sensors may be mounted on a common structure 101, e.g. a plate, shaped to fit the palate of the mouth cavity.
  • the first and second sensors 111,112 are responsive to the tongue, such as position, movements or pressures of the tongue.
  • the tongue may be provided with a ferrous, metallic or magnetic element on or in the tongue to enable detection of position and movement of the tongue.
  • each sensor 111, 112 may be an electro-magnetic coil which are responsive to tongue movements by generating induction currents or by changing induction properties from interaction by a ferrous or metal item which may have been pierced onto the tongue.
  • Such sensors are known from patent publication WO2009/138089 which is hereby incorporated by reference.
  • each sensor 111,112 may be a pressure sensitive switch which can be activated by applying a pressure from the tongue on the switch.
  • each sensor may be realized by a physical entity such as a coil or a switch.
  • an individual sensor or individual sensors may also be realized by non-physical entities, e.g . a position of the tongue may be sensed by a sensor system by measuring variations in a magnetic field caused by tongue movements by use of for example only three magnetic sensors.
  • the magnetic field may be generated by permanent magnets or electro-magnets.
  • only three sensors are used to sense variations in the magnetic field, e.g . ten positions of the tongue may be distinguishable and sensed.
  • Such a system may be configured as a headset provided with magnetic sensors located right outside the mouth close to the tongue.
  • the magnetic sensors e.g . coils, are able to detect the position of a magnet fixed to the tongue.
  • first and second sensors should be understood broadly to also include first and second tongue positions which are identifiable by a sensor system.
  • the one or more first sensors 111 are located in a first activation zone 115 in the mouth cavity, and the one or more second sensors 112 are located in a second activation zone 116.
  • the first activation zone is characteristic in the way that the tongue has a relative high selectivity in the first activation zone, whereas the second activation zone is characteristic in the way that the tongue has a lower selectivity. Accordingly, in the first activation zone the tongue is capable of activating sensors with a higher accuracy and possibly with a higher frequency in comparison with sensors located in the second activation zone. It is understood that the tongue's selectivity varies continuously as a function of position in the mouth cavity or locations on the palate.
  • the first and second activation zones should be understood broadly as any region in the mouth cavity or on the palate characterized by different tongue selectivities.
  • the first activity zone is located in the front part of the mouth cavity whereas the second activation zone is located in the back part of the mouth cavity. More specifically, the selectivity is higher the closer the tongue is to the center line and the front of the mouth .
  • the tongue sensor system 100 further comprises a control system 120 for allocating control of the at least first and second functions 161-164 of the one or more devices 151,152 to one of the first, second or other sensors 111,112.
  • control system 120 is provided with inputs 121 from receiving output signals from the tongue sensors 111-112 and outputs 123 for supply of control signals to the functions 161-164 and devices 151, 152.
  • Sensor signals may be transmitted wirelessly or by wire from the tongue sensors to the inputs 121.
  • control signals may be transmitted wirelessly or by wire from the outputs 123 to the devices 151, 152.
  • the control system 120 makes it possible to allocate different functions, e.g . at least two of the four functions 161-162, to anyone or more of the first sensors l l la-l l ld .
  • each individual first sensor l l la-l l ld and second sensor 112a-112d may be assigned anyone of a plurality of the functions 161-164.
  • control system 120 to allocate different functions to the sensors 111,112 enables certain functions 161-164 to be allocated to sensors 111 located in the first activity zone 115 and other functions to be allocated to sensors 112 located in the second activity zone 116.
  • functions 161-164 which are often used or require high activation accuracy can be associated with sensors located in the first activity zone, and sensors which are less often used or do not require high accuracy can be associated to sensors located in the second activity zone.
  • an arm prosthesis having controllable functions to generate hand closing/opening, elbow flexion and wrist rotation could be controlled by the tongue sensors.
  • the functions of hand closing and opening could be associated individually to sensors 111a and 111b, respectively in the first activation zone 115
  • wrist rotation could be associated with sensor 111c
  • the function of elbow flexion could be allocated to sensor 112a in the second activation zone 116.
  • the closing and opening operation may also be assigned to a single sensor, e.g . by distinguishing between the periods of time the sensor is activated - e.g . a short activation time implies closing and a long activation time implies opening .
  • control system 120 of enabling allocation of different functions to different sensors may be implemented in software on a computer or firmware or hardware on an electronic circuit.
  • control system is configurable via a user interface, e.g . on a monitor connected to the computer or electronic circuit, so that the user is able to allocate certain functions 161-164 to sensors 111-112 of the user's own selection and preference.
  • the control system 120 may have a function enabling the user to easily change between allocations of different functions 161-164 to a given sensor 111-112 by use of one of the sensors 111-112.
  • each one of a set of functions e.g .
  • function 161 of device 151 and function 163 of device 152 may, one at a time, be allocated to a given sensor such as sensor 111a.
  • the replacement of an allocation of function 161 with function 163 to a given sensor may be invoked by activation of a sensor, e.g . a sensor 112d located in the second activation zone 116.
  • the activation of a sensor can be performed by applying a force to a pressure sensitive sensor, by moving a ferrous or metal item pierced onto a tongue towards an induction sensitive sensor or in other ways. Activations of a sensor may be differentiated according to the duration of which a sensor is affected . Thus, a short activation of a sensor may be interpreted by the control system 120 to cause a first action, e.g . transmission of a control signal to a device 151-152, whereas a prolonged activation would cause a second action, e.g . replacement of an allocation of a first function 161 with a second function 162.
  • the user may wish to change the allocation of a function of an arm prosthesis to sensor 111a so that instead a function of a wheelchair is allocated to sensor 111a. This could be done by activating sensor 112d or for example by activating sensor 111a for a certain period of time such as five seconds.
  • control system to replace an allocation of a first or a second function 161-164 to one of the first and second sensors 111-112 with another function in response to a sensor signal from one of the sensors allows the user to easily select which function should be assigned to a given sensor.
  • control system may have a user interface enabling the user to select which functions 161-164 should be associated with a given sensor 111-112.
  • control system enables the user to select and associate two functions to a given sensor 111a. Accordingly, with two functions 161, 163 associated to a sensor 111a, the user may simply toggle between the two functions by activation of one of the sensors 111-112.
  • control system is capable of combining sensor signals from a plurality of the first or second sensors into a single sensor signal.
  • sensors 111a and 111c may be combined into a single sensor by combining the output sensor signal generated by each sensor into a single sensor signal, e.g . by simply adding the two sensor signals.
  • a number of sensors may be combined into a single sensor.
  • Fig . 2 shows an example of tongue sensor 201 comprising six first sensors 211a- 211f, four second sensors 212a-212d and eight third sensors 213a-213h .
  • the first sensors 211a-211f, the second sensors 212a-212d and the third sensors 213a- 213h are located at positions with different selectivities so that the selectivity of the first sensors is higher than the selectivity of the second and third sensors, and the selectivity of the second sensors is higher than the selectivity of the third sensors.
  • the total number of eighteen sensors 211-213 may not be needed by the user and therefore some of the sensors have been combined .
  • sensors 211a and 211d have been combined into a single sensor 221, and sensor pairs 221b and 211e, 211c and 211f, 212a and 212b, 212c and 212d have combined into corresponding sensors.
  • sensor pairs 221b and 211e, 211c and 211f, 212a and 212b, 212c and 212d have combined into corresponding sensors.
  • any number of sensors may be combined into a single sensor.
  • Each of the three groups of first sensors 211a-211f, second sensors 212a-212d and third sensors 213a-213h may be located on individual prints or any of the groups may be located on a common print.
  • the sensor 201 in Fig. 2 comprises eight sensors 213a-213h which are arranged to form a joystick where each of the sensors is allocated to a given displacement function, for example forward and backward direction functions of a wheelchair may be allocated to sensors 213h and 213d, respectively, and left and right turns of a wheel chair may be allocated to sensors 213b and 213f, respectively.
  • the joystick function of the touch sensor 201 may also be used for example to control movements of the thumb of a hand prosthesis.
  • Fig. 3 illustrates an advantageous use of the tongue control system in
  • a myo-electric signal 301 may be generated in response to activation of muscles e.g. the biceps. This signal could be used by an arm prosthesis for flexion of the elbow as illustrated in picture A. However, the myoelectric signal 301 may also be used for controlling other actuators of a
  • prosthesis e.g. for opening/closing of the hand of an arm prosthesis.
  • myo-electric signal 301 cannot readily be decoded to control different functions, the user may have to generate a sequence of muscle contractions, e.g. one contraction to activate the elbow actuator and two sequential contractions to activate the hand closing actuator.
  • a prosthesis may contain a larger number of actuators arranged to control different degrees of freedom, use of myo-electric signals for activating different actuators or functions may be impractical, in particular if more than two or three functions should be controlled.
  • the tongue control system 100 may be exploited to gain the full benefit of myo-electric control of prostheses and other devices by using a myo-electric signal 301 in combination with output signals from the tongue sensors 111-112, 211-213.
  • myoelectrical signals from myoelectrical sensors but also signals from other body sensors or bioelectrical sensors may be used in
  • Such other sensors and their sensor signals comprise eye gaze sensors, brain (EEG) sensor, and other sensors described elsewhere in this description .
  • the control system 120 is additionally provided with one or more input terminals 122 for receiving myoelectric signals from the user of the tongue sensor.
  • the control system is provided with processing means for generating a control signal for controlling at least a third or a fourth function such as different movement functions of a prosthesis in response to the myoelectric signal .
  • the processing means may filter the noisy myoelectric signal and convert it to a signal useful for driving or controlling actuators of a prosthesis.
  • the myoelectric signals may be used for controlling functions of other device as well, e.g . a computer or a wheelchair.
  • control signals for controlling different functions or devices by myoelectric signals is controllable or selectable in response to the sensor signal from one of the first and second sensors. That is, by activating of a sensor 111-112 the user can determine what function 161-164 will be controlled in response to a myoelectric signal .
  • Fig . 3 illustrates how the sensor 111-112 can be used to select what function 161- 164 will be controlled by a myoelectric signal .
  • the prosthesis controller may be comprised by the control system 120 or connected thereto
  • generation of a myoelectric signal from the biceps causes the control system 120 to generate a control or driving signal to a prosthesis actuator for flexion of the elbow as illustrated in picture A.
  • activating one or more of the tongue sensors 111-112 e.g . by activating a tongue sensor for an extended period of time (see picture B), e.g .
  • the state of the prosthesis controller is changed so that generation of the same myoelectric signal from the biceps causes the control system 120 to generate a control or driving signal to a prosthesis actuator for opening or closing of the hand as illustrated in picture C.
  • Myoelectric signals from the biceps and triceps may be used for respective closing and opening operations of the hand .
  • co-contractions i.e. simultaneous contractions of the biceps and triceps muscles.
  • each function could be selected by performing a number of such co-contractions sequentially. That is, a third function of the prosthesis could be selected by performing three co- contractions sequentially, and a fourth function could be selected by performing four co-contractions.
  • Fig . 4 illustrates the prosthesis controller 401 - which is referred to as a prosthesis controller 401 only for convenience - for controlling prostheses or other devices in response to a myoelectric signal 301 and where the actual function or device 402 to be controlled by the myoelectric signal 301 may be determined directly by the user by the user's activation of one or more of the first and/or second tongue sensors 111-112.
  • myoelectric signal enables the user to quickly select any one or more functions 161-164,402 from a group of functions being controllable by a myoelectric signal or different bioelectric signals, e.g . by a single activation of a tongue sensor 111- 112.
  • Fig . 5 shows experimental data from experiments where a computer modeled prosthesis with five functions 511-515 are activated .
  • the five functions 511-515 are activated in turn using only myoelectric signals.
  • the first function 511 is activated by a single co-contraction .
  • the second function 512 is selected by performing two muscle co-contraction, the third function 513 is selected by performing three muscle co-contractions, and so forth.
  • the first function 511 requires approx. 0.9 seconds to be invoked
  • the second function requires approx 1.6 seconds to be selected and invoked
  • the third function requires approx 2 seconds to be selected and activated
  • the fourth function requires approx 3 seconds to be selected and activated
  • the fifth function requires approx 3.5 seconds to be selected and activated.
  • the time required to select a function is seen to increase approximately linearly, with the reaction time offset included in the time for invoking the first function.
  • the first to fifth function 511-515 are selected by use of the tongue sensors 111-112 and activated by generation of a myoelectric signal 301.
  • Five different tongue sensors 111-112 are used to select individual functions 511-515.
  • the time used for selection of a function is approximately constant and particularly for functions three to five the required time for a selection and activation is lower in experiment 502 where tongue sensors are combined with myoelectric signals than in experiment 501.
  • Fig. 6 shows a graph 601 with output signals 602, 603 from a tongue sensor (111-112,211-213).
  • the curve 602 represents an analogue output signal and curve 603 represents a digital or discrete output signal.
  • Each of the output signals 602, 603 comprises at least a first output level 611,612 and second output level 613, 614.
  • the different output levels 611- 614 may be generated in response to the tongue movement, tongue pressure, tongue velocity or the period of time a given sensor is activated by the tongue.
  • the multiple output levels may be generated by pressure sensors which are sensitive to different pressures.
  • the multiple output levels may also be generated by a position sensitive tongue sensor or sensors, where a position generates a given output level.
  • the sensors may generate continuous analogue or
  • a sensor is allocated to given function, for example a grasp or a
  • the plurality of output levels 611-614 of the output signal 602, 603 can be used for controlling e.g. force or speed of the closing of this specific grasp or movement. This may be achieved by direct control of the amount of current sent to electro motors driving fingers or joints of the prosthesis, where the amount of current is set in dependence of the amplitude of the detected output level 611-614.
  • the range of motion of a joint is mapped onto the amplitude of output levels of the sensor output to allow for direct control of the position of a joint.
  • the sensor signal comprises at least two detectable output levels, preferably at least five output levels or more preferred at least eight output levels the user may experience the control of a function as directly related to e.g . tongue movements.
  • the utilization of a plurality of output levels from a single sensor or the utilization of a plurality of output levels from each of a plurality of sensors may reduce the amount of sensors needed to control a joint to one sensor instead of two since multiple functions or multiple settings of a function can be associated with the multiple output levels.
  • connections between any of the tongue sensors 111-112 and sensors for measuring myoelectric potentials to inputs 121-122 of the control system 120 may be wired and/or wireless. Wireless connections may be
  • connections between any of the outputs 123 of the control system 120 to the devices and functions 151-151, 161-164, 402 may be wired and/or wireless.
  • first, second, third and fourth functions should not be understood in any limiting way. I.e. the first and second functions may be the same as the third and fourth functions.
  • the first, second, third and fourth functions may be function of devices 151 and 152, a single device, three devices or four devices.
  • the tongue control system may be used for controlling any number of functions and devices.
  • the control system 120 may be implemented in software as computer program instructions to be carried out by a processor or computer when executed on said processor or computer.
  • the software may be available on a solid storage medium such as a DVD, from the internet or other broadcasting services or networks. Accordingly, the software may be read from any of these media into a memory or a processor as a full set of program instructions corresponding to aspects of this invention or as an update of program instructions to an existing program.
  • embodiments of the invention may be implemented as firmware.
  • control system 120 the myoelectric controller 401, other embodiments or parts of such embodiments, systems and controllers may be implemented in hardware as electronic components and circuits.
  • tongue sensors 111-112 aspects and embodiments of the invention may be extended to other body sensors as well.
  • the tongue sensors may be replaced or combined with body sensors which can be placed elsewhere on the body and which are sensitive to other actions of the body.
  • body sensors are sensors integrated in a sole of a shoe, joystick sensors which are operable by shoulder or head movements and eye gaze sensors.
  • integrated sole sensors may be assigned different functions, similarly the tongue sensors, in dependence of the selectivity of different sole sensors.
  • principles may apply to shoulder and eye gaze sensors.
  • the sensor system is primarily intended for disabled persons, the sensor system may also be used in other fields. E.g. a pilot may use body sensors to improve control of the airplane.
  • a computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems. Any reference signs in the claims should not be construed as limiting the scope.

Abstract

The invention relates to a tongue sensor system which enables the user to utilise the tongue's selectivity in a more optimal way. Specific functions or actions to be controlled, e.g. functions of a prosthesis can be allocated to specific sensors of the array of tongue sensors. Therefore, it is possible to predefine the function of different sensors so that sensors located where the tongue has a high selectivity is predefined to control or activate functions which are often used or requires high selection accuracy. The tongue sensors may be used directly to control devices, or the tongue sensors may be used indirectly to affect how other control systems should work. For example a control system for prostheses based on myoelectric signals may be affected by tongue sensor outputs so that pre-activation of specific tongue sensors causes the muscle control system to control different functions in dependence of the activated tongue sensor.

Description

A TONGUE SENSOR
FIELD OF THE INVENTION
The invention relates to body sensors and in particular to a tongue sensor.
BACKGROUND OF THE INVENTION
Tong ue keyboards or input devices with pressure, optically, electrically or inductance sensitive sensors are known . Such devices may be used by disabled persons to control for example wheelchairs, prostheses, neural prostheses, robots and computers.
An inductive input device or tongue sensor is known from WO2009/138089 which discloses an input device comprising an inductor unit and a magnetic or
conductive activation unit, the ind uctor unit comprises at least one carrier, the carrier having an interaction surface and carries at least one coil, the coil being arranged essentially parallel to the interaction surface, wherein during operation the input device provides signals by means of ind uctive interaction between the at least one coil and the activation unit on the interaction surface, wherein the carrier is a printed circuit board incorporating the at least one coil, the coil comprising a number of turns forming a planar spiral .
Thus, the sensor disclosed in WO2009/138089 provides the user with possibly a plurality of sensor coils, where the output signal from each coil can be used to control the function of external devices.
Even though the sensors in WO2009/138089 enables control of various functions and devices, the inventor of the present invention has appreciated that
improvements and better exploitations of tongue sensors would be of benefit, and has in consequence devised the present invention .
SUMMARY OF THE INVENTION
It would be advantageous to achieve improvements of tongue sensors and other body sensors which may be used by disabled persons for improving their possibilities in daily life or in other fields. In general it may be seen as an object of the present invention to provide a method that provides improvements in relation to the above mentioned reference, or other improvements, of the prior art.
To better address one or more of these concerns, in a first aspect of the invention a tongue sensor system for controlling one or more devices is presented, where the tongue sensor system comprises
- at least a first sensor being responsive to tongue movements, where the first sensor is capable of generating at least one output level in response to the tongue movement, and
- a control system for controlling the one or more devices in dependence of the output from the first sensor.
The device may be a hand or a hand-and-arm prosthesis and the tongue sensor system may be a hand or a hand-and-arm prosthesis control system for the hand or hand-and-arm prosthesis. The prosthesis system may allow use of five or more grasps, functions or controllable degrees of freedom. The prosthesis system may comprise the one or more tongue sensors which may be connected to the control system, i.e. a prosthesis controller. The tongue sensors may be placed in the upper palatal area of the user's mouth.
According to the first aspect a tongue sensor is provided which is capable of generating one or more output levels, e.g. voltage levels, in response to movement of the tongue. A control system is provided for controlling a device or functions of a device, e.g. hand or finger movement of a hand prosthesis in dependence of the at least one output level of the tongue sensor.
In an embodiment the tongue sensor system comprises
- at least first and second sensors being responsive to tongue movements, where the first sensor is configured to be located in a first activation zone in the mouth cavity where the tongue has a high tongue selectivity and the second sensor is configured to be located in a second activation zone in the mouth cavity where the tongue has a lower tongue selectivity,
- where each of the first and second sensors are capable of generating at least one output level in response to the tongue movement, and - where the control system is capable of controlling the one or more devices in dependence of the output from the first and second sensors.
The first and second sensors may be any sensors of a sensor system capable of identifying first and second tongue positions.
Since the sensors are configured to be located in different activation zones in the mouth cavity and since different sensors may control different functions of the device it is possible to use a sensor located at a location where the tongue has a high selectivity for control of a frequently used function, and to use a sensor located at a location where the tongue has lower selectivity for control of a less frequently used function .
As an example, the tongue sensor system may be used for controlling prostheses such as a hand prosthesis where each grasp function, other functions and controllable degrees of freedom may be connected to one of the at least first and second sensors and the first and second sensor's corresponding independent output signals. Furthermore, based on the selectivity of the tongue, the grasp, function or controllable degree of freedom that is most frequently used, may be allocated to the tongue sensor that is easiest to reach and activate with the tongue. If a sensor is connected to a grasp, activating the sensor will make the prosthesis pre-shape the position of fingers and then start the closing of the designated grasp. If a sensor is connected to the control of a degree of freedom, activating the sensor will make the specific joint flex/supinate or extend/pronate.
Whereas the first and second sensors have been defined as sensors of a tongue sensor system, the sensors could also be sensors of other body sensor systems such as shoe sole sensors, eye gaze sensors or joysticks which a operable by legs, shoulders or other extremities of the body, brain sensors, or muscle sensors.
In an embodiment at least one of the first and second sensors are capable of generating at least first and second output levels in response to the tongue movement, and - the control system is capable of controlling a function of the one or more devices in dependence of the at least first and second output levels of one of the first and second sensors. Providing a functionality of the controller to enable control of a device in dependence of the at least first and second output levels may advantageously enable control of e.g . velocity of a wheelchair or velocity or pressure of a finger of a hand-prosthesis in dependence of the output level of a sensor. Thereby a user may control e.g . velocity according to e.g . position of the tongue relative to a sensor or pressure of the tongue on a sensor.
In embodiment the control of the function of the one or more devices in dependence of the at least first and second output levels is configurable.
According to this embodiment the user may program or set the controller to perform control of a given function in dependence of output levels, e.g . so that a given output level equivalent to a given tongue action (e.g . pressure) generates a control signal from the controller for invoking e.g . a particular pressure or velocity of a finger of a hand prosthesis. In an embodiment the control system is capable of allocating control of at least first and second functions of the one or more devices to one of the first and second sensors.
The capability of allocating different functions to at least one of the first and second sensors has the advantage that functions or devices which need to be controlled most frequently or with greatest accuracy can be allocated to those sensors with highest selectivity, and conversely, functions which need to be controlled less frequently or with lower accuracy can be allocated to sensors with lower selectivity. Accordingly, the disabled person is able to utilize his or hers selectivity of the tongue or other body extremities optimally since a function, e.g . of the person's own choice, can be allocated to the sensor which best matches the requirements for that function . Accordingly, in relation to this embodiment of a tongue sensor system for controlling one or more devices may be provided where the tongue sensor system comprises
- at least first and second sensors being responsive to tongue movements, where the first sensor is configured to be located in a first activation zone in the mouth cavity where the tongue has a high selectivity and the second sensor is configured to be located in a second activation zone in the mouth cavity where the tongue has a lower selectivity, and
- a control system capable of allocating control of at least first and second functions of the one or more devices to one of the first and second sensors.
In an embodiment the control system may be capable of replacing an allocation of the first or second function to one of the first and second sensors with another function in response to a sensor signal from one of the sensors. Accordingly, the user may be able to change which function is allocated to a given sensor by activating some other sensor. For example, a function of a prosthesis allocated to a given sensor may be replaced with a control function of a wheelchair by activating a different sensor, e.g. a sensor located where the selectivity of the tongue is relatively low.
In an embodiment the control system may have an input for receiving a secondary sensor signal not originating from tongue sensors, such as a
myoelectric signal, recorded from the user of the tongue sensor, and the control system may have processing means capable of generating a control signal for controlling at least a third or a fourth function of one of the one or more devices in response to the secondary sensor signal.
Thus, in addition to tongue sensor signals, functions and devices may be controlled by e.g. myoelectric signals (EMG), brain signals (EEG), or signals from eye sensors or other bioelectric sensors. The same functions may be controlled both by myoelectric signals as well as sensor signals - or specific functions, e.g. specific arm prosthesis functions, may be controllable only by myoelectric signals. Accordingly, the control system may be operable to select which functions are allocated to sensor signals and myoelectric signals. As an example, depending on the level of amputation different biological signals could be included to increase the intuitivism of the tongue control system. If having a long upper arm amputation EMG signals from biceps and triceps muscles could be used to control closing and opening of grasps of a hand prosthesis or to give a more intuitive means of controlling grasp force or joint movement speed. In an embodiment generation of the control signal (from the secondary sensor signal) for controlling the third or the fourth function may be selectable in response to a sensor signal from one of the first and second sensors. Thus, the user may select which function is allocated to a given secondary sensor signal such as a myoelectric sensor signal by activation of a given sensor. This may provide the user with a great flexibility to quickly replace one function to be controlled by the myoelectric signal with a number of other functions.
In an embodiment the control system may comprise a secondary sensor control function for controlling which of the third or fourth function is allocated to the secondary sensor signal, where a change between allocations between the third and fourth function to the secondary sensor is invoked by a sensor signal from one of the tongue sensors. The secondary sensor control function or control device may comprise a look-up table which links different functions similar to the third and fourth functions to different tongue sensors. The look-up table may be modified by the user or by service people.
In an embodiment the tongue sensor system may comprise a plurality of the first sensors configured to be located in the first activation zone and a plurality of the second sensors configured to be located in the second activation zone. A typical maximum number of first and second sensors are within the range from ten to thirty.
In an embodiment the control system is capable of combining sensor signals from a plurality of the first or second sensors into a single sensor signal. It may be advantageous to combine output signals from sensors into a single signal since this enables generation of sensors with larger activations areas. Thus,
combination of two sensors into one may help the user in selecting the sensor with greater accuracy. In an embodiment the control system is configurable via a user interface. The user interface may be a computer implemented screen interface which is accessible via a normal computer keyboard, or via tongue sensors or other body sensors such as eye gaze sensors.
In an embodiment of the tongue sensor system, the one or more devices are one or more hand or hand-and-arm prostheses, the functions of the one or more prostheses comprise grasp functions for moving individual or groups of fingers, and the controller is configured or configurable to control force, position, speed or acceleration of one of the finger functions in dependence of the at least first and second output levels so that a particular force, position, speed or acceleration is selectable by the user.
A second aspect of the invention relates to a prosthesis system for controlling movement of one or more parts of a hand-prosthesis, the prosthesis system comprises
- the tongue sensor according to the embodiment defined in claim 13, and
- a hand prosthesis. The hand prosthesis may comprise one or more moveable or bendable fingers. The hand prosthesis may be provided with motors to enable fingers to move or bend. The motors may be controllable by control signals from the controller. Thus, first and second motors may be driven according to output signals from first and second tongue sensors. Optionally, speed, position or finger-force may be controlled according to the output level of a tongue sensor. Accordingly, the user may be able to control the actions of the hand prosthesis with a high level of accuracy and flexibility since actions may be controllable in response to signals from first and second tongue sensors, and/or in response to first and second output levels. Possibly, the user may also be able to control actions of the hand prosthesis by use of secondary sensor signals in combination with signals from tongue sensors.
A third aspect of the invention relates to a method for controlling at least a first function of one or more devices, the method comprises - generating a first output signal with a first output level from a first tongue sensor in response to tongue movement of a user of the tongue sensor system,
- controlling the first function in dependence of the output from the first sensor. An embodiment of the third aspect of the invention relates to a method for controlling at least the first function and a second function of the one or more devices, and the method further comprises
- generating anyone or more of a), b) and c),
- a) a first output signal with a second output level from a first tongue sensor in response to tongue movement of a user of the tongue sensor system,
- b) a second output signal with a third output level from a second tongue sensor in response to tongue movement of a user of the tongue sensor system, and
- c) a second output signal with a fourth output level from a second tongue sensor in response to tongue movement of a user of the tongue sensor system,
- controlling the first function in dependence of the first output signal with the first output level and/or the first output signal with a second output level, and/or
- controlling the second function in dependence of the second output signal with a third output level, and/or the second output signal with a fourth output level. Generating a first output signal with both first and second output levels and controlling any of the first and second functions in dependence of the first and second output levels enables the user to control a function, e.g. a function of a hand and arm prosthesis, so that the function reacts in dependence of the first and second levels. For example, a grasp function of a hand-prosthesis may react in dependence of the first and second output levels to generate first and second grasp velocities or strengths in response to the respective first and second output levels.
Generating both first and second output signals with respective first and second, and third and fourth output levels and controlling first and second functions in dependence of the respective signals form the first and second output signals and their different output levels may enable control of e.g. first and second grasp functions so that each of the first and second grasp functions react in dependence of the different output levels, e.g. to generate different grasp velocities in dependence of the output levels. Generating a first output signal with only a first output level and a second output signal with only a third output level, enables a user to control first and second functions with first and second tongue sensors which are dedicated to the respective first and second functions, e.g . movement of different fingers or degrees of freedom of a hand-prosthesis.
In relation to the third aspect a method for controlling at least first and second functions of one or more devices by use of a tongue sensor system may be provided where the tongue sensor system comprises at least first and second sensors which generate corresponding first and second sensor signals in response to tongue movements, where the first sensor is located in a first activation zone in a mouth cavity where the tongue has a high tongue selectivity and the second sensor is located in a second activation zone in the mouth cavity where the tongue has a lower tongue selectivity, where the method comprises
- selecting one of at least first and second functions and allocating control of the selected function to one of the first and second sensor signals of the respective first and second sensors. A fourth aspect of the invention relates to a computer program enabling a processor to carry out the method according to the second aspect.
In summary the invention relates to a tongue sensor system which enables the user to utilise the tongue's selectivity in a more advantageous way. Specific functions or actions to be controlled, e.g . functions of a prosthesis can be allocated to specific sensors of the array of tongue sensors. Therefore, it is possible to predefine the function of different sensors so that sensors located where the tongue has a high selectivity is predefined to control or activate functions which are often used or requires high selection accuracy. The tongue sensors may be used directly to control devices, or the tongue sensors may be used indirectly to affect how other control systems should work. For example a muscle control system which controls prostheses from myoelectric signals may be affected by tongue sensor outputs so that pre-activation of specific tongue sensors causes the muscle control system to control different functions in dependence of the activated tongue sensor. In general the various aspects of the invention may be combined and coupled in any way possible within the scope of the invention. These and other aspects, features and/or advantages of the invention will be apparent from and elucidated with reference to the embodiments described hereinafter.
BRIEF DESCRIPTION OF THE DRAWINGS
Embodiments of the invention will be described, by way of example only, with reference to the drawings, in which
Fig. 1 shows a tongue sensor system,
Fig. 2 show a specific example of a tongue sensor,
Fig. 3 illustrates use of a tongue sensor system in combination with an arm prosthesis,
Fig. 4 illustrates the principle of a prosthesis or myoelectric controller for controlling a prosthesis using both myoelectric and tongue sensor signals, Fig. 5 shows experimental data from experiments where sensor signals from tongue sensors and muscle sensors are combined for operating a prosthesis (here a modeled prosthesis), and
Fig. 6 shows output signal from tongue sensors where the signals comprise at least first and second output levels or signal amplitudes.
DESCRIPTION OF EMBODIMENTS
Fig. 1 shows a tongue sensor system 100 for controlling one or more devices 151,152. The tongue sensor system comprises at least a first sensor 111 and a second sensor 112. The first and second sensors may be mounted on a common structure 101, e.g. a plate, shaped to fit the palate of the mouth cavity. The first and second sensors 111,112 are responsive to the tongue, such as position, movements or pressures of the tongue. The tongue may be provided with a ferrous, metallic or magnetic element on or in the tongue to enable detection of position and movement of the tongue.
For example each sensor 111, 112 may be an electro-magnetic coil which are responsive to tongue movements by generating induction currents or by changing induction properties from interaction by a ferrous or metal item which may have been pierced onto the tongue. Such sensors are known from patent publication WO2009/138089 which is hereby incorporated by reference. As another example, each sensor 111,112 may be a pressure sensitive switch which can be activated by applying a pressure from the tongue on the switch.
Accordingly, each sensor may be realized by a physical entity such as a coil or a switch. However, an individual sensor or individual sensors may also be realized by non-physical entities, e.g . a position of the tongue may be sensed by a sensor system by measuring variations in a magnetic field caused by tongue movements by use of for example only three magnetic sensors. The magnetic field may be generated by permanent magnets or electro-magnets. Accordingly, whereas only three sensors are used to sense variations in the magnetic field, e.g . ten positions of the tongue may be distinguishable and sensed. Such a system may be configured as a headset provided with magnetic sensors located right outside the mouth close to the tongue. The magnetic sensors, e.g . coils, are able to detect the position of a magnet fixed to the tongue.
As another example, four electromagnetic coils may be located in the mouth cavity. A ferrous or metal item pierced onto the tongue alters the current in each coil in dependence of the position of the ferrous or metal item relative to the coils. Accordingly, the position of the tongue may be determined by comparing currents of the coils. In this way a number of tongue positions, e.g. fifteen positions, may be identifiable. Each identifiable position could correspond to an individual sensor. Thus, first and second sensors should be understood broadly to also include first and second tongue positions which are identifiable by a sensor system.
The one or more first sensors 111 are located in a first activation zone 115 in the mouth cavity, and the one or more second sensors 112 are located in a second activation zone 116. The first activation zone is characteristic in the way that the tongue has a relative high selectivity in the first activation zone, whereas the second activation zone is characteristic in the way that the tongue has a lower selectivity. Accordingly, in the first activation zone the tongue is capable of activating sensors with a higher accuracy and possibly with a higher frequency in comparison with sensors located in the second activation zone. It is understood that the tongue's selectivity varies continuously as a function of position in the mouth cavity or locations on the palate. Accordingly, the first and second activation zones should be understood broadly as any region in the mouth cavity or on the palate characterized by different tongue selectivities. Generally, the first activity zone is located in the front part of the mouth cavity whereas the second activation zone is located in the back part of the mouth cavity. More specifically, the selectivity is higher the closer the tongue is to the center line and the front of the mouth . The tongue sensor system 100 further comprises a control system 120 for allocating control of the at least first and second functions 161-164 of the one or more devices 151,152 to one of the first, second or other sensors 111,112. For that purpose the control system 120 is provided with inputs 121 from receiving output signals from the tongue sensors 111-112 and outputs 123 for supply of control signals to the functions 161-164 and devices 151, 152. Sensor signals may be transmitted wirelessly or by wire from the tongue sensors to the inputs 121. Similarly, control signals may be transmitted wirelessly or by wire from the outputs 123 to the devices 151, 152. Accordingly, the control system 120 makes it possible to allocate different functions, e.g . at least two of the four functions 161-162, to anyone or more of the first sensors l l la-l l ld . Thus, in an embodiment each individual first sensor l l la-l l ld and second sensor 112a-112d may be assigned anyone of a plurality of the functions 161-164.
The capability of the control system 120 to allocate different functions to the sensors 111,112 enables certain functions 161-164 to be allocated to sensors 111 located in the first activity zone 115 and other functions to be allocated to sensors 112 located in the second activity zone 116.
Accordingly, functions 161-164 which are often used or require high activation accuracy can be associated with sensors located in the first activity zone, and sensors which are less often used or do not require high accuracy can be associated to sensors located in the second activity zone. For example, an arm prosthesis having controllable functions to generate hand closing/opening, elbow flexion and wrist rotation could be controlled by the tongue sensors. Here the functions of hand closing and opening could be associated individually to sensors 111a and 111b, respectively in the first activation zone 115, wrist rotation could be associated with sensor 111c, whereas the function of elbow flexion could be allocated to sensor 112a in the second activation zone 116. The closing and opening operation may also be assigned to a single sensor, e.g . by distinguishing between the periods of time the sensor is activated - e.g . a short activation time implies closing and a long activation time implies opening .
The capability of the control system 120 of enabling allocation of different functions to different sensors may be implemented in software on a computer or firmware or hardware on an electronic circuit. In an embodiment the control system is configurable via a user interface, e.g . on a monitor connected to the computer or electronic circuit, so that the user is able to allocate certain functions 161-164 to sensors 111-112 of the user's own selection and preference.
The control system 120 may have a function enabling the user to easily change between allocations of different functions 161-164 to a given sensor 111-112 by use of one of the sensors 111-112. Thus, each one of a set of functions, e.g .
function 161 of device 151 and function 163 of device 152, may, one at a time, be allocated to a given sensor such as sensor 111a. The replacement of an allocation of function 161 with function 163 to a given sensor may be invoked by activation of a sensor, e.g . a sensor 112d located in the second activation zone 116.
The activation of a sensor can be performed by applying a force to a pressure sensitive sensor, by moving a ferrous or metal item pierced onto a tongue towards an induction sensitive sensor or in other ways. Activations of a sensor may be differentiated according to the duration of which a sensor is affected . Thus, a short activation of a sensor may be interpreted by the control system 120 to cause a first action, e.g . transmission of a control signal to a device 151-152, whereas a prolonged activation would cause a second action, e.g . replacement of an allocation of a first function 161 with a second function 162. Thus, as an example, the user may wish to change the allocation of a function of an arm prosthesis to sensor 111a so that instead a function of a wheelchair is allocated to sensor 111a. This could be done by activating sensor 112d or for example by activating sensor 111a for a certain period of time such as five seconds.
Thus, the capability of the control system to replace an allocation of a first or a second function 161-164 to one of the first and second sensors 111-112 with another function in response to a sensor signal from one of the sensors allows the user to easily select which function should be assigned to a given sensor.
Furthermore, the control system may have a user interface enabling the user to select which functions 161-164 should be associated with a given sensor 111-112. Advantageously, the control system enables the user to select and associate two functions to a given sensor 111a. Accordingly, with two functions 161, 163 associated to a sensor 111a, the user may simply toggle between the two functions by activation of one of the sensors 111-112.
In an embodiment the control system is capable of combining sensor signals from a plurality of the first or second sensors into a single sensor signal. For example, sensors 111a and 111c may be combined into a single sensor by combining the output sensor signal generated by each sensor into a single sensor signal, e.g . by simply adding the two sensor signals. Thus, if in a given situation the user does not need all sensors, or if a user need larger sensor areas than provided by individual sensors in order increase selection accuracy, a number of sensors may be combined into a single sensor.
Fig . 2 shows an example of tongue sensor 201 comprising six first sensors 211a- 211f, four second sensors 212a-212d and eight third sensors 213a-213h . The first sensors 211a-211f, the second sensors 212a-212d and the third sensors 213a- 213h are located at positions with different selectivities so that the selectivity of the first sensors is higher than the selectivity of the second and third sensors, and the selectivity of the second sensors is higher than the selectivity of the third sensors. The total number of eighteen sensors 211-213 may not be needed by the user and therefore some of the sensors have been combined . That is, sensors 211a and 211d have been combined into a single sensor 221, and sensor pairs 221b and 211e, 211c and 211f, 212a and 212b, 212c and 212d have combined into corresponding sensors. In general any number of sensors may be combined into a single sensor.
Each of the three groups of first sensors 211a-211f, second sensors 212a-212d and third sensors 213a-213h may be located on individual prints or any of the groups may be located on a common print. The sensor 201 in Fig. 2 comprises eight sensors 213a-213h which are arranged to form a joystick where each of the sensors is allocated to a given displacement function, for example forward and backward direction functions of a wheelchair may be allocated to sensors 213h and 213d, respectively, and left and right turns of a wheel chair may be allocated to sensors 213b and 213f, respectively. The joystick function of the touch sensor 201 may also be used for example to control movements of the thumb of a hand prosthesis.
Fig. 3 illustrates an advantageous use of the tongue control system in
combination with myo-electrically, controlled prostheses according to another embodiment. A myo-electric signal 301 may be generated in response to activation of muscles e.g. the biceps. This signal could be used by an arm prosthesis for flexion of the elbow as illustrated in picture A. However, the myoelectric signal 301 may also be used for controlling other actuators of a
prosthesis, e.g. for opening/closing of the hand of an arm prosthesis. However, since the myo-electric signal 301 cannot readily be decoded to control different functions, the user may have to generate a sequence of muscle contractions, e.g. one contraction to activate the elbow actuator and two sequential contractions to activate the hand closing actuator. Since a prosthesis may contain a larger number of actuators arranged to control different degrees of freedom, use of myo-electric signals for activating different actuators or functions may be impractical, in particular if more than two or three functions should be controlled. However, according to this embodiment, the tongue control system 100 may be exploited to gain the full benefit of myo-electric control of prostheses and other devices by using a myo-electric signal 301 in combination with output signals from the tongue sensors 111-112, 211-213. In general not only myoelectrical signals from myoelectrical sensors but also signals from other body sensors or bioelectrical sensors may be used in
combination with the tongue control system . Such other sensors and their sensor signals, herein referred to as secondary sensors and secondary sensor signals, comprise eye gaze sensors, brain (EEG) sensor, and other sensors described elsewhere in this description .
In order to combine the myo-electric signal 301 with sensor output signals, the control system 120 is additionally provided with one or more input terminals 122 for receiving myoelectric signals from the user of the tongue sensor. The control system is provided with processing means for generating a control signal for controlling at least a third or a fourth function such as different movement functions of a prosthesis in response to the myoelectric signal . I.e. the processing means may filter the noisy myoelectric signal and convert it to a signal useful for driving or controlling actuators of a prosthesis. However, the myoelectric signals may be used for controlling functions of other device as well, e.g . a computer or a wheelchair. According to an embodiment, the generation of control signals for controlling different functions or devices by myoelectric signals is controllable or selectable in response to the sensor signal from one of the first and second sensors. That is, by activating of a sensor 111-112 the user can determine what function 161-164 will be controlled in response to a myoelectric signal .
Fig . 3 illustrates how the sensor 111-112 can be used to select what function 161- 164 will be controlled by a myoelectric signal . Thus, in one state such as a default state of the prosthesis controller (the prosthesis controller may be comprised by the control system 120 or connected thereto), generation of a myoelectric signal from the biceps causes the control system 120 to generate a control or driving signal to a prosthesis actuator for flexion of the elbow as illustrated in picture A. By activating one or more of the tongue sensors 111-112, e.g . by activating a tongue sensor for an extended period of time (see picture B), e.g . 3 seconds, the state of the prosthesis controller is changed so that generation of the same myoelectric signal from the biceps causes the control system 120 to generate a control or driving signal to a prosthesis actuator for opening or closing of the hand as illustrated in picture C. Myoelectric signals from the biceps and triceps may be used for respective closing and opening operations of the hand . Traditionally, such changes of switching from control of one function of a prosthesis to another function of the prosthesis could be performed by the user by performing co-contractions, i.e. simultaneous contractions of the biceps and triceps muscles. In this way in a prosthesis with five functions, each function could be selected by performing a number of such co-contractions sequentially. That is, a third function of the prosthesis could be selected by performing three co- contractions sequentially, and a fourth function could be selected by performing four co-contractions.
Fig . 4 illustrates the prosthesis controller 401 - which is referred to as a prosthesis controller 401 only for convenience - for controlling prostheses or other devices in response to a myoelectric signal 301 and where the actual function or device 402 to be controlled by the myoelectric signal 301 may be determined directly by the user by the user's activation of one or more of the first and/or second tongue sensors 111-112. Thus, the combination of the tongue sensors 111-112 and the control system's 120 capability of controlling functions in response to a
myoelectric signal enables the user to quickly select any one or more functions 161-164,402 from a group of functions being controllable by a myoelectric signal or different bioelectric signals, e.g . by a single activation of a tongue sensor 111- 112.
Fig . 5 shows experimental data from experiments where a computer modeled prosthesis with five functions 511-515 are activated . In the first experiment 501, the five functions 511-515 are activated in turn using only myoelectric signals. The first function 511 is activated by a single co-contraction . The second function 512 is selected by performing two muscle co-contraction, the third function 513 is selected by performing three muscle co-contractions, and so forth. The
experiments show that the first function 511 requires approx. 0.9 seconds to be invoked, the second function requires approx 1.6 seconds to be selected and invoked, the third function requires approx 2 seconds to be selected and activated, the fourth function requires approx 3 seconds to be selected and activated, and the fifth function requires approx 3.5 seconds to be selected and activated. The time required to select a function is seen to increase approximately linearly, with the reaction time offset included in the time for invoking the first function.
In the second experiment 502, the first to fifth function 511-515 are selected by use of the tongue sensors 111-112 and activated by generation of a myoelectric signal 301. Five different tongue sensors 111-112 are used to select individual functions 511-515. In comparison with experiment 501, it is seen that the time used for selection of a function is approximately constant and particularly for functions three to five the required time for a selection and activation is lower in experiment 502 where tongue sensors are combined with myoelectric signals than in experiment 501. Fig. 6 shows a graph 601 with output signals 602, 603 from a tongue sensor (111-112,211-213). The curve 602 represents an analogue output signal and curve 603 represents a digital or discrete output signal. Each of the output signals 602, 603 comprises at least a first output level 611,612 and second output level 613, 614. Depending on the type of tongue sensor the different output levels 611- 614 may be generated in response to the tongue movement, tongue pressure, tongue velocity or the period of time a given sensor is activated by the tongue.
The multiple output levels may be generated by pressure sensors which are sensitive to different pressures. The multiple output levels may also be generated by a position sensitive tongue sensor or sensors, where a position generates a given output level. The sensors may generate continuous analogue or
discontinuous digital signals.
Thus, if a sensor is allocated to given function, for example a grasp or a
controllable degree of freedom of a prosthesis such as a hand prosthesis, the plurality of output levels 611-614 of the output signal 602, 603 can be used for controlling e.g. force or speed of the closing of this specific grasp or movement. This may be achieved by direct control of the amount of current sent to electro motors driving fingers or joints of the prosthesis, where the amount of current is set in dependence of the amplitude of the detected output level 611-614. In an embodiment the range of motion of a joint is mapped onto the amplitude of output levels of the sensor output to allow for direct control of the position of a joint. In particular, if the sensor signal comprises at least two detectable output levels, preferably at least five output levels or more preferred at least eight output levels the user may experience the control of a function as directly related to e.g . tongue movements. The utilization of a plurality of output levels from a single sensor or the utilization of a plurality of output levels from each of a plurality of sensors may reduce the amount of sensors needed to control a joint to one sensor instead of two since multiple functions or multiple settings of a function can be associated with the multiple output levels.
As mentioned previously, connections between any of the tongue sensors 111-112 and sensors for measuring myoelectric potentials to inputs 121-122 of the control system 120 may be wired and/or wireless. Wireless connections may be
implemented using wireless technology such as Bluetooth transmitters and receivers or infrared transmitters and receivers. Similarly, connections between any of the outputs 123 of the control system 120 to the devices and functions 151-151, 161-164, 402 may be wired and/or wireless.
The above references to first, second, third and fourth functions should not be understood in any limiting way. I.e. the first and second functions may be the same as the third and fourth functions. The first, second, third and fourth functions may be function of devices 151 and 152, a single device, three devices or four devices. The tongue control system may be used for controlling any number of functions and devices.
The control system 120, as well as the myoelectric controller 401 and other computer implementable embodiments of the invention, may be implemented in software as computer program instructions to be carried out by a processor or computer when executed on said processor or computer. The software may be available on a solid storage medium such as a DVD, from the internet or other broadcasting services or networks. Accordingly, the software may be read from any of these media into a memory or a processor as a full set of program instructions corresponding to aspects of this invention or as an update of program instructions to an existing program. Alternatively, the control system 120, as well as the myoelectric controller 401 and other computer implementable
embodiments of the invention may be implemented as firmware. As another alternative the control system 120, the myoelectric controller 401, other embodiments or parts of such embodiments, systems and controllers may be implemented in hardware as electronic components and circuits.
Whereas the description has focused on tongue sensors 111-112, aspects and embodiments of the invention may be extended to other body sensors as well. Thus, the tongue sensors may be replaced or combined with body sensors which can be placed elsewhere on the body and which are sensitive to other actions of the body. Examples of body sensors are sensors integrated in a sole of a shoe, joystick sensors which are operable by shoulder or head movements and eye gaze sensors. Thus, integrated sole sensors may be assigned different functions, similarly the tongue sensors, in dependence of the selectivity of different sole sensors. Similarly principles may apply to shoulder and eye gaze sensors.
Whereas the sensor system is primarily intended for disabled persons, the sensor system may also be used in other fields. E.g. a pilot may use body sensors to improve control of the airplane.
While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive; the invention is not limited to the disclosed embodiments. Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. A computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems. Any reference signs in the claims should not be construed as limiting the scope.

Claims

1. A tongue sensor system (100) for controlling one or more devices (151-152), the tongue sensor system comprises
- at least a first sensor (111-112,211-213) being responsive to tongue
movements, where the first sensor is capable of generating at least one output level in response to the tongue movement, and
- a control system for controlling the one or more devices (151-152) in
dependence of the output from the first sensor.
2. A tongue sensor system (100) according to claim 1 which comprises
- at least first and second sensors (111-112,211-213) being responsive to tongue movements, where the first sensor is configured to be located in a first activation zone (115) in the mouth cavity where the tongue has a high tongue selectivity and the second sensor is configured to be located in a second activation zone (116) in the mouth cavity where the tongue has a lower tongue selectivity,
- where each of the first and second sensors are capable of generating at least one output level in response to the tongue movement, and
- where the control system (120) is capable of controlling the one or more devices (151-152) in dependence of the output from the first and second sensors.
3. A tongue sensor system (100) according to claim 2, where
- at least one of the first and second sensors are capable of generating at least first and second output levels in response to the tongue movement, and
- where the control system (120) is capable of controlling a function (161-164) of the one or more devices (151-152) in dependence of the at least first and second output levels of one of the first and second sensors.
4. A tongue sensor system (100) according to claim 3, where the control of the function of the one or more devices in dependence of the at least first and second output levels is configurable.
5. A tongue sensor system (100) according to claim 2 or 3, where the control system (120) is capable of allocating control of at least first and second functions (161-164) of the one or more devices (151-152) to one of the first and second sensors.
6. A tongue sensor system according to claim 5, where the control system is capable of replacing an allocation of the first or second function to one of the first and second sensors with another function in response to a sensor signal from one of the sensors.
7. A tongue sensor system according to any of the preceding claims, where the control system has an input (122) for receiving a secondary sensor signal not originating from tongue sensors, such as a myoelectric signal (301), from the user of the tongue sensor, and where the control system has processing means capable of generating a control signal for controlling at least a third or a fourth function (161-164) of one of the one or more devices (151-152) in response to the secondary sensor signal.
8. A tongue sensor system according to claim 7, where generation of the control signal for controlling the third or the fourth function is selectable in response to a sensor signal from one of the first and second sensors.
9. A tongue sensor system according to claim 7, where the control system (120) comprises a secondary control function (401) for controlling which of the third or fourth function is allocated to the secondary sensor signal, where a change between allocations between the third and fourth function to the secondary sensor signal is invoked by a sensor signal from one of the tongue sensors.
10. A tongue sensor system according to any of the preceding claims, comprising a plurality of the first sensors configured to be located in the first activation zone and a plurality of the second sensors configured to be located in the second activation zone.
11. A tongue sensor according to any of the preceding claims, where the control system (120) is capable of combining sensor signals from a plurality of the first or second sensors into a single sensor signal.
12. A tongue sensor system according to any of the preceding claims, where the control system (120) is configurable via a user interface.
13. A tongue sensor system according to any of the preceding claims, where the one or more devices are one or more hand prostheses, and where functions of the one or more hand prostheses comprises finger functions for moving individual or groups of fingers, and where the controller is configured or configurable to control force, position, speed or acceleration of one of the finger functions in dependence of the at least first and second output levels so that a particular force, position, speed or acceleration is selectable by the user.
14. A prosthesis system for controlling movement of one or more parts of a hand prosthesis, the prosthesis system comprises
- the tongue sensor according to claim 13, and
- a hand or arm-and-hand prosthesis.
15. A method for controlling at least a first function (161-164) of one or more devices (151-152) by use of a tongue sensor system (100), the method comprises
- generating a first output signal with a first output level from a first tongue sensor in response to tongue movement of a user of the tongue sensor system,
- controlling the first function in dependence of the output from the first sensor.
16. A method according to claim 15 relating to controlling at least the first function and a second function of the one or more devices, where the method further comprises
- generating one or more of a), b) and c),
- a) a first output signal with a second output level from a first tongue sensor in response to tongue movement of a user of the tongue sensor system,
- b) a second output signal with a third output level from a second tongue sensor in response to tongue movement of a user of the tongue sensor system, and
- c) a second output signal with a fourth output level from a second tongue sensor in response to tongue movement of a user of the tongue sensor system,
- controlling the first function in dependence of the first output signal with the first output level and/or the first output signal with a second output level, and/or
- controlling the second function in dependence of the second output signal with a third output level, and/or the second output signal with a fourth output level.
17. A computer program enabling a processor to carry out the method of claim
PCT/DK2011/050171 2010-05-25 2011-05-24 A tongue sensor WO2011147418A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP11727110.6A EP2575699A1 (en) 2010-05-25 2011-05-24 A tongue sensor

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DKPA201070214 2010-05-25
DKPA201070214 2010-05-25

Publications (1)

Publication Number Publication Date
WO2011147418A1 true WO2011147418A1 (en) 2011-12-01

Family

ID=44584781

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/DK2011/050171 WO2011147418A1 (en) 2010-05-25 2011-05-24 A tongue sensor

Country Status (2)

Country Link
EP (1) EP2575699A1 (en)
WO (1) WO2011147418A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130090931A1 (en) * 2011-07-05 2013-04-11 Georgia Tech Research Corporation Multimodal communication system
CN105321519A (en) * 2014-07-28 2016-02-10 刘璟锋 Speech recognition system and unit
CN106648114A (en) * 2017-01-12 2017-05-10 长春大学 Interactive model of tongue machine and device
CN106876979A (en) * 2016-12-30 2017-06-20 努比亚技术有限公司 A kind of mobile terminal and communication means
CN106873769A (en) * 2016-12-30 2017-06-20 努比亚技术有限公司 A kind of method and terminal for realizing application control
CN112043550A (en) * 2020-09-29 2020-12-08 深圳睿瀚医疗科技有限公司 Tongue control hand rehabilitation robot system based on magnetic markers and operation method thereof
US11045366B2 (en) * 2013-12-05 2021-06-29 Now Technologies Zrt. Personal vehicle, and control apparatus and control method therefore
RU2777524C1 (en) * 2021-10-21 2022-08-05 Федеральное государственное бюджетное образовательное учреждение высшего образования Волгоградский государственный медицинский университет Министерства здравоохранения Российской Федерации Two-leaf apparatus for monitoring the activity of the muscles of the tongue

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5212476A (en) * 1990-09-28 1993-05-18 Maloney Sean R Wireless intraoral controller disposed in oral cavity with electrodes to sense E.M.G. signals produced by contraction of the tongue
US6598006B1 (en) * 1999-10-18 2003-07-22 Advanced Telecommunications Research Institute International Data input device using a palatal plate
US20050275620A1 (en) * 2003-04-07 2005-12-15 University Of Delaware Tongue-operated input device for control of electronic systems
WO2006105797A2 (en) * 2005-04-07 2006-10-12 Tks A/S Tongue based control method and system
WO2009138089A1 (en) 2008-05-13 2009-11-19 Aalborg Universitet Inductive input device
US20100007512A1 (en) * 2005-10-31 2010-01-14 Maysam Ghovanloo Tongue Operated Magnetic Sensor Based Wireless Assistive Technology

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5212476A (en) * 1990-09-28 1993-05-18 Maloney Sean R Wireless intraoral controller disposed in oral cavity with electrodes to sense E.M.G. signals produced by contraction of the tongue
US6598006B1 (en) * 1999-10-18 2003-07-22 Advanced Telecommunications Research Institute International Data input device using a palatal plate
US20050275620A1 (en) * 2003-04-07 2005-12-15 University Of Delaware Tongue-operated input device for control of electronic systems
WO2006105797A2 (en) * 2005-04-07 2006-10-12 Tks A/S Tongue based control method and system
US20100007512A1 (en) * 2005-10-31 2010-01-14 Maysam Ghovanloo Tongue Operated Magnetic Sensor Based Wireless Assistive Technology
WO2009138089A1 (en) 2008-05-13 2009-11-19 Aalborg Universitet Inductive input device

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130090931A1 (en) * 2011-07-05 2013-04-11 Georgia Tech Research Corporation Multimodal communication system
US11045366B2 (en) * 2013-12-05 2021-06-29 Now Technologies Zrt. Personal vehicle, and control apparatus and control method therefore
CN105321519A (en) * 2014-07-28 2016-02-10 刘璟锋 Speech recognition system and unit
CN106876979A (en) * 2016-12-30 2017-06-20 努比亚技术有限公司 A kind of mobile terminal and communication means
CN106873769A (en) * 2016-12-30 2017-06-20 努比亚技术有限公司 A kind of method and terminal for realizing application control
CN106648114A (en) * 2017-01-12 2017-05-10 长春大学 Interactive model of tongue machine and device
CN106648114B (en) * 2017-01-12 2023-11-14 长春大学 Tongue machine interaction model and device
CN112043550A (en) * 2020-09-29 2020-12-08 深圳睿瀚医疗科技有限公司 Tongue control hand rehabilitation robot system based on magnetic markers and operation method thereof
CN112043550B (en) * 2020-09-29 2023-08-18 深圳睿瀚医疗科技有限公司 Tongue control hand rehabilitation robot system based on magnetic marks and operation method thereof
RU2777524C1 (en) * 2021-10-21 2022-08-05 Федеральное государственное бюджетное образовательное учреждение высшего образования Волгоградский государственный медицинский университет Министерства здравоохранения Российской Федерации Two-leaf apparatus for monitoring the activity of the muscles of the tongue

Also Published As

Publication number Publication date
EP2575699A1 (en) 2013-04-10

Similar Documents

Publication Publication Date Title
EP2575699A1 (en) A tongue sensor
US10278882B2 (en) Assist wear item, control method for controller of assist wear item, and recording medium
US20170119553A1 (en) A haptic feedback device
Witteveen et al. Vibro-and electrotactile user feedback on hand opening for myoelectric forearm prostheses
US7608037B2 (en) Remotely located pleasure devices
Ranasinghe et al. Tongue mounted interface for digitally actuating the sense of taste
JP4758472B2 (en) Tongue control method and system for performing the method
Carrozza et al. A wearable biomechatronic interface for controlling robots with voluntary foot movements
Krishnamurthy et al. Tongue drive: A tongue operated magnetic sensor based wireless assistive technology for people with severe disabilities
JP5526298B1 (en) Electronic device, information processing apparatus, information processing method, and program
JP2017074354A (en) Electromechanical control assembly for a chair
Abbass et al. Embedded electrotactile feedback system for hand prostheses using matrix electrode and electronic skin
Cannan et al. A wearable sensor fusion armband for simple motion control and selection for disabled and non-disabled users
Dideriksen et al. Electrotactile and vibrotactile feedback enable similar performance in psychometric tests and closed-loop control
EP1912044A1 (en) Switch arrangement
JP4704599B2 (en) Haptic presentation method and tactile presentation device
US20230147243A1 (en) Method for controlling a limb of a virtual avatar by means of the myoelectric activities of a limb of an individual and system thereof
Jain et al. Tongue operated wheelchair for physically disabled people
CN112947766A (en) Vibration device, intelligent gloves and interaction method of intelligent gloves
Hasegawa et al. Pseudo-proprioceptive motion feedback by electric stimulation
US20060033705A1 (en) Mouse pointer controlling apparatus and method
Williams The magic touch: Our sense of touch is highly complex, but scientists and technology companies are coming up with clever ways to mimic and enhance it
Struijk A tongue based control for disabled people
CN215181885U (en) Vibrating device and intelligent gloves
JPH0975464A (en) Low-frequency therapeutic apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11727110

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2011727110

Country of ref document: EP