WO2022149165A1 - Système et procédé de commande d'un dispositif d'assistance - Google Patents

Système et procédé de commande d'un dispositif d'assistance Download PDF

Info

Publication number
WO2022149165A1
WO2022149165A1 PCT/IN2022/050014 IN2022050014W WO2022149165A1 WO 2022149165 A1 WO2022149165 A1 WO 2022149165A1 IN 2022050014 W IN2022050014 W IN 2022050014W WO 2022149165 A1 WO2022149165 A1 WO 2022149165A1
Authority
WO
WIPO (PCT)
Prior art keywords
signals
assistive device
gesture
controlling
motion
Prior art date
Application number
PCT/IN2022/050014
Other languages
English (en)
Inventor
Subhojit BASU
Pratik Bhalerao
Vishal Patil
Original Assignee
Deedee Labs Private Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Deedee Labs Private Limited filed Critical Deedee Labs Private Limited
Publication of WO2022149165A1 publication Critical patent/WO2022149165A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/68Operating or control means
    • A61F2/70Operating or control means electrical
    • A61F2/72Bioelectric control, e.g. myoelectric
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/54Artificial arms or hands or parts thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/60Artificial legs or feet or parts thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/78Means for protecting prostheses or for attaching them to the body, e.g. bandages, harnesses, straps, or stockings for the limb stump
    • A61F2/80Sockets, e.g. of suction type
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/76Means for assembling, fitting or testing prostheses, e.g. for measuring or balancing, e.g. alignment means
    • A61F2002/7615Measuring means
    • A61F2002/7625Measuring means for measuring angular position
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H1/00Apparatus for passive exercising; Vibrating apparatus ; Chiropractic devices, e.g. body impacting devices, external devices for briefly extending or aligning unbroken bones
    • A61H1/02Stretching or bending or torsioning apparatus for exercising
    • A61H1/0274Stretching or bending or torsioning apparatus for exercising for the upper limbs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/165Wearable interfaces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2230/00Measuring physical parameters of the user
    • A61H2230/60Muscle strain, i.e. measured on the user, e.g. Electromyography [EMG]
    • A61H2230/605Muscle strain, i.e. measured on the user, e.g. Electromyography [EMG] used as a control parameter for the apparatus

Definitions

  • the present invention relates to a system and method for controlling an assistive device.
  • present invention pertains to a system for controlling limb prosthesis devices, upper limb exoskeletal devices or neuro-rehabilitation devices and the method of controlling thereof.
  • Functional assistive devices play an important role in enhancing the quality of life of people with locomotor disability, partial or full loss of a limb by improving mobility and the ability to manage activities of daily living.
  • Functional or powered assistive devices employ multiple parts to exhibit an intended gesture or action or perform tasks.
  • Most of the commercially available powered prosthetic devices use sensors to capture surface electromyographic (EMG) signals or muscle’s electrical impulses from the residual limb of a wearer. It is critical to place the sensors relative to the active muscle sites of a wearer as the sensors non-invasively measure and amplify electric impulses generated by the active muscle. The functionality of the assistive device is thus dependent upon electric impulse captured by the sensor.
  • assistive devices comprise one or two sensors and supports only one degree of freedom (DoF).
  • DoF degree of freedom
  • conventional myoelectric prosthetic hand systems with one or two sensors support one DoF primarily, opening and closing the hand.
  • the wearer has to learn to trigger the prosthetic with a definite signal by intentionally making specific types of muscle movements in order to make/instruct the assistive device to perform a desired action or gesture or a task.
  • the user needs to trigger a specific group of muscles to perform a power grip and trigger a different set of muscles to perform a relaxed hand grip.
  • the wearer thus needs to master the art of triggering muscles separately for the different actions or gestures or tasks. This necessitates for the wearer to be put through an extensive post-fitment-training program to not only adapt to the externally fixated assistive devices, but also to learn control of the assistive device. This is a cumbersome process and takes a long time for the wearer to get trained and used to.
  • systems in the prosthetic devices are such that if the wearer experiences a mild jerk in the device while moving around or travelling, it leads to an undesired gesture triggering.
  • a conventional prosthesis requires the user to change gesture modes by generating a specific type of electrical impulse by triggering muscles.
  • a gesture mode consists of a group of few predefined types of gestures e.g. power grip, open together form one gesture mode, lateral pinch and point form another gesture mode. All the gestures and the modes need configuration via an external device like mobile app or external switch based control or RFID based tags or voice control input. This complicates its use by making the person remember multiple modes and impulses for using it.
  • Main object of the present invention is to provide a system for controlling the assistive device which eliminates the need of an extemal/additional device for performing a particular gesture.
  • Another object of the present invention is to provide an intuitive system to the user to control assistive device by a smart machine learning module with realtime surface pattern recognition and thereby reducing muscle fatigue.
  • Another object of the present invention is to provide a system for controlling assistive devices which can simultaneously control multiple degrees of freedom thereby minimizing undesired triggering.
  • Present invention relates to a system and method for controlling assistive devices such as limb prosthesis, exoskeleton devices or any other electromechanical rehabilitation devices for paraplegic patients.
  • the invention also describes a method of controlling assistive devices.
  • features of a system for controlling assistive devices which eliminates the need of extemal/additional device for performing a particular gesture, to overcome issues of undesired triggering of system, user friendly, simple to operate and intuitive in nature, wherein the system comprising plurality of electrodes mounted at various positions of a user’s residual limb configured to receive surface electromyography signals from the active muscle sites of a residual limb, at least one motion sensor to capture change in motion, at least one orientation sensor to capture change in orientation of a residual limb w.r.t.
  • a sensing module configured to amplify the received electromyography signals, reduce noise, filter received signals and digitized the said signals to provide digital electromyography signals to processing unit, a processing unit configured to receive digital electromyography signals data from said sensing module, store the data, extract various parameters, classify and analyze; a supervised machine learning module configured to identify correct gesture, false motions and a plurality of actuators configured to receive signal from said supervised machine learning module and perform the gesture intended to be performed by the user.
  • features of a method for controlling assistive devices to avoid undesired triggering of an assistive device, and without the need of external controlling device comprising the steps of mounting plurality of non-invasive electrodes on the active muscle sites of a wearer , capturing the electromyography signals through said electrodes, capturing motion and orientation of residual limb and assistive device independently by motion sensor and orientation sensor respectively, amplifying the received electromyography signals, reducing noise, filtering received signals and digitizing the said signals by sensing module to provide digital electromyography signals to processing unit, receiving digital electromyography signals data from said sensing module, storing, extracting various parameters, classifying and analyzing said data by a processing unit, generating training parameters, storing the parameters in memory, identifying correct gestures, false motions by a machine learning module, and sending control signals to plurality of actuators configured to receive signal from said supervised machine learning module to perform the gesture intended by the wearer.
  • Figure 1 a block diagram of a system for controlling an assistive device in accordance with an embodiment of the invention.
  • Figure 2 a block diagram of a system for controlling an assistive device in accordance with an embodiment of the invention.
  • Figure 3 a block diagram of a method for controlling an assistive device in accordance with an embodiment of the invention.
  • the present invention is directed towards a system and method for controlling an assistive device.
  • the assistive device is controlled based on pattern recognition of body signals of a wearer wearing the assistive device.
  • pattern recognition based assistive device cuts down post-fitment-training period needed by the user or wearer of the assistive device. Accordingly, the system does not require a wearer to generate definite impulses, and obviates the problems associated with prior-art, receive input data from multiple data points and seamlessly controls the assistive device with multiple DoFs. (Degree of Freedom).
  • Figure 1 shows a system (100) for controlling an assisting device (10) in accordance with an embodiment of the invention.
  • the assistive device can be an externally powered multi-functional or multiple DoFs assistive device like a prosthetic arm having a hand assembly or upper limb rehabilitation devices.
  • the assistive devices are configured to perform several actions or tasks, and the system and method of the present invention provides control signals to an actuator or a drive mechanism (160) of the assistive device to perform multiple tasks or actions.
  • Said system comprises of plurality of non-invasive electrodes (110) to be mounted on external skin surface of a user to capture surface electromyography signals (EMG) signals or muscle’s electrical impulses from the active muscle sites of a residual limb of a wearer.
  • the electrodes are customized medical grade passive electrodes meant for capturing EMG signals from the surface of skin as a result of contraction of muscle groups in case the user intends to perform a gesture. These electrodes may not have any pie-amplification electronic module.
  • the plurality of electrodes (110) may place nearly equidistant from each other such that the electrodes are in contact with the circumference of the residual limb at different locations.
  • the electrodes can be placed in flexible, elastic, inelastic or an adjustable socket made for a wearer or can be worn around the hand as a wearable band having multiple EMG sensors. Though linear equidistant placement of electrodes is preferred, precise placement of electrodes at specific location or at specific orientation is not essential. Further, if required, for placement of electrodes like in trauma cases site identification can be done to check for presence of EMG signals.
  • the system for controlling assistive devices further comprises of atleast one orientation sensor (150) and atleast one direction sensor (140) mounted either on residual limb or assistive device or both for independently capturing direction and orientation related movements of a limb.
  • the motion and orientation sensors are selected from but not limited to accelerometer, gyroscope, magnetometer, etc.
  • the motion and orientation sensors may place on the residual or healthy limb and/or assistive device that perform independent motion relative to the joint they are connected with to estimate any motion being performed by the limb or change in pose of the limb.
  • the motion and orientation sensors reduces incorrect/inaccurate detection for different poses or when performing motion of the assistive device or any body part, that can directly or indirecdy affect the EMG signals at the residual limb.
  • the system for controlling assistive devices to eliminate the need of extemal/additional device for performing a particular gesture further comprises of a sensing module (120) for sensing micro-volt EMG signals through plurality of electrodes, amplify, filter and convert the signals into machine readable digital format.
  • the sensing module is an analog electronic circuit which senses EMG signals and then amplifies filters and digitizes the EMG signals using an ADC to provide a digital EMG signal.
  • Said sensing module may configured for lead-off detection of each electrode and in case of detachment of any of the electrodes from user’s skin surface, the user will be informed by audio-visual alarms.
  • the system for controlling assistive devices to eliminate the need of extemal/additional device for performing a particular gesture further comprises of a processing unit (130) configured to acquire signals from electrodes (110) and sensors (140, 150) wherein the digital EMG signals are stored and/or analyzed.
  • the processing unit (130) is configured to determine the energy level of the digital EMG signals.
  • the processing unit (130) is configured to compare the energy level with a pre-determined or threshold energy level. In case the energy level is greater than the pre-determined energy level, the digital signals are classified as a gesture signal until the energy level falls below the pre-determined/thieshold energy level. Accordingly, valid digital EMG signals are identified by the system.
  • the digital EMG signal has two phases: Start trigger - When wearer starts moving his hand and End trigger - When movement of hand by wearer ends.
  • Said Processing module (130) configured to detect presence of motion. When there is some motion observed, it generates a start trigger and end trigger. Digital EMG data is being captured between the triggers.
  • Said digital EMG signals may classify as a gesture signal and further analyzed/processed by the processing unit (130).
  • the processing unit (130) is configured to extract parameters such as Mean; Variance; Correlation; Mode; Quartiles; Median; Standard Deviation; Minimum; Maximum, but need not be limited to these statistical parameters. Some other parameters like Slope sign changes; Zero crossing; Structure; Length for every channel of digital signals are determined. Also values from motion and orientation sensors are collected and processed for motion and pose parameters.
  • a feature vector for the gesture is created.
  • Each feature vector is assigned to a gesture performed by the user and thereby the assistive device. Accordingly, plurality of feature vectors are determined for each gesture or task that the user wishes to perform. Also feature vectors are determined when the person is not performing any gestures or some kind of motion which might correspond to false detection as gestures.
  • the system comprises a supervised machine learning module, wherein the plurality of feature vectors and their respective ascribed gestures and false gestures are inputs for training the supervised machine learning module.
  • the supervised machine learning module may generate some training parameters not to be limited to weights, distances after the training process has completed. Training parameters, if any, are stored in a volatile or non-volatile memory and used during a gesture identification phase as needed. After training has been completed the system and method of the present invention can identify gestures/actions and also motions falsely detected as gestures that the user intends to perform and control the actuator or the assistive device accordingly.
  • the electrodes obtain EMG signals. These EMG signals are then amplified, filtered and digitized by the ADC to provide a digital EMG signal.
  • the digital EMG signals are thereafter received by the processing unit, wherein the digital EMG signals are stored and/or analyzed by the processing unit to determine whether the digital EMG signals correspond to a gesture/action intended to be performed by the wearer.
  • the processing unit is configured to determine the energy level of the digital EMG signals. In this regard, the processing unit is configured to compare the energy level with a pre-determined or threshold energy level. In case the energy level is greater than the predetermined energy level, the digitized signals are classified as starting of gesture signal until the energy level falls below the pre-determined/threshold energy level. Accordingly, valid digitized signals are identified by the system.
  • the digital EMG signals classified as a gesture signal are further analyzed/processed by the processing unit to determine a gesture associated with the digital EMG signal to generate a control signal.
  • the processing unit is configured to extract parameters such as Mean; Variance; Correlation; Mode; Quartiles; Median; Standard Deviation; Minimum; Maximum but need not be limited to these statistical parameters. Some other parameters like Slope sign changes; Zero crossing; Structure; Length for every channel of EMG samples are determined. Also values from motion and orientation sensors are collected and processed for motion and pose parameters. [032] Based on any combination of some or all of the parameters mentioned, a feature vector for a detected gesture is created.
  • each feature vector is an input to a trained supervised machine learning module which generates a control signal specifying the gesture that has been performed by the wearer of the assistive device or no motion when a false gesture is detected.
  • the gesture or the control signal identified by the trained supervised machine learning system is communicated to the actuator which controls the assistive device to perform the given task.
  • the present invention provides a method for controlling an assistive device.
  • the method is carried out on a system discussed hereinbefore.
  • the method starts with capturing surface EMG signals from active muscle sites of the wearer.
  • the EMG signals are then amplified, filtered and converted to a digital signal.
  • the digital EMG signals are thereafter analyzed to determine the energy level of the digital EMG signals.
  • the energy level is compared with a pre-determined or threshold energy level. In case the energy level is greater than the pre-determined energy level, the digital signals are classified as starting of a gesture signal until the energy level falls below the predetermined/threshold energy level.
  • the digital signals classified as a gesture signal are further analyzed/processed, whereby parameters such as Mean; Variance; Correlation; Mode; Quartiles; Median; Standard Deviation; Minimum; Maximum but need not be limited to these statistical parameters are extracted. Some other parameters like Slope sign changes; Zero crossing; Structure; Length for every channel of digital signals are determined. Also values from motion and orientation sensors are collected and processed for motion and pose parameters. [034] Based on any combination of some or all of the parameters mentioned, a feature vector for the gesture is created, and for each feature vector a gesture performed by the user is ascribed. Accordingly, plurality of feature vectors is determined for each gesture or task to be performed by the assistive device. Also feature vectors are determined when the person is not performing any gestures or some kind of motion which might correspond to false detection as gestures.
  • the plurality of feature vectors and their respective ascribed gestures are inputs for training a supervised machine learning module.
  • the supervised machine learning module may generate some training parameters not to be limited to weights, distances after the training process has completed. Training parameters if any are stored in a volatile or non-volatile memory and used during a gesture identification phase as needed. After training has been completed the system and method of the present invention can identify gestures/actions that the user intends to perform and controls the assistive device accordingly. If motion or gestures are falsely detected as gestures, the assistive device performs no motion.
  • surface EMG signals are captured from active muscle sites of the wearer.
  • the EMG signals are then amplified, filtered and converted to a digital EMG signal.
  • the digital EMG signals are thereafter analyzed to determine whether the digital EMG signals correspond to a gesture/action intended to be performed by the wearer.
  • the method determines the energy level of the digital signals. Upon determining the energy level, the energy level is compared with a pre-determined or threshold energy level. The digital signals are classified as a gesture signal until the energy level in case the energy level is greater than the pre-determined energy level.
  • the digital signals classified as a gesture signal are further analyzed/processed whereby parameters such as Mean; Variance; Correlation; Mode; Quartiles; Median; Standard Deviation; Minimum; Maximum but need not be limited to these statistical parameters are extracted. Some other parameters like Slope sign changes; Zero crossing; Structure; Length for every channel of EMG samples are determined. Also values from motion and orientationsensors are collected and processed for motion and pose or orientation parameters.
  • each feature vector is an input to a trained supervised machine learning module which generates a control signal specifying the gesture that has been performed by the wearer of the assistive device.
  • the gesture or the control signal identified by the trained supervised machine learning system is communicated to the actuator which controls the assistive device to perform the given task. If motion or gestures are falsely detected as gestures, the assistive device performs no motion.
  • Figure 2 shows a system (200) for controlling an assistive device (10) in accordance with an embodiment of the invention.
  • the system is similar to the system illustrated in figure 1, and further comprises one or more sensors which provide feedback of parameters of the assistive device.
  • the sensors can include force or pressure feedback sensors, over current limit sensors, finger movement limiting sensor, etc. Based on input from the sensors, the system may modify or alter control signals to control the assistive device to prevent the assistive device or its actuator from damage.
  • a wearer of the assistive device can perform or control the assistive device to perform a task or action by thinking of it which is the same as performing an action in case the wearer had a healthy hand. Further, the present invention by adapting to the body signals of a person, cuts down on the training needed to control the assistive device.
  • Figure 3 shows a method for controlling an assistive device (10) as another embodiment of the present invention
  • a method for controlling assistive devices to avoid undesired triggering of an assistive device, intuitive in nature and without the need of external controlling device comprising the steps of mounting plurality of non- invasive electrodes (110) on the active muscle sites of a wearer at predetermined positions, capturing the electromyography signals through said electrodes (110), capturing motion and orientation of residual limb and assistive device independently by motion sensor (140) and orientation sensor (150) respectively, amplifying the received electromyography signals, reducing noise, filtering received signals and digitizing the said signals by sensing module (120) to provide digital electromyography signals to processing unit (130), receiving digital electromyography signals data from said sensing module (120), storing, extracting various parameters, classifying and analyzing said data by a processing unit (130), generating training parameters, storing the parameters in memory, identifying correct gestures, false motions by a machine learning module (170), and sending control signals to plurality of actuators (160) configured to receive signal from said supervised machine learning module and perform the
  • FIG. 1 A. Valid Gesture Detection (For example : Open, Close, Point, Pinch gestures)
  • Figure.1 Surface EMG data plot of 4 electrodes placed on user’s skin at 4 different locations. The Blue line plot represents raw EMG data and the Red line plot represents the energy level of the valid gesture.
  • Figure 1 shows digitized surface electromyography (EMG) signal data from four EMG electrodes placed across forearm muscle sites of a normal hand user and in the next step average energy of all the sampled data is being calculated and the graph plot of the average energy of the signal is shown in red line.
  • EMG surface electromyography
  • Y-axis represents the amplitude of EMG signals and their energy.
  • X-axis represents the number of samples captured from an individual sensor.
  • Figure 2 Combined Average Energy Plot of 4 EMG sensors For example, when the hand is in a resting state, all the four EMG signals have steady values near to zero baseline. And when the user performs a gesture, e.g. closing of hand, all the four EMG signals show variations which are a measure of magnitude of muscle force as shown in Figure 1. The muscle force generated at different muscle sites is different and it varies with different gestures also. Hence the pattern of close hand gesture will be different from the pattern of the open hand gesture.
  • Fig 3 shows a magnified view of the Energy plot in Figure. 2.
  • the Red line represents the threshold level and
  • the Blue line plot represents the average energy of the sampled data of a particular gesture.
  • the average energy value goes above the predetermined threshold value then it is a valid gesture otherwise it is steady hand state.
  • the energy goes above the Threshold indicating Start of gesture, which continues until the gesture is completed, after which energy goes below the threshold indicating End of gesture as shown in figure 3.

Abstract

La présente invention concerne un système et un procédé de commande d'un dispositif d'assistance. Le dispositif d'assistance est commandé sur la base d'une reconnaissance de caractéristiques de signaux corporels d'un porteur portant le dispositif d'assistance. L'adaptation aux signaux corporels d'une personne ou d'un porteur permet au dispositif d'assistance basé sur la reconnaissance de caractéristiques de réduire la période d'apprentissage post-adaptation nécessaire à l'utilisateur ou au porteur du dispositif d'assistance. Ledit système ne nécessite pas que le porteur génère des impulsions définies, reçoit des données d'entrée provenant de multiples points de données et commande sans interruption le dispositif d'assistance avec plusieurs degrés de liberté.
PCT/IN2022/050014 2021-01-10 2022-01-07 Système et procédé de commande d'un dispositif d'assistance WO2022149165A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN202021049128 2021-01-10
IN202021049128 2021-01-10

Publications (1)

Publication Number Publication Date
WO2022149165A1 true WO2022149165A1 (fr) 2022-07-14

Family

ID=82358801

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2022/050014 WO2022149165A1 (fr) 2021-01-10 2022-01-07 Système et procédé de commande d'un dispositif d'assistance

Country Status (1)

Country Link
WO (1) WO2022149165A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160107309A1 (en) * 2013-05-31 2016-04-21 President And Fellows Of Harvard College Soft Exosuit for Assistance with Human Motion
CN110695959A (zh) * 2019-08-27 2020-01-17 成都锦江电子系统工程有限公司 外骨骼机器人及其控制系统
US20200275895A1 (en) * 2019-02-28 2020-09-03 Facebook Technologies, Llc Methods and apparatus for unsupervised one-shot machine learning for classification of human gestures and estimation of applied forces

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160107309A1 (en) * 2013-05-31 2016-04-21 President And Fellows Of Harvard College Soft Exosuit for Assistance with Human Motion
US20200275895A1 (en) * 2019-02-28 2020-09-03 Facebook Technologies, Llc Methods and apparatus for unsupervised one-shot machine learning for classification of human gestures and estimation of applied forces
CN110695959A (zh) * 2019-08-27 2020-01-17 成都锦江电子系统工程有限公司 外骨骼机器人及其控制系统

Similar Documents

Publication Publication Date Title
Mendez et al. Evaluation of the Myo armband for the classification of hand motions
US9999391B2 (en) Wearable electromyogram sensor system
Yang et al. A proportional pattern recognition control scheme for wearable a-mode ultrasound sensing
KR101963694B1 (ko) 동작 인식 및 제어를 위한 웨어러블 장치 및 이를 이용한 동작 인식 제어 방법
US9770179B2 (en) System, method and device for detecting heart rate
US20170119553A1 (en) A haptic feedback device
EP2839774B1 (fr) Appareil d'interface de biosignal et méthode d'exploitation d'un appareil d'interface de biosignal
Bian et al. SVM based simultaneous hand movements classification using sEMG signals
KR101492480B1 (ko) 보행 단계에 기반한 표면 근전도 분석 시스템
JP7477309B2 (ja) 生体信号が表す情報を識別するためのシステム
Zhang et al. Real-time implementation of a self-recovery EMG pattern recognition interface for artificial arms
WO2018214522A1 (fr) Procédé et appareil d'acquisition de signal électromyographique
Herrmann et al. Fusion of myoelectric and near-infrared signals for prostheses control
KR20150000237A (ko) 퓨전 센서를 이용한 보행 패턴 인식 시스템
Patel et al. EMG-based human machine interface control
WO2022149165A1 (fr) Système et procédé de commande d'un dispositif d'assistance
KR20190080598A (ko) 생체신호를 이용한 감성 검출 시스템 및 그 방법
KR100706065B1 (ko) 근전도를 이용한 사용자 의도 인식 방법 및 그 시스템
JP3699996B2 (ja) 生体信号を利用したリモートコントローラ
KR101520462B1 (ko) 상지 장애인용 인터페이스장치
Bhardwaj et al. Electromyography in physical rehabilitation: a review
Ibáñez et al. An asynchronous BMI system for online single-trial detection of movement intention
KR20210131885A (ko) 생체 인증 장치 및 그의 동작 방법
CN110232976B (zh) 一种基于腰肩表面肌电测量的行为识别方法
KR101435905B1 (ko) 안전도와 근전도를 이용한 전자기기 제어 방법 및 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22736726

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22736726

Country of ref document: EP

Kind code of ref document: A1