WO2024159228A1 - Système de commande pour appareil à membre prothétique - Google Patents

Système de commande pour appareil à membre prothétique Download PDF

Info

Publication number
WO2024159228A1
WO2024159228A1 PCT/US2024/013385 US2024013385W WO2024159228A1 WO 2024159228 A1 WO2024159228 A1 WO 2024159228A1 US 2024013385 W US2024013385 W US 2024013385W WO 2024159228 A1 WO2024159228 A1 WO 2024159228A1
Authority
WO
WIPO (PCT)
Prior art keywords
prosthetic apparatus
signals
prosthetic
user
central controller
Prior art date
Application number
PCT/US2024/013385
Other languages
English (en)
Inventor
Kar-Han Tan
Eric M. MONSEF
Andru LIU
Original Assignee
Atom Limbs Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Atom Limbs Inc. filed Critical Atom Limbs Inc.
Publication of WO2024159228A1 publication Critical patent/WO2024159228A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/486Bio-feedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/389Electromyography [EMG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4851Prosthesis assessment or monitoring
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/68Operating or control means
    • A61F2/70Operating or control means electrical
    • A61F2/72Bioelectric control, e.g. myoelectric
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N1/00Electrotherapy; Circuits therefor
    • A61N1/18Applying electric currents by contact electrodes
    • A61N1/32Applying electric currents by contact electrodes alternating or intermittent currents
    • A61N1/36Applying electric currents by contact electrodes alternating or intermittent currents for stimulation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0247Pressure sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/50Prostheses not implantable in the body
    • A61F2/54Artificial arms or hands or parts thereof

Definitions

  • Described herein are methods and apparatuses (devices, systems, etc.) related to artificial limbs, including prosthetics and/or robotic arms. These methods and apparatuses described herein may be used as part of a powered prosthetic apparatus to be worn by a user.
  • apparatuses devices and systems, including garments and wearable devices
  • methods for use with a powered prosthetic apparatus are described herein.
  • these apparatuses may include any of the features or components described herein in any combination or individually.
  • wearable prosthetic apparatus comprising a should joint, a cuff, an upper arm, a forearm, a wrist, a palm, and fingers.
  • the prosthetic apparatus may include a control system.
  • the control system includes a plurality of sensors, controllers, and actuators.
  • the sensors which may include myoelectric sensors inertial measurement units, pressure sensors, cameras and the like may be coupled directly or indirectly to a central controller.
  • the central controller may form a stimulus vector from sensor signals.
  • the sensor signals may be used to control operations of the prosthetic apparatus.
  • Operation of the prosthetic apparatus may be guided by one or more operating states that define or describe a particular configuration of the prosthetic apparatus.
  • a controller can cause the prosthetic apparatus to operate in various operating states based on sensor signals or, in some cases, a stimulus vector associated with the sensor signals.
  • a camera can be used to program or learn behaviors for the prosthetic apparatus.
  • a camera can capture images of a real appendage of a user, and program the prosthetic apparatus to copy or mimic the actions of the real appendage.
  • Any of the methods described herein may be used for controlling any prosthetic apparatus and may include determining by a central controller, a current operating state of the prosthetic apparatus, receiving, by the central controller, a plurality of electromyographic (EMG) signals from a plurality of electrodes, forming, by the central controller, a stimulus vector of EMG signal values from the plurality of EMG signals, and transitioning to a next operating state based on the stimulus vector, wherein the transitioning includes activating one or more actuators of the prosthetic apparatus based on the next operating state.
  • EMG electromyographic
  • a stimulus vector is n-dimensional vector formed from a number n signal from n sensors.
  • the stimulus vector may originate at an origin.
  • a stimulus vector may be associated with a particular arrangement of input signals from n sensors.
  • the operating states can describe a position, relative position, actuator position or the like associated with any prosthetic apparatus.
  • each of the current and the next operating states may be associated with predetermined actuator positions of the prosthetic apparatus.
  • the actuator positions described herein may include actuator extension, retraction, rotation, or a combination thereof.
  • the actuator positions may be based on user training.
  • any of the electrodes may be surface electrodes disposed on a patient’s skin.
  • the electrodes may be myoelectric electrodes disposed on a garment worn by the user and configured to receive signals generated by the user.
  • the central controller can include one or more processors.
  • the processors may be one or more suitable processors capable of executing scripts or instructions of one or more software programs.
  • the central controller may be included within the prosthetic apparatus.
  • a processor may include hardware that runs the computer program code.
  • the term ‘processor’ may include and/or may be part of a controller and may encompass not only computers having different architectures such as single/multi-processor architectures and sequential (Von Neumann)/parallel architectures but also specialized circuits such as field-programmable gate arrays (FPGA), application specific circuits (ASIC), signal processing devices and other devices.
  • FPGA field-programmable gate arrays
  • ASIC application specific circuits
  • Any of the methods described herein may also include receiving, by the central controller, one or more signals from at least one inertial monitoring unit (IMU), wherein the stimulus vector is modified by the one or more signals from the at least one IMU.
  • IMU inertial monitoring unit
  • the current operating state describes a prosthetic orientation based on the one or more signals from the at least one IMU.
  • controlling a prosthetic apparatus may be based on configuring the prosthetic apparatus based on an operating state.
  • the methods described herein may include receiving, by the central controller, signals from a camera disposed on the prosthetic apparatus, wherein transitioning to the next operating state is based, at least in part, on the signals from the camera.
  • movement and/or motion of the prosthetic apparatus may be provided by one or more actuators.
  • Some actuators may be controlled by joint controllers that may be directly or indirectly coupled to a central controller.
  • the methods may include providing, by the actuators, a constant pressure to an object.
  • the pressure may be provided in response to receiving a particular stimulus vector.
  • the pressure provided by the actuators may be increased in response to receiving a particular stimulus vector.
  • any of the methods described herein may include providing haptic feedback to a user based on the signals associated with pressure.
  • transitioning to any operating state may be triggered by, or in response to, any particular stimulus vector.
  • different stimulus vectors may be used to transition the prosthetic apparatus to different operating states.
  • the same stimulus vectors may be used to transition the prosthetic apparatus to different operating states.
  • Stimulus vectors may be associated with a centroid.
  • the centroid may be a closest matching point that lies near or on the stimulus vector.
  • a centroid may be associated with an actuator position.
  • certain centroids may be associated with certain operating states.
  • the EMG signals may be provided by sensors.
  • the EMG signals may be processed.
  • the processing may include filtering, smoothing, and/or rectifying.
  • the EMG signals are sampled at a rate of 50 times a second.
  • the EMG signals are sampled at a rate of 200 times a second and then averaged for four samples to provide an effective sampling rate of 50 Hz.
  • prosthetic apparatuses may include one or more sensors, one or more processors, a memory storing instructions that, when executed by the one or more processors, cause the apparatus to determine, by a central controller, a current operating state of the prosthetic apparatus, receive, by the central controller, a plurality of electromyographic (EMG) signals from a plurality of electrodes, form, by the central controller, a stimulus vector of EMG signal values from the plurality of EMG signals, and transition to a next operating state based on the stimulus vector, wherein the transitioning includes activating one or more actuators of the prosthetic apparatus based on the next operating state.
  • EMG electromyographic
  • non-transitory computer-readable storage medium storing instructions that, when executed by one or more processors of a prosthetic apparatus, cause the prosthetic apparatus to determine, by a central controller, a current operating state of the prosthetic apparatus, receive, by the central controller, a plurality of electromyographic (EMG) signals from a plurality of electrodes, form, by the central controller, a stimulus vector of EMG signal values from the plurality of EMG signals, and transition to a next operating state based on the stimulus vector, wherein the transitioning includes activating one or more actuators of the prosthetic apparatus based on the next operating state.
  • EMG electromyographic
  • the methods may include receiving, from a camera, one or more images of a user’s limb, executing, by a processor, a neural network trained to match the one or more images of the user’s limb to one or more configurations of the prosthetic apparatus, wherein a configuration includes a relative position of portions the prosthetic apparatus, and determining a label associated with each of the one or more configurations of the prosthetic apparatus.
  • each label may be associated with a configuration, pose, or positioning of the prosthetic apparatus.
  • a controller may link or associate the determined labels with an operating state.
  • the method may include associating one or more stimulus vectors with each operating state.
  • Training of the neural network may be based on user data.
  • Training operations of the prosthetic apparatus can use captured images as input guidance to control, manipulate, or position the prosthetic. In this manner, a user can easily train a prosthetic apparatus to perform complex motions, tasks, or actions.
  • the camera can capture images of a user’s limb, which may include a hand and fingers and the like. The method can then program the prosthetic apparatus to mimic or mirror the captured images.
  • a prosthetic apparatus that includes one or more sensors, one or more processors, a memory storing instructions that, when executed by the one or more processors, cause the apparatus to receive, from a camera, one or more images of a user’s limb, execute, by a processor, a neural network trained to match the one or more images of the user’s limb to one or more configurations of the prosthetic apparatus, wherein a configuration includes a relative position of portions the prosthetic apparatus, and determine a label associated with each of the one or more configurations of the prosthetic apparatus.
  • Also described herein is a non-transitory computer-readable storage medium storing instructions that, when executed by one or more processors of a prosthetic apparatus, cause the prosthetic apparatus to receive, from a camera, one or more images of a user’s limb, execute, by a processor, a neural network trained to match the one or more images of the user’s limb to one or more configurations of the prosthetic apparatus, wherein a configuration includes a relative position of portions the prosthetic apparatus, and determine a label associated with each of the one or more configurations of the prosthetic apparatus.
  • the apparatuses described herein may be configured to perform any of the methods described. All of the methods and apparatuses described herein, in any combination, are herein contemplated and can be used to achieve the benefits as described herein.
  • FIG. 1 A illustrates an example of a garment apparatus configured to be used with (or to include) a prosthetic apparatus to be worn by a user.
  • FIG. IB is an example of a prosthetic apparatus as described herein.
  • FIG. 2A shows an example control system for a prosthetic apparatus.
  • FIG. 2B shows another example of a control system for a prosthetic apparatus.
  • FIG. 3 is an illustration of an example application of a prosthetic apparatus.
  • FIG. 4A shows a simplified diagram of an example prosthetic apparatus.
  • FIG. 4B shows a simplified diagram of another example prosthetic apparatus.
  • FIG. 5 shows a simplified diagram of prosthetic fingers and thumb grasping a target.
  • FIG. 6A shows a simplified diagram of an example prosthetic apparatus arranged to enable a user to drink from a cup.
  • FIG. 6B shows a simplified diagram of another example prosthetic apparatus arranged to enable a user to drink from a cup.
  • FIG. 7A shows another example prosthetic apparatus.
  • FIG. 7B shows a side view of an example arrangement of myoelectric electrodes and an IMU.
  • FIG. 8A shows a three-dimensional graph that includes data groupings for actions or motions associated with a predetermined prosthetic action.
  • FIG. 8B shows a two-dimensional graph that includes data groupings for actions or motions associated with a predetermined prosthetic action.
  • FIG. 9 is an example state diagram for controlling a prosthetic apparatus.
  • FIG. 10A is another example state diagram for controlling a prosthetic apparatus.
  • FIG. 10B is another state diagram for controlling a prosthetic apparatus.
  • FIG. 11 is another example state diagram for controlling a prosthetic apparatus.
  • FIG. 12 shows an example prosthetic apparatus that includes a variety of IMU sensors to hold a cup level.
  • FIG. 13 is a flowchart showing an example method for operating a prosthetic apparatus.
  • FIG. 14 shows an example system for training a prosthetic apparatus.
  • FIG. 15 shows an image of a user playing piano with a prosthetic apparatus.
  • FIG. 16A and 16B show images of actions that require individual control and/or articulation of finger joints.
  • FIG. 17 is a flowchart showing an example method for training a prosthetic apparatus.
  • FIG. 18 shows a block diagram of a device that may be one example of the control system of FIG. 2 A or FIG. 2B
  • FIG. 19A illustrates an example of a cuff including a spatial haptic feedback array (“spatial haptic feedback cuff’).
  • FIG. 19B illustrates one example of a pattern of actuating the spatial haptic feedback array of FIG. 19 A.
  • Wearable apparatuses for use with prosthetics, and in particular, powered prosthetics, may be configured to holding and distributing the weight and forces of the artificial limb.
  • Artificial limbs as described herein may refer to prosthetics and/or robotic arms.
  • the methods and apparatuses (garments) described herein may be used as part of a powered prosthetic apparatus to be worn by a user or they may equivalently be a part of a robotic apparatus that may be operated remotely and/or automatically.
  • the apparatuses (devices and/or system, including wearable garments for use with artificial limbs) described herein may include a skin-worn layer, which may be a compression garment, a set of rigid or semi-rigid plates coupled to the skin-worn layer, and one or more internal striction regions for engaging with the wearer’s skin.
  • a skin-worn layer which may be a compression garment
  • a set of rigid or semi-rigid plates coupled to the skin-worn layer
  • one or more internal striction regions for engaging with the wearer’s skin.
  • neuromuscular interfaces e.g., cuffs
  • the socket attachment may mount to the cuff and/or may be configured to couple to the garment; in some examples the socket may be integrated into the garment.
  • Powered prosthetic apparatuses are becoming increasingly more capable and complex.
  • many prosthetic apparatuses may include a large number of actuators that can move joints, limbs, or appendages.
  • the complexity of controlling the prosthetic’s actuators has surpassed controlling the opening and closing of a simple end effector (e.g., hook, claw, or the like) of simpler prosthetic apparatus.
  • FIG. 1 A illustrates an example of a garment apparatus configured to be used with (or to include) a prosthetic apparatus to be worn by a user.
  • the user may be, for example, an amputee.
  • the apparatus (garment) may include a harness or yoke 101 to which a prosthetic apparatus 100 may be coupled.
  • the user may also wear a neuromuscular interface, in this example, configured as a cuff 105 which may be applied over the skin in order to detect input (e.g., signals, including electromyographic, EMG, signals) to control the prosthetic apparatus 100.
  • input e.g., signals, including electromyographic, EMG, signals
  • the prosthetic apparatus 100 may include an upper arm portion 103 that may be coupled to the user’s torso and/or the yoke 101 and may include an elbow joint 107 linked to both the upper arm portion 103 and a forearm portion 109.
  • the elbow joint 107 may be powered, as described herein. That is, the elbow joint 107 may include any number of actuators to move the elbow joint 107 through any number of positions. In some variations, the elbow joint 107 may limit movement to certain axes (thereby limiting degrees of freedom).
  • the forearm 109 may include an outer housing enclosing powered and unpowered components, and may couple to a wrist joint 111, which may be a powered wrist joint, as described herein.
  • the wrist joint 111 may connect to a hand with fingers 115, which may be powered, and a palm 113.
  • FIG. IB shows another example prosthetic apparatus 120.
  • the prosthetic apparatus 120 may be similar to the prosthetic apparatus 100 of FIG. 1 A and also include a camera and microphone module 122. Although depicted here as a shared module (the camera and microphone module 122), in some variations the camera and the microphone may be separate devices. In some examples, the camera may be a wide-angle camera able to capture and/or detect objects, motion, or the like near the position of the camera and microphone module 122. Although only one camera and microphone module 122 is shown here, in other variations the prosthetic apparatus 120 may include any feasible number of cameras and/or microphones.
  • the camera and microphone module 122 may be used by a control system (not shown) to aid the user in controlling the prosthetic apparatus 120.
  • the camera and microphone module 122 may provide input signals for the control system.
  • the control system may receive camera images showing that the hand of the prosthetic apparatus 120 is being brought near a cup, glass, mug, or the like. The control system may then proceed to direct the prosthetic apparatus 120 to grasp the cup, glass, etc.
  • the control system may receive camera images showing the that the hand of the prosthetic apparatus 120 is being brought near a doorknob. The control system may then proceed to direct the prosthetic apparatus 120 to grasp and twist the doorknob.
  • the microphone may capture noises and/or voice commands to assist the control system in controlling the prosthetic apparatus 120.
  • FIG. 2A shows an example control system 200 for a prosthetic apparatus, such as the prosthetic apparatus 100 of FIG. 1A or the prosthetic apparatus 120 of FIG. IB.
  • the control system 200 may be distributed throughout a prosthetic apparatus thereby advantageously distributing component weight and optimizing placement of sensors used to provide input signals to the control system 200.
  • the control system 200 may be divided into sections that coincide with portions of a prosthetic apparatus.
  • the control system 200 includes a cuff section 210, a forearm section 220, a wrist section 230, and a hand section 240.
  • Each section may include sensors, actuators, processors, and the like that can be used to control and actuate the prosthetic apparatus. Moreover, each section may be configured to perform one or more operations that may collectively provide the user control of the prosthetic apparatus.
  • the cuff section 210 may be worn by the user and include a cylindrical sheath that has a plurality of myoelectric sensors that may be placed in contact with the user (often on the user’s residual limb or the like). The myoelectric sensors may receive, and in some cases amplify signals (sometimes referred to as electromyographic (EMG) signals) from the user’s skin.
  • EMG electromyographic
  • the cuff section 210 may also include a power supply (battery, or the like) as well as additional sensors that can provide information regarding motion or position of the cuff section 210.
  • the cuff section 210 may include an inertial measurement unit (IMU).
  • the IMU may include one or more inertial, gyroscopic, and/or compass sensors.
  • the cuff section 210 may collect and/or receive a variety of signals directly and indirectly from the user to control the prosthetic apparatus.
  • the cuff 210 may be coupled (mechanically, electrically, and/or communicatively) to the forearm section 220.
  • the forearm section 220 may include a central controller that receives signals from the cuff section 210 as well as other sensors, and generates control signals for a variety of actuators that may be included within the prosthetic apparatus.
  • the central controller can coordinate control of the actuators of the prosthetic apparatus based on any received signals.
  • the forearm section 220 may include a battery, power button, status LED and a power management unit.
  • the power management unit may receive direct current (DC) power from a DC in port to provide current to charge the battery.
  • the forearm section 220 may distribute power within the prosthetic apparatus through a power bus.
  • the power bus may be coupled to any controllers, processors, sensors, actuators, and the like within the prosthetic apparatus.
  • the central controller may be coupled to other processors and/or controllers.
  • the central controller may be coupled to a kinematics processor included in the forearm section 220 through a control area network (CAN).
  • the kinematics processor can direct other controllers to move actuators in the prosthetic apparatus.
  • the kinematics processor may be coupled (through the CAN) to a first joint controller in the forearm section 220.
  • the joint controller may control actuators in the forearm section 220 that may cause motion between any feasible sections of the prosthetic apparatus.
  • the forearm section 220 may also include a touch display, an audio amplifier, and a speaker.
  • the touch display, audio amplifier, and speaker may be used for configuration training.
  • the touch display may deliver messages to the user.
  • the messages may also be delivered through audio signals via the audio amplifier and the speaker.
  • a smartphone may wirelessly communicate to the central controller to train and/or configure the prosthetic apparatus. Use of a smartphone and/or an application executed on a smartphone to train and/or configure the prosthetic apparatus is described in more detail with respect to FIGS. 14-17.
  • the wrist section 230 may provide articulation between the forearm section 220 and the hand section 240. Although not shown, in some examples the wrist section 230 may include any feasible number of sensors, actuators, controllers and the like.
  • the hand section 240 may also include actuators and sensors.
  • the hand section 240 may include a number of actuators that are configured to move appendages (fingers, thumb, etc.) attached to the hand section 240.
  • the hand section 240 may include a joint controller coupled to the CAN and configured to control operation of the colocated actuators.
  • the hand section 240 may include an IMU that can provide input signals to the central controller. These IMU signals may be used in conjunction with other input signals to control the prosthetic apparatus.
  • the hand section 240 may include a camera and microphone that also may provide signals to the central controller to control movement and operations of the prosthetic apparatus.
  • the system 200’ shown in FIG. 2B is similar to that shown in FIG. 2A, discussed above, however the forearm portion 220’ includes a Bluetooth controller that may interface with the cuff 210 and mobile communications tool (e.g., smartphone). In this way, the EMG signals from the cuff 210 can be interpreted with a lower power processor, without having to go through the central controller, potentially resulting in lower power consumption and improved responsiveness.
  • the hand section 240 and wrist portion 230 may be the same as in FIG. 2A.
  • any of these prosthetic apparatuses may include one or more processors for controlling operation of the prosthetic.
  • the one or more processors e.g., configured as a controller or controller
  • These apparatuses may generally include a power supply (rechargeable power supply, wall power adapter, etc.), and may include communication circuitry (e.g., wireless communication circuitry) to communicate between components (e.g., joints, etc.) and/or with a remote processor, user smartphone, tablet, etc.
  • a variety of different sensors may be included.
  • the powered components may include sensors to detect forces, including torque, acting on the powered component, and/or may include sensors for detecting position (absolute position and/or relative position).
  • Sensors may include force sensors, accelerometers, etc., and in some case proximity sensors (including optical, ultrasound, electrical (e.g., ultrawideband, UWB, sensors, etc.), etc., may be integrated into the apparatus, including on figures and/or palm and/or back of the hand, side(s) of the hand, wrist, forearm, etc.
  • the prosthetic apparatus may be operated with a user-control input device, including sensory devices (e.g., EMG, joystick, etc.).
  • the apparatus may include a cuff or band including neuromuscular sensors (e.g., EMG or other equivalent sensor) as control input to control the powered operation of these apparatuses (e.g., artificial, powered limb).
  • the apparatus may be configured to execute pre-programed (e.g., “macro”) movements.
  • the prosthetic apparatuses described herein may include one or more sensors for determining the position and/or torque and/or status of one or more joints formed as part of the apparatus (e.g., finger joint, wrist joint, elbow joint, etc.).
  • FIG. 3 is an illustration 300 of an example application of a prosthetic apparatus.
  • the user uses a prosthetic apparatus to grasp a glass.
  • the prosthetic apparatus may include and use a variety of sensors and actuators to move and control some or all of the prosthetic apparatus.
  • FIG. 4A shows a simplified diagram of an example prosthetic apparatus 400.
  • the prosthetic apparatus 400 may include a cuff 410, an upper arm 420, a forearm 430, and a hand 440.
  • the prosthetic apparatus 400 may include a shoulder joint 411, an elbow joint 421, and a wrist joint 431.
  • the joints provide articulation between one or more sections of the prosthetic apparatus 400.
  • the forearm 430 may include a central controller (not shown) similar to as described with respect to FIGS. 2 A and 2B.
  • the prosthetic apparatus 400 includes IMUs that may provide the central controller absolute and/or relative positioning signals as well as acceleration signals.
  • the cuff 410 may include a cuff IMU
  • the upper arm 420 may include an elbow IMU
  • the hand 440 may include a palm IMU (not shown).
  • the cuff 410 may also include a plurality of myoelectric sensors (not shown). Signals from the IMUs and the myoelectric sensors may be coupled and/or provided to the central controller to enable the central controller to operate actuators within the prosthetic apparatus 400.
  • a user can generate signals (EMG signals) that are captured by the myoelectric sensors and coupled to the central controller.
  • EMG signals signals
  • the central controller may cause the prosthetic apparatus 400 to approach a target (shown here as a cup, but any feasible target is possible).
  • FIG. 4B shows a simplified diagram of another example prosthetic apparatus 450.
  • the prosthetic apparatus 450 may include the cuff 410, the upper arm 420, the forearm 430, and the hand 440 as described with respect to FIG. 4A.
  • the prosthetic apparatus 450 may include camera 460.
  • the camera 460 may capture images that may be used by the central controller to control the prosthetic apparatus 450. For example, as the prosthetic apparatus 450 is brought near a target, images from the camera 460 can be provided to the central controller. The central controller can use the images to cause the prosthetic apparatus 450 to move toward and, in some cases, grasp the target.
  • FIG. 5 shows a simplified diagram 500 of prosthetic fingers and thumb grasping a target.
  • the amount of grasping force provided by the fingers and thumb may be context dependent. That is, differing amounts of force may be used to grasp an object or target depending upon the target and/or a particular task involving the target.
  • each finger and/or thumb may include a pressure or force sensor to provide pressure or force information to the central controller (not shown).
  • individual fingers can flex (move) until a touch of the target is detected by any sensor. Then, after detection of the target, operation of the fingers may transition to “impedance control” to constantly apply some amount of pressure (e.g., grasping pressure) to the target. In some cases, the amount of pressure may be controller by the user. For example, the user may generate EMG signals to increase or decrease grasping pressure. Pressure feedback may be provided by the prosthetic apparatus to the user through a display and/or through haptic feedback.
  • some amount of pressure e.g., grasping pressure
  • the amount of pressure may be controller by the user. For example, the user may generate EMG signals to increase or decrease grasping pressure.
  • Pressure feedback may be provided by the prosthetic apparatus to the user through a display and/or through haptic feedback.
  • FIG. 6A shows a simplified diagram of an example prosthetic apparatus 600 arranged to enable a user to drink from a cup.
  • EMG signals from the user may be received by the central controller to initiate control of the prosthetic apparatus 600 to move and operate allowing the user to drink from a cup.
  • the central controller may operate the prosthetic apparatus through a series of states. Transitioning from one state to another may be based upon the reception of other EMG signals or reception of signals from one or more IMUs or other sensors included within the prosthetic apparatus 600.
  • the central controller may receive EMG signals causing the prosthetic apparatus to operate in a first state to raise a cup. Within the first state, the palm IMU may provide signals to the central controller to hold the cup level while being raised by the forearm. Next, the central controller may then operate in a second state that tilts the cup for drinking. Entry into the second state may be caused or triggered by new or additional EMG signals received by the central controller.
  • FIG. 6B shows a simplified diagram of another example prosthetic apparatus 650 arranged to enable a user to drink from a cup.
  • the prosthetic apparatus 650 includes a camera.
  • the camera provides signals to a central controller that may also be used to control operations of the prosthetic apparatus 650.
  • the camera can detect and track a user’s face and mouth. The detection and tracking of the user’s face and mouth can cause the prosthetic apparatus 650 to keep the cup level as it is brought up to the user’s mouth.
  • the central controller can cause the prosthetic apparatus 650 to tilt the cup.
  • FIG. 7A shows another example prosthetic apparatus 700.
  • the prosthetic apparatus 700 which may be an arm, may include a cuff 710.
  • the cuff 710 may include a plurality of myoelectric electrodes 711 distributed circumferentially around to cuff 710.
  • the myoelectric electrodes 711 can simultaneously receive EMG signals from the user.
  • the central controller can simultaneously receive eight EMG signals.
  • the central controller can simultaneously receive sixteen EMG signals.
  • the central controller (such as the central controller shown in FIGS. 2 A and 2B) can receive the EMG signals from the cuff 710 to control the prosthetic apparatus 700.
  • the EMG signals may be sampled periodically. For example, the EMG signals may be sampled fifty times a second (50 Hz). In another example, the EMG signals may be sampled 200 times a second (200 Hz). In some cases, the EMG signals may be averaged over four samples thereby providing an effective sampling rate of 50 Hz.
  • One or more of the EMG signals may be processed for or by the central controller. For example, one or more of the EMG signals may be rectified (made to substantially have voltages greater than or equal to zero volts).
  • one or more of the EMG signals may be filtered with a low-pass, high-pass, and/or band-pass filter to remove undesirable transitions, frequency components, or the like.
  • one or more of the EMG signals may be averaged. For example, a 200 Hz EMG signal may be averaged over four samples to generate a 50 Hz EMG signal.
  • the EMG signals may be differential signals that fluctuate between positive and negative signals.
  • the central controller may process the differential EMG signals.
  • the differential EMG signals may be integrated to remove spurious transitions (spikes).
  • the central controller may generate square waves from the differential EMG signals.
  • the EMG signals from the myoelectric electrodes 711 may be used to control the prosthetic apparatus.
  • the signals from the myoelectric electrodes 711 may be concatenated into an n-dimensional vector (where n is the number of myoelectric electrodes available to provide signals to the central controller).
  • the value of the n- dimensional vector may be used to control operations of the prosthetic apparatus 700.
  • the value of the n-dimensional vector may be used to trigger state transitions that control operations, positions, and/or movements of the prosthetic apparatus 700.
  • FIG. 7B shows a side view of an example arrangement 750 of myoelectric electrodes 751 and an IMU 752.
  • the myoelectric electrodes 751 and the IMU 752 may be disposed in or on any feasible surface or region of a prosthetic apparatus, such as the cuff 210 of FIGS. 2 A and 2B.
  • Information (signals) from the IMU 752 may be used in conjunction with EMG signals from the myoelectric electrodes 751. For example, if the prosthetic apparatus is positioned on a residual appendage, and the appendage is hanging straight down, then the myoelectric electrodes 751 may show signals that indicate all the muscles are relaxed while the EMG signals indicate that the appendage is hanging straight down. If the appendage is at an angle (with respect to the ground) then the myoelectric electrodes 751 may indicate that some muscles are active. However, data from the IMU 752 may be used to provide positional information to the central controller. The signals from the IMU 752 may be used in addition to EMG signals to control the prosthetic apparatus 700. Control of the prosthetic apparatus is described in more detail below with respect to FIGS. 8-11.
  • FIG. 8A shows a three-dimensional graph 800 that includes data groupings for actions or motions associated with a predetermined prosthetic action.
  • the signals from n myoelectric electrodes may be used to form an n-dimensional vector.
  • the three- dimensional graph 800 shows three-dimensional vectors (instead of n-dimensional vectors) that may be associated with particular prosthetic actions.
  • a controller such as the central controller of FIGS. 2A and 2B
  • the central controller may direct one or more actuators to activate and thereby move or control a prosthetic apparatus.
  • detection of sensor data may be used to generally signal user input.
  • the three-dimensional graph 800 includes an origin 810.
  • the origin 810 may represent little or no signals from any of the myoelectric sensors of the prosthetic apparatus. For example, if the user is exerting little or no force to a worn prosthetic apparatus, then the myoelectric sensors of the prosthetic apparatus would detect little or no myoelectric signals. Thus, the origin 810 may represent an initial, or resting condition of the user.
  • myoelectric sensors may detect signals and form an n-dimensional vector from the detected signals. In some cases, the controller may determine a stimulus vector starting at the origin 810 and traveling through a point defined by the n-dimensional vector.
  • the central controller may then determine a closest matching point (sometimes called a centroid) that lies near or on the stimulus vector.
  • the closest matching point e.g., nearest neighbor
  • the closest matching point may be used to determine user input.
  • the user input may control the prosthetic apparatus.
  • FIG. 8B shows a two-dimensional graph 850 that includes data groupings for actions or motions associated with a predetermined prosthetic action. Similar to the three- dimensional graph 800, the two-dimensional graph 850 may be a simplified representation of the n-dimensional vector formed from the signals from the myoelectric sensors. In the two- dimensional graph 850, different groupings of points may be associated with different n- dimensional vectors.
  • FIG. 9 is an example state diagram 900 for controlling a prosthetic apparatus. Operations associated with state diagram 900 may be performed by one or more controllers (including any controllers described herein, such as any of the controllers described with respect to FIGS. 2 A and 2B).
  • One or more signals may be received from one or more EMG signals 910.
  • the EMG signals 910 may be surface EMG (S-EMG) signals received from one or more myoelectric and/or surface myoelectric sensors.
  • the EMG signals 910 may be formed into an n-dimensional vector.
  • the EMG signals 910 and a current operating state 920 may be provided to an S- EMG analysis unit 930.
  • An operating state as described herein, may be associated with a particular mode of operation or with a particular position or orientation of the prosthetic apparatus.
  • the S-EMG analysis unit 930 determines one or more action parameters 940 based on the current state 920 and the EMG signals 910.
  • the action parameters 940 may include parameters to actuate any of the actuators included in the prosthetic apparatus.
  • the action parameters may control haptic feedback devices such as those described with respect to FIG. 5.
  • the S-EMG analysis unit 930 also receive signals from IMUs, cameras, microphones, pressure sensors included within the prosthetic apparatus.
  • the S- EMG analysis unit 930 can determine the action parameters 940 based on the EMG signals 910, the state 920, and the signals from IMUs, cameras, microphones, pressure sensors, or the like.
  • FIG. 10A is another example state diagram 1000 for controlling a prosthetic apparatus.
  • State diagram 1000 is an example of a state diagram with a branching factor of N, where N is the number of possible distinct actions, starting from an initial state 1010.
  • the N possible actions are shown here as actions 1020. Although N actions are shown here, any feasible number of actions are possible. Thus, from the initial state 1010, operations may branch to one of N possible actions through N possible branches. Transitions to any of the actions 1020 may be in response to myoelectric sensor signals, IMU signals, pressure transducer signals, camera signals, microphone signals, or any other feasible input.
  • FIG. 10B is another state diagram 1050 for controlling a prosthetic apparatus.
  • the state diagram 1050 is an example of a state diagram with a branching factor of two.
  • the branching factor of two means that from any state, there are two possible outcomes (responses) based on received signals, or other inputs.
  • operations may proceed to a first state 1061 or a second state 1062, based on received signals.
  • progress through the state diagram 1050 may include traversing through a plurality of states 1070 until an action is determined.
  • M possible actions 1080 are shown here, any number of actions are possible.
  • FIG. 11 is another example state diagram 1100 for controlling a prosthetic apparatus.
  • the state diagram 1100 shows possible operating states associated with reach for and drink from a cup. Similar state diagrams may be used to describe control of the prosthetic apparatus to perform any feasible task or operation.
  • Each state in the state diagram 1100 may represent an operating state or a configuration (e.g., a position and/or arrangement) of the prosthetic apparatus.
  • a configuration may include a relative position of one or more sections, joints, limbs of the prosthetic apparatus to itself.
  • any operating state may be associated with predetermined actuator positions.
  • Actuator positions may include actuator extension, retraction, rotation, or a combination thereof.
  • seven operating states are shown here, in other examples any feasible number of states may be used to control the prosthetic apparatus.
  • entry into any state may be determined by (or in response to) an input signal.
  • the input signal may be a collective or composite signal from any sensors including a plurality of myoelectric sensor signals.
  • the input signal may be an n-dimensional vector formed from the signals from the myoelectric sensors.
  • a first input 1130 may be an n-dimensional vector formed from signals from the myoelectric sensors may so little or no activity (similar to the origin 810 of FIG. 8). The first input 1130 may return or keep operation of the prosthetic apparatus in the idle state 1101.
  • the prosthetic apparatus may receive a second input 1120 that causes the prosthetic apparatus to operate in a reaching state 1102.
  • the reaching state 1102 may cause one or more actuators of the prosthetic apparatus to extend for (reach toward) a cup.
  • the second input 1120 may be an n-dimensional vector different from the first input 1130.
  • the second input 1120 may be a stimulus vector as described above with respect to FIG. 8 A.
  • operation in the reaching state 1102 may also be controlled by a camera (such as the camera of FIGS. 4B and/or 6B).
  • a controller can use images from the camera to guide actuators to cause the prosthetic apparatus to reach for the cup.
  • the prosthetic apparatus may include a controller configured to execute a machine learning-based procedure (e.g., a trained neural network) to perform operations associated with the reaching state 1102.
  • a machine learning-based procedure e.g., a trained neural network
  • any of these apparatuses and methods may include one or more trained pattern matching (e.g., machine learning) agents.
  • a trained pattern matching agent may include an artificial intelligence agent, including a machine learning agent.
  • the machine learning agent may be a deep learning agent.
  • the trained pattern matching agent may be trained neural network. Any appropriate type of neural network may be used, including generative neural networks.
  • the neural network may be one or more of: perceptron, feed forward neural network, multilayer perceptron, convolutional neural network, radial basis functional neural network, recurrent neural network, long short-term memory (LSTM), sequence to sequence model, modular neural network, etc.
  • a trained pattern matching agent may be trained using a training data set.
  • the prosthetic apparatus may receive a third input 1121 that causes the prosthetic apparatus to operate in a grasping state 1104.
  • a controller may cause one or more actuators of the prosthetic apparatus to grasp (close on to or increase a pressure on) a cup.
  • the third input 1121 may be an n-dimensional vector or stimulus vector as described herein.
  • the prosthetic apparatus may receive a fourth input 1122 that causes the prosthetic apparatus to operate in a holding state 1106.
  • a controller may cause one or more actuators of the prosthetic apparatus to hold on to an object such as a cup by providing a continuous predetermined force or pressure (e.g., constant pressure) as determined by force sensors.
  • a continuous predetermined force or pressure e.g., constant pressure
  • applying continuous force or pressure is referred to as impedance control.
  • the fourth input 1122 may be an n- dimensional vector or stimulus vector as described herein.
  • the fourth input 1122 may include pressure sensor inputs that indicate a touch with a cup or other object is detected.
  • the prosthetic apparatus may receive a fifth input 1123 that causes the prosthetic apparatus to operate in a raising/lowering state 1108.
  • the raising/lowering state 1108 may cause the prosthetic apparatus to raise or lower a grasped cup.
  • the fifth input 1123 may be an n-dimensional vector or stimulus vector as described herein.
  • input from an IMU sensor may be used to control the prosthetic apparatus while in the raising/lowering state 1108.
  • one or more IMU sensors may be used to hold/carry a cup in a level manner.
  • FIG. 12 shows an example prosthetic apparatus 1200 that includes a variety of IMU sensors to hold a cup level.
  • the prosthetic apparatus may receive a sixth input 1124 that causes the prosthetic apparatus to operate in a tilting state 1110.
  • the tilting state 1110 may cause the prosthetic apparatus to tilt a cup towards the subject’s (user’s) mouth.
  • the sixth input 1124 may be an n-dimensional vector or stimulus vector as described herein.
  • signals from a camera may be used to guide the prosthetic apparatus in the tilting state 1110.
  • a camera can be used to locate a subject’s mouth and begin tilting the cup only after the camera detects that the cup is within a predetermined distance to the mouth.
  • the prosthetic apparatus may receive a seventh input 1125 that causes the prosthetic apparatus to operate in (return to) the raising/lowering state 1108.
  • the seventh input 1125 may be an n-dimensional vector or stimulus vector as described herein.
  • the prosthetic apparatus may lower a cup away from the subject’s mouth.
  • one or more IMU sensors may be used to hold/carry a cup in a level manner in the raising/lowering state 1108.
  • the prosthetic apparatus may receive an eighth input 1126 that causes the prosthetic apparatus to operate in (return to) the holding state 1106.
  • the eighth input 1126 may be an n-dimensional vector or stimulus vector as described herein.
  • the holding state 1106 may be entered to prepare to release the cup.
  • the prosthetic apparatus may receive a ninth input 1127 that causes the prosthetic apparatus to operate in the release state 1112. In the release state 1112, the prosthetic apparatus releases hold of the cup.
  • the ninth input 1127 may be an n-dimensional vector or stimulus vector as described herein.
  • the prosthetic apparatus may receive a tenth input 1128 that causes the prosthetic apparatus to operate in (return to) the reaching state 1102. For example, the user may want to pick up a glass and drink again.
  • the prosthetic apparatus may receive an eleventh input 1129 that causes the prosthetic apparatus to operate in (return to) the idle state 1101.
  • the tenth input 1128 and the eleventh input 1129 may each be an n-dimensional vector or stimulus vector as described herein. In some variations, the tenth input 1128 and the eleventh input 1129 may be different or the same n-dimensional vectors or stimulus vectors.
  • Any of the inputs described above function as input signals to cause a transition between any of the operating states.
  • the n-dimensional vectors or stimulus vectors may be used to find a closest matching point (nearest neighbor). If the vector is within a predetermined distance to the matching point, then operation of the prosthetic apparatus may proceed to a next operating state.
  • the same n-dimensional or stimulus vector may be used to transition the prosthetic apparatus to different operating states.
  • input 1122 and input 1129 may be the same n- dimensional or stimulus vector but may cause the prosthetic apparatus to operate in different states.
  • transitions between any states may be governed a controller configured to execute a machine learning-based procedure (e.g., a trained neural network) to perform any feasible operation.
  • a machine learning-based procedure e.g., a trained neural network
  • transition between any operating states may occur without the need for the user to provide input signals, such as myoelectric signals.
  • pressure sensor, touch sensor, IMU sensor, and/or camera signals may be used, in some cases with machine learning-based procedures, to move between different operating states and control the prosthetic apparatus.
  • a prosthetic apparatus may be trained or may learn positions or configurations associated with each possible state.
  • transitions between states may be controlled by a combination of environmental conditions (IMU signals, pressure feedback signal, camera images, and the like) and/or myoelectric signals.
  • IMU signals environmental conditions
  • pressure feedback signal pressure feedback signal
  • camera images and the like
  • myoelectric signals myoelectric signals.
  • configurations associated with states may be reused for a variety of different operations. Different transitions may be used to train the apparatus to perform different operations. Machine learning procedures and training is described in more detail with respect to FIG. 17.
  • FIG. 19A illustrates one example of a cuff 1901 That may be configured as a cup, strap and/or band that fits over the limp (e.g., over the residual limb portion).
  • the cuff shown in FIG. 19A is configured as a cylindrical strap, and includes a plurality of surface electromyography (S-EMG) electrodes arranged in an array of N electrodes 1905.
  • S-EMG surface electromyography
  • the electrodes may be arranged in any advantageous configuration, including radially-spaced columns, etc.
  • the cuff may include one or more additional sensors, including movement or position sensors.
  • the cuff includes an inertial measurement unit (IMU) 1903.
  • the cuff may also include control circuitry that may process the signals on/from the electrodes, IMU, etc.
  • the control circulatory may include a memory.
  • the cuff may also be configured to provide tactile feedback, e.g., haptic feedback.
  • the cuff includes an array of haptic feedback outputs that may be driven by the controller, including by the control circuitry on the device.
  • the cuff may also include a power supply (e.g., battery 1909).
  • the apparatus includes a spatial array of haptic actuators 1907 configured to create a richer set of feedback to the user, which may be important for finer-grain control interactions.
  • the haptic feedback (outputs) may vibrate under the direction of the controller and/or control circuitry.
  • FIG. 19B illustrates one example of a successive actuation pattern of a spatial haptic actuator array that can be used to provide a richer set of feedback to the user than possible with just single actuators.
  • the timing pattern of the individual haptic actuators (1-6) is shown.
  • the pattern of haptic feedback (e.g., spatial and timing pattern) may be coordinated by the control circuity with a particular set of movements and/or feedback (e.g., force sensor, pressure sensors, etc.) on the limb and/or on the hand, finger(s), wrist and/or forearm portion of the limb.
  • FIG. 13 is a flowchart showing an example method 1300 for operating a prosthetic apparatus. Some examples may perform the operations described herein with additional operations, fewer operations, operations in a different order, operations in parallel, and some operations differently. The method 1300 is described below with respect to the control system 200 of FIGS. 2A and 2B, however, the method 1300 may be performed by any other suitable system or device.
  • the method 1300 begins in block 1302 as a controller determines a current operating state of the prosthetic apparatus.
  • the controller may be any feasible controller in the control system 200, such as a central controller.
  • the current (or any) operating state may describe, among other things, a position, relative position, actuator position, orientation, or the like of the prosthetic apparatus.
  • Actuator positions may include actuator extension, retraction, rotation, or the like. In some variations, the actuator positions may be based on user training of the prosthetic apparatus.
  • operating states may be associated with, or described by a state diagram such as the state diagram 900 of FIG. 9, the state diagram 1000 of FIG. 10B, the state diagram 1050 of FIG. 10B, the state diagram 1100 of FIG. 11, or any other feasible state diagram.
  • the controller receives EMG signals from electrodes.
  • the EMG signals may be received from any feasible sensors or electrodes.
  • the controller may receive additional signals.
  • the additional signals may include IMU signals, pressure sensor signals, camera signals, or other environmental signals.
  • the controller forms a stimulus vector from the EMG signals.
  • the stimulus vector may be an n-dimensional vector as described with respect to FIGS. 8 A and 8B.
  • the stimulus vector may originate at an origin such as the origin 810.
  • the additional signals described with respect to block 1304 may be used to modify the stimulus vector. In some other cases, the additional signals may be used independently or in conjunction with the stimulus vector.
  • the controller transitions the prosthetic apparatus to operate in a next operating state based on the stimulus vector.
  • a closest matching point nearest neighbor or centroid
  • the closest matching point may be used to determine the next operating state.
  • the same stimulus vector may be used to transition the prosthetic apparatus to different operating states.
  • FIG. 14 shows an example system 1400 for training a prosthetic apparatus.
  • the system may include a smartphone 1410 and a user 1420.
  • the system 1400 may include other connected compute devices (not shown).
  • the system 1400 may include compute nodes, processors, processing devices, and the like that may be coupled (wired or wirelessly) to the smartphone 1410.
  • a camera within the smartphone 1410 may be used to capture a desired motion for a prosthetic apparatus.
  • the user 1420 may show or demonstrate a desired motion that is captured by the smartphone 1410.
  • complex motions and/or gestures may be captured and used to train the user’s prosthetic apparatus.
  • the captured images and/or video data may be sent to a remote processor, processing unit, server, or the like to execute one or more machine learning-based procedures to determine control signals for the prosthetic apparatus and to have the prosthetic apparatus perform the desired motion.
  • the system 1400 may use one or more trained neural networks to recognize actions or gestures (captured through the smartphone 1410) and determine how to control the prosthetic apparatus accordingly.
  • the training of a prosthetic apparatus by the system 1400 may be performed by each user of the prosthetic apparatus.
  • each user may train all aspects and motions to be performed by the prosthetic apparatus.
  • the system 1400 may be used to train a prosthetic apparatus for any user. That is, the system 1400 may provide a generic basis of trained programming for a prosthetic apparatus that may later be customized for any specific user.
  • training of the prosthetic apparatus may use other machine learning -based procedures.
  • the cuff 210 of FIGS. 2 A and 2B may include the plurality of myoelectric electrodes 751 of FIG. 7B.
  • the cuff 210 may be positioned or oriented in a variety of different positions with respect to the user’s arm. The position of the cuff 210 may affect which myoelectric electrodes 751 receive particular signals from the user.
  • One or more machine learning-based procedures may use information from the IMU 752 along with one or more signals from the myoelectric electrodes 751 to determine the physical relationship (orientation) between the myoelectric electrodes 751 and the user’s arm.
  • the system 1400 can automatically determine how the cuff 210 and/or the myoelectric electrodes 751 are positioned with respect to the user’s arm.
  • a machine learning-based procedure (such as a trained neural network) is used to minimize training associated with controlling the prosthetic apparatus. For example, a user may perform extensive initial training exercises with a prosthetic apparatus. At a later time (another day, for example), the user again controls the prosthetic apparatus. However, due to repositioning of the prosthetic apparatus or slightly different user appendage positioning, the previous training of the prosthetic apparatus becomes inapplicable. Thus, the user may need to retrain the prosthetic apparatus.
  • a machine learning-based procedure such as a trained neural network
  • a machine learning-based procedure may advantageously be used to modify or adapt previous training information.
  • the user may perform initial training to grasp a cup with a prosthetic apparatus as described with respect to the state diagram of FIG. 11.
  • the initial training may begin with the user’s arm at rest hanging straight down.
  • the initial training may be based on the myoelectric electrodes 751 sensing little or no signals (because the user’s arm is at rest).
  • the user may want to initiate the trained cup grasping motion, but starting from a position when the arm is not hanging straight down, but instead partially inclined.
  • a machine learning-based procedure may determine myoelectric signal differences between partially inclined and at rest arm positions.
  • the machine learning-based procedure may determine variations between different myoelectric signals intelligently.
  • the machine learning-based procedure can then modify the previous training data to enable the user to grasp a cup beginning from a position where the arm is partially inclined.
  • FIG. 15 shows an image 1500 of a user playing piano with a prosthetic apparatus.
  • the prosthetic apparatus may include one or more controllers configured to actuate or control portions of the prosthetic apparatus based on signals received from sensors.
  • control of the prosthetic apparatus may be state based (such as described with respect to FIG. 11) or may be based only on current sensor signals, such as myoelectric signals.
  • the prosthetic apparatus may be trained using the system 1400 of FIG. 14.
  • FIG. 16A and 16B shows images of actions that require individual control and/or articulation of finger joints.
  • Image 1600 shows a person playing a piano and image 1650 shows a person playing guitar. Both of the illustrated activities show how each finger may need to be individually articulated and controlled.
  • the system 1400 may be used to train any feasible prosthetic apparatus to perform any action, including actions that require individual control and/or articulation of finger joints.
  • FIG. 17 is a flowchart showing an example method 1700 for training a prosthetic apparatus. Some examples may perform the operations described herein with additional operations, fewer operations, operations in a different order, operations in parallel, and some operations differently. The method 1700 is described below with respect to the system 1400 of FIG. 14, however, the method 1700 may be performed by any other suitable system or device.
  • the method 1700 begins in block 1702 as a camera receives one or more images of a user’s limb.
  • the camera which may be included with the smartphone 1410, captures poses or motions of the user’s limb.
  • the user may want a prosthetic limb (apparatus) to operate in a manner similar to his/her real limb as captured by the camera.
  • a processing unit matches one or more images captured by the camera to one or more configurations of a prosthetic apparatus. For example, the processing unit may determine a closest match between a known prosthetic configuration (pose, position, arrangement, or the like) and any of the images captured by the camera. [0139] Next, in block 1706, the processing unit determines a label for each of the determined prosthetic configurations. In some cases, determining the label may associate an operating state with the configuration of the prosthetic apparatus. Thus, a number of operating states may be associated with each label. The operating states may for a basis for a state diagram, such as the state diagram 1100 of FIG. 11.
  • the processing unit associates stimulus vectors with the one or more labeled prosthetic configurations. In this manner, different stimulus vectors may be used to transition operations to different operating states.
  • FIG. 18 shows a block diagram of a device 1800 that may be one example of the control system 200, 200’ of FIGS. 2A and 2B. Although described herein as a device, the functionality of the device 1800 may be performed by any feasible apparatus, system, or method.
  • the device 1800 may include a communication interface 1810, a processor 1830, and a memory 1840.
  • the communication interface 1810 may transmit signals to and receive signals from other wired or wireless devices, including remote (e.g., cloud-based) storage devices, cameras, processors, compute nodes, processing nodes, computers, mobile devices (e.g., cellular phones, tablet computers and the like) and/or displays.
  • the communication interface 1810 may include wired (e.g., serial, ethernet, or the like) and/or wireless (Bluetooth, Wi-Fi, cellular, or the like) transceivers that may communicate with any other feasible device through any feasible network.
  • the communication interface 1810 may receive sensor data from sensors 1820 and/or image data from camera 1825. Although shown here as separate from the device 1800, in some variations, the sensors 1820 and the camera 1825 may be integral with (included within) the device 1800.
  • the processor 1830 which is also coupled to the memory 1840, may be any one or more suitable processors capable of executing scripts or instructions of one or more software programs stored in the device 1800 (such as within memory 1840).
  • the memory 1840 may include prosthetic operating states 1841.
  • an operating state may describe a configuration, position, arrangement, or the like of a prosthetic apparatus.
  • the prosthetic operating states may be arranged or linked together to form a state diagram.
  • a state diagram may be used to control and/or describe operations of the prosthetic apparatus.
  • the memory 1840 may also include a non-transitory computer-readable storage medium (e.g., one or more nonvolatile memory elements, such as EPROM, EEPROM, Flash memory, a hard drive, etc.) that may store a signal processing software (SW) module 1842, an actuator control SW module 1843, a machine learning SW module 1844, and a communication SW module 1845.
  • SW signal processing software
  • Each software module includes program instructions that, when executed by the processor 1830, may cause the device 1800 to perform the corresponding function(s).
  • the non-transitory computer-readable storage medium of memory 1840 may include instructions for performing all or a portion of the operations described herein.
  • the processor 1830 may execute the signal processing SW module 1842 to receive and process any feasible signals from the sensors 1820 and/or the camera 1825.
  • the signal processing SW module 1842 may sample, average, and/or filter signals from myoelectric sensors, IMUs, cameras, etc.
  • the signal processing SW module 1842 may form one or more stimulus vectors as described with respect to FIGS. 8A and 8B. The stimulus vectors may be used to transition the device 1800 to operate within one or more of the states described in the prosthetic operating states 1841.
  • the processor 1830 may execute the actuator control SW module 1843 to actuate, control, move, and/or position any of the actuators 1821.
  • the processor 1830 may execute the actuator control SW module 1843 in response to an operating state of the prosthetic device.
  • the processor 1830 may execute the machine learning SW module 1844 to learn configurations of the prosthetic device.
  • the machine learning SW module 1844 may include a neural network that is trained to match images with configurations of the prosthetic apparatus.
  • the processor 1830 may execute the communication SW module 1845 to transmit and/or receive data with other devices, processors, or the like. Execution of the communication SW module 1845 may cause the processor 1830 to transmit and/or receive data through the communication interface 1810.
  • any of these apparatuses may also or alternatively refer to robotic apparatuses.
  • these apparatuses may be part of a robotic manipulator that is automatically or semi-automatically manipulated.
  • any of the methods (including user interfaces) described herein may be implemented as software, hardware or firmware, and may be described as a non-transitory computer-readable storage medium storing a set of instructions capable of being executed by a processor (e.g., computer, tablet, smartphone, etc.), that when executed by the processor causes the processor to control perform any of the steps, including but not limited to: displaying, communicating with the user, analyzing, modifying parameters (including timing, frequency, intensity, etc.), determining, alerting, or the like.
  • any of the methods described herein may be performed, at least in part, by an apparatus including one or more processors having a memory storing a non-transitory computer-readable storage medium storing a set of instructions for the processes(s) of the method.
  • computing devices and systems described and/or illustrated herein broadly represent any type or form of computing device or system capable of executing computer-readable instructions, such as those contained within the modules described herein.
  • these computing device(s) may each comprise at least one memory device and at least one physical processor.
  • memory or “memory device,” as used herein, generally represents any type or form of volatile or non-volatile storage device or medium capable of storing data and/or computer-readable instructions.
  • a memory device may store, load, and/or maintain one or more of the modules described herein.
  • Examples of memory devices comprise, without limitation, Random Access Memory (RAM), Read Only Memory (ROM), flash memory, Hard Disk Drives (HDDs), Solid-State Drives (SSDs), optical disk drives, caches, variations or combinations of one or more of the same, or any other suitable storage memory.
  • processor or “physical processor,” as used herein, generally refers to any type or form of hardware-implemented processing unit capable of interpreting and/or executing computer-readable instructions.
  • a physical processor may access and/or modify one or more modules stored in the above-described memory device.
  • Examples of physical processors comprise, without limitation, microprocessors, microcontrollers, Central Processing Units (CPUs), Field-Programmable Gate Arrays (FPGAs) that implement softcore processors, Application-Specific Integrated Circuits (ASICs), portions of one or more of the same, variations or combinations of one or more of the same, or any other suitable physical processor.
  • the method steps described and/or illustrated herein may represent portions of a single application.
  • one or more of these steps may represent or correspond to one or more software applications or programs that, when executed by a computing device, may cause the computing device to perform one or more tasks, such as the method step.
  • one or more of the devices described herein may transform data, physical devices, and/or representations of physical devices from one form to another. Additionally or alternatively, one or more of the modules recited herein may transform a processor, volatile memory, non-volatile memory, and/or any other portion of a physical computing device from one form of computing device to another form of computing device by executing on the computing device, storing data on the computing device, and/or otherwise interacting with the computing device.
  • computer-readable medium generally refers to any form of device, carrier, or medium capable of storing or carrying computer-readable instructions.
  • Examples of computer-readable media comprise, without limitation, transmission-type media, such as carrier waves, and non-transitory-type media, such as magnetic-storage media (e.g., hard disk drives, tape drives, and floppy disks), optical-storage media (e.g., Compact Disks (CDs), Digital Video Disks (DVDs), and BLU-RAY disks), electronic-storage media (e.g., solid-state drives and flash media), and other distribution systems.
  • transmission-type media such as carrier waves
  • non-transitory-type media such as magnetic-storage media (e.g., hard disk drives, tape drives, and floppy disks), optical-storage media (e.g., Compact Disks (CDs), Digital Video Disks (DVDs), and BLU-RAY disks), electronic-storage media (e.g., solid-state drives and flash media),
  • the processor as described herein can be configured to perform one or more steps of any method disclosed herein. Alternatively or in combination, the processor can be configured to combine one or more steps of one or more methods as disclosed herein.
  • the device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
  • the terms “upwardly”, “downwardly”, “vertical”, “horizontal” and the like are used herein for the purpose of explanation only unless specifically indicated otherwise.
  • first and second may be used herein to describe various features/elements (including steps), these features/elements should not be limited by these terms, unless the context indicates otherwise. These terms may be used to distinguish one feature/element from another feature/element. Thus, a first feature/element discussed below could be termed a second feature/element, and similarly, a second feature/element discussed below could be termed a first feature/element without departing from the teachings of the present invention.
  • any of the apparatuses and methods described herein should be understood to be inclusive, but all or a sub-set of the components and/or steps may alternatively be exclusive and may be expressed as “consisting of’ or alternatively “consisting essentially of’ the various components, steps, sub-components or sub-steps. [0170] As used herein in the specification and claims, including as used in the examples and unless otherwise expressly specified, all numbers may be read as if prefaced by the word "about” or “approximately,” even if the term does not expressly appear.
  • a numeric value may have a value that is +/- 0.1% of the stated value (or range of values), +/- 1% of the stated value (or range of values), +/- 2% of the stated value (or range of values), +/- 5% of the stated value (or range of values), +/- 10% of the stated value (or range of values), etc.
  • Any numerical values given herein should also be understood to include about or approximately that value, unless the context indicates otherwise. For example, if the value " 10" is disclosed, then “about 10" is also disclosed.
  • any numerical range recited herein is intended to include all sub-ranges subsumed therein. It is also understood that when a value is disclosed that “less than or equal to” the value, “greater than or equal to the value” and possible ranges between values are also disclosed, as appropriately understood by the skilled artisan. For example, if the value "X” is disclosed the “less than or equal to X” as well as “greater than or equal to X” (e.g., where X is a numerical value) is also disclosed. It is also understood that the throughout the application, data is provided in a number of different formats, and that this data represents endpoints and starting points, and ranges for any combination of the data points.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Transplantation (AREA)
  • Vascular Medicine (AREA)
  • Cardiology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Prostheses (AREA)

Abstract

L'invention concerne des procédés et des appareils (dispositifs, systèmes, etc.) pour commander un appareil à membre prothétique. Les appareils peuvent comprendre une pluralité de dispositifs de commande, d'actionneurs et de capteurs pour recevoir une entrée électromyographique. L'entrée de capteur peut être formée en un vecteur de stimulus à n dimensions qui est utilisé pour faire passer l'appareil prothétique dans une pluralité d'états de fonctionnement pour commander plus précisément et uniformément le fonctionnement de l'appareil à membre prothétique.
PCT/US2024/013385 2023-01-28 2024-01-29 Système de commande pour appareil à membre prothétique WO2024159228A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363482033P 2023-01-28 2023-01-28
US63/482,033 2023-01-28

Publications (1)

Publication Number Publication Date
WO2024159228A1 true WO2024159228A1 (fr) 2024-08-02

Family

ID=91971224

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2024/013385 WO2024159228A1 (fr) 2023-01-28 2024-01-29 Système de commande pour appareil à membre prothétique

Country Status (1)

Country Link
WO (1) WO2024159228A1 (fr)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030173076A1 (en) * 2002-03-13 2003-09-18 Sheiretov Todor K. Constant force actuator
WO2011026086A1 (fr) * 2009-08-31 2011-03-03 Iwalk, Inc. Prothèse ou orthèse des membres inférieurs utilisée pour se mettre debout
US20150343218A1 (en) * 2007-06-20 2015-12-03 Michael Goorevich Optimizing Operational Control of a Hearing Prosthesis
US20160051382A1 (en) * 2010-11-22 2016-02-25 Vanderbilt University Control system for a grasping device
WO2018111138A1 (fr) * 2016-12-14 2018-06-21 Общество с ограниченной ответственностью "Бионик Натали" Procédé et système de commande d'extrémité bionique intelligente
CN108814778A (zh) * 2018-07-19 2018-11-16 郭伟超 一种肌电仿人灵巧假肢手级联控制方法和系统
US20190344075A1 (en) * 2015-12-22 2019-11-14 Ecole Polytechnique Federale De Lausanne (Epfl) System for selective spatiotemporal stimulation of the spinal cord
US20210365114A1 (en) * 2017-11-13 2021-11-25 Bios Health Ltd Neural interface
WO2022223466A1 (fr) * 2021-04-19 2022-10-27 Luca Miller Prothèse bionique percevant l'environnement

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030173076A1 (en) * 2002-03-13 2003-09-18 Sheiretov Todor K. Constant force actuator
US20150343218A1 (en) * 2007-06-20 2015-12-03 Michael Goorevich Optimizing Operational Control of a Hearing Prosthesis
WO2011026086A1 (fr) * 2009-08-31 2011-03-03 Iwalk, Inc. Prothèse ou orthèse des membres inférieurs utilisée pour se mettre debout
US20160051382A1 (en) * 2010-11-22 2016-02-25 Vanderbilt University Control system for a grasping device
US20190344075A1 (en) * 2015-12-22 2019-11-14 Ecole Polytechnique Federale De Lausanne (Epfl) System for selective spatiotemporal stimulation of the spinal cord
WO2018111138A1 (fr) * 2016-12-14 2018-06-21 Общество с ограниченной ответственностью "Бионик Натали" Procédé et système de commande d'extrémité bionique intelligente
US20210365114A1 (en) * 2017-11-13 2021-11-25 Bios Health Ltd Neural interface
CN108814778A (zh) * 2018-07-19 2018-11-16 郭伟超 一种肌电仿人灵巧假肢手级联控制方法和系统
WO2022223466A1 (fr) * 2021-04-19 2022-10-27 Luca Miller Prothèse bionique percevant l'environnement

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
AARON FLEMING; NICOLE STAFFORD; STEPHANIE HUANG; XIAOGANG HU; DANIEL P FERRIS; HE (HELEN) HUANG: "Myoelectric control of robotic lower limb prostheses: a review of electromyography interfaces, control paradigms, challenges and future directions", JOURNAL OF NEURAL ENGINEERING, INSTITUTE OF PHYSICS PUBLISHING, BRISTOL, GB, vol. 18, no. 4, 27 July 2021 (2021-07-27), GB , pages 041004, XP020368243, ISSN: 1741-2552, DOI: 10.1088/1741-2552/ac1176 *

Similar Documents

Publication Publication Date Title
US10990174B2 (en) Methods and apparatus for predicting musculo-skeletal position information using wearable autonomous sensors
US20230023282A1 (en) Systems and methods for postural control of a multi-function prosthesis
US10076425B2 (en) Control of limb device
Zhang et al. Design and functional evaluation of a dexterous myoelectric hand prosthesis with biomimetic tactile sensor
WO2019147949A1 (fr) Traitement en temps réel d'estimations de modèle de représentation d'état de main
EP3742961A1 (fr) Techniques d'étalonnage pour modélisation de représentation d'état de main à l'aide de signaux neuromusculaires
US10543111B2 (en) Biomimetic controller for increased dexterity prosthesis
Laghi et al. Shared-autonomy control for intuitive bimanual tele-manipulation
Yap et al. Design of a wearable FMG sensing system for user intent detection during hand rehabilitation with a soft robotic glove
Cipriani et al. Influence of the weight actions of the hand prosthesis on the performance of pattern recognition based myoelectric control: preliminary study
Romeo et al. Method for automatic slippage detection with tactile sensors embedded in prosthetic hands
CN109498375B (zh) 一种人体运动意图识别控制装置及控制方法
Kirchner et al. Intuitive interaction with robots–technical approaches and challenges
Penaloza et al. Towards intelligent brain-controlled body augmentation robotic limbs
He et al. Vision-based assistance for myoelectric hand control
Starke et al. Semi-autonomous control of prosthetic hands based on multimodal sensing, human grasp demonstration and user intention
Fajardo et al. User-prosthesis interface for upper limb prosthesis based on object classification
Schabron et al. Integration of forearm sEMG signals with IMU sensors for trajectory planning and control of assistive robotic arm
Murillo et al. Individual robotic arms manipulator control employing electromyographic signals acquired by myo armbands
Gardner et al. An unobtrusive vision system to reduce the cognitive burden of hand prosthesis control
Gupta MAC-MAN
Sharma et al. Design and implementation of robotic hand control using gesture recognition
Woodward et al. Integrated grip switching and grasp control for prosthetic hands using fused inertial and mechanomyography measurement
Hioki et al. Estimation of finger joint angles from sEMG using a recurrent neural network with time-delayed input vectors
WO2024159228A1 (fr) Système de commande pour appareil à membre prothétique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24747953

Country of ref document: EP

Kind code of ref document: A1