CN117784941A - Gesture control method of bionic hand, storage medium, control device and bionic hand - Google Patents

Gesture control method of bionic hand, storage medium, control device and bionic hand Download PDF

Info

Publication number
CN117784941A
CN117784941A CN202410202068.1A CN202410202068A CN117784941A CN 117784941 A CN117784941 A CN 117784941A CN 202410202068 A CN202410202068 A CN 202410202068A CN 117784941 A CN117784941 A CN 117784941A
Authority
CN
China
Prior art keywords
action
bionic hand
hand
intention
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410202068.1A
Other languages
Chinese (zh)
Inventor
韩璧丞
汪文广
阿迪斯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Qiangnao Technology Co ltd
Original Assignee
Zhejiang Qiangnao Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Qiangnao Technology Co ltd filed Critical Zhejiang Qiangnao Technology Co ltd
Priority to CN202410202068.1A priority Critical patent/CN117784941A/en
Publication of CN117784941A publication Critical patent/CN117784941A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a gesture control method of a bionic hand, which comprises the steps of obtaining an action intention set of a wearer of the bionic hand to a preset object; detecting an electromyographic signal through an electromyographic sensor of a bionic hand, and matching the electromyographic signal with the action corresponding to the action intention set to determine a target action intention; and controlling the bionic hand to execute corresponding gesture according to the target action intention. The invention also discloses a computer readable storage medium, wherein the computer readable storage medium is stored with a computer program, and the computer program realizes the steps of the gesture control method of the bionic hand when being executed by a processor. The invention also discloses a control device which comprises a memory and a processor. The invention also discloses a bionic hand, which comprises a control device and a myoelectric sensor. According to the technical scheme, the sensitivity of the bionic hand is improved.

Description

Gesture control method of bionic hand, storage medium, control device and bionic hand
Technical Field
The invention relates to the technical field of bionic hands, in particular to a gesture control method of a bionic hand, a storage medium, a control device and the bionic hand.
Background
The intelligent bionic hand is an intelligent product with the brain-computer interface technology highly integrated with an artificial intelligent algorithm. The intelligent bionic hand can identify the movement intention of a wearer by extracting the electromyographic signals of the arm of the wearer and convert the movement intention into the action of the intelligent bionic hand, so that the upper limb amputee can improve the life experience based on the bionic hand.
At present, the method for recognizing the motion intention of the intelligent bionic hand based on the electromyographic signals is generally adopted to directly match the electromyographic signals collected at present with all gesture templates one by one so as to obtain the gesture expected to be made by a wearer, but the method is long in recognition time and low in efficiency, so that the reaction speed of the bionic hand is low, and the sensitivity of the bionic hand is affected.
Disclosure of Invention
The invention provides a gesture control method, a storage medium, a control device and a bionic hand, aiming at improving the sensitivity of the bionic hand.
In order to achieve the above purpose, the present invention adopts the following technical scheme:
the invention provides a gesture control method of a bionic hand, which comprises the following steps:
acquiring an action intention set of a bionic hand wearer on a preset object;
detecting an electromyographic signal through a myoelectric sensor simulating a hand, and matching the electromyographic signal with the action corresponding to the action intention set to determine a target action intention;
and controlling the bionic hand to execute corresponding gesture according to the target action intention.
In some embodiments, the obtaining a set of motion intents of the simulated hand wearer for a predetermined object comprises:
acquiring an image of a predetermined object;
analyzing features of the image according to a pattern recognition algorithm to recognize the predetermined object;
and determining an action intention set corresponding to the predetermined object.
In some embodiments, the controlling the simulated hand to perform the corresponding gesture according to the target motion intention includes:
analyzing the preset force required by the bionic hand to execute the gesture according to the preset object and the target action corresponding to the target action intention;
and controlling the bionic hand to execute the gesture according to the preset force and the target action.
In some embodiments, the controlling the simulated hand to perform the gesture according to the predetermined force and the target motion further comprises:
monitoring the contact force between the bionic hand and the preset object in real time;
and when the contact force is increased to the preset force, controlling the bionic hand to stop acting.
In some embodiments, the detecting the electromyographic signal by the electromyographic sensor of the bionic hand and matching the electromyographic signal with the motion corresponding to the motion intention set to determine the target motion intention includes:
determining a first action template and a second action template in the action corresponding to the action intention set according to history matching data, wherein the history matching data comprises a history matching success rate;
matching the electromyographic signals with the first action template and/or the second action template according to the history matching success rate so as to obtain matching information;
and determining the target action intention according to the matching information.
In some embodiments, the determining the target action intent from the matching information comprises:
obtaining the maximum similarity from the matching information;
and determining the action corresponding to the maximum similarity to determine the target action intention corresponding to the action.
In some embodiments, the detecting the electromyographic signal by the electromyographic sensor of the bionic hand and matching the electromyographic signal with the action corresponding to the action intention set to determine the target action intention further includes:
determining a third action template and a fourth action template in the action corresponding to the action intention set according to the association degree of the preset object and the action corresponding to the action intention set;
matching the electromyographic signals with the third action template and/or the fourth action template according to the association degree so as to obtain matching information;
and determining the target action intention according to the matching information.
The present invention further proposes a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method for controlling hand-simulated gestures as described in the above embodiments.
The invention further proposes a control device comprising a memory and a processor, the memory being for storing a computer program; the processor is configured to implement the steps of the method for controlling a gesture of a simulated hand according to the above embodiment when executing the computer program.
The invention further provides a bionic hand, which comprises a control device and a myoelectric sensor, wherein the control device is the control device in the embodiment, and the myoelectric sensor is used for detecting myoelectric signals and outputting the myoelectric signals to the control device.
According to the technical scheme of the gesture control method of the bionic hand, the target action intention of the bionic hand wearer for the preset object can be determined by acquiring the action intention set of the bionic hand wearer for the preset object, then detecting the electromyographic signals through the electromyographic sensors of the bionic hand and matching the electromyographic signals with the actions corresponding to the action intention set, and then the bionic hand is controlled to execute the corresponding gesture action according to the target action intention, so that the recognition efficiency of the bionic hand for the action intention of the wearer can be improved, and the sensitivity of the bionic hand is further improved.
Drawings
FIG. 1 is a flow chart of a method for controlling a simulated hand gesture according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a method for obtaining a set of action intents according to an embodiment of the present invention;
FIG. 3 is a schematic flow chart of controlling a simulated hand to perform a gesture according to an embodiment of the present invention;
FIG. 4 is a flow chart illustrating a control of stopping motion of a bionic hand according to an embodiment of the invention;
FIG. 5 is a flowchart illustrating determining a target action intention according to an embodiment of the present invention;
FIG. 6 is a flowchart illustrating determining a target action intention according to another embodiment of the present invention;
FIG. 7 is a flowchart illustrating determining a target action intention according to another embodiment of the present invention;
fig. 8 is a schematic structural diagram of a control device according to an embodiment of the invention.
The achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
The following description of the embodiments of the present invention will be made more clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be noted that all directional indicators (such as up, down, left, right, front, and rear … …) in the embodiments of the present invention are merely used to explain the relative positional relationship, movement, etc. between the components in a particular posture (as shown in the drawings), and if the particular posture is changed, the directional indicator is changed accordingly.
It will also be understood that when an element is referred to as being "mounted" or "disposed" on another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present.
Furthermore, the description of "first," "second," etc. in this disclosure is for descriptive purposes only and is not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In addition, the technical solutions of the embodiments may be combined with each other, but it is necessary to base that the technical solutions can be realized by those skilled in the art, and when the technical solutions are contradictory or cannot be realized, the combination of the technical solutions should be considered to be absent and not within the scope of protection claimed in the present invention.
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
The invention provides a gesture control method for a bionic hand.
Referring to fig. 1, fig. 1 is a flowchart illustrating a method for controlling a hand-simulated gesture according to an embodiment of the invention.
The gesture control method of the bionic hand in the embodiment of the application comprises the following steps:
step S10, acquiring an action intention set of a bionic hand wearer on a preset object;
step S20, detecting a myoelectric signal through a myoelectric sensor of a bionic hand, and matching the myoelectric signal with actions corresponding to the action intention set to determine a target action intention;
step S30, controlling the bionic hand to execute corresponding gesture according to the target action intention.
In step S10, the motion intention of the wearer may be predicted by the bionic hand by acquiring a set of motion intentions of the wearer of the bionic hand to a predetermined object. The predetermined object may be an object that the bionic hand wearer wants to perform an action through the bionic hand, for example, the predetermined object is a cup, and the bionic hand wearer may perform a picking action on the cup through the bionic hand. In this embodiment, the set of action intents of the bionic hand wearer on the predetermined object may be a set of all action intents that the bionic hand wearer may perform on the predetermined object, for example, when the predetermined object is a book, the set of action intents may be grabbing and turning pages.
In step S20, after the motion intention set of the wearer of the bionic hand to the predetermined object is obtained, the myoelectric signal detected by the myoelectric sensor may be matched with the motion corresponding to the motion intention set, so that the target motion intention of the wearer of the bionic hand may be determined. Wherein the target motion is intended to be a motion that the wearer of the simulated hand eventually wants to achieve for the predetermined object. In this embodiment, when matching the electromyographic signal with the action corresponding to the action intention set, the electromyographic signal may be matched with the action in the action intention set one by one, or may be matched simultaneously. It should be noted that the myoelectric sensor in this embodiment may be disposed at the position of the receiving cavity of the bionic hand.
In step S30, the control device of the bionic hand may control the bionic hand to execute a corresponding gesture according to the obtained target motion intention, where the gesture may be a grabbing motion. In the present embodiment, different gesture actions may be performed according to different predetermined objects and target action intentions, for example, when the predetermined object is a piano and the target action intention is a piano, a bionic hand may be controlled to realize a gesture action of pressing a key.
According to the gesture control method of the bionic hand, the motion intention set of the bionic hand wearer to the preset object is obtained, then the myoelectric signal is detected through the myoelectric sensor of the bionic hand, the myoelectric signal is matched with the motion corresponding to the motion intention set, the target motion intention of the bionic hand wearer to the preset object can be determined, the corresponding gesture motion is executed by the bionic hand according to the target motion intention, and therefore the recognition efficiency of the motion intention of the bionic hand to the wearer can be improved, and the sensitivity of the bionic hand is improved.
Referring to fig. 2, fig. 2 is a flowchart illustrating a process of obtaining an action intention set according to an embodiment of the invention.
In some embodiments, the acquiring the set of motion intentions of the simulated hand wearer for the predetermined object in step S10 comprises:
step S11, acquiring an image of a preset object;
step S12, analyzing the characteristics of the image according to a pattern recognition algorithm to recognize a preset object;
step S13, determining an action intention set corresponding to the predetermined object.
In step S11, the predetermined object is discriminated by acquiring an image of the predetermined object, so that a set of action intentions of the bionic hand wearer for the predetermined object can be determined. In this embodiment, an image of a predetermined object may be acquired by a camera set on the bionic hand; the image of the predetermined object can be acquired through other intelligent wearable devices, wherein the image can be a picture or a video.
In step S12, from the image of the predetermined object acquired in step S11, features of the image may be analyzed using a pattern recognition algorithm, thereby recognizing the predetermined object. Wherein the features of the image may be the shape, color distribution, texture features, etc. of the object.
In step S13, by the predetermined object confirmed in step S12, a corresponding set of action intentions thereof may be determined according to the predetermined object, that is, by the identified predetermined object, a set on which the wearer of the bionic hand may perform all actions may be determined.
In this way, in this embodiment, the image of the predetermined object may be acquired, and then the predetermined object is identified after the feature of the image is analyzed by the pattern recognition algorithm, so as to determine the action intention set corresponding to the predetermined object, so that the range of action matching may be reduced, and further the recognition efficiency of the bionic hand may be improved.
Referring to fig. 3, fig. 3 is a flowchart illustrating a control procedure for performing a gesture by a bionic hand according to an embodiment of the invention.
In some embodiments, controlling the simulated hand to perform the corresponding gesture motion according to the target motion intent in step S30 includes:
step S31, analyzing the preset force required by the simulated hand to execute the gesture action according to the preset object and the target action corresponding to the target action intention;
step S32, controlling the bionic hand to execute gesture motion according to the preset force and the target motion.
In step S31, the predetermined force required by the bionic hand to perform the gesture is analyzed according to the target motion corresponding to the predetermined object and the target motion intention, so that an appropriate force can be applied to the predetermined object when the bionic hand is controlled to perform the gesture, so as to complete the target motion intention of the wearer of the bionic hand. The predetermined force can be analyzed through a preset force template of the predetermined object and the target action in the database. For example, in this embodiment, the predetermined object is a cup, the target action is gripping, and the predetermined force required by the bionic hand to grip the cup can be obtained through a preset force template for gripping the cup in the database.
In step S32, according to the target action and the predetermined force acquired in step S31, the control of the gesture action performed by the bionic hand may be more accurate, so as to improve the use experience of the wearer of the bionic hand. For example, in the present embodiment, if the predetermined object is a piano and the target action is intended to be playing the piano, the biomimetic hand needs to apply a predetermined force required to play the piano to press the key to make the sound.
In this way, in this embodiment, the predetermined force required by the bionic hand to perform the gesture can be analyzed through the target action corresponding to the predetermined object and the target action intention, and then the bionic hand can be controlled to perform the gesture according to the predetermined force and the target action, so that the action performed by the bionic hand can be more accurate.
Referring to fig. 4, fig. 4 is a flow chart illustrating a control of stopping motion of a bionic hand according to an embodiment of the invention.
In some embodiments, after controlling the simulated hand to perform the gesture according to the predetermined force and the target action in step S32, the method further includes:
step S40, monitoring the contact force between the bionic hand and a preset object in real time;
in step S50, when the contact force increases to a predetermined force, the motion of the simulated hand is controlled to stop.
In step S40, the contact force between the bionic hand and the predetermined object is monitored in real time to determine whether the contact force reaches the predetermined force of the bionic hand when performing the gesture. The contact force in this embodiment is the force applied to a predetermined object by the simulated hand when performing the gesture. In this embodiment, the contact force may be detected by the load current of the bionic hand driving motor, or may be detected by a force sensor provided on the bionic finger.
In step S50, when the contact force between the bionic hand and the predetermined object is monitored in real time, the bionic hand can be controlled to stop moving when the contact force is increased to the predetermined force, so that damage to the predetermined object due to excessive contact force can be avoided. For example, in this embodiment, if the predetermined object is a cup, when it is detected that the contact force between the bionic hand and the cup reaches the predetermined force required for grasping the cup, the control device may control the bionic hand to stop bending of the fingers of the bionic hand.
In this way, in this embodiment, by monitoring the contact force between the bionic hand and the predetermined object in real time, when the contact force increases to the predetermined force, the bionic hand is controlled to stop moving, so that damage to the predetermined object caused by excessive contact force can be avoided.
Referring to fig. 5, fig. 5 is a flowchart illustrating a method for determining a target action intention according to an embodiment of the invention.
In some embodiments, detecting the electromyographic signal by the electromyographic sensor of the simulated hand in step S20 and matching the electromyographic signal with the action corresponding to the set of action intents to determine the target action intent comprises:
step S21, determining a first action template and a second action template in actions corresponding to the action intention set according to history matching data, wherein the history matching data comprises a history matching success rate;
step S22, matching the electromyographic signals with the first action template and/or the second action template according to the history matching success rate so as to obtain matching information;
step S23, determining the target action intention according to the matching information.
In step S21, the first action template and the second action template in the actions corresponding to the action intention set are determined according to the history matching data, so that the corresponding actions in the action intention set can be divided to improve the matching efficiency. In this embodiment, the first action template may be an action with a higher history matching success rate in the history matching data, and the second action template may be an action with a lower history matching success rate in the history matching data; or, the first action template may be an action with a lower history matching success rate in the history matching data, and the second action template may be an action with a higher history matching success rate in the history matching data.
In step S22, the electromyographic signal may be matched with the first action template and/or the second action template according to the historical matching success rate, so as to obtain matching information, so that the matching time may be shortened, and the action response speed of the bionic hand may be improved. For example, in this embodiment, when the first action template is an action with a higher history matching success rate in the history matching data and the second action template is an action with a lower history matching success rate in the history matching data, the first action template and the second action template may be sequentially matched according to the level of the history matching success rate, so as to obtain the matching information.
In step S23, the target action intention may be determined according to matching information, wherein the matching information may be similarity data of matching of the electromyographic signal with the action template. For example, in the present embodiment, the target action intention may be determined according to the similarity data size in the matching information.
In this way, in this embodiment, the electromyographic signal is matched with the first action template and/or the second action template through the history matching success rate, so as to obtain matching information, and the target action intention is determined according to the matching information, so that the matching recognition efficiency and the reaction speed of the bionic hand can be improved.
Referring to fig. 6, fig. 6 is a flowchart illustrating a method for determining a target action intention according to another embodiment of the present invention.
In some embodiments, determining the target action intent from the matching information in step S23 includes:
step S231, obtaining the maximum similarity from the matching information;
step S232, determining the action corresponding to the maximum similarity to determine the target action intention corresponding to the action.
In step S231, the target action intention may be determined by acquiring the maximum similarity from the matching information. For example, the similarity data in the matching information may be: if the motion 1 is 0.2, the motion 2 is 0.4, the motion 3 is 0.7, and the motion 4 is 0.9, the maximum similarity in the matching information is known as the similarity between the motion 4 and the electromyographic signal.
In step S232, by determining an action corresponding to the maximum similarity, a target action intention corresponding to the action can be determined. For example, in this embodiment, when the predetermined object is a book, the similarity of the capturing actions in the matching information is 0.5, and the similarity of the page turning actions is 0.9, and the action corresponding to the maximum similarity in the matching information is the page turning action, so that it can be determined that the target action intention of the bionic hand wearer is page turning.
Referring to fig. 7, fig. 7 is a flowchart illustrating a method for determining a target action intention according to another embodiment of the present invention.
In some embodiments, detecting the electromyographic signal by the electromyographic sensor of the simulated hand in step S20 and matching the electromyographic signal with the action corresponding to the set of action intents to determine the target action intent further comprises:
step S24, determining a third action template and a fourth action template in the actions corresponding to the action intention set according to the association degree of the preset object and the actions corresponding to the action intention set;
step S25, matching the electromyographic signals with the third action template and/or the fourth action template according to the association degree so as to obtain matching information;
and step S26, determining the target action intention according to the matching information.
In step S24, according to the association degree of the predetermined object and the action corresponding to the action intention set, the third action template and the fourth action template in the action corresponding to the action intention set may be determined, so that the corresponding actions in the action intention set may be divided to improve the matching efficiency. In this embodiment, the third action template may be an action with a higher degree of association, and the fourth action template may be an action with a lower degree of association; alternatively, the third action template may be an action with a low degree of association, and the fourth action template may be an action with a high degree of association.
In step S25, the electromyographic signal may be matched with the first action template and/or the second action template according to the degree of association, so as to obtain matching information, so that the matching time may be shortened, and the action response speed of the bionic hand may be improved. For example, in this embodiment, when the third action template is an action with a higher degree of association and the fourth action template is an action with a lower degree of association, the first action template and the second action template may be sequentially matched according to the degree of association, so as to obtain matching information.
In step S26, the target action intention may be determined according to matching information, wherein the matching information may be similarity data of matching of the electromyographic signal with the action template. For example, in the present embodiment, the target action intention may be determined according to the similarity data size in the matching information.
In this way, in this embodiment, the electromyographic signal is matched with the third action template and/or the fourth action template through the association degree of the corresponding action of the predetermined object and the action intention set, so as to obtain matching information, and the target action intention is determined according to the matching information, so that the matching recognition efficiency and the reaction speed of the bionic hand can be improved.
The present invention further proposes a computer readable storage medium, on which a computer program is stored, which when executed by a processor implements the steps of the method for controlling a gesture of a bionic hand according to the above embodiments.
Referring to fig. 8, fig. 8 is a schematic structural diagram of a control device according to an embodiment of the invention.
The invention further proposes a control device 100, the control device 100 comprising a memory 10 and a processor 20, the memory 10 being adapted to store a computer program; the processor 20 is configured to implement the steps of the method for controlling a gesture of a bionic hand according to the above embodiment when executing a computer program.
The invention further provides a bionic hand, which comprises a control device and a myoelectric sensor, wherein the control device is the control device of the embodiment, and the myoelectric sensor is used for detecting myoelectric signals and outputting the myoelectric signals to the control device.
The foregoing is merely exemplary of the application and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the application and are intended to be comprehended within the scope of the application.

Claims (10)

1. The gesture control method of the bionic hand is characterized by comprising the following steps of:
acquiring an action intention set of a bionic hand wearer on a preset object;
detecting an electromyographic signal through a myoelectric sensor simulating a hand, and matching the electromyographic signal with the action corresponding to the action intention set to determine a target action intention;
and controlling the bionic hand to execute corresponding gesture according to the target action intention.
2. The method of claim 1, wherein the obtaining a set of motion intents of the simulated hand wearer for the predetermined object comprises:
acquiring an image of a predetermined object;
analyzing features of the image according to a pattern recognition algorithm to recognize the predetermined object;
and determining an action intention set corresponding to the predetermined object.
3. The method according to claim 1, wherein the controlling the bionic hand to perform the corresponding gesture according to the target motion intention comprises:
analyzing the preset force required by the bionic hand to execute the gesture according to the preset object and the target action corresponding to the target action intention;
and controlling the bionic hand to execute the gesture according to the preset force and the target action.
4. The method according to claim 3, wherein the controlling the bionic hand to perform the gesture according to the predetermined force and the target motion further comprises:
monitoring the contact force between the bionic hand and the preset object in real time;
and when the contact force is increased to the preset force, controlling the bionic hand to stop acting.
5. The method of any one of claims 1 to 4, wherein the detecting an electromyographic signal by an electromyographic sensor of the simulated hand and matching the electromyographic signal with an action corresponding to the set of action intents to determine a target action intent comprises:
determining a first action template and a second action template in the action corresponding to the action intention set according to history matching data, wherein the history matching data comprises a history matching success rate;
matching the electromyographic signals with the first action template and/or the second action template according to the history matching success rate so as to obtain matching information;
and determining the target action intention according to the matching information.
6. The method of claim 5, wherein determining the target motion intent based on the matching information comprises:
obtaining the maximum similarity from the matching information;
and determining the action corresponding to the maximum similarity to determine the target action intention corresponding to the action.
7. The method of any one of claims 1 to 4, wherein the detecting an electromyographic signal by an electromyographic sensor of the simulated hand and matching the electromyographic signal with an action corresponding to the set of action intents to determine a target action intent further comprises:
determining a third action template and a fourth action template in the action corresponding to the action intention set according to the association degree of the preset object and the action corresponding to the action intention set;
matching the electromyographic signals with the third action template and/or the fourth action template according to the association degree so as to obtain matching information;
and determining the target action intention according to the matching information.
8. A computer-readable storage medium, on which a computer program is stored, which computer program, when being executed by a processor, implements the steps of the method for controlling hand-simulated gestures according to any of claims 1 to 7.
9. A control device comprising a memory and a processor, the memory being for storing a computer program; the processor is configured to implement the steps of the method for controlling a hand-simulated gesture according to any one of claims 1 to 6 when executing the computer program.
10. A bionic hand, comprising a control device and a myoelectric sensor, wherein the control device is the control device of claim 9, and the myoelectric sensor is used for detecting a myoelectric signal and outputting the myoelectric signal to the control device.
CN202410202068.1A 2024-02-23 2024-02-23 Gesture control method of bionic hand, storage medium, control device and bionic hand Pending CN117784941A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410202068.1A CN117784941A (en) 2024-02-23 2024-02-23 Gesture control method of bionic hand, storage medium, control device and bionic hand

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410202068.1A CN117784941A (en) 2024-02-23 2024-02-23 Gesture control method of bionic hand, storage medium, control device and bionic hand

Publications (1)

Publication Number Publication Date
CN117784941A true CN117784941A (en) 2024-03-29

Family

ID=90387480

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410202068.1A Pending CN117784941A (en) 2024-02-23 2024-02-23 Gesture control method of bionic hand, storage medium, control device and bionic hand

Country Status (1)

Country Link
CN (1) CN117784941A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018111138A1 (en) * 2016-12-14 2018-06-21 Общество с ограниченной ответственностью "Бионик Натали" Method and system for controlling an intelligent bionic limb
CN108983973A (en) * 2018-07-03 2018-12-11 东南大学 A kind of humanoid dexterous myoelectric prosthetic hand control method based on gesture identification
CN113970968A (en) * 2021-12-22 2022-01-25 深圳市心流科技有限公司 Intelligent bionic hand action pre-judging method
CN114167995A (en) * 2022-02-14 2022-03-11 浙江强脑科技有限公司 Gesture locking method and device for bionic hand, terminal and storage medium
CN114201052A (en) * 2022-02-16 2022-03-18 浙江强脑科技有限公司 Motion force control method and device of bionic hand and storage medium
CN116449967A (en) * 2023-06-20 2023-07-18 浙江强脑科技有限公司 Bionic hand teaching aid, control method thereof and main control equipment
CN116486683A (en) * 2023-06-20 2023-07-25 浙江强脑科技有限公司 Intelligent bionic hand teaching aid
CN116766213A (en) * 2023-08-24 2023-09-19 烟台大学 Bionic hand control method, system and equipment based on image processing

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018111138A1 (en) * 2016-12-14 2018-06-21 Общество с ограниченной ответственностью "Бионик Натали" Method and system for controlling an intelligent bionic limb
CN108983973A (en) * 2018-07-03 2018-12-11 东南大学 A kind of humanoid dexterous myoelectric prosthetic hand control method based on gesture identification
CN113970968A (en) * 2021-12-22 2022-01-25 深圳市心流科技有限公司 Intelligent bionic hand action pre-judging method
CN114167995A (en) * 2022-02-14 2022-03-11 浙江强脑科技有限公司 Gesture locking method and device for bionic hand, terminal and storage medium
CN114201052A (en) * 2022-02-16 2022-03-18 浙江强脑科技有限公司 Motion force control method and device of bionic hand and storage medium
CN116449967A (en) * 2023-06-20 2023-07-18 浙江强脑科技有限公司 Bionic hand teaching aid, control method thereof and main control equipment
CN116486683A (en) * 2023-06-20 2023-07-25 浙江强脑科技有限公司 Intelligent bionic hand teaching aid
CN116766213A (en) * 2023-08-24 2023-09-19 烟台大学 Bionic hand control method, system and equipment based on image processing

Similar Documents

Publication Publication Date Title
CN110337269B (en) Method and apparatus for inferring user intent based on neuromuscular signals
EP3506068A1 (en) Devices and methods for dynamic association of user input with mobile device actions
CN103442114B (en) A kind of identity identifying method based on dynamic gesture
US20090153499A1 (en) Touch action recognition system and method
JP2019532543A5 (en)
CN105452995B (en) The method of wearable bio signal interface and the wearable bio signal interface of operation
CN104834907A (en) Gesture recognition method, apparatus, device and operation method based on gesture recognition
CN105912142B (en) A kind of note step and Activity recognition method based on acceleration sensor
TW201218023A (en) Efficient gesture processing
KR20110129042A (en) Facial expression recognition interaction method between mobile machine and human
CN114384999B (en) User-independent myoelectric gesture recognition system based on self-adaptive learning
CN109144262B (en) Human-computer interaction method, device, equipment and storage medium based on eye movement
CN117784941A (en) Gesture control method of bionic hand, storage medium, control device and bionic hand
KR102363879B1 (en) Method for predicting clinical functional assessment scale using feature values derived by upper limb movement of patients
CN117827010A (en) Control method and control device for bionic hand, bionic hand and storage medium
KR101869304B1 (en) System, method and program for recognizing sign language
CN111230872B (en) Object delivery intention recognition system and method based on multiple sensors
CN111126294B (en) Method and server for identifying gait of terminal user based on mobile terminal data
CN117621113A (en) Bionic hand and control method and control device thereof
CN111176422B (en) Intelligent wearable device, operation method thereof and computer readable storage medium
KR20110080871A (en) Method for controlling a portable information terminal and contents thereof using recognizing finger motions and gestures on the back of the portable information terminal
CN117773952A (en) Bionic hand control method, storage medium, control device and bionic hand
CN117681203A (en) Scene selection-based bionic hand control method and device and bionic hand
CN117817701A (en) Sectional control method and control equipment for bionic hand and finger thereof
CN117798917A (en) Bionic hand control method and device based on pressure sensing and bionic hand

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination