CN114625246A - Gesture combination triggering method and device, intelligent bionic hand and storage medium - Google Patents

Gesture combination triggering method and device, intelligent bionic hand and storage medium Download PDF

Info

Publication number
CN114625246A
CN114625246A CN202210133702.1A CN202210133702A CN114625246A CN 114625246 A CN114625246 A CN 114625246A CN 202210133702 A CN202210133702 A CN 202210133702A CN 114625246 A CN114625246 A CN 114625246A
Authority
CN
China
Prior art keywords
gesture
combination
action
gesture combination
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210133702.1A
Other languages
Chinese (zh)
Inventor
韩璧丞
黄琦
阿迪斯
王俊霖
古月
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mental Flow Technology Co Ltd
Original Assignee
Shenzhen Mental Flow Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mental Flow Technology Co Ltd filed Critical Shenzhen Mental Flow Technology Co Ltd
Priority to CN202210133702.1A priority Critical patent/CN114625246A/en
Publication of CN114625246A publication Critical patent/CN114625246A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Neurosurgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Neurology (AREA)
  • Health & Medical Sciences (AREA)
  • Dermatology (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a gesture combination triggering method and device, an intelligent bionic hand and a storage medium, wherein the method comprises the following steps: acquiring electromyographic signal data, and determining gesture actions according to the electromyographic signal data; determining a target gesture combination corresponding to the gesture action according to the gesture action; and triggering the intelligent bionic motion unit hand according to the target gesture combination, and controlling the motion unit to complete the target gesture combination. The invention can determine gesture actions according to the collected electromyographic signal data, can determine target gesture combinations according to the gesture actions in consideration of the relevance between the gesture actions, controls the motion unit to complete the target gesture combinations, and is beneficial to executing the gesture combinations more efficiently, thereby improving the convenience and comfort of the user using the bionic hand.

Description

Gesture combination triggering method and device, intelligent bionic hand and storage medium
Technical Field
The invention relates to the technical field of bionic hands, in particular to a gesture combination triggering method and device, an intelligent bionic hand and a storage medium.
Background
The intelligent bionic hand is an intelligent product with high integration of a brain-computer interface technology and an artificial intelligence algorithm. The intelligent bionic hand can identify the movement intention of the wearer by extracting the arm neuromuscular signals of the wearer and convert the movement schematic diagram into the movement of the intelligent bionic hand, so that the smart intelligence is achieved, and the hand moves with the heart.
When the intelligent bionic hand in the prior art identifies the electromyographic signals, the intelligent bionic hand basically and directly identifies the electromyographic signals received each time in a single mode, and an action is identified when an electromyographic signal is received each time. However, in practical applications, many gesture motions may be combined, but in the prior art, a plurality of interrelated gesture motions are not combined to be recognized.
Thus, there is a need for improvements and enhancements in the art.
Disclosure of Invention
The present invention is to provide a method and an apparatus for triggering gesture combination, an intelligent bionic hand, and a storage medium, aiming at solving the problem that in the prior art, gesture recognition is performed by only one action, and multiple interrelated gesture actions are not combined for recognition.
In order to solve the technical problems, the technical scheme adopted by the invention is as follows:
in a first aspect, the present invention provides a gesture combination triggering method, wherein the method includes:
acquiring electromyographic signal data, and determining gesture actions according to the electromyographic signal data;
determining a target gesture combination corresponding to the gesture action according to the gesture action;
and triggering a motion unit of the intelligent bionic hand according to the target gesture combination, and controlling the motion unit to complete the target gesture combination.
In one implementation method, the acquiring electromyographic signal data and determining a gesture action according to the electromyographic signal data includes:
acquiring the gesture electromyographic signal data, and analyzing the gesture electromyographic signal data to obtain action potential information corresponding to the gesture electromyographic signal data;
and determining the gesture action corresponding to the action potential information according to the action potential information.
In one implementation, the determining, according to the gesture motion, a target gesture combination corresponding to the gesture motion includes:
determining a behavior category corresponding to the gesture action according to the gesture action;
and determining a target gesture combination corresponding to the gesture action according to the behavior category.
In one implementation method, the determining, according to the behavior category, a target gesture combination corresponding to the gesture action includes:
matching the behavior categories with a preset gesture combination template library, wherein gesture combination information of a plurality of behavior categories is stored in the gesture combination template library, and each gesture combination information is provided with a plurality of gesture actions;
determining gesture combination information corresponding to the behavior category in the gesture combination template library;
and screening the gesture combination information to determine the target gesture combination.
In an implementation method, the screening the gesture combination information to determine the target gesture combination includes:
matching the gesture action with each piece of gesture combination information;
and taking the gesture combination information of the gesture action as the target gesture combination.
In one implementation method, the controlling the motion unit to execute the action corresponding to the target gesture combination includes:
acquiring a matching action in the target gesture combination, wherein the matching action is used for forming the target gesture combination with the gesture action combination;
and controlling the motion unit to continue to execute the matching action after executing the gesture action.
In one implementation, the gesture combination triggering method includes:
and when the motion unit executes the target gesture combination, stopping acquiring the electromyographic signal data.
In a second aspect, an embodiment of the present invention further provides a gesture combination triggering apparatus, where the apparatus includes:
the gesture action determining module is used for acquiring electromyographic signal data and determining a gesture action according to the electromyographic signal data;
the gesture combination determining module is used for determining a target gesture combination corresponding to the gesture action according to the gesture action;
and the gesture combination execution module is used for triggering the intelligent bionic motion unit hand according to the target gesture combination and controlling the motion unit to execute the action corresponding to the target gesture combination.
In a third aspect, an embodiment of the present invention further provides an intelligent bionic hand, where the intelligent bionic hand includes a memory, a processor, and a program of a gesture combination triggering method stored in the memory and executable on the processor, and when the processor executes the program of the gesture combination triggering method, the steps of the gesture combination triggering method in any one of the above-mentioned schemes are implemented.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, where the computer-readable storage medium stores a program of a gesture combination triggering method, and when the program of the gesture combination triggering method is executed by a processor, the steps of the gesture combination triggering method in any one of the above-mentioned schemes are implemented.
Has the advantages that: compared with the prior art, the invention provides a gesture combination triggering method, firstly acquiring electromyographic signal data, and determining gesture actions according to the electromyographic signal data; determining a target gesture combination corresponding to the gesture action according to the gesture action; and triggering the intelligent bionic motion unit hand according to the target gesture combination, and controlling the motion unit to complete the target gesture combination. The invention considers the relevance between gesture actions, determines the target gesture combination according to the gesture actions, and controls the motion unit to complete the target gesture combination, so that a series of gesture action combinations can be triggered through one gesture action, the gesture combination can be judged and executed more efficiently, and the convenience and the comfort of using the bionic hand by a user are improved.
Drawings
Fig. 1 is a flowchart of a specific implementation of a gesture combination triggering method according to an embodiment of the present invention.
Fig. 2 is a schematic block diagram of a gesture combination triggering apparatus provided in an embodiment of the present invention.
Fig. 3 is a schematic block diagram of an internal structure of an intelligent bionic hand provided by an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and effects of the present invention clearer and clearer, the present invention is further described in detail below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
When the intelligent bionic hand in the prior art identifies the electromyographic signals, the intelligent bionic hand basically and directly identifies the electromyographic signals received each time in a single mode, and an action is identified when an electromyographic signal is received each time. However, in practical applications, many gesture motions may be combined, but in the prior art, a plurality of correlated gesture motions are not combined to be recognized, so that the recognition efficiency is affected. In order to solve the above technical problem, this embodiment provides a gesture combination triggering method, in this example, firstly, obtaining electromyographic signal data, and determining a gesture action according to the electromyographic signal data; determining a target gesture combination corresponding to the gesture action according to the gesture action; and triggering a motion unit of the intelligent bionic hand according to the target gesture combination, and controlling the motion unit to complete the target gesture combination. The gesture combination can be determined according to the collected electromyographic signal data, the target gesture combination can be determined according to the gesture actions in consideration of the relevance among the gesture actions, the motion unit is controlled to complete the target gesture combination, the gesture combination can be executed more efficiently, and therefore convenience and comfortableness of a user using a bionic hand are improved.
For example, when a target gesture combination of a user needs to be determined, the intelligent bionic hand obtains electromyographic signal data a of the user, analyzes the electromyographic signal data a, determines that a gesture action corresponding to the electromyographic signal data a is B, determines a target gesture combination B 'corresponding to the gesture action B by screening after the gesture action B is obtained, triggers a motion unit of the intelligent bionic hand, and controls the motion unit to complete the target gesture combination B'. Therefore, the gesture combination B' formed by the gesture actions can be directly obtained through the gesture action B, the intelligent bionic hand can recognize and execute the gesture actions more efficiently, and the use requirements of users are met.
Exemplary method
The gesture combination triggering method can be applied to terminal equipment which can be computers, mobile phones and other intelligent terminal products. Moreover, the gesture template generated in the embodiment is applied to the bionic hand, so that the terminal device of the embodiment can establish communication connection with the bionic hand, or is directly arranged on the bionic hand to form an intelligent bionic hand. The gesture combination triggering method of the embodiment is explained by taking an intelligent bionic hand as an execution subject.
In specific implementation, as shown in fig. 1, the gesture combination triggering method in this embodiment includes the following steps:
and S100, acquiring electromyographic signal data, and determining gesture actions according to the electromyographic signal data.
The electromyographic signal data is superposition of action potential information of a motion unit (such as an arm) in a plurality of myofibers on time and space, and because each electromyographic signal data has different action potential information, and the action potential information reflects that a surface electromyographic signal is a comprehensive effect of electrical activity of superficial muscles and nerve trunks on the surface of skin, the gesture characteristics corresponding to the action gesture made by a user can be determined based on the action potential information. The embodiment can acquire corresponding action potential information from the effective electromyographic signal data, and then analyze corresponding gesture features.
For example, when A, B, C is obtained as the electromyographic signal data of the bionic hand user, it can be determined that the gesture actions of the bionic hand user are respectively a holding, B picking a key, and C shaking based on the electromyographic signal. It can be seen that each electromyographic signal data may represent the same gesture action or may represent different gesture actions.
In an implementation method, when acquiring electromyographic signal data and determining a gesture action according to the electromyographic signal data, the embodiment includes the following steps:
s101, acquiring the gesture electromyographic signal data, and analyzing the gesture electromyographic signal data to obtain action potential information corresponding to the gesture electromyographic signal data;
and S102, determining the gesture action corresponding to the action potential information according to the action potential information.
In this embodiment, the action potential information corresponding to different electromyographic signal data is different, and the gesture actions corresponding to different electromyographic signal data may be different. Therefore, after obtaining the electromyographic signal data, the embodiment can obtain the action potential information corresponding to each electromyographic signal data, and the action potential information reflects the gesture action corresponding to each electromyographic signal data, so that the embodiment can determine which action the gesture action corresponding to the electromyographic signal data corresponds to by matching the action potential information with the action potential information corresponding to the preset known action.
For example, after acquiring myoelectric signal data with the numbers of a1 and a2 … a10, the intelligent bionic hand further obtains action potential information a1 and a2 … a10 corresponding to the myoelectric signal data. And determining the gesture action corresponding to each action potential information according to the action potential information. For example, it is determined that the gesture motions corresponding to the motion potential information a1, a2, A3 are handshake motions, the gesture motions corresponding to the motion potential information a4, a5, A6, a7, A8, a9 are pen-holding motions, the gesture motions corresponding to the motion potential information a10 are key-holding motions, the gesture motions corresponding to the myoelectric signal data a1, a2, A3 are handshake motions, the gesture motions corresponding to the myoelectric signal data a4, a5, A6, a7, A8, a9 are pen-holding motions, and the gesture motion corresponding to the myoelectric signal data a10 is key-holding motions.
And S200, determining a target gesture combination corresponding to the gesture action according to the gesture action.
Considering that in practical application, certain relevance exists between gesture actions, and many gesture actions can be combined, a target gesture combination corresponding to the gesture actions can be determined according to the gesture actions, so that the efficiency of recognizing the gesture combination is improved.
In one implementation manner, according to the gesture motion, the step of determining the target gesture combination corresponding to the gesture motion comprises the following steps:
step S201, determining a behavior category corresponding to the gesture action according to the gesture action;
and S202, determining a target gesture combination corresponding to the gesture action according to the behavior category.
In this embodiment, the intelligent bionic hand analyzes the acquired gesture actions, determines the behavior categories corresponding to the gesture actions, and matches the behavior categories with a preset gesture combination template library, where gesture combination information of a plurality of behavior categories is stored in the gesture combination template library, and each gesture combination information is provided with a plurality of gesture actions; and determining gesture combination information corresponding to the behavior category in the gesture combination template library, and screening the gesture combination information to determine the target gesture combination. And screening gesture combination information, namely matching the gesture action with each piece of gesture combination information, and taking the gesture combination information with the gesture action as the target gesture combination.
When the intelligent bionic hand is applied specifically, the intelligent bionic hand determines the corresponding behavior type according to the acquired gesture actions. For example, when the intelligent bionic hand recognizes that the gesture motion of the user is a hand stretching motion of opening five fingers for 3 seconds, the behavior class corresponding to the hand stretching motion can be determined to be 'hand motion'. And matching the behavior category of 'hand movement' with a preset gesture combination template library. The gesture combination template library stores gesture combination information of a plurality of 'hand movement' behavior categories, such as gesture combination information (five fingers open for 3 seconds, five fingers close for 3 seconds, and fist making for 3 seconds), (five fingers close for 3 seconds, five fingers open for 3 seconds, and palm striking), (thumb open for 3 seconds, index finger open, middle finger open), (palm open for 3 seconds, fist making, and five fingers open), which are gesture combination information belonging to the 'hand movement' behavior category. And then screening the gesture combination information, matching the gesture action 'five fingers open for three seconds' with each gesture combination information, and considering that the specific sequence of the gesture combination information finds that only the first gesture action can be matched as the gesture combination information of 'five fingers open for 3 seconds' (five fingers open for 3 seconds, five fingers close together for 3 seconds, and fist making for 3 seconds), and then using the gesture combination information (five fingers open for 3 seconds, five fingers close together for 3 seconds, and fist making for 3 seconds) as the target gesture combination.
And 300, triggering a motion unit of the intelligent bionic hand according to the target gesture combination, and controlling the motion unit to complete the target gesture combination.
In this embodiment, after the target gesture combination is determined, the motion unit of the intelligent bionic hand can be triggered, and the target gesture combination is controlled to be completed, so that the target gesture combination can be determined and executed through gesture actions, and the recognition efficiency of the intelligent bionic hand is improved. The motion unit is composed of a movable joint and fingers and is used for controlling the movable joint and the fingers to finish independent motion of the fingers and coordinated operation of finger tips. And after the target gesture combination is determined, the intelligent bionic hand sends a control instruction to the motion unit, wherein the control instruction comprises control information such as target gesture motion combination information, an execution sequence, an execution time, motion strength, an angle and the like of all parts of the motion unit. And after the motion unit receives the control instruction, controlling each component to execute the corresponding instruction, and controlling to complete the target gesture combination.
For example, as described in the above example, after determining that the target gesture combination is (five fingers open for 3 seconds, five fingers close together for 3 seconds, and fist making for 3 seconds), the intelligent bionic hand sends a control instruction to the motion unit, where the control instruction includes gesture motion combination information (five fingers open for 3 seconds, five fingers close together for 3 seconds, and fist making for 3 seconds), the execution sequence is that the five fingers open first, then the five fingers close together, and finally the fist making is performed, the execution time is 3 seconds for each motion, and control information such as motion strength and angle required by each finger and movable joint of the motion unit to complete the gesture motion. After receiving the control instruction, the motion unit controls each component to execute the corresponding instruction, so that the motion unit can complete the target gesture combination.
In one implementation manner, the step of controlling the motion unit to execute the action corresponding to the target gesture combination includes the following steps:
301, acquiring a matching action in the target gesture combination, wherein the matching action is used for forming the target gesture combination with the gesture action combination;
and step 302, controlling the motion unit to continue to execute the matching action after executing the gesture action.
When the intelligent bionic gesture control system is applied specifically, the intelligent bionic gesture firstly combines gesture actions and matching actions, and then controls the motion unit to execute the gesture action combination. For example, as in the above example, the intelligent bionic hand first recognizes that the gesture motion of the user is a hand stretching motion of "five fingers open for 3 seconds", and determines that the motion matched with the gesture motion is "five fingers close together for 3 seconds, and fist making for 3 seconds". And combining the gesture action of opening the five fingers for 3 seconds with the matching action of closing the five fingers for 3 seconds and clenching the fist for 3 seconds. The intelligent bionic hand firstly controls the motion unit to execute gesture motions of opening the five fingers for 3 seconds and then continuously executes matching motions of closing the five fingers for 3 seconds and clenching the fist for 3 seconds, so that the motion unit is controlled to sequentially execute target gesture combination.
In one implementation, the acquiring of the electromyographic signal data is stopped when the motion unit is performing the target gesture combination.
When the intelligent bionic hand is used for controlling the motion unit to execute target gesture combination, the electromyographic signal data stops being acquired. That is, to illustrate, once the target gesture combination is determined and executed, the intelligent bionic hand does not continue to acquire electromyographic signal data and determine other gesture actions until the execution of the target gesture combination is completed.
For example, as described in the above example, the intelligent bionic hand first controls the motion unit to perform the gesture motion "five fingers open for 3 seconds" and then continues to perform the matching motion "five fingers close together for 3 seconds and make a fist for 3 seconds". If the user generates electromyographic signal data a1, a2 during the execution of the gesture combination, and the user generates electromyographic signal data A3, a4 after the execution of the gesture combination, only the bionic hand can acquire the electromyographic signal data A3, a 4.
In summary, the embodiment first obtains the electromyographic signal data, and determines the gesture action according to the electromyographic signal data; determining a target gesture combination corresponding to the gesture action according to the gesture action; and triggering the intelligent bionic motion unit hand according to the target gesture combination, and controlling the motion unit to complete the target gesture combination. The gesture actions can be determined according to the collected electromyographic signal data, the target gesture combination can be determined according to the gesture actions in consideration of the relevance among the gesture actions, the motion unit is controlled to complete the target gesture combination, the gesture combination can be executed more efficiently, and therefore convenience and comfortableness of a user using a bionic hand are improved.
Exemplary devices
Based on the above embodiment, the present invention also discloses a gesture combination triggering device, as shown in fig. 2, the device includes: a gesture action determination module 10, a gesture combination determination module 20, and a gesture combination execution module 30. Specifically, the gesture action determining module 10 in this embodiment is configured to acquire electromyographic signal data and determine a gesture action according to the electromyographic signal data. The gesture combination determination module 20 is configured to determine, according to the gesture motion, a target gesture combination corresponding to the gesture motion. And a gesture combination executing module 30, configured to trigger the intelligent bionic motion unit hand according to the target gesture combination, and control the motion unit to execute an action corresponding to the target gesture combination.
In one implementation, the gesture motion determination module 10 in this embodiment includes:
the electromyographic signal data acquisition unit is used for acquiring historical electromyographic signal data;
and the gesture action determining unit is used for determining the gesture action corresponding to the electromyographic signal data according to a preset time period.
In one implementation manner, the electromyographic signal data acquiring unit in the present embodiment includes:
the action potential information acquisition subunit is used for acquiring a plurality of electromyographic signal data and determining action potential information corresponding to each electromyographic signal data.
In one implementation, the gesture combination determination module 20 in this embodiment includes:
the behavior type determining unit is used for determining a behavior type corresponding to the gesture action according to the gesture action;
and the target gesture combination determining unit is used for determining a target gesture combination corresponding to the gesture action according to the behavior category.
In one implementation manner, the target gesture combination determination unit in this embodiment includes:
the gesture combination template library matching subunit is used for matching the behavior classes with a preset gesture combination template library, the gesture combination template library stores gesture combination information of a plurality of behavior classes, and each gesture combination information is provided with a plurality of gesture actions;
the gesture combination information determining subunit is used for determining gesture combination information corresponding to the behavior category in the gesture combination template library;
and the target gesture combination determining subunit is used for screening the gesture combination information to determine the target gesture combination.
In one implementation, the target gesture combination determination subunit in this embodiment includes:
the gesture combination information matching subunit is used for matching the gesture action with each piece of gesture combination information;
and the target gesture combination subunit is used for taking the gesture combination information with the gesture actions as the target gesture combination.
In one implementation, the gesture combination execution module 30 in this embodiment includes:
a matching action obtaining unit, configured to obtain a matching action in the target gesture combination, where the matching action is used to form the target gesture combination with the gesture combination;
the matching action execution unit is used for controlling the motion unit to continuously execute the matching action after executing the gesture action;
and the electromyographic signal data stop acquiring unit is used for stopping acquiring the electromyographic signal data when the motion unit executes the target gesture combination.
The working principle of the gesture combination triggering device in this embodiment is the same as that described in the above method embodiment, and is not described herein again.
Based on the above embodiments, the present invention further provides an intelligent bionic hand, and a schematic block diagram thereof can be shown in fig. 3. The intelligent bionic hand comprises a processor and a memory which are connected through a system bus. Wherein the processor of the intelligent bionic hand is used for providing computing and control capability. The memory of the intelligent bionic hand comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the intelligent bionic hand is used for being connected and communicated with an external terminal through a network. The computer program is executed by a processor to implement a gesture combination triggering method.
It will be understood by those skilled in the art that the block diagram shown in fig. 3 is a block diagram of only a portion of the structure associated with the inventive arrangements and is not intended to limit the intelligent biomimetic hand to which the inventive arrangements may be applied, as the particular intelligent biomimetic hand may include more or less components than shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, an intelligent bionic hand is provided, which includes a memory, a processor and a gesture combination triggering program stored in the memory and executable on the processor, and when the processor executes the gesture combination triggering program, the following operation instructions are implemented:
acquiring electromyographic signal data, and determining gesture actions according to the electromyographic signal data;
determining a target gesture combination corresponding to the gesture action according to the gesture action;
and triggering the intelligent bionic motion unit hand according to the target gesture combination, and controlling the motion unit to complete the target gesture combination.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, operational databases, or other media used in embodiments provided herein may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double-rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), synchronous link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The invention discloses a gesture combination triggering method, a gesture combination triggering device, terminal equipment and a storage medium, wherein the method comprises the following steps: acquiring electromyographic signal data, and determining gesture actions according to the electromyographic signal data; determining a target gesture combination corresponding to the gesture action according to the gesture action; and triggering the intelligent bionic motion unit hand according to the target gesture combination, and controlling the motion unit to complete the target gesture combination. The invention can determine gesture actions according to the collected electromyographic signal data, can determine target gesture combinations according to the gesture actions in consideration of the relevance between the gesture actions, controls the motion unit to complete the target gesture combinations, and is beneficial to executing the gesture combinations more efficiently, thereby improving the convenience and comfort of the user using the bionic hand.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, and not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A gesture combination triggering method, characterized in that the method comprises:
acquiring electromyographic signal data, and determining gesture actions according to the electromyographic signal data;
determining a target gesture combination corresponding to the gesture action according to the gesture action;
and triggering a motion unit of the intelligent bionic hand according to the target gesture combination, and controlling the motion unit to complete the target gesture combination.
2. The gesture combination triggering method according to claim 1, wherein the acquiring electromyographic signal data and determining the gesture action according to the electromyographic signal data comprises:
acquiring the gesture electromyographic signal data, and analyzing the gesture electromyographic signal data to obtain action potential information corresponding to the gesture electromyographic signal data;
and determining the gesture action corresponding to the action potential information according to the action potential information.
3. The method for triggering the gesture combination according to claim 1, wherein the determining, according to the gesture action, the target gesture combination corresponding to the gesture action comprises:
determining a behavior category corresponding to the gesture action according to the gesture action;
and determining a target gesture combination corresponding to the gesture action according to the behavior category.
4. The gesture combination triggering method according to claim 3, wherein the determining, according to the behavior category, the target gesture combination corresponding to the gesture action includes:
matching the behavior categories with a preset gesture combination template library, wherein gesture combination information of a plurality of behavior categories is stored in the gesture combination template library, and each gesture combination information is provided with a plurality of gesture actions;
determining gesture combination information corresponding to the behavior category in the gesture combination template library;
and screening the gesture combination information to determine the target gesture combination.
5. The gesture combination triggering method according to claim 4, wherein the filtering the gesture combination information to determine the target gesture combination comprises:
matching the gesture action with each piece of gesture combination information;
and taking the gesture combination information of the gesture action as the target gesture combination.
6. The gesture combination triggering method according to claim 1, wherein the controlling the motion unit to execute the action corresponding to the target gesture combination comprises:
acquiring a matching action in the target gesture combination, wherein the matching action is used for forming the target gesture combination with the gesture action combination;
and controlling the motion unit to continue to execute the matching action after executing the gesture action.
7. The gesture combination triggering method according to claim 1, characterized in that the method further comprises:
and when the motion unit executes the target gesture combination, stopping acquiring the electromyographic signal data.
8. A gesture combination triggering device, the device comprising:
the gesture action determining module is used for acquiring electromyographic signal data and determining a gesture action according to the electromyographic signal data;
the gesture combination determining module is used for determining a target gesture combination corresponding to the gesture action according to the gesture action;
and the gesture combination execution module is used for triggering the intelligent bionic motion unit hand according to the target gesture combination and controlling the motion unit to execute the action corresponding to the target gesture combination.
9. An intelligent bionic hand, characterized in that the intelligent bionic hand comprises a memory, a processor and a gesture combination triggering program stored in the memory and capable of running on the processor, and the processor executes the gesture combination triggering program to realize the steps of the gesture combination triggering method according to any one of claims 1-7.
10. A computer-readable storage medium, having stored thereon a gesture combination trigger program, which when executed by a processor, performs the steps of the gesture combination trigger method according to any one of claims 1-7.
CN202210133702.1A 2022-02-14 2022-02-14 Gesture combination triggering method and device, intelligent bionic hand and storage medium Pending CN114625246A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210133702.1A CN114625246A (en) 2022-02-14 2022-02-14 Gesture combination triggering method and device, intelligent bionic hand and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210133702.1A CN114625246A (en) 2022-02-14 2022-02-14 Gesture combination triggering method and device, intelligent bionic hand and storage medium

Publications (1)

Publication Number Publication Date
CN114625246A true CN114625246A (en) 2022-06-14

Family

ID=81898029

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210133702.1A Pending CN114625246A (en) 2022-02-14 2022-02-14 Gesture combination triggering method and device, intelligent bionic hand and storage medium

Country Status (1)

Country Link
CN (1) CN114625246A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117752477A (en) * 2024-02-22 2024-03-26 浙江强脑科技有限公司 Method, device, terminal and medium for controlling gesture locking of bionic hand

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104665962A (en) * 2015-02-05 2015-06-03 华南理工大学 Wearable function-enhanced manipulator system as well as assisting fingers and control method thereof
CN106598234A (en) * 2016-11-28 2017-04-26 电子科技大学 Gesture recognition method based on inertial sensing
CN109765823A (en) * 2019-01-21 2019-05-17 吉林大学 Ground crawler-type unmanned vehicle control method based on arm electromyography signal
CN113183150A (en) * 2021-04-09 2021-07-30 周先军 Bionic hand control optimization method and system and electronic equipment
CN113946225A (en) * 2021-12-20 2022-01-18 深圳市心流科技有限公司 Gesture locking method, intelligent bionic hand, terminal and storage medium
CN113946224A (en) * 2021-12-20 2022-01-18 深圳市心流科技有限公司 Control method and device for myoelectric gesture recognition of intelligent bionic hand and storage medium
CN113970968A (en) * 2021-12-22 2022-01-25 深圳市心流科技有限公司 Intelligent bionic hand action pre-judging method
CN113977588A (en) * 2021-12-23 2022-01-28 深圳市心流科技有限公司 Gesture recognition method and device for intelligent bionic hand, terminal and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104665962A (en) * 2015-02-05 2015-06-03 华南理工大学 Wearable function-enhanced manipulator system as well as assisting fingers and control method thereof
CN106598234A (en) * 2016-11-28 2017-04-26 电子科技大学 Gesture recognition method based on inertial sensing
CN109765823A (en) * 2019-01-21 2019-05-17 吉林大学 Ground crawler-type unmanned vehicle control method based on arm electromyography signal
CN113183150A (en) * 2021-04-09 2021-07-30 周先军 Bionic hand control optimization method and system and electronic equipment
CN113946225A (en) * 2021-12-20 2022-01-18 深圳市心流科技有限公司 Gesture locking method, intelligent bionic hand, terminal and storage medium
CN113946224A (en) * 2021-12-20 2022-01-18 深圳市心流科技有限公司 Control method and device for myoelectric gesture recognition of intelligent bionic hand and storage medium
CN113970968A (en) * 2021-12-22 2022-01-25 深圳市心流科技有限公司 Intelligent bionic hand action pre-judging method
CN113977588A (en) * 2021-12-23 2022-01-28 深圳市心流科技有限公司 Gesture recognition method and device for intelligent bionic hand, terminal and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117752477A (en) * 2024-02-22 2024-03-26 浙江强脑科技有限公司 Method, device, terminal and medium for controlling gesture locking of bionic hand

Similar Documents

Publication Publication Date Title
CN113946224B (en) Control method and device for myoelectric gesture recognition of intelligent bionic hand and storage medium
CN111209885B (en) Gesture information processing method and device, electronic equipment and storage medium
Karnam et al. EMGHandNet: A hybrid CNN and Bi-LSTM architecture for hand activity classification using surface EMG signals
Wang et al. An accurate eegnet-based motor-imagery brain–computer interface for low-power edge computing
CN113970968B (en) Intelligent bionic hand action pre-judging method
CN113986017B (en) Myoelectric gesture template generation method and device and storage medium
CN113977589A (en) Gesture recognition threshold adjusting method and device and storage medium
Khushaba Correlation analysis of electromyogram signals for multiuser myoelectric interfaces
CN113946225B (en) Gesture locking method, intelligent bionic hand, terminal and storage medium
WO2021052045A1 (en) Body movement recognition method and apparatus, computer device and storage medium
Samadani et al. Hand gesture recognition based on surface electromyography
CN114167995B (en) Gesture locking method and device for bionic hand, terminal and storage medium
Zhu et al. Cascaded adaptation framework for fast calibration of myoelectric control
Malešević et al. Vector autoregressive hierarchical hidden Markov models for extracting finger movements using multichannel surface EMG signals
CN114201052A (en) Motion force control method and device of bionic hand and storage medium
CN114625246A (en) Gesture combination triggering method and device, intelligent bionic hand and storage medium
US11886559B2 (en) Biometric identification and control systems and methods for providing customizable security through authentication of biosignal representations of one or more user-specific and user-selected gesture-intentions
Malešević et al. Decoding of individual finger movements from surface EMG signals using vector autoregressive hierarchical hidden Markov models (VARHHMM)
CN114756136B (en) Training standard reaching prompting method and device for electromyographic signals and electroencephalographic signals
CN114217694A (en) Bionic hand and gesture control method thereof, server and storage medium
CN114167996B (en) Sensor-based action pre-judging method and device and storage medium
Chang et al. A hierarchical hand motions recognition method based on IMU and sEMG sensors
Fang et al. Modelling EMG driven wrist movements using a bio-inspired neural network
CN114668564B (en) Method for dynamically adjusting sampling frequency based on electromyographic signal data
CN114683292B (en) Sampling frequency control method of electromyographic equipment, intelligent bionic hand and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20220614