CN111061368B - Gesture detection method and wearable device - Google Patents

Gesture detection method and wearable device Download PDF

Info

Publication number
CN111061368B
CN111061368B CN201911247266.5A CN201911247266A CN111061368B CN 111061368 B CN111061368 B CN 111061368B CN 201911247266 A CN201911247266 A CN 201911247266A CN 111061368 B CN111061368 B CN 111061368B
Authority
CN
China
Prior art keywords
air pressure
gesture
arm
angle
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911247266.5A
Other languages
Chinese (zh)
Other versions
CN111061368A (en
Inventor
黄剑
段涛
曹瑜
熊蔡华
汪雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Ezhou Institute of Industrial Technology Huazhong University of Science and Technology
Original Assignee
Huazhong University of Science and Technology
Ezhou Institute of Industrial Technology Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology, Ezhou Institute of Industrial Technology Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CN201911247266.5A priority Critical patent/CN111061368B/en
Publication of CN111061368A publication Critical patent/CN111061368A/en
Application granted granted Critical
Publication of CN111061368B publication Critical patent/CN111061368B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Abstract

The invention provides a gesture detection method and wearable equipment, wherein the wearable equipment comprises the following steps: the first wrist structure comprises a first wrist strap, a first inertial navigation sensor arranged on the first wrist strap, at least one air bag arranged on the inner wall of the first wrist strap, and at least one air pressure sensor connected with the at least one air bag in a one-to-one correspondence manner; the second wrist structure comprises a second wrist strap, a second inertial navigation sensor arranged on the second wrist strap and a main control module arranged on the second wrist strap; the system comprises at least one air pressure sensor, a main control module, a first inertial navigation sensor, a second inertial navigation sensor, a first control module and a second control module, wherein the at least one air pressure sensor is used for detecting air pressure information of a corresponding air bag, obtaining at least one air pressure information corresponding to the at least one air bag one by one, the first inertial navigation sensor is used for detecting and obtaining a forearm gesture angle, the second inertial navigation sensor is used for detecting and obtaining a forearm gesture angle, and the main control module is used for determining a target intention gesture corresponding to a target arm based on the at least one air pressure information, the forearm gesture angle and the forearm gesture angle.

Description

Gesture detection method and wearable device
Technical Field
The invention relates to the technical field of man-machine interaction, in particular to a gesture detection method and wearable equipment.
Background
In daily life, people can finish various grabbing movements required by daily life through hands. However, many people have to lose their own hands due to accidents or disease occurrence, which is a great disaster for patients with missing hand limbs. It is therefore necessary to replace the hand with another device, and the design of the prosthesis becomes particularly important. The control quality of a prosthetic hand is evaluated, and one of the most important indexes is the man-machine interaction capability and the control effectiveness of the prosthetic hand. The human-computer interaction capability and the control effectiveness thereof all need to accurately identify the gestures.
In the existing research, the electromyographic signals can reflect the motion control intention of the human body in real time and can be used for gesture recognition. The electromyographic signals are weak electrical signals generated in the process of detecting muscle activities by electromyographic electrodes which are closely attached to the skin. However, the myoelectric signal must be amplified and filtered by a weak electric signal to be applied to actual prosthetic hand control or man-machine interaction. The process is easily interfered by electromagnetic noise in the environment, and signal disturbance is caused. And the myoelectric electrode needs to be closely attached to the skin in the wearing process, and the contact surface of the skin and the myoelectric electrode needs to be rubbed by alcohol to ensure small impedance, so that inconvenience is caused in the wearing process of a user. Moreover, along with the change of time, the electromyographic signals can be influenced by fatigue and sweating of a human body, so that gesture recognition is not accurate enough.
Disclosure of Invention
The embodiment of the invention provides a gesture detection method and wearable equipment, which are used for detecting the muscle activity state and the arm gesture of a user in real time and accurately identifying the intention gesture of the user in real time.
In a first aspect, the present invention provides a gesture detection method applied to a wearable device, where the wearable device includes a first wrist structure and a second wrist structure, the first wrist structure includes a first wrist strap, a first inertial navigation sensor disposed on the first wrist strap, at least one air pressure sensor disposed on an inner wall of the first wrist strap and connected to the at least one air bag in a one-to-one correspondence manner, the second wrist structure includes a second wrist strap, a second inertial navigation sensor disposed on the second wrist strap, and a main control module disposed on the second wrist strap, and the main control module is connected to the at least one air pressure sensor, the first inertial navigation sensor, and the second inertial navigation sensor, where the first wrist structure is worn on a forearm of a target arm through the first wrist strap and the second wrist structure is worn on a big arm of the target arm through the second wrist strap, the method includes:
at least one air pressure information corresponding to the at least one air bag is obtained through detection of the at least one air pressure sensor, and the at least one air pressure information is sent to the main control module;
the small arm attitude angle is obtained through detection of the first inertial navigation sensor, and the small arm attitude angle is sent to the main control module;
detecting and obtaining a large arm attitude angle through the second inertial navigation sensor, and sending the large arm attitude angle to the main control module;
and determining a target intention gesture corresponding to the target arm through the main control module.
Optionally, the main control module includes a gesture recognition module, and the determining, by the main control module, the target intention gesture corresponding to the target arm includes:
determining, by the gesture recognition module, an air pressure characteristic corresponding to the at least one air pressure information;
determining rotation angle characteristics of four degrees of freedom corresponding to the small arm gesture angle and the large arm gesture angle through the gesture recognition module;
and inputting the air pressure characteristics and the rotation angle characteristics into a gesture recognition model through the gesture recognition module, and then recognizing a target intention gesture corresponding to the target arm, wherein the gesture recognition model is a model which is trained in advance based on different air pressure characteristics and different rotation angle characteristics.
Optionally, the determining, by the gesture recognition module, the air pressure characteristic corresponding to the at least one air pressure information includes:
and normalizing the at least one piece of air pressure information through the gesture recognition module to obtain air pressure characteristics, wherein each piece of air pressure information in the at least one piece of air pressure information is an air pressure change value of the corresponding air bag compared with the initial air pressure of the air bag.
Optionally, the determining, by the gesture recognition module, rotation angle features of four degrees of freedom corresponding to the small arm posture angle and the large arm posture angle includes:
the gesture recognition module is used for inputting the arm gesture angle and the big arm gesture angle into an arm joint model to obtain rotation angle characteristics of four degrees of freedom output by the arm joint model, wherein the rotation angle characteristics comprise an arm rotation angle, a crank angle, a big arm forward lifting angle and a lateral lifting angle, and the arm joint model is a model which is obtained by training in advance based on different arm gesture angles and different arm gesture angles.
Optionally, the main control module includes a sending module, and the determining, by the main control module, the target intention gesture corresponding to the target arm includes:
and transmitting the at least one piece of air pressure information, the small arm posture angle and the large arm posture angle to target equipment through the transmitting module, so that the target equipment recognizes a target intention gesture corresponding to the target arm based on the at least one piece of air pressure information, the small arm posture angle and the large arm posture angle, wherein the target equipment is used for determining air pressure characteristics corresponding to the at least one piece of air pressure information, determining rotation angle characteristics of four degrees of freedom corresponding to the small arm posture angle and the large arm posture angle, inputting the air pressure characteristics and the rotation angle characteristics into a gesture recognition model, and recognizing the target intention gesture corresponding to the target arm, wherein the gesture recognition model is a model which is trained in advance based on different air pressure characteristics and different rotation angle characteristics.
In a second aspect, an embodiment of the present invention provides a wearable device, including:
the first wrist structure comprises a first wrist strap, a first inertial navigation sensor arranged on the first wrist strap, at least one air bag arranged on the inner wall of the first wrist strap, and at least one air pressure sensor connected with the at least one air bag in a one-to-one correspondence manner;
the second wrist structure comprises a second wrist strap, a second inertial navigation sensor arranged on the second wrist strap and a main control module arranged on the second wrist strap, wherein the main control module is connected with the at least one air pressure sensor, the first inertial navigation sensor and the second inertial navigation sensor;
the first wrist structure is worn on a forearm of a target arm through the first wrist strap, the second wrist structure is worn on a forearm of the target arm through the second wrist strap, the at least one air pressure sensor is used for detecting air pressure information of corresponding air bags, at least one air pressure information corresponding to the at least one air bags one by one is obtained, the at least one air pressure information is sent to the main control module, the first inertial navigation sensor is used for detecting and obtaining a forearm gesture angle, the forearm gesture angle is sent to the main control module, the second inertial navigation sensor is used for detecting and obtaining a forearm gesture angle, the forearm gesture angle is sent to the main control module, and the main control module is used for determining a target intention gesture corresponding to the target arm based on the at least one air pressure information, the forearm gesture angle and the forearm gesture angle.
Optionally, the main control module includes a sending module, where the sending module is configured to send the at least one air pressure information, the forearm gesture angle, and the forearm gesture angle to a target device, so that the target device identifies a target intention gesture corresponding to the target arm based on the at least one air pressure information, the forearm gesture angle, and the forearm gesture angle, where the target device is configured to determine an air pressure feature corresponding to the at least one air pressure information, determine rotation angle features of four degrees of freedom corresponding to the forearm gesture angle, and the forearm gesture feature and the rotation angle feature are input into a gesture identification model, and then identify the target intention gesture corresponding to the target arm, where the gesture identification model is a model that is obtained by training in advance based on different air pressure features and different rotation angle features.
Optionally, the main control module includes a gesture recognition module, where the gesture recognition module is specifically configured to:
determining an air pressure characteristic corresponding to the at least one air pressure information;
determining rotation angle characteristics of four degrees of freedom corresponding to the small arm posture angle and the large arm posture angle;
and inputting the air pressure characteristics and the rotation angle characteristics into a gesture recognition model, and then recognizing a target intention gesture corresponding to the target arm, wherein the gesture recognition model is a model which is obtained by training in advance based on different air pressure characteristics and different rotation angle characteristics.
Optionally, the gesture recognition module is specifically configured to:
and normalizing the at least one piece of air pressure information through the gesture recognition module to obtain air pressure characteristics, wherein each piece of air pressure information in the at least one piece of air pressure information is an air pressure change value of the corresponding air bag compared with the initial air pressure of the air bag.
Optionally, the gesture recognition module is specifically configured to:
the gesture recognition module is used for inputting the arm gesture angle and the big arm gesture angle into an arm joint model to obtain rotation angle characteristics of four degrees of freedom output by the arm joint model, wherein the rotation angle characteristics comprise an arm rotation angle, a crank angle, a big arm forward lifting angle and a lateral lifting angle, and the arm joint model is a model which is obtained by training in advance based on different arm gesture angles and different arm gesture angles.
The above-mentioned one or more technical solutions in the embodiments of the present application at least have one or more of the following technical effects:
in the technical scheme of the embodiment of the invention, the wearable device comprises a first wrist structure and a second wrist structure, wherein the first wrist structure comprises a first wrist strap, a first inertial navigation sensor arranged on the first wrist strap, at least one air bag arranged on the inner wall of the first wrist strap, and at least one air pressure sensor connected with the at least one air bag in a one-to-one correspondence manner, the second wrist structure comprises a second wrist strap, a second inertial navigation sensor arranged on the second wrist strap, and a main control module arranged on the second wrist strap, the main control module is connected with the at least one air pressure sensor, the first inertial navigation sensor and the second inertial navigation sensor, under the condition that the first wrist structure is worn on a forearm of a target arm through the first wrist strap and the second wrist structure is worn on a big arm of the target arm through the second wrist strap, air pressure information of each air bag can be obtained through detection of the air pressure sensor, a small arm posture angle is obtained through detection of the first inertial navigation sensor, then the main control module is obtained through detection of the second inertial navigation sensor, and the main control module can accurately determine the gesture angle information of the target arm and the gesture of each air pressure posture according to the gesture of the target arm. Thus, the pneumatic characteristics of the air bag can accurately reflect the muscle activity state of the forearm and are not influenced by factors such as muscle fatigue and the like. Furthermore, the gesture recognition method is more accurate by combining the gesture angle of the small arm and the gesture angle of the large arm. In addition, the wearable device is small and easy to wear, and can be used without contacting skin, so that allergy is prevented, an interaction scene can be better adapted, and the interaction experience of a user is improved.
Drawings
Fig. 1 is a schematic wearing view of a wearable device according to a first embodiment of the present application;
FIG. 2 is a schematic view of a first wrist structure according to a first embodiment of the present disclosure;
FIG. 3 is an expanded view of a first wrist structure according to a first embodiment of the present application;
FIG. 4 is a schematic view of an air bag related structure in a first wrist structure according to a first embodiment of the present disclosure;
fig. 5 is a schematic diagram of the working principle of detecting the muscle activity of the forearm according to the first embodiment of the application.
Detailed Description
The embodiment of the invention provides a gesture detection method and wearable equipment, which are used for detecting the muscle activity state and the arm gesture of a user in real time and accurately identifying the intention gesture of the user in real time. The gesture detection method in this embodiment is applied to a wearable device, the wearable device includes a first wrist structure and a second wrist structure, the first wrist structure includes a first wrist strap, a first inertial navigation sensor disposed on the first wrist strap, at least one air bag disposed on an inner wall of the first wrist strap, at least one air pressure sensor connected to the at least one air bag in a one-to-one correspondence manner, the second wrist structure includes a second wrist strap, a second inertial navigation sensor disposed on the second wrist strap, and a main control module disposed on the second wrist strap, the main control module is connected with the at least one air pressure sensor, the first inertial navigation sensor and the second inertial navigation sensor, and in a case that the first wrist structure is worn on a forearm of a target arm through the first wrist strap and the second wrist structure is worn on a forearm of the target arm through the second wrist strap, the method includes: at least one air pressure information corresponding to the at least one air bag is obtained through detection of the at least one air pressure sensor, and the at least one air pressure information is sent to the main control module; the small arm attitude angle is obtained through detection of the first inertial navigation sensor, and the small arm attitude angle is sent to the main control module; detecting and obtaining a large arm attitude angle through the second inertial navigation sensor, and sending the large arm attitude angle to the main control module; determining air pressure characteristics corresponding to the at least one piece of air pressure information through the main control module; determining rotation angle characteristics of four degrees of freedom corresponding to the small arm posture angle and the large arm posture angle through the main control module; and determining a target intention gesture corresponding to the target arm based on the air pressure characteristic and the rotation angle characteristic.
The following detailed description of the technical solutions of the present invention is made by the accompanying drawings and specific embodiments, and it should be understood that the specific features of the embodiments and embodiments of the present application are detailed descriptions of the technical solutions of the present application, and not limiting the technical solutions of the present application, and the technical features of the embodiments and embodiments of the present application may be combined with each other without conflict.
The term "and/or" is herein merely an association relationship describing an associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship.
Since the human body intention acquired by a single sensor is limited in the prior art, the application is limited. The multi-sensor fusion detection can fuse various sensor information, so that human intentions can be complemented among different sensors, and finally accurate human intentions can be extracted. Therefore, the wearable device in the embodiment adopts a multi-sensor fusion mode to more accurately recognize the gestures of the user.
Referring to fig. 1 to 4, a wearable device according to a first embodiment of the present invention includes:
the first wrist structure 10 comprises a first wrist strap 11, a first inertial navigation sensor 12 arranged on the first wrist strap 11, at least one air bag 13 arranged on the inner wall of the first wrist strap 11, and at least one air pressure sensor 14 connected with the at least one air bag in a one-to-one correspondence manner.
The second wrist structure 20 comprises a second wrist strap 21, a second inertial navigation sensor 22 arranged on the second wrist strap 21, and a main control module 23 arranged on the second wrist strap 21, wherein the main control module 23 is connected with at least one air pressure sensor 14, the first inertial navigation sensor 12 and the second inertial navigation sensor 22.
Wherein, in the case that the first wrist structure 10 is worn on the forearm of the target arm through the first wrist strap 11 and the second wrist structure 20 is worn on the forearm of the target arm through the second wrist strap 21, the at least one air pressure sensor 14 is configured to detect air pressure information of the corresponding air bag 13, obtain at least one air pressure information corresponding to the at least one air bag one-to-one, send the at least one air pressure information to the main control module 23, the first inertial navigation sensor 12 is configured to detect and obtain the forearm posture angle, send the forearm posture angle to the main control module 23, the second inertial navigation sensor 22 is configured to detect and obtain the forearm posture angle, send the forearm posture angle to the main control module 23, the main control module 23 is configured to determine air pressure characteristics corresponding to the at least one air pressure information, and determine rotation angle characteristics of four degrees corresponding to the forearm posture angle and the forearm posture angle, and determine the target intention corresponding to the target arm based on the air pressure characteristics and the rotation angle characteristics.
Specifically, in the present embodiment, fig. 1 shows a wearing schematic diagram of the wearable device in the present embodiment, where the first wrist structure 10 is worn on the forearm of the target arm and the second wrist structure 20 is worn on the forearm of the target arm. Referring to fig. 2, fig. 2 shows a first wrist type structure 10 of the present embodiment, wherein 6 air bags 13 are disposed on the inner wall of a first wrist strap 11. Also placed on the first wristband 11 is an inertial navigation sensor 12 which is capable of detecting the arm pose in real time. All data on the first wrist structure 10 is transferred by the data bus 15 to the master control module 23. Referring to fig. 3, 6 air bags in the first wrist structure 10 are bound together by the first wrist strap 11, the outer part is non-deformable, and the tightness of the arm ring of the air bags when worn on the arm can be adjusted by the buckle 16.
Referring to fig. 4, each air bag 13 is an air bag made of PVC plastic by pressing, and each air bag has a one-way film valve 17 at the tail thereof for inflating the air bag. Each balloon has an air pressure sensor 14 in communication with an air duct 18. The detection range of the air pressure sensor is 0-20KPa higher than the atmospheric pressure, the corresponding output voltage is 0.5-4.5V, and the output voltage is in direct proportion to the air pressure in the air bag.
Referring to fig. 5, fig. 5 illustrates the working principle of the 6-channel air bag for detecting the muscle activity of the forearm in the air bag armlet device and method for gesture detection according to the present invention. When the air bag ring is worn, the forearm naturally relaxes, and 6 air bags of the air bags are respectively placed on six muscles of ulnar wrist flexor, ulnar extensor, total extensor, ulnar wrist flexor and superficial flexor. As shown in the figure, when the arm muscles are forcefully expanded, the outer portion of the balloon is covered with the first wrist strap 11 which is not deformed, and when the muscles are expanded, the balloon 13 is pressed, the inner volume becomes smaller, and the pressure of the inner portion becomes larger. The air duct is communicated with the air pressure sensor, the change of the air pressure can be detected currently, and the muscle force of the corresponding muscle can be measured through the linear change relation of the positive proportion.
In the specific implementation process, the number of the air bags arranged on the inner wall of the first wristband may be set according to actual needs, for example: 2.4, etc., and the present embodiment is not limited thereto.
Specifically, in this embodiment, the main control module 23 in the second wrist structure may use an STM32F104 embedded microprocessor to receive data detected by the air pressure sensor in real time, and receive data of the first inertial navigation sensor 12 and the second inertial navigation sensor 22 in real time, that is, at least one of air pressure information, a forearm attitude angle, and a forearm attitude angle. The main control module 23 further includes a transmitting module, which may be a wireless transmitting module. The main control module 23 can package data at 200Hz and transmit the data to the nRF24L01 wireless transmitting module through the SPI interface, and the nRF24L01 wireless transmitting module transmits the data to the wireless receiving module of the target device through the 2.4Ghz wireless radio frequency network.
The working flow of the main control module 23 provided in this embodiment is described as follows:
(1) After the main control module 23 is powered by the power switch, the system starts to operate, and firstly, the system is initialized, wherein the initialization comprises pin definition, clock initialization, interrupt initialization and the like
(2) After the system is initialized, a DMA mode of AD conversion is started, in the mode, the processor can circularly measure the voltage value of the air pressure sensor in the background, and after the air pressure information of the 6-channel air bag is detected each time, the air pressure information is stored in a register, so that the air pressure information is convenient to read.
(3) After the system is initialized, the serial port is started to be interrupted, the first inertial navigation sensor and the second inertial navigation sensor send respective gesture angles to the main controller through the serial port at the frequency of 200Hz, after the gesture information is received by the interruption, the main controller closes the interruption, data are stored in the gesture register, the first inertial navigation sensor and the second inertial navigation sensor adopt GY953 inertial navigation sensors, and in the implementation process, the types of the first inertial navigation sensor and the second inertial navigation sensor can be set according to actual needs, and the embodiment is not limited.
(4) After the system is initialized, the other thread starts a timer, sets a timing time of 5ms, encapsulates attitude information (comprising the attitude angle of the small arm and the attitude angle of the large arm) and air pressure information (comprising voltages corresponding to the 6 air bags) into frames after the timing time is up, sends the frames to target equipment through a sending module, and simultaneously displays the acquired information in a display screen.
In this embodiment, the flow of receiving data and recognizing gestures by the target device is as follows:
(1) After the system starts, the wireless receiving module receives the data frame transmitted by the main control module 23.
(2) And splitting the data frame into two parts, wherein one part is the air pressure value of six channels, and the other part is the attitude angle measured by the two inertial navigation sensors.
(3) And subtracting the initial value from the air pressure value of the six channels to obtain an air pressure change value. And normalizing the air pressure change value to obtain the air pressure characteristic.
(4) And modeling the arm joints to obtain an arm joint model, filtering the attitude angles of the two inertial navigation sensors, and converting the attitude angles into rotation angles of four degrees of freedom of the big arm and the small arm, namely a rotation angle of the small arm, a toggle angle, a forward lifting angle of the big arm and a lateral lifting angle of the big arm through the arm joint model.
(5) And sending the arm posture information and the forearm six-channel air bag air pressure information into a trained gesture recognition model, so that posture features and air pressure features can be converted into corresponding gestures. The gesture recognition model is a model which is obtained by training different air pressure characteristics and different rotation angle characteristics acquired for a plurality of times in advance.
(6) The gesture obtained by recognition can be displayed in real time through the display interface.
Further, in this embodiment, the operation of the gesture recognition portion may be completed by a gesture recognition module in the main control module, specifically, the gesture recognition module determines an air pressure characteristic corresponding to at least one air pressure information, specifically, normalizes the at least one air pressure information to obtain the air pressure characteristic, where each air pressure information in the at least one air pressure information is an air pressure variation value of the corresponding air bag compared with an initial air pressure of the air bag. Meanwhile, the gesture recognition module determines rotation angle characteristics of four degrees of freedom corresponding to the gesture angle of the small arm and the gesture angle of the large arm, and the gesture recognition module recognizes a target intention gesture corresponding to a target arm after inputting the air pressure characteristics and the rotation angle characteristics into a gesture recognition model, wherein the gesture recognition model is a model which is trained in advance based on different air pressure characteristics and different rotation angle characteristics. And finally, inputting the arm gesture angle and the large arm gesture angle into an arm joint model through a gesture recognition module, and obtaining rotation angle characteristics of four degrees of freedom output by the arm joint model, wherein the rotation angle characteristics comprise the arm rotation angle, the elbow angle, the large arm forward lifting angle and the lateral lifting angle, and the arm joint model is a model which is trained in advance based on different arm gesture angles and different arm gesture angles.
The wearable equipment that this embodiment provided can measure the muscle activity of forearm in real time to and arm gesture, and the device is small and exquisite easily dresses, and can not contact skin in the use, prevents that the allergy can adapt to interaction scene better, promotes user's interactive experience. Moreover, the muscle activity detection method through the air bag is a novel muscle activity detection mode and is not influenced by factors such as muscle fatigue, and further, the gesture recognition intention is more accurate by combining the gesture angle of the forearm and the gesture angle of the forearm.
A second embodiment of the present invention provides a gesture detection method applied to the wearable device in the foregoing first embodiment, where the wearable device includes a first wrist structure and a second wrist structure, the first wrist structure includes a first wrist strap, a first inertial navigation sensor disposed on the first wrist strap, at least one air bag disposed on an inner wall of the first wrist strap, at least one air pressure sensor connected to the at least one air bag in a one-to-one correspondence manner, the second wrist structure includes a second wrist strap, a second inertial navigation sensor disposed on the second wrist strap, a master control module disposed on the second wrist strap, and the master control module is connected to the at least one air pressure sensor, the first inertial navigation sensor, and the second inertial navigation sensor, where the first wrist structure is worn on a forearm of a target arm through the first wrist strap and the second wrist structure is worn on a forearm of the target arm through the second wrist strap, the method includes:
at least one air pressure information corresponding to the at least one air bag is obtained through detection of the at least one air pressure sensor, and the at least one air pressure information is sent to the main control module;
the small arm attitude angle is obtained through detection of the first inertial navigation sensor, and the small arm attitude angle is sent to the main control module;
detecting and obtaining a large arm attitude angle through the second inertial navigation sensor, and sending the large arm attitude angle to the main control module;
and determining a target intention gesture corresponding to the target arm through the main control module.
Further, in this embodiment, the main control module includes a gesture recognition module, and the determining, by the main control module, the target intention gesture corresponding to the target arm includes:
determining, by the gesture recognition module, an air pressure characteristic corresponding to the at least one air pressure information;
determining rotation angle characteristics of four degrees of freedom corresponding to the small arm gesture angle and the large arm gesture angle through the gesture recognition module;
and inputting the air pressure characteristics and the rotation angle characteristics into a gesture recognition model through the gesture recognition module, and then recognizing a target intention gesture corresponding to the target arm, wherein the gesture recognition model is a model which is trained in advance based on different air pressure characteristics and different rotation angle characteristics.
Further, in this embodiment, the determining, by the gesture recognition module, the air pressure characteristic corresponding to the at least one air pressure information includes:
and normalizing the at least one piece of air pressure information through the gesture recognition module to obtain air pressure characteristics, wherein each piece of air pressure information in the at least one piece of air pressure information is an air pressure change value of the corresponding air bag compared with the initial air pressure of the air bag.
Further, in this embodiment, the determining, by the gesture recognition module, a rotation angle feature of four degrees of freedom corresponding to the forearm attitude angle and the forearm attitude angle includes:
the gesture recognition module is used for inputting the arm gesture angle and the big arm gesture angle into an arm joint model to obtain rotation angle characteristics of four degrees of freedom output by the arm joint model, wherein the rotation angle characteristics comprise an arm rotation angle, a crank angle, a big arm forward lifting angle and a lateral lifting angle, and the arm joint model is a model which is obtained by training in advance based on different arm gesture angles and different arm gesture angles.
Further, in this embodiment, the main control module includes a sending module, and the determining, by the main control module, the target intention gesture corresponding to the target arm includes:
and transmitting the at least one piece of air pressure information, the small arm posture angle and the large arm posture angle to target equipment through the transmitting module, so that the target equipment recognizes a target intention gesture corresponding to the target arm based on the at least one piece of air pressure information, the small arm posture angle and the large arm posture angle, wherein the target equipment is used for determining air pressure characteristics corresponding to the at least one piece of air pressure information, determining rotation angle characteristics of four degrees of freedom corresponding to the small arm posture angle and the large arm posture angle, inputting the air pressure characteristics and the rotation angle characteristics into a gesture recognition model, and recognizing the target intention gesture corresponding to the target arm, wherein the gesture recognition model is a model which is trained in advance based on different air pressure characteristics and different rotation angle characteristics.
The specific implementation flow of the gesture detection method in this embodiment is described in detail in the first embodiment, and reference may be made to the corresponding content in the first embodiment, which is not described herein.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (6)

1. The gesture detection method is characterized by being applied to wearable equipment, wherein the wearable equipment comprises a first wrist type structure and a second wrist type structure, the first wrist type structure comprises a first wrist strap, a first inertial navigation sensor arranged on the first wrist strap, at least one air bag arranged on the inner wall of the first wrist strap, and at least one air pressure sensor connected with the at least one air bag in a one-to-one correspondence mode, the second wrist type structure comprises a second wrist strap, a second inertial navigation sensor arranged on the second wrist strap, and a main control module arranged on the second wrist strap, the main control module is connected with the at least one air pressure sensor, the first inertial navigation sensor and the second inertial navigation sensor, and the method comprises the following steps of:
at least one air pressure information corresponding to the at least one air bag is obtained through detection of the at least one air pressure sensor, and the at least one air pressure information is sent to the main control module;
the small arm attitude angle is obtained through detection of the first inertial navigation sensor, and the small arm attitude angle is sent to the main control module;
detecting and obtaining a large arm attitude angle through the second inertial navigation sensor, and sending the large arm attitude angle to the main control module;
determining a target intention gesture corresponding to the target arm through the main control module;
the main control module comprises a gesture recognition module, and the determining of the target intention gesture corresponding to the target arm by the main control module comprises the following steps:
determining, by the gesture recognition module, an air pressure characteristic corresponding to the at least one air pressure information;
determining rotation angle characteristics of four degrees of freedom corresponding to the small arm gesture angle and the large arm gesture angle through the gesture recognition module;
the gesture recognition module is used for inputting the air pressure characteristics and the rotation angle characteristics into a gesture recognition model, and then recognizing a target intention gesture corresponding to the target arm, wherein the gesture recognition model is a model which is obtained by training in advance based on different air pressure characteristics and different rotation angle characteristics;
the determining, by the gesture recognition module, rotation angle features of four degrees of freedom corresponding to the small arm posture angle and the large arm posture angle includes:
the gesture recognition module is used for inputting the arm gesture angle and the big arm gesture angle into an arm joint model to obtain rotation angle characteristics of four degrees of freedom output by the arm joint model, wherein the rotation angle characteristics comprise an arm rotation angle, a crank angle, a big arm forward lifting angle and a lateral lifting angle, and the arm joint model is a model which is obtained by training in advance based on different arm gesture angles and different arm gesture angles.
2. The method of claim 1, wherein the determining, by the gesture recognition module, a barometric pressure characteristic corresponding to the at least one barometric pressure information comprises:
and normalizing the at least one piece of air pressure information through the gesture recognition module to obtain air pressure characteristics, wherein each piece of air pressure information in the at least one piece of air pressure information is an air pressure change value of the corresponding air bag compared with the initial air pressure of the air bag.
3. The method of claim 1, wherein the master control module comprises a sending module, and wherein the determining, by the master control module, the target intent gesture corresponding to the target arm comprises:
and transmitting the at least one piece of air pressure information, the small arm posture angle and the large arm posture angle to target equipment through the transmitting module, so that the target equipment recognizes a target intention gesture corresponding to the target arm based on the at least one piece of air pressure information, the small arm posture angle and the large arm posture angle, wherein the target equipment is used for determining air pressure characteristics corresponding to the at least one piece of air pressure information, determining rotation angle characteristics of four degrees of freedom corresponding to the small arm posture angle and the large arm posture angle, inputting the air pressure characteristics and the rotation angle characteristics into a gesture recognition model, and recognizing the target intention gesture corresponding to the target arm, wherein the gesture recognition model is a model which is trained in advance based on different air pressure characteristics and different rotation angle characteristics.
4. A wearable device, comprising:
the first wrist structure comprises a first wrist strap, a first inertial navigation sensor arranged on the first wrist strap, at least one air bag arranged on the inner wall of the first wrist strap, and at least one air pressure sensor connected with the at least one air bag in a one-to-one correspondence manner;
the second wrist structure comprises a second wrist strap, a second inertial navigation sensor arranged on the second wrist strap and a main control module arranged on the second wrist strap, wherein the main control module is connected with the at least one air pressure sensor, the first inertial navigation sensor and the second inertial navigation sensor;
the first wrist structure is worn on a forearm of a target arm through the first wrist strap, the second wrist structure is worn on a forearm of the target arm through the second wrist strap, the at least one air pressure sensor is used for detecting air pressure information of corresponding air bags, at least one air pressure information corresponding to the at least one air bags one by one is obtained, the at least one air pressure information is sent to the main control module, the first inertial navigation sensor is used for detecting and obtaining a forearm gesture angle, the forearm gesture angle is sent to the main control module, the second inertial navigation sensor is used for detecting and obtaining a forearm gesture angle, the forearm gesture angle is sent to the main control module, and the main control module is used for determining a target intention gesture corresponding to the target arm based on the at least one air pressure information, the forearm gesture angle and the forearm gesture angle;
the main control module comprises a gesture recognition module, and the gesture recognition module is specifically used for:
determining an air pressure characteristic corresponding to the at least one air pressure information;
determining rotation angle characteristics of four degrees of freedom corresponding to the small arm posture angle and the large arm posture angle;
the air pressure characteristics and the rotation angle characteristics are input into a gesture recognition model, then a target intention gesture corresponding to the target arm is recognized, and the gesture recognition model is a model which is obtained by training in advance based on different air pressure characteristics and different rotation angle characteristics;
the gesture recognition module is specifically configured to:
the gesture recognition module is used for inputting the arm gesture angle and the big arm gesture angle into an arm joint model to obtain rotation angle characteristics of four degrees of freedom output by the arm joint model, wherein the rotation angle characteristics comprise an arm rotation angle, a crank angle, a big arm forward lifting angle and a lateral lifting angle, and the arm joint model is a model which is obtained by training in advance based on different arm gesture angles and different arm gesture angles.
5. The wearable device of claim 4, wherein the master control module includes a transmitting module configured to transmit the at least one barometric information, the forearm attitude angle, and the forearm attitude angle to a target device, such that the target device identifies a target intent gesture corresponding to the target arm based on the at least one barometric information, the forearm attitude angle, and the forearm attitude angle, wherein the target device is configured to determine barometric features corresponding to the at least one barometric information, determine rotation angle features of four degrees of freedom corresponding to the forearm attitude angle, and the forearm attitude angle, input the barometric features and the rotation angle features into a gesture identification model, and then identify a target intent gesture corresponding to the target arm, the gesture identification model being a model pre-trained based on different barometric features, different rotation angle features.
6. The wearable device of claim 4, wherein the gesture recognition module is specifically to:
and normalizing the at least one piece of air pressure information through the gesture recognition module to obtain air pressure characteristics, wherein each piece of air pressure information in the at least one piece of air pressure information is an air pressure change value of the corresponding air bag compared with the initial air pressure of the air bag.
CN201911247266.5A 2019-12-09 2019-12-09 Gesture detection method and wearable device Active CN111061368B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911247266.5A CN111061368B (en) 2019-12-09 2019-12-09 Gesture detection method and wearable device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911247266.5A CN111061368B (en) 2019-12-09 2019-12-09 Gesture detection method and wearable device

Publications (2)

Publication Number Publication Date
CN111061368A CN111061368A (en) 2020-04-24
CN111061368B true CN111061368B (en) 2023-06-27

Family

ID=70299934

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911247266.5A Active CN111061368B (en) 2019-12-09 2019-12-09 Gesture detection method and wearable device

Country Status (1)

Country Link
CN (1) CN111061368B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023220775A1 (en) * 2022-05-16 2023-11-23 University Of Wollongong PRESSURE-BASED FORCE MYOGRAPHY (pFMG) SYSTEM FOR DETERMINING BODY MOVEMENT

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015033327A1 (en) * 2013-09-09 2015-03-12 Belfiori Alfredo Wearable controller for wrist
CN105190578A (en) * 2013-02-22 2015-12-23 赛尔米克实验室公司 Methods and devices that combine muscle activity sensor signals and inertial sensor signals for gesture-based control
TW201606573A (en) * 2014-04-18 2016-02-16 英特爾股份有限公司 Techniques for improved wearable computing device gesture based interactions
KR101696593B1 (en) * 2015-07-09 2017-01-16 현대자동차주식회사 Input apparatus, vehicle comprising the same and control method for the vehicle
CN108042142A (en) * 2017-12-14 2018-05-18 华中科技大学 A kind of wearable human body attitude detection and myodynamia measuring system
CN108509024A (en) * 2018-01-25 2018-09-07 北京奇艺世纪科技有限公司 A kind of data processing method and device based on virtual reality device
CN109771905A (en) * 2019-01-25 2019-05-21 北京航空航天大学 Virtual reality interactive training restoring gloves based on touch driving
WO2019108880A1 (en) * 2017-11-30 2019-06-06 Ctrl-Labs Corporation Methods and apparatus for simultaneous detection of discrete and continuous gestures

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10528135B2 (en) * 2013-01-14 2020-01-07 Ctrl-Labs Corporation Wearable muscle interface systems, devices and methods that interact with content displayed on an electronic display
US10782790B2 (en) * 2015-12-22 2020-09-22 Intel Corporation System and method to collect gesture input through wrist tendon and muscle sensing
US10449672B2 (en) * 2016-03-14 2019-10-22 California Institute Of Technology Wearable electromyography sensor array using conductive cloth electrodes for human-robot interactions
CN105824414A (en) * 2016-03-14 2016-08-03 北京诺亦腾科技有限公司 Motion capturing glove for virtual reality system and virtual reality system
CN110072678A (en) * 2016-09-14 2019-07-30 奥尔堡大学 The mankind for moving auxiliary are intended to detection system
CN109521784B (en) * 2018-12-13 2021-05-11 华南农业大学 Touch sensing type wearable upper limb exoskeleton unmanned aerial vehicle control system and method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105190578A (en) * 2013-02-22 2015-12-23 赛尔米克实验室公司 Methods and devices that combine muscle activity sensor signals and inertial sensor signals for gesture-based control
WO2015033327A1 (en) * 2013-09-09 2015-03-12 Belfiori Alfredo Wearable controller for wrist
TW201606573A (en) * 2014-04-18 2016-02-16 英特爾股份有限公司 Techniques for improved wearable computing device gesture based interactions
KR101696593B1 (en) * 2015-07-09 2017-01-16 현대자동차주식회사 Input apparatus, vehicle comprising the same and control method for the vehicle
WO2019108880A1 (en) * 2017-11-30 2019-06-06 Ctrl-Labs Corporation Methods and apparatus for simultaneous detection of discrete and continuous gestures
CN108042142A (en) * 2017-12-14 2018-05-18 华中科技大学 A kind of wearable human body attitude detection and myodynamia measuring system
CN108509024A (en) * 2018-01-25 2018-09-07 北京奇艺世纪科技有限公司 A kind of data processing method and device based on virtual reality device
CN109771905A (en) * 2019-01-25 2019-05-21 北京航空航天大学 Virtual reality interactive training restoring gloves based on touch driving

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
《Hand/arm gesture segmentation by motion using IMU and EMG sensing》;Joao Lopes;《Procedia Manufacturing》;全文 *
《手功能康复机器人的鲁棒控制研究》;邢科新;《计算技术与自动化》;全文 *

Also Published As

Publication number Publication date
CN111061368A (en) 2020-04-24

Similar Documents

Publication Publication Date Title
EP3843617B1 (en) Camera-guided interpretation of neuromuscular signals
EP2875778B1 (en) Wearable mobile device and method of measuring biological signal with the same
JP2021072136A (en) Methods and devices for combining muscle activity sensor signals and inertial sensor signals for gesture-based control
US9278453B2 (en) Biosleeve human-machine interface
US20170296363A1 (en) Systems, apparatuses and methods for controlling prosthetic devices by gestures and other modalities
CN104317403B (en) A kind of wearable device for Sign Language Recognition
CN111158494B (en) Posture correction device and posture correction method
CN105575219A (en) Intelligent glove
CN203417440U (en) Composite sensing system for wearable pneumatic lower limb rehabilitation robot
CN103654774A (en) Wearable movable bracelet
WO2004114107A1 (en) Human-assistive wearable audio-visual inter-communication apparatus.
CN110618754B (en) Surface electromyogram signal-based gesture recognition method and gesture recognition armband
CN107943285B (en) Man-machine interaction wrist ring, system and method based on biological myoelectricity
Yang et al. Experimental study of an EMG-controlled 5-DOF anthropomorphic prosthetic hand for motion restoration
CN110537921A (en) Portable gait multi-sensing data acquisition system
CN106020490B (en) Multi-contact data glove system based on three axle Gravity accelerometers
CN104571837A (en) Method and system for realizing human-computer interaction
CN105014676A (en) Robot motion control method
CN109498375B (en) Human motion intention recognition control device and control method
CN110908515A (en) Gesture recognition method and device based on wrist muscle pressure
CN111061368B (en) Gesture detection method and wearable device
CN108042142A (en) A kind of wearable human body attitude detection and myodynamia measuring system
CN203552178U (en) Wrist strip type hand motion identification device
CN218793768U (en) Body-building action detection system combining body-building equipment and auxiliary wearable device
CN106020442A (en) Sensing method for intelligent sensing glove

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant