CN111399640A - Multi-mode man-machine interaction control method for flexible arm - Google Patents
Multi-mode man-machine interaction control method for flexible arm Download PDFInfo
- Publication number
- CN111399640A CN111399640A CN202010148713.8A CN202010148713A CN111399640A CN 111399640 A CN111399640 A CN 111399640A CN 202010148713 A CN202010148713 A CN 202010148713A CN 111399640 A CN111399640 A CN 111399640A
- Authority
- CN
- China
- Prior art keywords
- flexible arm
- imu
- control method
- data
- interaction control
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/087—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices for sensing other physical parameters, e.g. electrical or chemical properties
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0081—Programme-controlled manipulators with master teach-in means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/161—Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/1635—Programme controls characterised by the control loop flexible-arm control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1661—Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Human Computer Interaction (AREA)
- Artificial Intelligence (AREA)
- Automation & Control Theory (AREA)
- Evolutionary Computation (AREA)
- General Physics & Mathematics (AREA)
- Fuzzy Systems (AREA)
- Biomedical Technology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Evolutionary Biology (AREA)
- Dermatology (AREA)
- General Health & Medical Sciences (AREA)
- Neurology (AREA)
- Neurosurgery (AREA)
- User Interface Of Digital Computer (AREA)
- Prostheses (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The invention relates to the technical field of man-machine interaction, artificial intelligence and mode recognition, in particular to a multi-mode man-machine interaction control method for a flexible arm. The myoelectric muscle training device comprises a sensor, an upper computer and a flexible arm, and is controlled by the following steps of firstly, collecting myoelectric signal data of a controller and preprocessing the myoelectric signal; extracting features of the preprocessed electromyographic signals to train a classifier model; step three, the controller wears the sensor and calibrates the sensor; step four, loading the classifier model trained in the step two by the upper computer, and dynamically analyzing the electromyographic signals and the IMU data by adopting a detection algorithm to obtain joint angles; and step five, inputting the joint angle into a proportional-differential controller to complete the control of the flexible arm.
Description
Technical Field
The invention relates to the technical field of man-machine interaction, artificial intelligence and mode recognition, in particular to a multi-mode man-machine interaction control method for a flexible arm.
Background
Along with the continuous improvement of the industrial automation degree, the application of the industrial mechanical arm is more and more extensive, and meanwhile, due to the rapid change of social production and life style, the traditional industrial mechanical arm cannot meet the requirements of high flexibility, intelligence and flexibility of production. Therefore, a new generation of flexible mechanical arm is produced on the market, and accordingly, a human-machine interaction interface facing the flexible arm by recognizing the action intention of the upper limb of the human body is also proposed in succession.
Surface electromyography (sEMG) is a non-invasive electrical signal from the skin surface of a human body, and the superposition of electrical signals generated by a plurality of motion units (Motor units) can well reflect the movement intention of the human body. The Inertial navigation signal (Inertial Measurement Unit) can estimate information such as the attitude, position and the like of the system through nine-axis attitude data (acceleration in the xyz direction, angular velocity in the xyz direction, magnetic field strength in the xyz direction) without depending on external reference information. Due to the characteristics of easy acquisition, non-invasiveness and the like of the sEMG signal and the IMU signal, a control mode of mixing the sEMG signal and the IMU signal is gradually developed into a new man-machine interaction control interface.
In recent years, research work for analyzing sEMG and IMU signals to obtain joint angles and controlling a mechanical arm has been carried out by relevant research units at home and abroad, and a keehon research group of korea academy of sciences wears 4 sEMG sensors and 3 IMU sensors to obtain good interaction effect on a 6-degree-of-freedom K-arm after steps of sensor calibration, model training and the like. But the sensor is difficult to be applied in the actual industrial production environment due to the problems of complex sensor wearing mode, long training time, low calculation efficiency and the like.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides a multi-mode man-machine interaction control method for a flexible arm, which solves two problems in the prior method: high number of sensors and long training time.
In order to achieve the purpose, the invention adopts the following technical scheme:
a multi-mode man-machine interaction control method facing a flexible arm comprises a sensor, an upper computer and the flexible arm, and comprises the following control steps,
collecting electromyographic signal data of a controller, and preprocessing the electromyographic signal;
extracting features of the preprocessed electromyographic signals to train a classifier model;
step three, the controller wears the sensor and calibrates the sensor;
step four, loading the classifier model trained in the step two by the upper computer, and dynamically analyzing the electromyographic signals and the IMU data by adopting a detection algorithm to obtain joint angles;
and step five, inputting the joint angle into a proportional-differential controller to complete the control of the flexible arm.
According to the technical scheme, the method comprises the following steps of further optimizing, collecting electromyographic signal data of gestures of a controller in the first step, and measuring the amplitude of the maximum electromyographic signal.
The technical scheme is further optimized, wherein the electromyographic signal preprocessing in the step one comprises the steps of extracting active segment data and carrying out low-pass filtering by using a Butterworth filter
And the second step comprises the steps of extracting and normalizing the characteristic vector of the electromyographic signal, inputting the characteristic vector into a neural network, and training a model by adopting a Levenberg optimization algorithm.
The technical scheme is further optimized, and three characteristics of an average absolute value, a zero-crossing point number and a slope change number are extracted and normalized.
And step four, extracting an electromyographic signal active segment by adopting a double-threshold detection algorithm, preprocessing and extracting characteristics of active segment data, and inputting the active segment data into a classifier model for identification.
Further optimizing the technical scheme, in the fourth step, IMU data are identified by adopting an IMU resolving model
According to the technical scheme, quaternion data [ x, y, z and w ] transmitted from the IMU to the upper computer are further optimized and converted into a Roll angle (Roll), a Pitch angle (Pitch) and a Yaw angle (Yaw) according to the following formulas:
yaw=arcsin(2xy+2zw)
and carrying out data warping on the angle according to the following rules:
wherein the IMUnAt time n (roll)n,pitchn,yawn),
ΔIMUn=IMUn-IMUn-1,Δsgnn=sgn(IMUn)-sgn(IMUn-1)。
The technical scheme is further optimized, the control of the flexible arm in the fifth step is that the joint angle position vector theta is solved, and the difference between the joint angle vector theta and the joint angle obtained in the fourth step forms a feedback errorInputting a proportional-differential controller, outputting a torque control signal:wherein Kp、KdRepresenting the proportional and derivative parameters of the controller, g (Θ) is the gravity compensation term,is a friction compensation term.
Different from the prior art, the technical scheme has the following advantages: the control of 8 degrees of freedom of the flexible arm (shoulder 3 degree of freedom + elbow 2 degree of freedom + wrist 2 degree of freedom) and the tail end paw (1 degree of freedom) is realized by only using a small number of sensors, the wearing is simple, and the use is convenient; and the model is simple, the training time is short, the calculation efficiency is high, and the response is fast. The invention can be effectively applied to the production environment mainly based on flexible arm operation, and has extremely high application prospect.
Drawings
FIG. 1 is a flow chart of a multimodal human-machine interaction control method;
FIG. 2 is a schematic diagram of sensor wear;
FIG. 3 is a schematic diagram of the MAV dual threshold detection algorithm;
fig. 4 is a schematic view of a flexible arm joint.
Detailed Description
To explain technical contents, structural features, and objects and effects of the technical solutions in detail, the following detailed description is given with reference to the accompanying drawings in conjunction with the embodiments.
Referring to fig. 1, in a preferred embodiment of the present invention, a multi-modal human-computer interaction control method for a flexible arm includes, in a whole control system, 1 myoelectric bracelet, 1 nine-axis IMU sensor, 1 upper computer and a flexible arm.
The control method comprises the following steps:
the method comprises the following steps: selecting an 8-channel electromyography bracelet with the sampling frequency of 200HZ and a nine-axis IMU sensor, selecting 5 gestures of fist making, opening, inversion, eversion and splayed hands, respectively collecting 120 seconds of sEMG data, and having 5 seconds of rest time between every 5 seconds of actions in the collection process. And respectively counting the maximum myoelectric amplitude in the process of inversion and eversion, and recording as EMVC。
Step two: and extracting a data segment of the action process according to a rule of 5 seconds of action and 5 seconds of rest in the acquisition paradigm, and performing low-pass filtering on the extracted action data segment by using a Butterworth filter under 45 hz.
Setting the size of a sliding window to be 250ms, setting the size of an increment window to be 50ms, extracting three characteristics of an average absolute value (MAV), a zero crossing point number (ZC) and a slope change number (SSC) in the sliding window, and normalizing the three characteristics, wherein the three characteristics have the following calculation formulas, i represents the ith channel, L represents the window length, k represents the kth data in the window, and sgn () represents a sign function:
the meaning of MAV value is the average of the absolute sEMG values within the window, where i is 1,2, 8, L is 50;
the ZC value means that if the absolute value of two adjacent sEMG values is larger than or equal to the threshold value and the signs of the adjacent sEMG values are opposite, the ZC value in the window is added with 1;
(xi,k-xi,k-1)*(xi,k-xi,k+1)≥,
the meaning of the SSC value is that if three consecutive adjacent sEMG values satisfy the above formula condition, the SSC value in the window is increased by 1.
Since 3 kinds of features are extracted for each channel, the input classifier feature vector is 24 dimensions, that is, the input layer nodes of the neural network are 24, before inputting into the neural network, normalization of the following formula is also required for each dimension of features:
where μ represents the mean of the feature and represents the standard deviation of the feature.
And (3) carrying out 5-fold cross validation on 12 groups of parameters including the learning rate [0.1,0.01,0.001,0.0001] and the hidden layer number [50,100,150] by using a grid search method, and finally determining the parameters to be the learning rate 0.1 and the hidden layer number 100. And (3) training the neural network by using a Levenberg optimization algorithm, storing the model, and performing experimental statistics to obtain the training time of less than 1 second.
Step four: referring to fig. 2, the myoelectric bracelet is worn on the muscle group (about 5cm away from the elbow joint) of the extensor muscles of the forearm, extensor muscles, brachioradialis muscles, flexor elbows and flexor fingers, and the nine-axis IMU sensor is worn on the acromion of the human body. After accurate wearing, according to the output feedback of the screen angle data, the forearm and the upper arm are moved to the zero position of an IMU sensor (including an IMU built in the myoelectric bracelet and a nine-axis IMU on the shoulder) at the same time.
Step five: detecting sEMG and IMU signal data stream, and extracting sEMG signal by using designed MAV dual-threshold detection algorithmNumber active segment, the algorithm is specifically: taking the average MAV value of the last 100ms of 8-channel sEMG,where N is 8, L is 20, setting a start threshold THonsetEnd threshold value THoffsetIf MAV (x) > THonsetThen, it is considered to enter the active segment of sEMG signal, if MAV (x) < THoffsetThen the active segment of the sEMG signal is considered to be left. The detection algorithm is schematically shown in fig. 3.
Preprocessing the data of the active section as in the second step and extracting the characteristics in the third step, inputting a classifier with stored parameters and outputting a prediction result, wherein the relationship between the predicted gesture and the represented instruction is as follows:
gesture | Instruction meaning |
Fist making | End paw closure |
Open out | The end paw is opened |
Inward turning | The 6 th joint rotates clockwise by a certain angle |
Eversion | The 6 th joint rotates anticlockwise by a certain angle |
Hand with eight characters | The end paw and the 6 th joint return to the zero state |
The angle theta of the 6 th joint rotation and the absolute value E of sEMGsEMGIn a linear relationship:wherein theta ismaxIs the limit angle of the 6 th joint.
Quaternion data [ x, y, z, w ] transmitted from the IMU to the upper computer are converted into a Roll angle (Roll), a Pitch angle (Pitch) and a Yaw angle (Yaw) according to the following formula:
yaw=arcsin(2xy+2zw)
because the angle calculated by the inverse trigonometric function is the angle calculated by the inverse trigonometric function, in order to avoid discontinuity between-180 degrees and 180 degrees, the angle is further subjected to data regularization according to the following rule:
wherein the IMUnAt time n (roll)n,pitchn,yawn),
ΔIMUn=IMUn-IMUn-1,Δsgnn=sgn(IMUn)-sgn(IMUn-1)。
The pitch angle analyzed through the data of the nine-axis IMU sensor corresponds to the 1 st joint angle, the yaw angle corresponds to the 2 nd joint angle, and the roll angle corresponds to the 3 rd joint angle; the pitch angle analyzed through the IMU data of the myoelectricity bracelet corresponds to the 4 th joint angle, the roll angle corresponds to the 5 th joint angle, and the yaw angle corresponds to the 7 th joint angle. Theta for ith joint angleiDenotes, i 1,2.., 7, θ1To theta7Make up a 7-dimensional joint vectorFor the flexible arm in this embodiment, the position of each joint is seen in FIG. 4.
Step six: obtaining the current rotation angle of the motor through an encoder, calculating a joint angle position vector theta according to a motor-joint conversion matrix (determined by a flexible arm mechanical structure), and forming a feedback error by subtracting the joint angle position vector theta obtained in the step five from the joint vector obtained in the step fiveThe following PD controllers are input, and final torque control signals are output:
wherein Kp、KdRepresenting the proportional and derivative parameters of the controller, g (Θ) is the gravity compensation term,is a friction compensation term.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrases "comprising … …" or "comprising … …" does not exclude the presence of additional elements in a process, method, article, or terminal that comprises the element. Further, herein, "greater than," "less than," "more than," and the like are understood to exclude the present numbers; the terms "above", "below", "within" and the like are to be understood as including the number.
Although the embodiments have been described, once the basic inventive concept is obtained, other variations and modifications of these embodiments can be made by those skilled in the art, so that the above embodiments are only examples of the present invention, and not intended to limit the scope of the present invention, and all equivalent structures or equivalent processes using the contents of the present specification and drawings, or any other related technical fields, which are directly or indirectly applied thereto, are included in the scope of the present invention.
Claims (9)
1. A multi-mode man-machine interaction control method facing a flexible arm is characterized by comprising a sensor, an upper computer and the flexible arm, and the control steps are as follows,
collecting electromyographic signal data of a controller, and preprocessing the electromyographic signal;
extracting features of the preprocessed electromyographic signals to train a classifier model;
step three, the controller wears the sensor and calibrates the sensor;
step four, loading the classifier model trained in the step two by the upper computer, and dynamically analyzing the electromyographic signals and the IMU data by adopting a detection algorithm to obtain joint angles;
and step five, inputting the joint angle into a proportional-differential controller to complete the control of the flexible arm.
2. The multi-modal human-computer interaction control method facing the flexible arm as claimed in claim 1, wherein the first step collects electromyographic signal data of the gesture of the controller and determines a maximum electromyographic signal amplitude.
3. The multi-modal human-computer interaction control method facing the flexible arm as recited in claim 1, wherein the electromyographic signal preprocessing in the first step comprises extracting data of active segments and performing low-pass filtering by using a Butterworth filter.
4. The multi-modal human-computer interaction control method facing the flexible arm as recited in claim 1, wherein the second step comprises performing feature vector extraction and normalization processing on the electromyographic signals, inputting the processed electromyographic signals into a neural network, and training a model by using a Levenberg optimization algorithm.
5. The flexible arm-oriented multimodal human-computer interaction control method as claimed in claim 4, wherein three features of an average absolute value, a zero-crossing point number and a slope change number are extracted and normalized.
6. The multi-modal human-computer interaction control method facing the flexible arm as recited in claim 1, wherein the fourth step comprises extracting an electromyographic signal active segment by using a double-threshold detection algorithm, preprocessing and extracting characteristics of the active segment data, and inputting the active segment data into a classifier model for recognition.
7. The flexible arm-oriented multimodal human-computer interaction control method as claimed in claim 1, wherein the IMU data in the fourth step is identified by adopting an IMU solution model.
8. The multi-modal human-computer interaction control method for the flexible arm as claimed in claim 7, wherein the quaternion data [ x, y, z, w ] transmitted from the IMU to the upper computer is converted into a Roll angle (Roll), a Pitch angle (Pitch), and a Yaw angle (Yaw) according to the following formulas:
yaw=arcsin(2xy+2zw)
and carrying out data warping on the angle according to the following rules:
wherein the IMUnAt time n (roll)n,pitchn,yawn),
ΔIMUn=IMUn-IMUn-1,Δsgnn=sgn(IMUn)-sgn(IMUn-1)。
9. The multi-modal human-computer interaction control method facing the flexible arm as claimed in claim 1, wherein the control of the flexible arm in the fifth step is to solve a joint angle position vector Θ and form a feedback error by making a difference with the joint angle obtained in the fourth stepInputting a proportional-differential controller, outputting a torque control signal:wherein Kp、KdRepresenting the proportional and derivative parameters of the controller, g (Θ) is the gravity compensation term,is a friction compensation term.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010148713.8A CN111399640A (en) | 2020-03-05 | 2020-03-05 | Multi-mode man-machine interaction control method for flexible arm |
NL2027179A NL2027179B1 (en) | 2020-03-05 | 2020-12-21 | Flexible-arm-oriented multi-modal human-machine interaction control method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010148713.8A CN111399640A (en) | 2020-03-05 | 2020-03-05 | Multi-mode man-machine interaction control method for flexible arm |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111399640A true CN111399640A (en) | 2020-07-10 |
Family
ID=71436279
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010148713.8A Pending CN111399640A (en) | 2020-03-05 | 2020-03-05 | Multi-mode man-machine interaction control method for flexible arm |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111399640A (en) |
NL (1) | NL2027179B1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111700718A (en) * | 2020-07-13 | 2020-09-25 | 北京海益同展信息科技有限公司 | Holding posture identifying method, holding posture identifying device, artificial limb and readable storage medium |
CN112123332A (en) * | 2020-08-10 | 2020-12-25 | 北京海益同展信息科技有限公司 | Construction method of gesture classifier, exoskeleton robot control method and device |
CN113059570A (en) * | 2021-04-09 | 2021-07-02 | 华南理工大学 | Human-robot cooperative control method based on human body dynamic arm strength estimation model |
CN113305879A (en) * | 2021-04-09 | 2021-08-27 | 南开大学 | Robot control system and method based on joint angle and muscle length measurement |
WO2023138784A1 (en) * | 2022-01-21 | 2023-07-27 | Universität St. Gallen | System and method for configuring a robot |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113805696A (en) * | 2021-09-01 | 2021-12-17 | 肩并肩智能技术(北京)有限公司 | Machine learning method based on surface electromyographic signals and dynamic capture technology |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107553499A (en) * | 2017-10-23 | 2018-01-09 | 上海交通大学 | Natural the gesture motion control system and method for a kind of Multi-shaft mechanical arm |
US20180164899A1 (en) * | 2016-12-09 | 2018-06-14 | Sick Ag | Control for the safe controlling of at least one machine |
CN108388114A (en) * | 2018-02-07 | 2018-08-10 | 中国航空工业集团公司西安飞机设计研究所 | A kind of flexible mechanical arm composite control method based on Output Redefinition |
CN110068326A (en) * | 2019-04-29 | 2019-07-30 | 京东方科技集团股份有限公司 | Computation method for attitude, device, electronic equipment and storage medium |
-
2020
- 2020-03-05 CN CN202010148713.8A patent/CN111399640A/en active Pending
- 2020-12-21 NL NL2027179A patent/NL2027179B1/en active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180164899A1 (en) * | 2016-12-09 | 2018-06-14 | Sick Ag | Control for the safe controlling of at least one machine |
CN107553499A (en) * | 2017-10-23 | 2018-01-09 | 上海交通大学 | Natural the gesture motion control system and method for a kind of Multi-shaft mechanical arm |
CN108388114A (en) * | 2018-02-07 | 2018-08-10 | 中国航空工业集团公司西安飞机设计研究所 | A kind of flexible mechanical arm composite control method based on Output Redefinition |
CN110068326A (en) * | 2019-04-29 | 2019-07-30 | 京东方科技集团股份有限公司 | Computation method for attitude, device, electronic equipment and storage medium |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111700718A (en) * | 2020-07-13 | 2020-09-25 | 北京海益同展信息科技有限公司 | Holding posture identifying method, holding posture identifying device, artificial limb and readable storage medium |
CN112123332A (en) * | 2020-08-10 | 2020-12-25 | 北京海益同展信息科技有限公司 | Construction method of gesture classifier, exoskeleton robot control method and device |
CN113059570A (en) * | 2021-04-09 | 2021-07-02 | 华南理工大学 | Human-robot cooperative control method based on human body dynamic arm strength estimation model |
CN113305879A (en) * | 2021-04-09 | 2021-08-27 | 南开大学 | Robot control system and method based on joint angle and muscle length measurement |
CN113059570B (en) * | 2021-04-09 | 2022-05-24 | 华南理工大学 | Human-robot cooperative control method based on human body dynamic arm strength estimation model |
WO2023138784A1 (en) * | 2022-01-21 | 2023-07-27 | Universität St. Gallen | System and method for configuring a robot |
Also Published As
Publication number | Publication date |
---|---|
NL2027179B1 (en) | 2022-07-26 |
NL2027179A (en) | 2021-04-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111399640A (en) | Multi-mode man-machine interaction control method for flexible arm | |
Xiong et al. | Deep learning for EMG-based human-machine interaction: A review | |
Allard et al. | A convolutional neural network for robotic arm guidance using sEMG based frequency-features | |
CN107378944B (en) | Multidimensional surface electromyographic signal artificial hand control method based on principal component analysis method | |
Xu et al. | Advances and disturbances in sEMG-based intentions and movements recognition: A review | |
CN110413107B (en) | Bionic manipulator interaction control method based on electromyographic signal pattern recognition and particle swarm optimization | |
Xiong et al. | A novel HCI based on EMG and IMU | |
CN113849068B (en) | Understanding and interaction method and system for multi-modal information fusion of gestures | |
Fang et al. | Attribute-driven granular model for EMG-based pinch and fingertip force grand recognition | |
Wu et al. | A Visual-Based Gesture Prediction Framework Applied in Social Robots. | |
Wu et al. | sEMG measurement position and feature optimization strategy for gesture recognition based on ANOVA and neural networks | |
Shin et al. | Korean sign language recognition using EMG and IMU sensors based on group-dependent NN models | |
Qi et al. | An adaptive reinforcement learning-based multimodal data fusion framework for human–robot confrontation gaming | |
CN112041784A (en) | Method and apparatus for simultaneous detection of discrete and continuous gestures | |
Calado et al. | A geometric model-based approach to hand gesture recognition | |
Cui et al. | Recognition of upper limb action intention based on IMU | |
Chang et al. | A hierarchical hand motions recognition method based on IMU and sEMG sensors | |
Lin et al. | A normalisation approach improves the performance of inter-subject sEMG-based hand gesture recognition with a ConvNet | |
Liu et al. | Human hand motion analysis with multisensory information | |
Chen et al. | SEMG-based gesture recognition using GRU with strong robustness against forearm posture | |
Ge et al. | Gesture recognition and master–slave control of a manipulator based on sEMG and convolutional neural network–gated recurrent unit | |
Lv et al. | A Novel Interval Type-2 Fuzzy Classifier Based on Explainable Neural Network for Surface Electromyogram Gesture Recognition | |
Tong et al. | sEMG based Gesture Recognition Method for Coal Mine Inspection Manipulator Using Multi-Stream CNN | |
Liu et al. | A wearable system for sign language recognition enabled by a convolutional neural network | |
CN111367399B (en) | Surface electromyographic signal gesture recognition method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |