CN115533939A - Anthropomorphic grasping control method and system for bionic hand - Google Patents

Anthropomorphic grasping control method and system for bionic hand Download PDF

Info

Publication number
CN115533939A
CN115533939A CN202211273495.6A CN202211273495A CN115533939A CN 115533939 A CN115533939 A CN 115533939A CN 202211273495 A CN202211273495 A CN 202211273495A CN 115533939 A CN115533939 A CN 115533939A
Authority
CN
China
Prior art keywords
control
bionic hand
hand
bionic
fingers
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211273495.6A
Other languages
Chinese (zh)
Other versions
CN115533939B (en
Inventor
李可
胡元栋
李光林
魏娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University
Original Assignee
Shandong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University filed Critical Shandong University
Priority to CN202211273495.6A priority Critical patent/CN115533939B/en
Publication of CN115533939A publication Critical patent/CN115533939A/en
Application granted granted Critical
Publication of CN115533939B publication Critical patent/CN115533939B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J15/00Gripping heads and other end effectors
    • B25J15/08Gripping heads and other end effectors having finger members
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/08Programme-controlled manipulators characterised by modular constructions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to the technical field of bionic hands, and provides an anthropomorphic grasp control method and system for the bionic hands, which comprises the following steps: acquiring the motor activity amplitude, the human finger curvature, the control dimension, the object type and the expected gripping force of the five fingers; finding the corresponding relation between the human hand and the bionic hand posture according to the human finger bending range and the motor movement amplitude to generate a bionic hand data set; performing dimension reduction processing on the bionic hand data set based on the control dimension to generate a collaborative data set; generating a pre-grasping gesture according to the cooperative data set and the object type so as to control the motor to move according to a first control law; and after the bionic hand is contacted with the object, obtaining the contact force, controlling the motor to move according to a second control law by combining the expected gripping force of the five fingers, and adjusting the pose of the bionic hand so as to finish the human-like gripping of the object by the bionic hand. A more humanoid learning of gripping skills is achieved.

Description

Anthropomorphic grasping control method and system for bionic hand
Technical Field
The invention belongs to the technical field of bionic hands, and particularly relates to an anthropomorphic grasping control method and system for a bionic hand.
Background
The statements in this section merely provide background information related to the present disclosure and may not necessarily constitute prior art.
The service robots are increasingly integrated into the lives of people, and assist people to better complete daily production and life. The smart bionic hand is an important end effector of the robot, is used as a highly integrated electromechanical integrated system with multiple sensing functions and intellectualization, and is an important component for sensing environmental information and carrying out interaction of a robot system. The dexterity of the hand is of great value to human exploration and interaction with the world, duplicating the dexterity of the hand provides many new ideas for the development of bionic hands for healthcare, services, industrial and space applications. In the face of the dexterous bionic hands with various styles on the market at present, how to improve the humanoid control performance of the bionic hands and how to realize anthropomorphic learning and accurate control of the robot gripping technology is an important problem to be solved urgently at present.
In order to make a dexterous hand personified, it is first of all provided with the ability to imitate a human hand in gripping an object. There are many ways currently used to obtain the hand's gripping posture, optical detection systems based on passive, active marker and unmarked systems, data gloves, inertial sensors, magnetic sensors, etc. After the grasping posture of the human hand is obtained, the key point is to map the human hand to a bionic hand so that the bionic hand has similar grasping performance. For various electric dexterous hands on the market, the driving modes are different, namely full driving and under driving; the number of fingers is different, namely three fingers, five fingers and the like; the driving degrees of freedom also differ greatly. However, the essence of the method is that the driving motor drives the finger joints to move, so a method is expected to be found based on the principle, the mapping of the hand posture on the bionic hands with different specifications is simplified, and the learning process is more convenient and faster.
For a dexterous bionic hand, the light dependence is not enough to simulate the posture of the human hand, and more importantly, a proper control strategy needs to be found to enable the bionic hand to meet different application scenes. Since the 80's of the 20 th century, the problem of multi-fingered grasping has been studied more and more extensively, with some success in modeling and control. The predecessor proposed a method that combines proportional-differential position control of a finger with a compliant contact model of a gripping object, a method that divides force control into direct force control for precise gripping and indirect force control for stable gripping, a method based on a data-driven neural network strategy, and the like. There are two key points in these studies, one is to plan kinematically constrained finger joint trajectories to achieve the desired motion; and secondly, force closing and grabbing are completed, and sliding between the fingers and the object is prevented. Therefore, it is desirable to find a suitable control strategy that satisfies the control accuracy and robustness, and also provides a smooth and stable gripping behavior of the bionic hand.
Disclosure of Invention
In order to solve the technical problems in the background art, the invention provides an anthropomorphic gripping control method and system for a bionic hand, which use a synergistic thought to simplify the control dimension, and smooth the gripping process by a strategy switching and force control method, so that the bionic hand has a smooth and stable gripping performance, and more anthropomorphic gripping skill learning is realized.
In order to achieve the purpose, the invention adopts the following technical scheme:
a first aspect of the present invention provides an anthropomorphic grip control method for a bionic hand, comprising:
acquiring the motor activity amplitude, the human finger curvature, the control dimension, the object type and the expected gripping force of the five fingers;
finding the corresponding relation between the human hand and the bionic hand posture according to the human finger bending range and the motor movement amplitude to generate a bionic hand data set;
performing dimension reduction processing on the bionic hand data set based on the control dimension to generate a collaborative data set;
generating a pre-grasping posture according to the cooperative data set and the object type so as to control a motor to move according to a first control law, wherein the first control law is switched between a nonsingular tail end sliding mode control law and a linear sliding mode control law according to errors;
and after the bionic hand is contacted with the object, acquiring a contact force, controlling the motor to move according to a second control law by combining the expected gripping force of the five fingers, and adjusting the pose of the bionic hand so as to finish the human-like gripping of the object by the bionic hand.
Further, performing dimension reduction processing on the bionic hand data set through principal component analysis;
the control dimension represents the number of the principal components to be selected.
Further, the second control law is:
μ=μ p ω+μ f (1-ω)
where, ω = re n R is a normal number, e n Is the difference between the expected gripping force and the contact force of the five fingers at the moment n, mu p Denotes the first control law, μ f Indicating force control.
Further, the force control is:
Figure BDA0003896121350000031
wherein, K p And K i Proportional and integral terms, respectively, T is the sampling time, e k Is the difference between the expected gripping force and the contact force of the five fingers at time k.
Further, the corresponding relationship between the human hand and the bionic hand posture is as follows:
Figure BDA0003896121350000032
wherein alpha is imax Is maximum human finger curvature, beta imax The maximum motor activity amplitude.
Further, if the contact force cannot reach the expected gripping force of the five fingers, the object type and the expected gripping force of the five fingers are adjusted.
A second aspect of the present invention provides an anthropomorphic grip control system for a bionic hand, comprising:
a data acquisition module configured to: acquiring the motor activity amplitude, the human finger curvature, the control dimension, the object type and the expected gripping force of the five fingers;
a relationship correspondence module configured to: finding the corresponding relation between the human hand and the bionic hand posture according to the bending range of the human fingers and the motor motion amplitude to generate a bionic hand data set;
a dimension reduction module configured to: performing dimension reduction processing on the bionic hand data set based on the control dimension to generate a collaborative data set;
a pre-control module configured to: generating a pre-grasping posture according to the cooperative data set and the object type to control a motor to move according to a first control law, wherein the first control law is switched between a nonsingular tail end sliding mode control law and a linear sliding mode control law according to errors;
a pose adjustment module configured to: and after the bionic hand is contacted with the object, obtaining the contact force, controlling the motor to move according to a second control law by combining the expected gripping force of the five fingers, and adjusting the pose of the bionic hand so as to finish the human-like gripping of the object by the bionic hand.
Further, a parameter adjustment module is included that is configured to: if the contact force can not reach the expected gripping force of the five fingers, the type of the object and the expected gripping force of the five fingers are adjusted.
A third aspect of the present invention provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, carries out the steps in an anthropomorphic grip control method for a bionic hand as described above.
A fourth aspect of the invention provides a computer apparatus comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor when executing the program implementing the steps in an anthropomorphic grip control method for a bionic hand as described above.
Compared with the prior art, the invention has the beneficial effects that:
the invention provides an anthropomorphic gripping control method for a bionic hand, which simplifies control dimension by using a collaborative idea, smoothes a gripping process by a strategy switching and force control method, enables the bionic hand to have smooth and stable gripping performance, realizes more anthropomorphic gripping skill learning, and provides a good suggestion for the development of a robot learning strategy in the future.
The invention provides an anthropomorphic grasping control method for a bionic hand, which enables the bionic hand to obtain better human-like grasping performance only by a user through a plurality of inputs in an interface, and enables the user to master the human-like grasping skills realized on different dexterous bionic hands through simple learning.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the invention and together with the description serve to explain the invention and not to limit the invention.
FIG. 1 is a partial flow diagram of an anthropomorphic grip control method for a bionic hand in accordance with one embodiment of the invention;
FIG. 2 is an overall flow chart of an anthropomorphic grip control method for a bionic hand according to one embodiment of the invention;
FIG. 3 is an overall block diagram of an anthropomorphic grip control method for a bionic hand according to a first embodiment of the invention;
fig. 4 is a system interface diagram of a user operating an upper computer according to a first embodiment of the present invention;
fig. 5 is a motor control flow diagram according to a first embodiment of the present invention.
Detailed Description
The invention is further described with reference to the following figures and examples.
It is to be understood that the following detailed description is exemplary and is intended to provide further explanation of the invention as claimed. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
Example one
The embodiment provides an anthropomorphic grasp control method for a bionic hand, which specifically comprises the following steps as shown in fig. 1 and fig. 2:
step 1, clicking a 'connection establishment' button on a system interface of a user operation upper computer by a user, establishing communication between the system and a bionic hand, and importing a hand grasping data set.
The hand grasping data set refers to the common gesture of a hand grasping an object, and the set of the angles of the joints of each finger is acquired by using a data glove.
The "establish connection" button is responsible for establishing communication with the bionic hand.
And 2, as shown in fig. 3, inputting parameters by a user on a system interface of the user operation upper computer.
As shown in fig. 4, the input parameters include motor activity amplitude, human finger curvature, control dimensions, object type, and expected five-finger grip strength.
And the user inputs the corresponding motor rotation amplitude according to the specific bionic hand specification.
The invention can also combine visual information to improve the environmental adaptability and the gripping flexibility of the system. With the aid of an imaging device, target object information (object type) is obtained by using a 3D point cloud matching technique, and is used to assist a user in adjusting gripping parameters.
And 3, finding the corresponding relation between the human fingers and the bionic finger bending degree according to the human finger bending range and the motor motion amplitude, completing the mapping of the gripping gesture, and generating a bionic hand data set psi.
Specifically, a proper corresponding relation is established, and posture mapping from the human hand to the bionic hand is realized. Firstly, recording the common gripping postures of human hands by using the most widely used data gloves, and defining the finger in a vertical state as 0 and completely bending as alpha imax Then the human finger is in a bending state alpha i ∈[0,α imax ],i∈[1,5]. For the finger joint of a bionic hand, in a position control mode, a motor for controlling joint motion has a certain rotation amplitude to ensure that the finger can complete the state change from vertical to complete bending, the amplitude of the finger in the vertical state (motor motion amplitude) is 0, and the amplitude of the finger in the complete bending (motor motion amplitude) is beta imax Then bionic finger bending state beta i ∈[0,β imax ],i∈[1,5]. According to the above-mentioned correspondent relation eta of human hand and bionic hand posture i The following were used:
Figure BDA0003896121350000071
wherein alpha is imax Beta is the maximum degree of curvature of the human finger imax The maximum motor activity amplitude.
Finding out the corresponding relation eta of the human hand and the bionic hand posture i Then, the obtained hand data set can be mapped to a bionic hand to obtain a bionic hand data set (grip data set). For different specifications of bionic hands, the two have different corresponding relations, but similar grasping performance can be obtained through the processing on the same hand posture.
Step 4, the system performs dimension reduction processing on the grasping data set through principal component analysis, and controls the dimension n through user input c And determining the number of the principal components to generate a collaborative data set, namely collaborative space lambda.
Specifically, the obtained bionic hand data set is processed by utilizing a synergistic idea. Extracting the proportion weight of each finger joint in the data set in the grabbing action by using Principal Component Analysis (PCA), wherein the number of the Principal components selected by the system according to the needs of a user, namely the number n of the input control dimensions c A low-dimensional collaborative space (collaborative data set) λ is generated. Number n of control dimensions c Representing the number of the main components to be selected.
Step 5, selecting a gripping object, namely the selected object type tau based on the collaborative data set, and generating a pre-gripping posture lambda p And obtaining the rotation degree delta of each motor. The selected object type tau determines the grasping posture lambda selected by the system in a certain direction in the cooperative space p The motor is made to generate a motion generating this attitude under position control. The collaborative data set is n c Vector space of dimensions, n c For determining according to the number of principal components selected by the user, selecting the gripping object, i.e. selecting the coordinates in the vector spaceThe position, having a coordinate position, means that an exact grip data is found. As shown in fig. 5, the method specifically includes:
establishing a new motor motion model:
Figure BDA0003896121350000081
wherein delta is the rotation angle of the motor,
Figure BDA0003896121350000082
is the displacement of the joint, and the joint is,
Figure BDA0003896121350000083
and
Figure BDA0003896121350000084
respectively represent the speed and the acceleration of the corresponding joint, K and T are respectively a gain term and a time constant, and alpha is a nonlinear constant. Defining state variables
Figure BDA0003896121350000085
Figure BDA0003896121350000086
μ = δ, giving the formula:
Figure BDA0003896121350000087
Figure BDA0003896121350000088
wherein, r represents an unknown quantity,
Figure BDA0003896121350000089
c represents unknown dynamic and environmental interference and satisfies | c<l d ,l d >0,l d The method is an upper limit of unknown dynamic interference set artificially, and ensures that the interference is not too large.
The sliding mode control function is defined as:
s=a 1 e 1 +e 2 (5)
wherein, let e 1 =x 1 -x d ,
Figure BDA00038961213500000810
a 1 >0,
Figure BDA00038961213500000815
Representing the desired joint velocity, a 1 Is a normal number, x 1 、x d Respectively representing the current finger joint angle and the expected finger joint angle, namely x, obtained by the upper computer 1 And x d Representing the current state and the desired state, respectively. This gives:
Figure BDA00038961213500000812
wherein,
Figure BDA00038961213500000813
representing the desired finger joint acceleration.
Then, a Linear Sliding Mode (LSM) control law is obtained as follows:
Figure BDA00038961213500000814
wherein,
Figure BDA0003896121350000091
is a normal number and is set by human experience.
In order to reduce the buffeting phenomenon in the control process and make the movement more coherent, the combination of Nonsingular Terminal Sliding Mode (NTSM) control is considered. The slip form surface is defined as follows:
Figure BDA0003896121350000092
wherein λ is * Is selected to be normal number, q and p are positive odd number, and satisfy p > q,
Figure BDA0003896121350000093
the nonsingular tail end sliding mode control law is obtained as follows:
Figure BDA0003896121350000094
wherein,
Figure BDA0003896121350000095
is a normal number and is set by human experience.
When the system is close to the balance point, the convergence speed of the NTSM is better than that of the LSM; and inferior to LSM when the system is far from the equilibrium point. While LSM alone does not have global convergence. Therefore, the mode switching method is used to combine the two types of control, and the advantages of each control are fully exerted.
The switching function is as follows:
Figure BDA0003896121350000096
since the LSM and NTSM are independent, their control laws can be designed independently. Substituting (10) the expressions (7) and (9) to obtain a first control law:
Figure BDA0003896121350000097
where ε is the threshold value. And the first control law is switched between a nonsingular tail end sliding mode control law and a linear sliding mode control law according to the error.
According to the type tau of the gripping object selected by the user, the system generates a gripping gesture lambda moving towards a certain direction of the cooperative space p The motor is controlled to produce motion according to a first control law.
Step 6, the motor moves under the position control, the bionic hand is controlled by the combination force after contacting with the object, and the force information (the contact force F of each finger of the five fingers) input by the user is used i ) And continuously controlling the motor to move according to a second control law, adjusting the pose of the bionic hand, and finally finishing the human-like grabbing of the object by using the smart bionic hand.
The size of the gripping force is set according to the requirement, the motor is controlled to adjust the output torque, and if the contact force does not reach the expectation all the time, the bionic hand can finally reach the expected requirement by adjusting the type of the object and the expected force.
And adding texts in the display area to prompt the function realization condition of each link. When the contact force display device is in contact with an object, the actual contact force area on a system interface of a user operation upper computer can display the real-time force of the five fingers, the display area is used for visually displaying the completion condition of the system function, and feedback is provided for the operation of the user.
In particular, the motor generates a movement according to the error e 1 In two strategies mu l And mu n Switching between the two parts, generating a force signal after contacting with the object, and controlling the mu by the binding force f According to contact force F c With a desired force F i Difference e of n Modulation of mu f And mu p And finally outputting the control signal mu according to the ratio weight in the control law.
When contact with an object occurs, the adjustment is made in conjunction with force control. The force controller is designed by means of the proportional-integral-derivative control concept as follows:
Figure BDA0003896121350000101
wherein, K p And K i Proportional and integral terms, respectively, T is the sampling time, e k Is the target force (the contact force F of each finger of the five fingers is set) i ) And measuring the difference of the force (force signal) at the time k, e n Representing the difference between the target force and the measured force at the nth moment,
Figure BDA0003896121350000102
representsThe sum of the differences from the start to the nth time.
And finally, combining the position control and the force control to obtain an integral system control law, namely a second control law:
μ=μ p ω+μ f (1-ω) (13)
wherein ω = re n R is a normal number, e n Is the difference between the expected gripping force and the contact force of the five fingers at the moment n, mu p Denotes the first control law, μ f Indicating force control.
Through the strategy, the system can control the bionic hand motor, and the good realization of the grasping action is ensured.
And 7, if the user clicks the stop action, stopping the motion of the bionic hand.
The stopping action is responsible for interrupting the motion of the bionic hand when some accidents occur in the gripping process, and plays a role in protection.
The invention aims to enable a user to learn the skill of realizing human-like gripping on different dexterous bionic hands by simple learning. The control dimensionality is simplified by using a synergistic idea, the gripping process is smoothed by a strategy switching and force control method, more anthropomorphic gripping skill learning is realized, and a good suggestion is provided for the development of a robot learning strategy in the future.
By the method, a user can conveniently and quickly realize accurate control on the bionic hand, and the system can be externally connected to different bionic hands to realize the same grasping control effect through the adjustment of some input items. The invention can provide important assistance for industrial production, medical manufacturing and the like, provides important technical support for replacing human beings to finish simple and repeated production and living, and has wide application value.
The invention simplifies the learning of the holding posture from a human hand to a bionic hand. And mapping the gripping postures of the hands of the people to the bionic hands with different specifications to obtain the same gripping performance. The corresponding relation between the bending degree of the human fingers and the angles of the simulated hand joints is found and established by inputting parameters needing to be adjusted in the system, and the bending degree of the human fingers and the angles of the simulated hand joints are scaled to a uniform scale to complete the mapping of the data set.
The invention finds a suitable bionic hand control strategy. The bionic hand can well grasp the object. Firstly, the obtained gripping data set is processed, and a coordination data set is obtained by utilizing a coordination idea and is used for generating a pre-gripping gesture to carry out position control. And then strategy switching is carried out by using sliding mode control, force control is carried out by combining contact force information, so that the bionic hand has a smooth and stable gripping performance, and finally gripping is finished.
The invention integrates the system and enables the user to obtain the interface operation experience. The system is integrated, so that a user can control different bionic hands through interface operation. By simply entering the terms, the system can make adjustments accordingly so that the bionic hand obtains the desired grip skill performance.
Example two
The embodiment provides an anthropomorphic grasping control system for a bionic hand, which specifically comprises the following modules:
a data acquisition module configured to: acquiring the motion amplitude of a motor, the curvature of human fingers, control dimensions, object types and expected gripping force of five fingers;
a relationship correspondence module configured to: finding the corresponding relation between the human hand and the bionic hand posture according to the bending range of the human fingers and the motor motion amplitude to generate a bionic hand data set;
a dimension reduction module configured to: performing dimension reduction processing on the bionic hand data set based on the control dimension to generate a collaborative data set;
a pre-control module configured to: generating a pre-grasping posture according to the cooperative data set and the object type to control a motor to move according to a first control law, wherein the first control law is switched between a nonsingular tail end sliding mode control law and a linear sliding mode control law according to errors;
a pose adjustment module configured to: and after the bionic hand is contacted with the object, obtaining the contact force, controlling the motor to move according to a second control law by combining the expected gripping force of the five fingers, and adjusting the pose of the bionic hand so as to finish the human-like gripping of the object by the bionic hand.
A parameter adjustment module configured to: if the contact force can not reach the expected gripping force of the five fingers, the type of the object and the expected gripping force of the five fingers are adjusted.
It should be noted that, each module in the present embodiment corresponds to each step in the first embodiment one to one, and the specific implementation process is the same, which is not described herein again.
EXAMPLE III
The present embodiment provides a computer-readable storage medium on which a computer program is stored which, when being executed by a processor, carries out the steps of a method for anthropomorphic grip control of a bionic hand as described in the first of the above embodiments.
Example four
The present embodiment provides a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps in an anthropomorphic grip control method for a bionic hand as described in the first embodiment above when executing the program.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of a hardware embodiment, a software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. An anthropomorphic grip control method for a bionic hand, comprising:
acquiring the motion amplitude of a motor, the curvature of human fingers, control dimensions, object types and expected gripping force of five fingers;
finding the corresponding relation between the human hand and the bionic hand posture according to the bending range of the human fingers and the motor motion amplitude to generate a bionic hand data set;
performing dimension reduction processing on the bionic hand data set based on the control dimension to generate a collaborative data set;
generating a pre-grasping posture according to the cooperative data set and the object type to control a motor to move according to a first control law, wherein the first control law is switched between a nonsingular tail end sliding mode control law and a linear sliding mode control law according to errors;
and after the bionic hand is contacted with the object, obtaining the contact force, controlling the motor to move according to a second control law by combining the expected gripping force of the five fingers, and adjusting the pose of the bionic hand so as to finish the human-like gripping of the object by the bionic hand.
2. The anthropomorphic grip control method for a bionic hand according to claim 1, characterized in that dimension reduction processing is performed on a bionic hand data set by principal component analysis;
the control dimension represents the number of the principal components to be selected.
3. The anthropomorphic grip control method for a bionic hand as set forth in claim 1, characterized in that said second control law is:
μ=μ p ω+μ f (1-ω)
where, ω = re n R is a normal number, e n Is the difference between the expected gripping force and the contact force of the five fingers at the moment n, mu p Denotes the first control law, μ f Indicating force control.
4. An anthropomorphic grip control method for a bionic hand as claimed in claim 3, characterized in that said force control is:
Figure FDA0003896121340000011
wherein, K p And K i Proportional and integral terms, respectively, T is the sampling time, e k Is the difference between the expected gripping force and the contact force of the five fingers at time k.
5. The anthropomorphic grip control method for a bionic hand as recited in claim 1, wherein the correspondence between the human hand and the gesture of the bionic hand is:
Figure FDA0003896121340000021
wherein alpha is imax Is maximum human finger curvature, beta imax Is the maximum motor amplitude of motion.
6. The method of claim 1, wherein the object type and the expected five-finger grip are adjusted if the contact force does not achieve the expected five-finger grip.
7. An anthropomorphic grip control system for a bionic hand, comprising:
a data acquisition module configured to: acquiring the motor activity amplitude, the human finger curvature, the control dimension, the object type and the expected gripping force of the five fingers;
a relationship correspondence module configured to: finding the corresponding relation between the human hand and the bionic hand posture according to the bending range of the human fingers and the motor motion amplitude to generate a bionic hand data set;
a dimension reduction module configured to: performing dimension reduction processing on the bionic hand data set based on the control dimension to generate a collaborative data set;
a pre-control module configured to: generating a pre-grasping posture according to the cooperative data set and the object type to control a motor to move according to a first control law, wherein the first control law is switched between a nonsingular tail end sliding mode control law and a linear sliding mode control law according to errors;
a pose adjustment module configured to: and after the bionic hand is contacted with the object, obtaining the contact force, controlling the motor to move according to a second control law by combining the expected gripping force of the five fingers, and adjusting the pose of the bionic hand so as to finish the human-like gripping of the object by the bionic hand.
8. The anthropomorphic grip control system for a bionic hand of claim 7, further comprising a parameter adjustment module configured to: if the contact force can not reach the expected gripping force of the five fingers, the type of the object and the expected gripping force of the five fingers are adjusted.
9. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of a method for anthropomorphic grip control of a bionic hand as claimed in any one of claims 1 to 6.
10. A computer arrangement comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor, when executing the program, carries out the steps in a method for anthropomorphic grip control of a bionic hand as claimed in any one of claims 1 to 6.
CN202211273495.6A 2022-10-18 2022-10-18 Anthropomorphic gripping control method and system for bionic hand Active CN115533939B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211273495.6A CN115533939B (en) 2022-10-18 2022-10-18 Anthropomorphic gripping control method and system for bionic hand

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211273495.6A CN115533939B (en) 2022-10-18 2022-10-18 Anthropomorphic gripping control method and system for bionic hand

Publications (2)

Publication Number Publication Date
CN115533939A true CN115533939A (en) 2022-12-30
CN115533939B CN115533939B (en) 2024-08-13

Family

ID=84734778

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211273495.6A Active CN115533939B (en) 2022-10-18 2022-10-18 Anthropomorphic gripping control method and system for bionic hand

Country Status (1)

Country Link
CN (1) CN115533939B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103955295A (en) * 2014-04-17 2014-07-30 北京航空航天大学 Real-time grabbing method of virtual hand based on data glove and physical engine
US20180359448A1 (en) * 2017-06-07 2018-12-13 Digital Myths Studio, Inc. Multiparty collaborative interaction in a virtual reality environment
CN109150029A (en) * 2018-10-11 2019-01-04 哈尔滨工业大学 Permanent magnet synchronous motor method for controlling position-less sensor based on smooth non-singular terminal sliding mode observer
CN112180719A (en) * 2020-09-01 2021-01-05 上海大学 Novel robust finite time trajectory control method based on man-machine cooperation system
CN112223275A (en) * 2020-09-01 2021-01-15 上海大学 Cooperative robot control method based on finite time tracking control
CN112947523A (en) * 2021-03-02 2021-06-11 中国人民解放军火箭军工程大学 Angle constraint guidance method and system based on nonsingular rapid terminal sliding mode control
CN114020026A (en) * 2021-11-05 2022-02-08 西北工业大学 Fixed-time multi-spacecraft formation capture method and system based on extended state observer
US20220088786A1 (en) * 2020-07-24 2022-03-24 Yanshan University Fractional Order Sliding Mode Synchronous Control Method For Teleoperation System Based On Event Trigger Mechanism
CN114407007A (en) * 2022-01-17 2022-04-29 山东新一代信息产业技术研究院有限公司 Self-adaptive nonsingular terminal sliding mode control method and device for mechanical arm and medium
WO2022088593A1 (en) * 2020-10-26 2022-05-05 北京理工大学 Robotic arm control method and device, and human-machine cooperation model training method
CN114516047A (en) * 2022-02-14 2022-05-20 安徽大学 Method and system for controlling track of mechanical arm based on radial basis function neural network terminal sliding mode

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103955295A (en) * 2014-04-17 2014-07-30 北京航空航天大学 Real-time grabbing method of virtual hand based on data glove and physical engine
US20180359448A1 (en) * 2017-06-07 2018-12-13 Digital Myths Studio, Inc. Multiparty collaborative interaction in a virtual reality environment
CN109150029A (en) * 2018-10-11 2019-01-04 哈尔滨工业大学 Permanent magnet synchronous motor method for controlling position-less sensor based on smooth non-singular terminal sliding mode observer
US20220088786A1 (en) * 2020-07-24 2022-03-24 Yanshan University Fractional Order Sliding Mode Synchronous Control Method For Teleoperation System Based On Event Trigger Mechanism
CN112180719A (en) * 2020-09-01 2021-01-05 上海大学 Novel robust finite time trajectory control method based on man-machine cooperation system
CN112223275A (en) * 2020-09-01 2021-01-15 上海大学 Cooperative robot control method based on finite time tracking control
WO2022088593A1 (en) * 2020-10-26 2022-05-05 北京理工大学 Robotic arm control method and device, and human-machine cooperation model training method
CN112947523A (en) * 2021-03-02 2021-06-11 中国人民解放军火箭军工程大学 Angle constraint guidance method and system based on nonsingular rapid terminal sliding mode control
CN114020026A (en) * 2021-11-05 2022-02-08 西北工业大学 Fixed-time multi-spacecraft formation capture method and system based on extended state observer
CN114407007A (en) * 2022-01-17 2022-04-29 山东新一代信息产业技术研究院有限公司 Self-adaptive nonsingular terminal sliding mode control method and device for mechanical arm and medium
CN114516047A (en) * 2022-02-14 2022-05-20 安徽大学 Method and system for controlling track of mechanical arm based on radial basis function neural network terminal sliding mode

Also Published As

Publication number Publication date
CN115533939B (en) 2024-08-13

Similar Documents

Publication Publication Date Title
Yang et al. Haptics electromyography perception and learning enhanced intelligence for teleoperated robot
Li et al. Asymmetric bimanual control of dual-arm exoskeletons for human-cooperative manipulations
Ekvall et al. Interactive grasp learning based on human demonstration
CN106945043B (en) Multi-arm cooperative control system of master-slave teleoperation surgical robot
Ekvall et al. Learning and evaluation of the approach vector for automatic grasp generation and planning
Adachi et al. Imitation learning for object manipulation based on position/force information using bilateral control
Fang et al. Skill learning for human-robot interaction using wearable device
Zeng et al. Encoding multiple sensor data for robotic learning skills from multimodal demonstration
CN115351780A (en) Method for controlling a robotic device
Jamone et al. Incremental learning of context-dependent dynamic internal models for robot control
CN111702767A (en) Manipulator impedance control method based on inversion fuzzy self-adaptation
Skoglund et al. Programming-by-Demonstration of reaching motions—A next-state-planner approach
Luo et al. A vision-based virtual fixture with robot learning for teleoperation
Fujimoto et al. Time series motion generation considering long short-term motion
Li et al. Neural learning and kalman filtering enhanced teaching by demonstration for a baxter robot
Nemec et al. A virtual mechanism approach for exploiting functional redundancy in finishing operations
Liu et al. Multi-fingered tactile servoing for grasping adjustment under partial observation
Campbell et al. Superpositioning of behaviors learned through teleoperation
Koeppe et al. Learning compliant motions by task-demonstration in virtual environments
Iodice et al. Learning cooperative dynamic manipulation skills from human demonstration videos
Palm et al. Learning of grasp behaviors for an artificial hand by time clustering and Takagi-Sugeno modeling
CN115533939B (en) Anthropomorphic gripping control method and system for bionic hand
Wei et al. Multisensory visual servoing by a neural network
Salvietti et al. A static intrinsically passive controller to enhance grasp stability of object-based mapping between human and robotic hands
Garg et al. Handaid: A seven dof semi-autonomous robotic manipulator

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant