CN111376269B - Object grabbing method and device, storage medium and electronic equipment - Google Patents

Object grabbing method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN111376269B
CN111376269B CN202010144401.XA CN202010144401A CN111376269B CN 111376269 B CN111376269 B CN 111376269B CN 202010144401 A CN202010144401 A CN 202010144401A CN 111376269 B CN111376269 B CN 111376269B
Authority
CN
China
Prior art keywords
controlled manipulator
hand
manipulator
controlled
included angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010144401.XA
Other languages
Chinese (zh)
Other versions
CN111376269A (en
Inventor
李红红
韩久琦
姚秀军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jingdong Shuke Haiyi Information Technology Co Ltd
Jingdong Technology Information Technology Co Ltd
Original Assignee
Beijing Haiyi Tongzhan Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Haiyi Tongzhan Information Technology Co Ltd filed Critical Beijing Haiyi Tongzhan Information Technology Co Ltd
Priority to CN202010144401.XA priority Critical patent/CN111376269B/en
Publication of CN111376269A publication Critical patent/CN111376269A/en
Application granted granted Critical
Publication of CN111376269B publication Critical patent/CN111376269B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1612Programme controls characterised by the hand, wrist, grip control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J15/00Gripping heads and other end effectors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J15/00Gripping heads and other end effectors
    • B25J15/08Gripping heads and other end effectors having finger members
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manipulator (AREA)

Abstract

The disclosure provides an object grabbing method and device, electronic equipment and a computer readable storage medium, and relates to the technical field of artificial intelligence. The object grabbing method comprises the following steps: acquiring an expected contact point preset in a controlled manipulator, and acquiring motion data of a human hand in real time; calculating an included angle between the five fingers of the human hand and the back of the hand based on the motion data; mapping the included angle to a target angle when the controlled manipulator grabs an object, and controlling the controlled manipulator to reach the target angle; and controlling and adjusting the controlled manipulator until the expected contact point is detected to be fully contacted with the object. The data glove can realize high-dexterous user control under the condition of accurate positioning, and can also realize automatic grabbing of parts while avoiding the object from falling.

Description

Object grabbing method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of artificial intelligence technologies, and in particular, to an object grasping method, an object grasping apparatus, an electronic device, and a computer-readable storage medium.
Background
Due to the requirements of life and production, the manipulator needs to operate in a high-risk environment many times, is suitable for transportation, and has good application prospects in the aspects of deep sea exploration, battlefield mine clearance, nuclear material carrying, aerospace equipment maintenance and the like.
Due to the high complexity and unpredictability of the work environment faced by manipulators, it is extremely challenging to handle complex tasks and to operate accurately, completely autonomously. The current feasible method is a master-slave control method, namely, the manipulator is directly controlled by hands, so that various complex environments encountered by the manipulator are directly decided by utilizing intelligence of human brain, and the manipulator is remotely controlled to complete a target operation task.
However, the existing master-slave control method cannot realize the local automation of the manipulator under the conversion environment. For example, when the manipulator is controlled to grasp through the data glove, the perception angle of each finger is not enough to control the manipulator to adapt to the shapes of different objects, and meanwhile, the manipulator is inhibited from controlling and adjusting the grasping capacity of the manipulator according to different task requirements. The single master-slave relationship can reduce the gripping success rate and the gripping time of the manipulator, and the risk of falling objects exists.
Therefore, it is desirable to provide an object grasping method that can achieve highly dexterous user control and avoid dropping of an object during grasping.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of the embodiments of the present disclosure is to provide an object grasping method, an object grasping apparatus, an electronic device, and a computer-readable storage medium, which can achieve both highly dexterous user control with accurate positioning and partial automatic grasping while avoiding an object from falling.
According to a first aspect of the present disclosure, there is provided an object grasping method including:
acquiring an expected contact point preset in a controlled manipulator, and acquiring motion data of a human hand in real time;
calculating an included angle between the five fingers of the human hand and the back of the hand based on the motion data;
mapping the included angle to a target angle when the controlled manipulator grabs an object, and controlling the controlled manipulator to reach the target angle;
and controlling and adjusting the controlled manipulator until the expected contact point is detected to be fully contacted with the object.
In an exemplary embodiment of the present disclosure, the human hand is worn with a data glove, and sensors are provided at the tip of the five fingers and the palm of the human hand, respectively;
the real-time motion data of the human hand is collected, and the real-time motion data comprises the following steps:
and periodically reading each sensor to acquire the motion data, wherein the motion data comprises three-axis acceleration, three-axis angular velocity and quaternion of the motion of the human hand.
In an exemplary embodiment of the present disclosure, the calculating an included angle between the five fingers of the human hand and the back of the hand based on the motion data includes:
normalizing the collected quaternions of the five fingers and the hand back to obtain corresponding attitude quaternions;
and calculating the included angle between each finger and the back of the hand by using the posture quaternion.
In an exemplary embodiment of the present disclosure, a proportional-derivative controller is disposed on the controlled manipulator; the mapping the included angle to a target angle when the controlled manipulator grabs the object includes:
and acquiring an included angle between the five fingers of the hand and the back of the hand, and mapping by the proportional differential controller to obtain the corresponding target angle.
In an exemplary embodiment of the present disclosure, the desired contact point is set according to a shape, a size, and a number of the objects.
In an exemplary embodiment of the present disclosure, a tactile sensor is disposed at each finger of the controlled manipulator;
the adjusting the controlled manipulator until the object contacts all the desired contact points of the controlled manipulator comprises:
adjusting the joint moment of the controlled manipulator until the object contacts all the expected contact points of the controlled manipulator.
In an exemplary embodiment of the present disclosure, the adjusting the joint torque of the controlled manipulator includes:
and adjusting the joint moment of the controlled manipulator to move each finger of the controlled manipulator to the direction of increasing the contact area with the object.
According to a second aspect of the present disclosure, there is provided an object grasping apparatus comprising:
the acquisition module is used for acquiring an expected contact point preset in the controlled manipulator and acquiring motion data of the human hand in real time;
the calculation module is used for calculating the included angle between the five fingers of the human hand and the back of the hand based on the motion data;
the mapping module is used for mapping the included angle to a target angle when the controlled manipulator grabs an object and controlling the controlled manipulator to reach the target angle;
and the grabbing module is used for adjusting the controlled manipulator until the object contacts all the expected contact points of the controlled manipulator.
According to a third aspect of the present disclosure, there is provided an electronic device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the method of any one of the above via execution of the executable instructions.
According to a fourth aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of any one of the above.
Exemplary embodiments of the present disclosure may have some or all of the following benefits:
in the object grabbing method provided by the embodiment of the disclosure, firstly, an expected contact point preset in a controlled manipulator is obtained, and motion data of a human hand is collected in real time; and then, calculating an included angle between the five fingers of the hand and the back of the hand based on the collected motion data, mapping the calculated included angle to a target angle when the controlled manipulator grabs the object, controlling the controlled manipulator to reach the target angle, and adjusting the controlled manipulator until all the set expected contact points are detected to contact the object to be grabbed. On the one hand, in the object grabbing method provided by the exemplary embodiment, the included angle between the five fingers of the human hand and the back of the hand is calculated through the collected data of the motion of the human hand, the target angle required by the controlled manipulator for grabbing the object is obtained based on the included angle, and then the controlled manipulator is controlled to grab the object according to the target angle. On the other hand, after the manipulator is controlled to contact the object according to the target angle, the manipulator is adjusted until the preset expected contact points in the manipulator all contact the object to be grabbed, and the contact area between the manipulator and the object can be increased and the grabbing stability is improved by setting the expected contact points. Meanwhile, the manipulator can be controlled to control and adjust the own gripping capability according to different task requirements by adjusting the positions and the number of expected contact points.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
Fig. 1 is a schematic diagram illustrating an exemplary system architecture to which an object grasping method and apparatus according to an embodiment of the present disclosure may be applied;
FIG. 2 illustrates a schematic structural diagram of a computer system suitable for use with the electronic device used to implement embodiments of the present disclosure;
FIG. 3 schematically illustrates an operational schematic of an object grasping method according to one embodiment of the present disclosure;
FIG. 4 schematically illustrates a flow chart of an object grasping method according to a specific embodiment of the present disclosure;
fig. 5 schematically illustrates a block diagram of an object grasping apparatus according to one embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
Fig. 1 is a schematic diagram illustrating a system architecture of an exemplary application environment to which an object grasping method and apparatus according to an embodiment of the present disclosure may be applied.
As shown in fig. 1, the system architecture 100 may include one or more of terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few. The terminal devices 101, 102, 103 may be various electronic devices having a display screen, including but not limited to desktop computers, portable computers, smart phones, tablet computers, and the like. It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. For example, server 105 may be a server cluster comprised of multiple servers, or the like.
The object grabbing method provided by the embodiment of the disclosure may be executed by the terminal devices 101, 102, and 103, and correspondingly, the object grabbing device may also be disposed in the terminal devices 101, 102, and 103. The object grabbing method provided by the embodiment of the present disclosure may also be executed by the terminal devices 101, 102, and 103 and the server 105 together, and accordingly, the object grabbing apparatus may be disposed in the terminal devices 101, 102, and 103 and the server 105. In addition, the object capture method provided by the present disclosure may also be executed by the server 105, and accordingly, the object capture device may be disposed in the server 105, which is not particularly limited in this exemplary embodiment.
For example, in the present exemplary embodiment, the above-described object grasping method may be performed by the terminal apparatuses 101, 102, 103. Firstly, setting an expected contact point in a controlled manipulator according to a grabbing task, acquiring the expected contact point preset in the controlled manipulator by terminal equipment 101, 102 and 103, and acquiring motion data of a human hand in real time; then, the terminal device calculates the included angle between the five fingers of the hand and the back of the hand based on the acquired motion data, sends the included angle to a proportional differential controller arranged on the controlled manipulator, maps the included angle to a target angle when the controlled manipulator grabs an object through the proportional differential controller, and controls the controlled manipulator to reach the target angle; finally, the controlled robot is adjusted until all desired contact points contact the object to be grasped.
FIG. 2 illustrates a schematic structural diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present disclosure.
It should be noted that the computer system 200 of the electronic device shown in fig. 2 is only an example, and should not bring any limitation to the functions and the scope of the application of the embodiments of the present disclosure.
As shown in fig. 2, the computer system 200 includes a Central Processing Unit (CPU)201 that can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)202 or a program loaded from a storage section 208 into a Random Access Memory (RAM) 203. In the RAM 203, various programs and data necessary for system operation are also stored. The CPU 201, ROM 202, and RAM 203 are connected to each other via a bus 204. An input/output (I/O) interface 205 is also connected to bus 204.
The following components are connected to the I/O interface 205: an input portion 206 including a keyboard, a mouse, and the like; an output section 207 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 208 including a hard disk and the like; and a communication section 209 including a network interface card such as a LAN card, a modem, or the like. The communication section 209 performs communication processing via a network such as the internet. A drive 210 is also connected to the I/O interface 205 as needed. A removable medium 211 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 210 as necessary, so that a computer program read out therefrom is mounted into the storage section 208 as necessary.
Due to the requirements of life and production, the manipulator needs to operate in a high-risk environment many times, is suitable for transportation, and has good application prospects in the aspects of deep sea exploration, battlefield mine clearance, nuclear material carrying, aerospace equipment maintenance and the like.
In order to control the manipulator to work in a highly complex and unpredictable environment, the inventor tentatively provides a manipulator master-slave control method, and the main idea is to make decisions on various complex environments encountered by the manipulator directly by using human brain intelligence, so as to remotely control the manipulator to complete a target operation task. The implementation process is as follows: and acquiring motion data of the finger joint motion of the human hand for controlling the manipulator, and mapping the motion angle of the human hand finger to the controlled manipulator based on the acquired motion data so as to control the manipulator to complete the target task. In the process, the outer framework mechanism can be used for measuring the motion angle of the human finger joint, the motion of the human finger joint can be obtained by measuring the surface electromyographic signals when the human hand moves, and more accurately, the motion angle of the human finger joint can be obtained by using the data glove.
However, practice proves that the method can achieve the target operation task by using the intelligent remote manipulator of human brain, and simultaneously has the following problems: firstly, because the structure difference exists between the hand and the manipulator, no matter the motion angle of the fingers obtained by the exoskeleton mechanism, the surface electromyogram signal or the data glove, the obtained hand motion needs to be mapped on the manipulator, so that the precision, the real-time property and the stability of the manipulator motion can be ensured. However, in the master-slave control method of the manipulator, taking master-slave control of obtaining the motion angle of the human finger joint through the data glove as an example, when the data glove controls the manipulator to grip, the sensing angle of each finger is not enough to control the manipulator to adapt to the shapes of different objects, the manipulator is also inhibited from controlling and adjusting the gripping ability thereof according to different task requirements, and the local automation of the controlled manipulator in a changing environment cannot be realized. Therefore, the single master-slave relationship of the method reduces the gripping success rate and the gripping time of the manipulator, and the risk of falling of the object exists.
In order to solve the problems in the above method, in the present exemplary embodiment, the inventor further proposes a new technical solution to achieve the objective operation task by using the intelligent remote manipulator of the human brain. The technical solution of the embodiment of the present disclosure is elaborated below:
the present exemplary embodiment first provides an object grasping method. Referring to fig. 3, the object grabbing method specifically includes the following steps:
step S310: acquiring an expected contact point preset in a controlled manipulator, and acquiring motion data of a human hand in real time;
step S320: calculating an included angle between the five fingers of the human hand and the back of the hand based on the motion data;
step S330: mapping the included angle to a target angle when the controlled manipulator grabs an object, and controlling the controlled manipulator to reach the target angle;
step S340: and controlling and adjusting the controlled manipulator until the expected contact point is detected to be fully contacted with the object.
In the object grabbing method provided by the exemplary embodiment of the present disclosure, on one hand, in the object grabbing method provided by the exemplary embodiment, the included angle between the five fingers of the human hand and the back of the hand is calculated through the collected data of the motion of the human hand, a target angle required when the controlled manipulator grabs the object is obtained based on the included angle, and then the controlled manipulator is controlled to grab the object according to the target angle. On the other hand, after the manipulator is controlled to contact the object according to the target angle, the manipulator is adjusted until the preset expected contact points in the manipulator all contact the object to be grabbed, and the contact area between the manipulator and the object can be increased and the grabbing stability is improved by setting the expected contact points. Meanwhile, the manipulator can be controlled to control and adjust the own gripping capability according to different task requirements by adjusting the positions and the number of expected contact points.
Next, in another embodiment, the above steps are explained in more detail.
In step S310, a desired contact point preset in the controlled manipulator is acquired, and the motion data of the human hand is collected in real time.
The object grabbing method provided by the embodiment can make a decision on the working environment faced by the manipulator by using the intelligence of human brain, and the manipulator is remotely controlled to complete a target operation task. In the embodiment of the present invention, the controlled manipulator is a medium for performing a target operation in a task environment under manual control, and a human hand controls the manipulator remotely, so that the controlled manipulator completes the target operation in a complex high-risk environment, thereby ensuring the safety of the operation. The controlled manipulator may be in different forms according to different target operation tasks, for example, the controlled manipulator may be a five-finger manipulator, a three-finger manipulator, or other forms according to the definition of the controlled manipulator, and this is not particularly limited in this exemplary embodiment.
In the present exemplary embodiment, the desired contact point is a contact point that is set in advance in the controlled robot arm in order to increase the contact area between the controlled robot arm and the object to be grasped and improve grasping stability in the grasping task. The desired contact point may be set and adjusted depending on the grasping task. For example, the number and the position of the desired contact points may be determined according to one or more of the shape, the size and the number of the objects to be grasped, and may also be determined together with the shape of the controlled manipulator, which is not particularly limited in this exemplary embodiment.
In the object grabbing method provided by the exemplary embodiment, because the human hand and the controlled manipulator have structural differences, the motion of the human hand needs to be mapped onto the controlled manipulator, so that the precision, the real-time performance and the stability of the motion of the dexterous hand are ensured. Preferably, the motion data of the human hand can be acquired through the data glove, and compared with other methods, the method can acquire the motion angle of the human finger joint more accurately. In addition, the outer skeleton mechanism may also be used to measure the motion angle of the human finger joint, and the motion data of the human finger joint may also be obtained by measuring the surface electromyogram signal of the human hand during motion.
Taking the example of acquiring the motion data of the human hand through the data glove, the motion data comprises three-axis acceleration, three-axis angular velocity and quaternion of the motion of the human hand. The implementation process of collecting the motion data of the human hand in real time can be as follows: wearing inertial data gloves, and arranging sensors at the tips of the five fingers and the palm of a human hand respectively; and periodically reading each sensor to acquire the motion data.
It should be noted that the above scenario is only an exemplary illustration, and the scope of protection of the exemplary embodiment is not limited thereto.
In step S320, an included angle between the five fingers of the human hand and the back of the hand is calculated based on the motion data.
In the present exemplary embodiment, after the motion data is acquired in step S310, an included angle between the five fingers of the human hand and the back of the hand can be calculated by using the acquired motion data, so that the motion of the human hand can be mapped to the controlled manipulator according to the included angle.
The method for calculating the angle of the included angle may be, for example, as follows: normalizing the quaternions of the five fingers and the back of the hand acquired by the data glove in the step S310 to obtain a corresponding posture quaternion; and calculating the included angle between each finger and the back of the hand by using the posture quaternion.
In the following, the calculation process is described in detail by taking the example of calculating the angle between the index finger and the back of the hand at time T:
firstly, respectively normalizing the quaternion q _ finger of the index finger end and the quaternion q _ play of the palm end to correspondingly obtain nq _ finger and nq _ play.
Taking q _ finger ═ q0, q1, q2, q3 as an example, the above normalization process is as follows:
Figure BDA0002400224080000091
q0(T)_n=q0(T)/norm(T)
q1(T)_n=q1(T)/norm(T)
q2(T)_n=q2(T)/norm(T)
q3(T)_n=q3(T)/norm(T)
wherein norm (T) is a modulus of a T-time quaternion, q0(T) _ n, q1(T) _ n, q2(T) _ n, and q3(T) _ n are attitude quaternions at T-time obtained after normalization, respectively, and the normalized index finger quaternion is merged and expressed as nq _ finger ═ q0_ n, q1_ n, q2_ n, q3_ n;
finally, the angular difference between the quaternions can be calculated using the dot product of the quaternions, i.e., the angle between the index finger and the palm can be obtained by calculating the angle between the quaternions nq _ play and nq _ finger:
Figure BDA0002400224080000101
angleA=cos-1(θ′)*2
in this exemplary embodiment, the angles of the included angles between the other four fingers and the palm can be calculated according to the above calculation process for calculating the index finger and the palm, which is not described herein again. It should be noted that the above scenario is only an exemplary illustration, and the scope of protection of the exemplary embodiment is not limited thereto.
In step S330, the included angle is mapped to a target angle when the controlled manipulator grabs the object, and the controlled manipulator is controlled to reach the target angle.
In the present exemplary embodiment, the object is a target object to be grasped by the controlled manipulator controlled by the human hand. In the object grabbing method, because the human hand and the controlled manipulator have structural difference, the calculated included angle needs to be mapped to be the target angle and used for controlling the controlled manipulator, and the purpose that the controlled manipulator is controlled intelligently by the human brain can be achieved.
In the present exemplary embodiment, the process of obtaining the target angle may be as follows, for example: a proportional-differential controller is arranged on the controlled manipulator; and acquiring the calculated included angle between the five fingers of the human hand and the back of the hand, and mapping by a proportional differential controller to obtain a corresponding target angle.
The proportional-derivative controller is configured to map the calculated included angle between each finger and the back of the hand to the target angle. If there is no contact between one finger and the object, the finger is still controlled by a proportional-derivative controller arranged in the controlled manipulator to convert the calculated included angle into a target angle and control the controlled manipulator to reach the target angle, so that each finger contacts the object to be grabbed. It should be noted that the above scenario is only an exemplary illustration, and the scope of protection of the exemplary embodiment is not limited thereto.
In step S340, control adjusts the controlled manipulator until all of the desired contact points are detected to contact the object.
In the object grabbing method provided by the present exemplary embodiment, after the controlled manipulator contacts the object in step S330, the controlled manipulator can stably grab the object by sharing the idea of the control theory, so as to fill up the gap in the prior art and establish a bridge for effectively executing human intention and expected tasks.
The implementation method of the above process may be, for example, as follows: arranging a touch sensor at each finger of the controlled manipulator; and adjusting the joint moment of the controlled manipulator until the object contacts all the expected contact points of the controlled manipulator.
In the above method, the specific implementation of adjusting the joint moment may be: after the controlled manipulator contacts the object in the step S330, the sharing controller processes the information of the tactile sensor placed on the finger of the controlled manipulator to finely adjust the finger, and the sharing controller applies a predefined force to adjust the joint moment in magnitude and direction. That is, each finger may be caused to slide along the surface of the object to continue to seek contact with the object in the desired point of contact that has not yet been achieved. Meanwhile, the proportional-derivative controller continuously calculates the joint torque required by the data glove instruction joint angle. Through the sharing control, when the controlled manipulator contacts an object to be grabbed in the grabbing task, the touch sensor arranged on the controlled manipulator is combined with the direction trend of finger movement, so that the bending angle of the finger can be automatically increased, the contact area between the dexterous hand and the object is maximized, and the object is stably grabbed. However, the tactile sensor may be replaced by other force feedback devices that can achieve the same function, and this is not limited in this exemplary embodiment.
It should be noted that the above scenario is only an exemplary illustration, and the scope of protection of the exemplary embodiment is not limited thereto.
Next, taking a specific example of the present exemplary embodiment as an example, the object grabbing method is further described, and as shown in fig. 4, the specific example includes the following steps:
step S410: and collecting data.
In the step, an inertial data glove is worn on a human hand playing a control role, and an inertial sensor is respectively arranged at the tip of the five fingers and the palm; and periodically reading the information of each inertial sensor at the T moment, wherein the information of the inertial sensor at the T moment comprises triaxial acceleration information, triaxial angular velocity data and a quaternion at the T moment.
Step S420: and (5) resolving the attitude.
In the step, the included angle between each finger of the human hand and the back of the hand at the time T is calculated by applying quaternion operation. The calculation process is already described in detail in step S320, and therefore will not be described herein again.
Step S430: and (4) sharing control.
In this step, first, a desired contact point is defined in the controlled manipulator, and the controlled manipulator is adjusted to contact all the preset desired contact points, so that the contact area between the controlled manipulator and the object can be increased. Wherein the type and number of desired contact points can be customized from gripping task to gripping task. The manipulator used in this embodiment has 5 fingers, and tactile sensors are provided on the palm side of each finger, the side and top surfaces of contact between the two fingers. Each finger has three phalanges, and the joints between each phalange can independently control the torque. In the controlled manipulator, 1 desired contact point is defined for each phalanx of each finger except the thumb, and 2 desired contact points are defined for the thumb.
The controlled manipulator is provided with a proportional-derivative controller and a shared controller, and the proportional-derivative controller and the shared controller realize shared control. The sharing control comprises the following two parts:
(1) when the controlled manipulator does not contact any object, the proportional differential controller of the controlled manipulator maps the angle between each finger of the human hand and the back of the hand, which is calculated in step S420, as the target angle of the controlled manipulator, and adjusts the joint torque so that the finger joint of the controlled manipulator reaches the target angle.
(2) Under the shared control condition, once the touch sensor at the finger end of the controlled manipulator is contacted with the object, the shared controller applies joint moment to the direction of the expected contact point, so that more expected contact points of the controlled manipulator are contacted with the object, the contact area of the controlled manipulator and the object is increased until all the expected contact points arranged in the controlled manipulator are contacted with the object, and the effect of stabilizing the object is achieved.
In this embodiment, on the one hand, the included angle between the five fingers of the hand and the back of the hand is calculated through the collected data of the hand movement, a target angle required when the controlled manipulator grabs the object is obtained based on the included angle, and then the controlled manipulator is controlled to grab the object according to the target angle. On the other hand, after the manipulator is controlled to contact the object according to the target angle, the manipulator is adjusted until the preset expected contact points in the manipulator all contact the object to be grabbed, and the contact area between the manipulator and the object can be increased and the grabbing stability is improved by setting the expected contact points. Meanwhile, the manipulator can be controlled to control and adjust the own gripping capability according to different task requirements by adjusting the positions and the number of expected contact points.
It should be noted that although the various steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that these steps must be performed in this particular order, or that all of the depicted steps must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Further, in the present exemplary embodiment, an object grabbing apparatus is also provided, and as shown in fig. 5, the object grabbing apparatus 500 may include an acquisition module 510, a calculation module 520, a mapping module 530, and a grabbing module 540. Wherein:
the acquisition module 510 may be configured to acquire a desired contact point preset in the controlled manipulator and acquire motion data of the human hand in real time;
the calculation module 520 may be configured to calculate an included angle between the five fingers of the human hand and the back of the hand based on the motion data;
the mapping module 530 may be configured to map the included angle to a target angle when the controlled manipulator grabs an object, and control the controlled manipulator to reach the target angle;
the gripping module 540 may be used to adjust the controlled robot until the object contacts all of the desired contact points of the controlled robot.
The specific details of each module or unit in the object capture device have been described in detail in the corresponding object capture method, and therefore are not described herein again.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units.
As another aspect, the present application also provides a computer-readable medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by an electronic device, cause the electronic device to implement the method as described in the embodiments below. For example, the electronic device may implement the steps shown in fig. 3 to 4, and the like.
It should be noted that the computer readable media shown in the present disclosure may be computer readable signal media or computer readable storage media or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (9)

1. An object grasping method, characterized by comprising:
acquiring an expected contact point preset in a controlled manipulator, and arranging sensors at the ends of five fingers and a palm of a hand to acquire motion data of the hand in real time; the motion data comprises three-axis acceleration, three-axis angular velocity and quaternion of the motion of the human hand;
calculating an included angle between the five fingers of the human hand and the back of the hand based on the motion data;
mapping the included angle to a target angle when the controlled manipulator grabs an object, and controlling the controlled manipulator to reach the target angle;
the controlled manipulator is provided with a proportional-derivative controller; the mapping the included angle to a target angle when the controlled manipulator grabs the object includes: acquiring an included angle between the five fingers of the hand and the back of the hand, and mapping by the proportional differential controller to obtain the corresponding target angle;
controlling and adjusting the controlled manipulator until the expected contact point is detected to be in full contact with the object;
and stabilizing the grasping of the object by the controlled manipulator through a shared control theory.
2. The object grasping method according to claim 1, wherein the human hand is worn with a data glove;
the real-time motion data of the human hand is collected, and the real-time motion data comprises the following steps:
and periodically reading each sensor to acquire the motion data.
3. The object grabbing method according to claim 2, wherein the calculating of the included angle between the five fingers of the human hand and the back of the hand based on the motion data includes:
normalizing the collected quaternions of the five fingers and the hand back to obtain corresponding attitude quaternions;
and calculating the included angle between each finger and the back of the hand by using the posture quaternion.
4. The object grasping method according to claim 1, wherein the desired contact point is set in accordance with a shape, a size, and a number of the object.
5. The object grabbing method according to claim 1, wherein a touch sensor is arranged at each finger of the controlled manipulator;
the adjusting the controlled manipulator until the object contacts all the desired contact points of the controlled manipulator comprises:
adjusting the joint moment of the controlled manipulator until the object contacts all the expected contact points of the controlled manipulator.
6. The object grasping method according to claim 5, wherein the adjusting the joint moment of the controlled robot includes:
and adjusting the joint moment of the controlled manipulator to move each finger of the controlled manipulator to the direction of increasing the contact area with the object.
7. An object grasping apparatus, comprising:
the acquisition module is used for acquiring an expected contact point preset in the controlled manipulator, and sensors are respectively arranged at the finger ends and the palm of the five fingers of the hand to acquire motion data of the hand in real time; the motion data comprises three-axis acceleration, three-axis angular velocity and quaternion of the motion of the human hand;
the calculation module is used for calculating the included angle between the five fingers of the human hand and the back of the hand based on the motion data;
the mapping module is used for mapping the included angle to a target angle when the controlled manipulator grabs an object and controlling the controlled manipulator to reach the target angle;
the controlled manipulator is provided with a proportional-derivative controller; the mapping the included angle to a target angle when the controlled manipulator grabs the object includes: acquiring an included angle between the five fingers of the hand and the back of the hand, and mapping by the proportional differential controller to obtain the corresponding target angle;
the grabbing module is used for adjusting the controlled manipulator until the object contacts all the expected contact points of the controlled manipulator; and stabilizing the grasping of the object by the controlled manipulator through a shared control theory.
8. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method of any one of claims 1-6.
9. An electronic device, comprising:
a processor;
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of any of claims 1-6 via execution of the executable instructions.
CN202010144401.XA 2020-03-04 2020-03-04 Object grabbing method and device, storage medium and electronic equipment Active CN111376269B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010144401.XA CN111376269B (en) 2020-03-04 2020-03-04 Object grabbing method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010144401.XA CN111376269B (en) 2020-03-04 2020-03-04 Object grabbing method and device, storage medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN111376269A CN111376269A (en) 2020-07-07
CN111376269B true CN111376269B (en) 2021-11-09

Family

ID=71213529

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010144401.XA Active CN111376269B (en) 2020-03-04 2020-03-04 Object grabbing method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN111376269B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117901097B (en) * 2024-01-12 2024-07-02 深圳职业技术大学 Hand-foot bionic manipulator grabbing control method and device based on joint pose change

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120071891A1 (en) * 2010-09-21 2012-03-22 Intuitive Surgical Operations, Inc. Method and apparatus for hand gesture control in a minimally invasive surgical system
US20170071679A1 (en) * 2011-02-15 2017-03-16 Intuitive Surgical Operations, Inc. Methods and systems for indicating a clamping prediction
CN109514521A (en) * 2018-12-18 2019-03-26 合肥工业大学 The servo operation and its method of manpower collaboration Dextrous Hand based on multi-information fusion
CN109732610A (en) * 2019-03-01 2019-05-10 北京航空航天大学 Man-machine collaboration robot grasping system and its working method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1011122A (en) * 1996-06-24 1998-01-16 Nippon Telegr & Teleph Corp <Ntt> Information providing device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120071891A1 (en) * 2010-09-21 2012-03-22 Intuitive Surgical Operations, Inc. Method and apparatus for hand gesture control in a minimally invasive surgical system
US20170071679A1 (en) * 2011-02-15 2017-03-16 Intuitive Surgical Operations, Inc. Methods and systems for indicating a clamping prediction
CN109514521A (en) * 2018-12-18 2019-03-26 合肥工业大学 The servo operation and its method of manpower collaboration Dextrous Hand based on multi-information fusion
CN109732610A (en) * 2019-03-01 2019-05-10 北京航空航天大学 Man-machine collaboration robot grasping system and its working method

Also Published As

Publication number Publication date
CN111376269A (en) 2020-07-07

Similar Documents

Publication Publication Date Title
Okamura et al. An overview of dexterous manipulation
Nguyen Constructing force-closure grasps
Fang et al. A novel data glove using inertial and magnetic sensors for motion capture and robotic arm-hand teleoperation
Fang et al. A robotic hand-arm teleoperation system using human arm/hand with a novel data glove
Yuan et al. Design and control of roller grasper v2 for in-hand manipulation
WO2015106278A2 (en) Wearable robot assisting manual tasks
CN102814814A (en) Kinect-based man-machine interaction method for two-arm robot
Fang et al. Robotic teleoperation systems using a wearable multimodal fusion device
Liang et al. An Augmented Discrete‐Time Approach for Human‐Robot Collaboration
Zheng et al. Dexterous robotic grasping of delicate fruits aided with a multi-sensory e-glove and manual grasping analysis for damage-free manipulation
CN116113523A (en) Information processing device, information processing method, and program
CN111376269B (en) Object grabbing method and device, storage medium and electronic equipment
Çoban et al. Wireless teleoperation of an industrial robot by using myo arm band
Fan et al. Improved teleoperation of an industrial robot arm system using leap motion and myo armband
Falck et al. DE VITO: A dual-arm, high degree-of-freedom, lightweight, inexpensive, passive upper-limb exoskeleton for robot teleoperation
Boru et al. Novel technique for control of industrial robots with wearable and contactless technologies
Parga et al. Tele-manipulation of robot arm with smartphone
Sharma et al. Design and implementation of robotic hand control using gesture recognition
Deng et al. Human-like posture correction for seven-degree-of-freedom robotic arm
CN212352006U (en) Teaching glove and teaching system of two-finger grabbing robot
Liu et al. A novel upper limb training system based on ur5 using semg and imu sensors
Wang et al. A sensor glove based on inertial measurement unit for robot teleoperetion
Li et al. Simulation results for manipulation of unknown objects in hand
CN111002295A (en) Teaching glove and teaching system of two-finger grabbing robot
Scharfe et al. Hybrid physics simulation of multi-fingered hands for dexterous in-hand manipulation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Patentee after: Jingdong Technology Information Technology Co.,Ltd.

Address before: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Patentee before: Jingdong Shuke Haiyi Information Technology Co.,Ltd.

Address after: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Patentee after: Jingdong Shuke Haiyi Information Technology Co.,Ltd.

Address before: 601, 6 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Patentee before: BEIJING HAIYI TONGZHAN INFORMATION TECHNOLOGY Co.,Ltd.

CP01 Change in the name or title of a patent holder