CN113858238A - Robot bionic hand and grabbing method and system - Google Patents

Robot bionic hand and grabbing method and system Download PDF

Info

Publication number
CN113858238A
CN113858238A CN202111070052.2A CN202111070052A CN113858238A CN 113858238 A CN113858238 A CN 113858238A CN 202111070052 A CN202111070052 A CN 202111070052A CN 113858238 A CN113858238 A CN 113858238A
Authority
CN
China
Prior art keywords
finger
grabbing
paw
hand
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111070052.2A
Other languages
Chinese (zh)
Other versions
CN113858238B (en
Inventor
丁梓豪
陈国栋
王振华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou University
Original Assignee
Suzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou University filed Critical Suzhou University
Priority to CN202111070052.2A priority Critical patent/CN113858238B/en
Publication of CN113858238A publication Critical patent/CN113858238A/en
Application granted granted Critical
Publication of CN113858238B publication Critical patent/CN113858238B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J15/00Gripping heads and other end effectors
    • B25J15/0009Gripping heads and other end effectors comprising multi-articulated fingers, e.g. resembling a human hand
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1612Programme controls characterised by the hand, wrist, grip control

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Manipulator (AREA)

Abstract

The present disclosure provides a robotic bionic hand, comprising: the three-finger paw comprises a three-finger paw body, wherein the three-finger paw body comprises a palm and three fingers, the fingers comprise finger roots, finger middle parts, finger front parts and finger tips, the palm is respectively hinged with the finger roots of the three fingers, the finger roots of the fingers are hinged with the finger middle parts, the finger middle parts of the fingers are hinged with the finger front parts, the finger front parts of the fingers and the finger tips are integrated, and the positions of the three fingers can be adjusted freely; the paw driving motor is arranged on the three-finger paw body and used for driving and controlling the three-finger paw body to operate; the touch sensors are arranged on three fingers of the three-finger paw body to form an array type touch sensor which is used for sensing the contact force and the contact force distribution of an object and converting the contact force distribution into an electric signal matched with pressure; and the flexible film pressure conversion module is used for converting the analog signal of the touch sensor into a digital signal. The disclosure also provides a robot simulation hand grabbing method and system.

Description

Robot bionic hand and grabbing method and system
Technical Field
The disclosure relates to a robot hand, a grabbing system and a grabbing method, and belongs to the technical field of robots.
Background
With the continuous development of robot technology, robots have been applied to various fields, replacing people to complete complex work.
When the robot is used in different fields, the work content is different, different paws need to be customized for the robot, and traditional robot paws are all based on manual teaching or visual guidance, and can not deal with objects made of different materials, so that the damage to soft materials is easily caused.
How to plan the correct operation action according to the hardness degree of the object, and ensure that the operated object does not fall off or be damaged, is a difficult point in the field of the current robot.
Disclosure of Invention
In order to solve one of the technical problems, the present disclosure provides a robot gripper, a grasping method and a system.
According to an aspect of the present disclosure, there is provided a robotic bionic hand, comprising:
the three-finger paw comprises a three-finger paw body, wherein the three-finger paw body comprises a palm, a first finger, a second finger and a third finger, the fingers comprise finger roots, finger centers, finger fronts and finger tips, the palm is respectively hinged with the first finger, the second finger and the finger roots of the third finger, the finger roots of the fingers are hinged with the finger centers, the finger fronts of the fingers are hinged with the finger fronts, the finger fronts of the fingers and the finger tips are integrated, and the positions of the first finger, the second finger and the third finger can be adjusted at will;
the paw driving motor is arranged on the three-finger paw body and used for driving and controlling the three-finger paw body to operate;
the touch sensors are arranged on three fingers of the three-finger paw body to form an array type touch sensor which is used for sensing the contact force and the contact force distribution of an object and converting the contact force and the contact force distribution into an electric signal matched with pressure; and the number of the first and second groups,
and the flexible film pressure conversion module is used for converting the analog signal of the touch sensor into a digital signal.
The robot bionic hand according to at least one embodiment of the present disclosure, the composition of the tactile sensor comprises:
the polyimide insulating layer uses epoxy resin as a filler of the piezoresistive sheet; and the number of the first and second groups,
and the high molecular polymer thick film device is positioned between the polyimide insulating layers.
The robotic bionic hand according to at least one embodiment of the present disclosure, the tactile sensor includes:
the sensing area is used for sensing the contact force of a contact object; and the number of the first and second groups,
and the pin is used for outputting signals.
The robot bionic hand according to at least one embodiment of the present disclosure, wherein the sensing area shape comprises a circle or/and a long strip.
According to the bionic hand of the robot in at least one embodiment of the disclosure, the diameter of the circle is 15mm, the length of the strip is 50mm, and the width of the strip is 15 mm.
According to still another aspect of the present disclosure, there is provided a robotic bionic hand grasping method including:
the robot simulates the bionic hand to start grabbing;
the gripper driving motor drives and controls the three-finger gripper body of the bionic hand of the robot to grip an object based on gripping parameters, wherein the gripping parameters comprise gripping force, gripping posture and gripping position;
judging whether the grabbing is successful or not, if so, finishing, otherwise, acquiring the contact force of the contact object and the distributed electric signals by the touch sensor of the robot bionic hand, and converting the contact force and the distributed electric signals into digital signals through the flexible film pressure conversion module of the robot bionic hand;
analyzing the digital signals to obtain object soft and hard attribute data; and the number of the first and second groups,
and based on the acquired soft and hard attribute data of the object, the grabbing parameters of the paw driving motor are adjusted, including the grabbing force, the grabbing posture and the grabbing position of the three-finger paw body, and the three-finger paw body of the bionic hand of the robot is driven and controlled by the paw driving motor to continuously and repeatedly grab the object until the grabbing is successful.
According to yet another aspect of the present disclosure, there is provided a robotic biomimetic hand grasping system comprising:
the robotic bionic hand of any one of the above;
the touch analysis module is in communication connection with the flexible film pressure conversion module of the robot bionic hand and is used for receiving and analyzing data of the touch sensor converted by the flexible film pressure conversion module of the robot bionic hand to obtain soft and hard attribute data of object materials; and the number of the first and second groups,
and the grabbing control module is in communication connection with the paw driving motor of the bionic robot hand and controls the three-finger paw body to grab the object based on the soft and hard attribute data of the material of the object.
According to at least one embodiment of the present disclosure, the system for grasping a robot bionic hand, wherein the tactile analysis module is in communication connection with the flexible film pressure conversion module of the robot bionic hand, and is configured to receive and analyze data of a tactile sensor converted by the flexible film pressure conversion module of the robot bionic hand, and obtain soft and hard attribute data of a material of an object, includes:
the data of the touch sensor is analyzed in real time through a neural network algorithm to obtain soft and hard attribute data of the material of the object, and the data of the touch sensor is a digital signal corresponding to the contact force and the contact force distribution of the object when the touch sensor contacts the object.
According to this imitative biological hand grasping system of robot of at least one embodiment of this disclosure, based on the soft or hard attribute data of object material, control the three-finger claw body and snatch the object, include:
adjusting the grabbing force based on the soft and hard attribute data of the material;
adjusting the grabbing position based on the soft and hard attribute data of the material; and the number of the first and second groups,
and adjusting the grabbing posture of the three-finger paw body based on the soft and hard attribute data of the material.
According to the robot bionic hand grabbing system of at least one embodiment of the disclosure, the touch sensation analysis module and the grabbing control module are distributed on a remote computer.
Drawings
The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the disclosure and together with the description serve to explain the principles of the disclosure.
Fig. 1 is a schematic structural diagram of a robotic bionic hand according to one embodiment of the present disclosure.
Fig. 2 is a schematic structural view of a three-finger gripper body of a robotic bionic hand according to one embodiment of the present disclosure.
FIG. 3 is a schematic diagram of a circular sensor configuration according to one embodiment of the present disclosure.
Fig. 4 is a schematic view of an elongated sensor structure according to one embodiment of the present disclosure.
Fig. 5 is a flowchart illustrating a method for grasping a bionic hand by a robot according to an embodiment of the disclosure.
Fig. 6 is a schematic structural diagram of a robotic biomimetic hand grasping system according to one embodiment of the present disclosure.
Description of the reference numerals
1000 robot bionic hand
1002 three-finger paw body
1004 gripper driving motor
1006 tactile sensor
1008 flexible film pressure conversion module
1021 palm
1022 first finger
1023 second finger
1024 third finger
1025 finger root
1026 in the finger
1027 refers to the front
1028 finger tip
1029 sensing region
1030 Pin
2000 robot bionic hand grasping system
2002 robot bionic hand
2004 tactile analysis Module
2006 capture control module
2100 bus
2200 processor
2300 memory
2400 other circuits.
Detailed Description
The present disclosure will be described in further detail with reference to the drawings and embodiments. It is to be understood that the specific embodiments described herein are for purposes of illustration only and are not to be construed as limitations of the present disclosure. It should be further noted that, for the convenience of description, only the portions relevant to the present disclosure are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present disclosure may be combined with each other without conflict. Technical solutions of the present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Unless otherwise indicated, the illustrated exemplary embodiments/examples are to be understood as providing exemplary features of various details of some ways in which the technical concepts of the present disclosure may be practiced. Accordingly, unless otherwise indicated, features of the various embodiments may be additionally combined, separated, interchanged, and/or rearranged without departing from the technical concept of the present disclosure.
The use of cross-hatching and/or shading in the drawings is generally used to clarify the boundaries between adjacent components. As such, unless otherwise noted, the presence or absence of cross-hatching or shading does not convey or indicate any preference or requirement for a particular material, material property, size, proportion, commonality between the illustrated components and/or any other characteristic, attribute, property, etc., of a component. Further, in the drawings, the size and relative sizes of components may be exaggerated for clarity and/or descriptive purposes. While example embodiments may be practiced differently, the specific process sequence may be performed in a different order than that described. For example, two processes described consecutively may be performed substantially simultaneously or in reverse order to that described. In addition, like reference numerals denote like parts.
When an element is referred to as being "on" or "on," "connected to" or "coupled to" another element, it can be directly on, connected or coupled to the other element or intervening elements may be present. However, when an element is referred to as being "directly on," "directly connected to" or "directly coupled to" another element, there are no intervening elements present. For purposes of this disclosure, the term "connected" may refer to physically, electrically, etc., and may or may not have intermediate components.
For descriptive purposes, the present disclosure may use spatially relative terms such as "below … …," below … …, "" below … …, "" below, "" above … …, "" above, "" … …, "" higher, "and" side (e.g., as in "side wall") to describe one component's relationship to another (other) component as illustrated in the figures. Spatially relative terms are intended to encompass different orientations of the device in use, operation, and/or manufacture in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as "below" or "beneath" other elements or features would then be oriented "above" the other elements or features. Thus, the exemplary term "below … …" can encompass both an orientation of "above" and "below". Further, the devices may be otherwise positioned (e.g., rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
The terminology used herein is for the purpose of describing particular embodiments and is not intended to be limiting. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. Furthermore, when the terms "comprises" and/or "comprising" and variations thereof are used in this specification, the presence of stated features, integers, steps, operations, elements, components and/or groups thereof are stated but does not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components and/or groups thereof. It is also noted that, as used herein, the terms "substantially," "about," and other similar terms are used as approximate terms and not as degree terms, and as such, are used to interpret inherent deviations in measured values, calculated values, and/or provided values that would be recognized by one of ordinary skill in the art.
Fig. 1 and 2 are schematic structural views of a robotic bionic hand according to one embodiment of the present disclosure.
As shown in fig. 1 and 2, a robotic bionic hand 1000 includes:
the three-finger claw body 1002 comprises a palm 1021, a first finger 1022, a second finger 1023 and a third finger 1024, wherein the fingers comprise a finger root 1025, a finger middle 1026, a finger front 1027 and a finger tip 1028, the palm 1021 is respectively connected with the first finger 1022, the second finger 1023 and the finger root of the third finger 1024 through hinging, the finger root of the finger is connected with the finger middle through hinging, the finger front 1027 of the finger is connected with the finger tip 1028 into a whole, and the positions of the first finger 1022, the second finger 1023 and the finger tip 1024 can be adjusted at will;
the paw driving motor 1004 is arranged on the three-finger paw body 1002 and is used for driving and controlling the three-finger paw body 1002 to run;
the touch sensors 1006 are arranged on three fingers of the three-finger paw body to form an array type touch sensor, and are used for sensing the contact force and the contact force distribution of an object and converting the contact force and the contact force distribution into electric signals matched with pressure; and the number of the first and second groups,
and the flexible film pressure conversion module 1008 is used for converting the analog signal of the touch sensor into a digital signal.
Among them, the composition of the tactile sensor 1006 includes:
the polyimide insulating layer uses epoxy resin as a filler of the piezoresistive sheet; and the number of the first and second groups,
the high molecular polymer thick film device is positioned between the polyimide insulating layers.
Among them, the tactile sensor 1006 includes:
a sensing region 1029 for sensing a contact force of a contact; and the number of the first and second groups,
pin 1030 for outputting a signal.
The shape of the sensing area comprises a circle or/and a long strip.
The robot bionic hand according to at least one embodiment of the present disclosure has a circular diameter of 15mm, a long strip of 50mm in length and a long strip of 15mm in width.
FIG. 3 is a schematic diagram of a circular sensor configuration according to one embodiment of the present disclosure.
As shown in fig. 3, the circular tactile sensor includes: a sensing region 1029 and a pin 1030.
Fig. 4 is a schematic view of an elongated sensor structure according to one embodiment of the present disclosure.
As shown in fig. 4, the elongated tactile sensor includes: a sensing region 1029 and a pin 1030.
Fig. 2 is a schematic flow diagram of a robotic bionic hand grasping method according to one embodiment of the disclosure.
As shown in fig. 5, a method S100 for grasping a bionic hand by a robot includes:
s102: the robot simulates the bionic hand to start grabbing;
s104: the gripper driving motor drives and controls the three-finger gripper body of the bionic hand of the robot to grip an object based on gripping parameters, wherein the gripping parameters comprise gripping force, gripping posture and gripping position;
s106: judging whether the grabbing is successful, if so, going to step S114,
s108: the touch sensor of the robot bionic hand acquires the contact force of a contact object and distributed electric signals and converts the electric signals into digital signals through the flexible film pressure conversion module of the robot bionic hand;
s110: analyzing the digital signal to obtain the soft and hard attribute data of the object;
s112: based on the acquired soft and hard attribute data of the object, adjusting the grabbing parameters of the paw driving motor, including the grabbing force, the grabbing posture and the grabbing position of the three-finger paw body, and going to step S104; and the number of the first and second groups,
s114: and finishing grabbing.
Fig. 6 is a schematic structural diagram of a robotic biomimetic hand grasping system according to one embodiment of the present disclosure.
The means in the system may comprise respective modules for performing each or several of the steps of the above-described flow charts. Thus, each step or several steps in the above-described flow charts may be performed by a respective module, and the apparatus may comprise one or more of these modules. The modules may be one or more hardware modules specifically configured to perform the respective steps, or implemented by a processor configured to perform the respective steps, or stored within a computer-readable medium for implementation by a processor, or by some combination.
The hardware architecture may be implemented using a bus architecture. The bus architecture may include any number of interconnecting buses and bridges depending on the specific application of the hardware and the overall design constraints. The bus connects together various circuits including one or more processors, memories, and/or hardware modules. The bus may also connect various other circuits such as peripherals, voltage regulators, power management circuits, external antennas, and the like.
The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one connection line is shown, but no single bus or type of bus is shown.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present disclosure includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the implementations of the present disclosure. The processor performs the various methods and processes described above. For example, method embodiments in the present disclosure may be implemented as a software program tangibly embodied in a machine-readable medium, such as a memory. In some embodiments, some or all of the software program may be loaded and/or installed via memory and/or a communication interface. When the software program is loaded into memory and executed by a processor, one or more steps of the method described above may be performed. Alternatively, in other embodiments, the processor may be configured to perform one of the methods described above by any other suitable means (e.g., by means of firmware).
The logic and/or steps represented in the flowcharts or otherwise described herein may be embodied in any readable storage medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
For the purposes of this description, a "readable storage medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the readable storage medium include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable read-only memory (CDROM). In addition, the readable storage medium may even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in the memory.
It should be understood that portions of the present disclosure may be implemented in hardware, software, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps of the method implementing the above embodiments may be implemented by hardware that is instructed to implement by a program, which may be stored in a readable storage medium, and when executed, includes one or a combination of the steps of the method embodiments.
In addition, each functional unit in the embodiments of the present disclosure may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a separate product, may also be stored in a readable storage medium. The storage medium may be a read-only memory, a magnetic or optical disk, or the like.
As shown in fig. 6, a robotic biomimetic hand grasping system 2000 comprises:
the biomimetic hand 2002 of any of the robots;
the touch analysis module 2004 is in communication connection with the flexible film pressure conversion module of the robot bionic hand, and is used for receiving and analyzing the data of the touch sensor converted by the flexible film pressure conversion module of the robot bionic hand to obtain the soft and hard attribute data of the material of the object; and the number of the first and second groups,
and the grabbing control module 2006 is in communication connection with a paw driving motor of the bionic robot hand and controls the three-finger paw body to grab the object based on the soft and hard attribute data of the material of the object.
According to the bionic robot hand grasping system of at least one embodiment of the present disclosure, a touch analysis module is in communication connection with the flexible film pressure conversion module of the bionic robot hand, and is configured to receive and analyze data of a touch sensor converted by the flexible film pressure conversion module of the bionic robot hand, and obtain soft and hard attribute data of a material of an object, including:
the data of the touch sensor is analyzed in real time through a neural network algorithm to obtain the soft and hard attribute data of the material of the object, and the data of the touch sensor is the digital signal corresponding to the contact force and the contact force distribution of the object when the touch sensor contacts the object.
According to this imitative biological hand grasping system of robot of at least one embodiment of this disclosure, based on the soft or hard attribute data of object material, control three-finger hand claw body and snatch the object, include:
adjusting the grabbing force based on the soft and hard attribute data of the material;
adjusting the grabbing position based on the soft and hard attribute data of the material; and the number of the first and second groups,
and adjusting the grabbing posture of the three-finger paw body based on the soft and hard attribute data of the material.
According to the robot bionic hand grabbing system of at least one embodiment of the disclosure, the touch sensation analysis module and the grabbing control module are distributed on a remote computer.
In the description herein, reference to the description of the terms "one embodiment/mode," "some embodiments/modes," "example," "specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment/mode or example is included in at least one embodiment/mode or example of the application. In this specification, the schematic representations of the terms used above are not necessarily intended to be the same embodiment/mode or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments/modes or examples. Furthermore, the various embodiments/aspects or examples and features of the various embodiments/aspects or examples described in this specification can be combined and combined by one skilled in the art without conflicting therewith.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
It will be understood by those skilled in the art that the foregoing embodiments are merely for clarity of illustration of the disclosure and are not intended to limit the scope of the disclosure. Other variations or modifications may occur to those skilled in the art, based on the foregoing disclosure, and are still within the scope of the present disclosure.

Claims (10)

1. A robotic bionic hand, comprising:
the three-finger paw comprises a three-finger paw body, wherein the three-finger paw body comprises a palm, a first finger, a second finger and a third finger, the fingers comprise finger roots, finger middle parts, finger front parts and finger tips, the palm is respectively hinged with the finger roots of the first finger, the second finger and the third finger, the finger roots of the fingers are hinged with the finger middle parts, the finger middle parts of the fingers are hinged with the finger front parts, the finger front parts of the fingers and the finger tips are integrated, and the positions of the first finger, the second finger and the third finger can be adjusted at will;
the paw driving motor is arranged on the three-finger paw body and used for driving and controlling the three-finger paw body to operate;
the touch sensors are arranged on three fingers of the three-finger paw body to form an array type touch sensor which is used for sensing the contact force and the contact force distribution of an object and converting the contact force and the contact force distribution into an electric signal matched with pressure; and
and the flexible film pressure conversion module is used for converting the analog signal of the touch sensor into a digital signal.
2. The robotic bionic hand of claim 1, wherein the composition of the tactile sensor comprises:
the polyimide insulating layer uses epoxy resin as a filler of the piezoresistive sheet; and
and the high molecular polymer thick film device is positioned between the polyimide insulating layers.
3. The robotic bionic hand of claim 1, wherein the tactile sensor comprises:
the sensing area is used for sensing the contact force of a contact object; and
and the pin is used for outputting signals.
4. A robotic bionic hand as claimed in claim 3, wherein the sensing zone shape comprises circular or/and elongated.
5. A robotic bionic hand as claimed in claim 4, wherein the circular diameter is 15mm, the length of the bar is 50mm and the width of the bar is 15 mm.
6. A robot bionic hand grabbing method is characterized by comprising the following steps:
the robot simulates the bionic hand to start grabbing;
the gripper driving motor drives and controls the three-finger gripper body of the bionic hand of the robot to grip an object based on gripping parameters, wherein the gripping parameters comprise gripping force, gripping posture and gripping position;
judging whether the grabbing is successful or not, if so, finishing, otherwise, acquiring the contact force of the contact object and the distributed electric signals by the touch sensor of the robot bionic hand, and converting the contact force and the distributed electric signals into digital signals through the flexible film pressure conversion module of the robot bionic hand;
analyzing the digital signals to obtain object soft and hard attribute data; and
and based on the acquired soft and hard attribute data of the object, the grabbing parameters of the paw driving motor are adjusted, including the grabbing force, the grabbing posture and the grabbing position of the three-finger paw body, and the three-finger paw body of the bionic hand of the robot is driven and controlled by the paw driving motor to continuously and repeatedly grab the object until the grabbing is successful.
7. A robotic biomimetic hand grasping system, comprising:
the robotic bionic hand of any one of claims 1 to 5;
the touch analysis module is in communication connection with the flexible film pressure conversion module of the robot bionic hand and is used for receiving and analyzing data of the touch sensor converted by the flexible film pressure conversion module of the robot bionic hand to obtain soft and hard attribute data of object materials; and
and the grabbing control module is in communication connection with the paw driving motor of the bionic robot hand and controls the three-finger paw body to grab the object based on the soft and hard attribute data of the material of the object.
8. The system as claimed in claim 7, wherein the tactile analysis module, communicatively connected to the flexible film pressure conversion module of the biomimetic robotic hand, is configured to receive and analyze data of the tactile sensor converted by the flexible film pressure conversion module of the biomimetic robotic hand, and obtain soft and hard attribute data of the material of the object, and comprises:
the data of the touch sensor is analyzed in real time through a neural network algorithm to obtain soft and hard attribute data of the material of the object, and the data of the touch sensor is a digital signal corresponding to the contact force and the contact force distribution of the object when the touch sensor contacts the object.
9. The robotic biomimetic hand-grasping system according to claim 7, wherein the controlling the three-finger gripper body to grasp an object based on soft and hard attribute data of the material of the object comprises:
adjusting the grabbing force based on the soft and hard attribute data of the material;
adjusting the grabbing position based on the soft and hard attribute data of the material; and
and adjusting the grabbing posture of the three-finger paw body based on the soft and hard attribute data of the material.
10. The robotic biomimetic hand grasping system according to claim 7, wherein the haptic analysis module, the grasping control module are distributed on a remote computer.
CN202111070052.2A 2021-09-13 2021-09-13 Robot bionic hand and grabbing method and system Active CN113858238B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111070052.2A CN113858238B (en) 2021-09-13 2021-09-13 Robot bionic hand and grabbing method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111070052.2A CN113858238B (en) 2021-09-13 2021-09-13 Robot bionic hand and grabbing method and system

Publications (2)

Publication Number Publication Date
CN113858238A true CN113858238A (en) 2021-12-31
CN113858238B CN113858238B (en) 2023-07-21

Family

ID=78995673

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111070052.2A Active CN113858238B (en) 2021-09-13 2021-09-13 Robot bionic hand and grabbing method and system

Country Status (1)

Country Link
CN (1) CN113858238B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100999077A (en) * 2006-12-28 2007-07-18 中国科学院合肥物质科学研究院 Multipurpose shape self-adaptive robot paw and working method
US20160025615A1 (en) * 2014-07-22 2016-01-28 SynTouch, LLC Method and applications for measurement of object tactile properties based on how they likely feel to humans
CN106346510A (en) * 2016-10-11 2017-01-25 佛山科学技术学院 Flexible three-finger clamp holder having touch sensing function
US20220239296A1 (en) * 2019-06-19 2022-07-28 Mitsui Chemicals, Inc. Tactile sensor formed on polyimide thin film having high total light transmittance, and switching device using same

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100999077A (en) * 2006-12-28 2007-07-18 中国科学院合肥物质科学研究院 Multipurpose shape self-adaptive robot paw and working method
US20160025615A1 (en) * 2014-07-22 2016-01-28 SynTouch, LLC Method and applications for measurement of object tactile properties based on how they likely feel to humans
CN106346510A (en) * 2016-10-11 2017-01-25 佛山科学技术学院 Flexible three-finger clamp holder having touch sensing function
US20220239296A1 (en) * 2019-06-19 2022-07-28 Mitsui Chemicals, Inc. Tactile sensor formed on polyimide thin film having high total light transmittance, and switching device using same

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
龚晓光: "三指机械手结构与控制系统的设计", 中国优秀硕士学位论文全文数据库 *

Also Published As

Publication number Publication date
CN113858238B (en) 2023-07-21

Similar Documents

Publication Publication Date Title
Goger et al. Tactile sensing for an anthropomorphic robotic hand: Hardware and signal processing
Homberg et al. Robust proprioceptive grasping with a soft robot hand
Gorges et al. Haptic object recognition using passive joints and haptic key features
US9914212B2 (en) Systems and methods for sensing objects
US7707001B2 (en) Control of object operating force, object gripping force and robot hands
Drimus et al. Classification of rigid and deformable objects using a novel tactile sensor
Petković et al. Adaptive control algorithm of flexible robotic gripper by extreme learning machine
WO2009144767A1 (en) Complex sensor and robot hand
CN107127735A (en) People's demonstration formula has the robot learning of power and position purpose task
Cannata et al. An embedded tactile and force sensor for robotic manipulation and grasping
Kappassov et al. Semi-anthropomorphic 3D printed multigrasp hand for industrial and service robots
Büscher et al. Tactile dataglove with fabric-based sensors
Nassour et al. Design of new sensory soft hand: Combining air-pump actuation with superimposed curvature and pressure sensors
CN113858238A (en) Robot bionic hand and grabbing method and system
Cirillo et al. Design and evaluation of tactile sensors for the estimation of grasped wire shape
Wiranata et al. A DIY fabrication approach of stretchable sensors using carbon nano tube powder for wearable device
Tajima et al. Robust bin-picking system using tactile sensor
CN113942009B (en) Robot bionic hand grabbing method
JP2006136983A (en) Detection method in distribution type tactile sensor for robot hand and robot hand
CN112659162A (en) Touch sensing fingertip device and robot
Fiedler et al. A low-cost modular system of customizable, versatile, and flexible tactile sensor arrays
Lu et al. 3-D tactile-based object recognition for robot hands using force-sensitive and bend sensor arrays
Li Touching is believing: sensing and analyzing touch information with GelSight
Dong et al. Design and tactile classification of flexible tactile sensor for soft gripper
Nawrocki et al. Structured computational polymers for a soft robot: Actuation and cognition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant