CN110091331A - Grasping body method, apparatus, equipment and storage medium based on manipulator - Google Patents

Grasping body method, apparatus, equipment and storage medium based on manipulator Download PDF

Info

Publication number
CN110091331A
CN110091331A CN201910371561.5A CN201910371561A CN110091331A CN 110091331 A CN110091331 A CN 110091331A CN 201910371561 A CN201910371561 A CN 201910371561A CN 110091331 A CN110091331 A CN 110091331A
Authority
CN
China
Prior art keywords
manipulator
soft
grabbed
hard grade
grade
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910371561.5A
Other languages
Chinese (zh)
Inventor
刘文印
莫秀云
陈俊洪
梁达勇
朱展模
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong University of Technology
Original Assignee
Guangdong University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong University of Technology filed Critical Guangdong University of Technology
Priority to CN201910371561.5A priority Critical patent/CN110091331A/en
Publication of CN110091331A publication Critical patent/CN110091331A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Automation & Control Theory (AREA)
  • Multimedia (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Fuzzy Systems (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a kind of grasping body method, apparatus, equipment and computer readable storage medium based on manipulator, comprising: the image input of object to be grabbed is previously-completed in trained convolutional neural networks, exports the first soft or hard grade of object to be grabbed;Manipulator closure is controlled, until manipulator touches object to be grabbed, and records the initial crawl angle of manipulator at this time;After control manipulator is closed the preset percentage of initial crawl angle again, and record the current teady state pressure value of manipulator and electrode impedance reduction amount;Teady state pressure value and electrode impedance reduction amount are input to and are previously-completed in trained sorting algorithm, the second soft or hard grade of object to be grabbed is exported;According to the first soft or hard grade and the second soft or hard grade, the soft or hard grade of target of object to be grabbed is determined;Corresponding with the soft or hard grade of target target pressure value is searched, the manipulator closure is controlled up to the pressure value of the manipulator reaches target pressure value, realizes the grasping manipulation of object to be grabbed.

Description

Grasping body method, apparatus, equipment and storage medium based on manipulator
Technical field
The present invention relates to machine control techniques fields, more particularly to a kind of grasping body method based on manipulator, dress It sets, equipment and computer readable storage medium.
Background technique
The soft or hard physical attribute one of important as object influences manipulator crawl control to a certain extent.Manpower energy The soft or hard degree that object is accurately identified by complicated tactilely-perceptible system, then takes suitable strength to be grabbed.So And this simple task is not easy to for manipulator, most of manipulators only have basic pressure feedback, are difficult It is directly soft or hard to object to distinguish, thus affect the crawl control of different soft and hard object.
In the prior art in order to realize the crawl to soft or hard object, four are had chosen other than soft or hard attribute, outer parameter For all consistent experimental subjects of number as crawl object, shape is 5cm*5cm*5cm.And to this four experimental subjects label four Different soft or hard grades.Control manipulator opens completely, and four kinds of different testees of soft or hard degree are sequentially placed into manipulator Inside keeps the left and right edges of testee identical at a distance from the finger sensor contact surface of two sides.Control mechanical finger at the uniform velocity closes It closes, position feedback thinks that sensor has contacted object after reaching pre-set contact position.After contact, it is closed a fixation Distance after stop, acquire sensing data.After manipulator repeatedly grabs, the numeric feedback training network of manipulator is obtained, is obtained To soft or hard grade.Using the formula of design, the corresponding grasp force of soft or hard grade and closing distance are calculated, realizes that soft or hard object is grabbed It takes.
In manipulator crawl experiment, good effect is had been obtained in the grasping of prior art rigid objects, but uses The method for grasping rigid objects is applied on very soft object, and the effect is unsatisfactory for crawl.The prior art is in training net When network, the crawl object size of selection is fixed, this does not meet the task standard of accurate crawl familiar object, causes manipulator right When common different size of object is grabbed in life, there is very big error.And the prior art is only using machinery Hand feeling data realize the soft or hard identification of object, and data sheet one causes ineffective.
In summary as can be seen that how to accurately identify the soft or hard grade for being grabbed object, and ensure that object is held and not There is deformation and breakage is current problem to be solved.
Summary of the invention
The object of the present invention is to provide a kind of, and grasping body method, apparatus, equipment and computer based on manipulator can Storage medium is read, to solve to be crawled the soft or hard grade of object due to that cannot accurately identify in the prior art, causing cannot be true Stablizing in the case where deformation and breakage, which does not occur, in guarantor grabs described the problem of being grabbed object.
In order to solve the above technical problems, the present invention provides a kind of grasping body method based on manipulator, comprising: will be wait grab It takes the image of object to input to be previously-completed in trained convolutional neural networks, the first of the output object to be grabbed is soft or hard etc. Grade;Manipulator closure is controlled, until the manipulator touches the object to be grabbed, and the manipulator is recorded and touches institute State initial crawl angle when grabbing object;Control the default percentage that the manipulator is closed the initial crawl angle again Than, and record the current teady state pressure value of the manipulator and electrode impedance reduction amount;By the teady state pressure value and the electricity Pole impedance reduction amount, which is input to, to be previously-completed in trained sorting algorithm, and the second soft or hard grade of the object to be grabbed is exported; According to the described first soft or hard grade and the second soft or hard grade, the soft or hard grade of target of the object to be grabbed is determined;It searches Target pressure value corresponding with the soft or hard grade of the target controls the manipulator closure until the pressure value of the manipulator reaches To the target pressure value, the grasping manipulation of the object to be grabbed is realized.
Preferably, described that the image input of object to be grabbed is previously-completed in trained convolutional neural networks, export institute The the first soft or hard grade for stating object to be grabbed includes:
In the Kinect observation scene for being previously-completed calibration using smart collaboration robot, Kinect image is obtained;
Utilize the target area that there is the object to be grabbed in Kinect image described in YOLOv3 target detection internet search Domain;
The image input of target area is previously-completed in trained target VGG16 network, the object to be grabbed is exported The first soft or hard grade.
Preferably, further includes:
The training object that preset quantity is chosen in ImageNet data set, training object described in manual annotation it is soft or hard etc. Grade;
The image of the trained object is input in VGG16 network, the VGG16 network is trained, has been obtained At trained target VGG16 network.
Preferably, described be input to the teady state pressure value and the electrode impedance reduction amount is previously-completed trained point In class algorithm, the second soft or hard grade for exporting the object to be grabbed includes:
The teady state pressure value and the electrode impedance reduction amount are input to and are previously-completed trained k nearest neighbour classification calculation In method, the second soft or hard grade of the object to be grabbed is exported.
Preferably, described according to the described first soft or hard grade and the second soft or hard grade, determine the object to be grabbed The soft or hard grade of target include:
It is that the first weighted factor is arranged in the described first soft or hard grade using weighted mean method, is set for the described second soft or hard grade Set the second weighted factor, wherein first weighted factor and second weighted factor and be 1;
According to first weighted factor and second weighted factor, by the described first soft or hard grade and described second soft Hard grade is merged, and determines the target software grade of the object to be grabbed.
Preferably, the control manipulator closure, until the manipulator touches the object to be grabbed, and records institute It states manipulator and touches initial crawl angle when grabbing object and include:
Manipulator closure is controlled, carries out the manipulator and described wait grab using the PAC value of Biotac sensor The contact of object detects;
When the PAC value is shaken, then the manipulator is contacted with the object to be grabbed, and records the machinery Hand touches initial crawl angle when grabbing object.
The present invention also provides a kind of device for grasping bodies based on manipulator, comprising:
First soft or hard grade output module, for the image input of object to be grabbed to be previously-completed trained convolutional Neural In network, the first soft or hard grade of the object to be grabbed is exported;
Control module until the manipulator touches the object to be grabbed, and is recorded for controlling manipulator closure The manipulator touches initial crawl angle when grabbing object;
Logging modle, the preset percentage for being closed the initial crawl angle again for controlling the manipulator, and remember Record the current teady state pressure value of the manipulator and electrode impedance reduction amount;
Second soft or hard grade output module, it is pre- for the teady state pressure value and the electrode impedance reduction amount to be input to It first completes in trained sorting algorithm, exports the second soft or hard grade of the object to be grabbed;
Determining module, for determining the object to be grabbed according to the described first soft or hard grade and the second soft or hard grade The soft or hard grade of the target of body;
Handling module controls the manipulator and closes for searching target pressure value corresponding with the soft or hard grade of the target It closes until the pressure value of the manipulator reaches the target pressure value, the grasping manipulation of the realization object to be grabbed.
Preferably, the control module is specifically used for:
Manipulator closure is controlled, carries out the manipulator and described wait grab using the PAC value of Biotac sensor The contact of object detects;When the PAC value is shaken, then the manipulator is contacted with the object to be grabbed, and is recorded The manipulator touches initial crawl angle when grabbing object.
The grasping body equipment based on manipulator that the present invention also provides a kind of, comprising:
Memory, for storing computer program;Processor realizes above-mentioned one kind when for executing the computer program The step of grasping body method based on manipulator.
The present invention also provides a kind of computer readable storage medium, meter is stored on the computer readable storage medium Calculation machine program, the computer program realize the step of a kind of above-mentioned grasping body method based on manipulator when being executed by processor Suddenly.
The image input of object to be grabbed is previously-completed by the grasping body method provided by the present invention based on manipulator In trained convolutional neural networks, the first soft or hard grade of the object to be grabbed is exported.Manipulator closure is controlled, until described Manipulator touches described when grabbing object, the initial crawl angle of the record manipulator at this time.Control the manipulator After being closed the preset percentage of the initial crawl angle again, the current teady state pressure value of the manipulator and electrode resistance are recorded Anti- reduction amount.The teady state pressure value and electrode impedance reduction amount input are previously-completed in trained sorting algorithm, institute State the second soft or hard grade of object to be grabbed.Described first soft or hard grade and the second soft or hard grade are merged, determined The soft or hard grade of target of the object to be grabbed.Present invention combination visual pattern and haptic data treat the soft or hard etc. of crawl object Grade is identified, the accuracy of identification of soft or hard grade is substantially increased.And the machinery is set by the soft or hard grade of the target The actual grasping force of hand ensures that the object to be grabbed is held but do not occur deforming and damaged.
Detailed description of the invention
It, below will be to embodiment or existing for the clearer technical solution for illustrating the embodiment of the present invention or the prior art Attached drawing needed in technical description is briefly described, it should be apparent that, the accompanying drawings in the following description is only this hair Bright some embodiments for those of ordinary skill in the art without creative efforts, can be with root Other attached drawings are obtained according to these attached drawings.
Fig. 1 is the process of the first specific embodiment of the grasping body method provided by the present invention based on manipulator Figure;
Fig. 2 is the process of second of specific embodiment of the grasping body method provided by the present invention based on manipulator Figure;
Fig. 3 is the process of the third specific embodiment of the grasping body method provided by the present invention based on manipulator Figure;
Fig. 4 is a kind of structural block diagram of the device for grasping bodies based on manipulator provided in an embodiment of the present invention.
Specific embodiment
Core of the invention is to provide a kind of grasping body method, apparatus, equipment and computer based on manipulator can It reads storage medium and improves the success rate for grabbing object to be grabbed by precisely predicting the soft or hard grade of object to be grabbed.
In order to enable those skilled in the art to better understand the solution of the present invention, with reference to the accompanying drawings and detailed description The present invention is described in further detail.Obviously, described embodiments are only a part of the embodiments of the present invention, rather than Whole embodiments.Based on the embodiments of the present invention, those of ordinary skill in the art are not making creative work premise Under every other embodiment obtained, shall fall within the protection scope of the present invention.
Referring to FIG. 1, Fig. 1 is the first specific implementation of the grasping body method provided by the present invention based on manipulator The flow chart of example;Specific steps are as follows:
Step S101: the image input of object to be grabbed is previously-completed in trained convolutional neural networks, described in output The soft or hard grade of the first of object to be grabbed;
In the present embodiment, can by the soft or hard grade classification of the object to be grabbed be Pyatyi: grade A, grade B, etc. Grade C, grade D and grade E.The grade A indicates that the object to be grabbed is very soft, and the grade B indicates described wait grab Object is more soft, and the grade C indicates that the object to be grabbed is slightly soft, and the grade D indicates the object to be grabbed Harder, the grade E indicates that the object to be grabbed is stone.It should be noted that in other embodiments of the present invention, it can also To be the number of levels of other data by the software grade classification of the object to be grabbed.
Step S102: control manipulator closure, until the manipulator touches the object to be grabbed, and described in record Manipulator touches initial crawl angle when grabbing object;
Step S103: the preset percentage that the manipulator is closed the initial crawl angle again is controlled, and records institute State the current teady state pressure value of manipulator and electrode impedance reduction amount;
Step S104: the teady state pressure value and the electrode impedance reduction amount are input to and are previously-completed trained classification In algorithm, the second soft or hard grade of the object to be grabbed is exported;
Step S105: according to the described first soft or hard grade and the second soft or hard grade, the object to be grabbed is determined The soft or hard grade of target;
Step S106: searching target pressure value corresponding with the soft or hard grade of the target, and it is straight to control the manipulator closure Pressure value to the manipulator reaches the target pressure value, realizes the grasping manipulation of the object to be grabbed.
In the present embodiment, the image of object to be grabbed is input to and is previously-completed trained convolutional neural networks, realized The soft or hard grade of the object to be grabbed is predicted.Record manipulator crawl initial crawl angle when grabbing object The manipulator is reclosed predetermined angle on the basis of the initial crawl angle by degree, records the manipulator at this time Teady state pressure value and electrode impedance reduction amount.The teady state pressure value and the electrode impedance reduction amount are input to and are previously-completed In trained sorting algorithm, the soft or hard grade of the object to be grabbed is exported.And by the convolutional neural networks and the classification The software grade fusion of algorithm output provides real guarantee for the crawl of stablizing of the object to be grabbed.
Based on the above embodiment, it can use YOLOv3 target detection network in the present embodiment in smart collaboration robot The target area of object to be grabbed is searched in the Kinect image of acquisition;Selection is previously-completed trained target VGG16 network pair The image of the target area is handled, and obtains the first soft or hard grade of the object to be grabbed according to output result.It please join Fig. 2 is examined, Fig. 2 is the flow chart of second of specific embodiment of the grasping body method provided by the present invention based on manipulator; Specific steps are as follows:
Step S201: it in the Kinect observation scene for being previously-completed calibration using smart collaboration robot, obtains Kinect image;
Step S202: the object to be grabbed in the presence of described in Kinect image described in YOLOv3 target detection internet search is utilized Target area;
After being demarcated with smart collaboration (baxter) robot to kinect observation scene, kinect image is obtained.Benefit The target area that there is the object to be grabbed in the kinect image is found with YOLOv3 target detection network.
Step S203: the input of the image of target area is previously-completed in trained target VGG16 network, output it is described to Grab the first soft or hard grade of object;
Choose ImageNet data set in preset quantity object, be with the image of 80 groups of objects in the present embodiment It arranges, the soft or hard grade of the image of 80 groups of objects described in manual annotation, the image of 80 groups of objects after marking soft or hard grade is utilized to make For data set, training VGG16 network obtains the soft or hard grade of the image of 80 groups of objects, completes to the VGG16 network Training, obtains the target VGG16 network.
In other embodiments of the invention, it is also an option that other convolutional neural networks for completing training realize it is described to Grab the prediction of the soft or hard grade of object.
Step S204: controlling the manipulator closure of the smart collaboration robot, until the manipulator touch it is described Object to be grabbed, and record the manipulator and touch initial crawl angle when grabbing object;
Step S205: the preset percentage that the manipulator is closed the initial crawl angle again is controlled, described in record The current teady state pressure value of manipulator and electrode impedance reduction amount;
Step S206: the teady state pressure value and the electrode impedance reduction amount are input to and are previously-completed trained classification In algorithm, the second soft or hard grade of the object to be grabbed is exported;
Step S207: according to the described first soft or hard grade and the second soft or hard grade, the object to be grabbed is determined The soft or hard grade of target;
Step S208: it is respectively that weighted factor is arranged in the first soft or hard grade and the second soft or hard grade, obtains institute State the first weighted factor of the first soft or hard grade and the second weighted factor of the second soft or hard grade, wherein described first adds Weight factor and second weighted factor and be 1;
Step S209: according to first weighted factor and second weighted factor, by the described first soft or hard grade and The second soft or hard grade is merged, and determines the target software grade of the object to be grabbed;
Step S210: searching target pressure value corresponding with the soft or hard grade of the target, and it is straight to control the manipulator closure Pressure value to the manipulator reaches the target pressure value, realizes the grasping manipulation of the object to be grabbed.
A kind of double linear-elsatic buckling frames for predicting the soft or hard grade of object to be grabbed are present embodiments provided, by predicting wait grab The soft or hard grade of the target of object is taken to judge the practical grasping force of manipulator, to guarantee to grab the object to be grabbed When taking, the object to be grabbed has no obvious deformation substantially.
Based on the above embodiment, in the present embodiment, the teady state pressure value and electrode impedance for obtaining object to be grabbed are reduced Amount, and the teady state pressure value and the electrode impedance reduction amount are input to and are previously-completed trained k nearest neighbour classification algorithm In, export the second soft or hard grade of the object to be grabbed.Obtaining the first soft or hard grade and second of the object to be grabbed After soft or hard grade, the first software grade and the second soft or hard grade are merged using weighted mean method, obtain institute State the soft or hard grade of target of object to be grabbed.Referring to FIG. 3, Fig. 3 is the grasping body provided by the present invention based on manipulator The flow chart of the third specific embodiment of method;Specific steps are as follows:
Step S301: it in the Kinect observation scene for being previously-completed calibration using smart collaboration robot, obtains Kinect image;
Step S302: the object to be grabbed in the presence of described in Kinect image described in YOLOv3 target detection internet search is utilized Target area;
Step S303: the input of the image of target area is previously-completed in trained target VGG16 network, output it is described to Grab the first soft or hard grade of object;
Step S304: control the smart collaboration robot manipulator closure, using Biotac sensor PAC value into Row manipulator contact with the object to be grabbed detection;
Step S305: when the PAC value is shaken, then the manipulator is contacted with the object to be grabbed, and is remembered It records the manipulator and touches initial crawl angle when grabbing object;
Step S306: the manipulator is closed the initial crawl angle again 20 percent, described in record is controlled The current teady state pressure value of manipulator and electrode impedance reduction amount;
Step S307: the teady state pressure value and the electrode impedance reduction amount are input to and are previously-completed trained k most In neighbouring sorting algorithm, the second soft or hard grade of the object to be grabbed is exported;
Using the manipulator touch feedback data as the training set of the k nearest neighbour classification algorithm (KNN algorithm).It is described The manipulator of smart collaboration robot respectively grabs 80 groups of objects and acquires data.For each object, control The manipulator closure is made, until encountering object, records the opening angle of the manipulator at this time as initial crawl angle.So Afterwards, the manipulator is closed the 20% of initial crawl angle again, and obtains the teady state pressure value and electricity of the manipulator at this time Pole impedance reduction amount, the soft or hard grade of these data of manual annotation.The stable state pressure of 80 groups of objects is obtained according to above-mentioned steps Force value and electrode impedance reduction amount, and using these data as the training sample of the KNN algorithm, Classification and Identification goes out described 80 groups The soft or hard grade of object completes the training of the KNN algorithm.
It should be noted that it is described wait grab also to can use the realization of other sorting algorithms in other embodiments of the invention Take the second soft or hard grade of object.
Step S308: being respectively that the described first soft or hard grade and the second soft or hard grade setting add using weighted mean method Weight factor obtains the first weighted factor of the described first soft or hard grade and the second weighted factor of the second soft or hard grade, In, first weighted factor and second weighted factor and be 1;
Step S309: according to first weighted factor and second weighted factor, by the described first soft or hard grade and The second soft or hard grade is merged, and determines the target software grade of the object to be grabbed;
Since the ability of the soft or hard grade obtained using the target VGG16 network and the KNN algorithm is had nothing in common with each other, by This is provided with the weighted factor for considering two kinds of algorithms.The determination of weighted factor is to the described first soft or hard grade and described second soft or hard The final fusion of grade plays a decisive role.The final of the soft or hard grade of object to be grabbed is completed using average weighted method Judgement.
Step S310: searching target pressure value corresponding with the soft or hard grade of the target, and it is straight to control the manipulator closure Pressure value to the manipulator reaches the target pressure value, realizes the grasping manipulation of the object to be grabbed.
Under each soft or hard grade, dynamics of the manipulator with size for a attempts crawl current object, every time with increment Dynamics for a grabs current object, upper limit b, the minimum dynamics for successfully grabbing current object of record and maximum and causes to work as The preceding indeformable dynamics of object, repeatedly grabs the object under this soft or hard grade, takes the equal of the minimum value and maximum value successfully grabbed Value is used as the corresponding grasp force of these level object;(a <b, a, b are experiment testing constant).
In the present embodiment, the first identification branch using target VGG16 network obtain the object to be grabbed it is first soft Hard grade.When in order to allow manipulator to grasp different soft and hard object, ensures that object is held and obviously deformation and breakage do not occur, the Two identification branch crawl processes are divided into before contact and contact latter two stage.First stage, before contact is described wait grab object, The manipulator receives control command, and finger starts to be closed to the object to be grabbed, this process is sensed using Biotac The PAC value of device carries out contact detection, and PAC value can occur obviously to shake in the moment for touching the object to be grabbed.Second-order Section, after detecting contact, recording this angle is initial crawl angle.Then, the manipulator is closed again described initially grabs The 20% of angle is taken, and obtains the teady state pressure value and electrode impedance reduction amount of the manipulator at this time, according to the stable state pressure Force value and the electrode impedance reduced value.Finally, soft or hard by the target that the two above network integration goes out the object to be grabbed Grade is mapped to corresponding manipulator pressure value according to the soft or hard grade of target, manipulator is closed, until the pressure value of manipulator reaches To this threshold value, the grasping body control to be grabbed is completed.
The prior art in training network, fix by the crawl object size of selection, this does not meet accurate crawl familiar object Task standard, the present embodiment choose training the convolutional neural networks and the KNN algorithm classification object be common object Body, different sizes, the network that training obtains are more suitable for practical application.There is technology only using manipulator haptic data realization object The soft or hard identification of body, data sheet one cause ineffective.This implementation is while using haptic data is grabbed, knot visual pattern instruction Practice network, the soft or hard accuracy of identification of object is made to be greatly enhanced.
Referring to FIG. 4, Fig. 4 is a kind of structural frames of the device for grasping bodies based on manipulator provided in an embodiment of the present invention Figure;Specific device may include:
First soft or hard grade output module 100, for the image input of object to be grabbed to be previously-completed trained convolution In neural network, the first soft or hard grade of the object to be grabbed is exported;
Control module 200, for controlling manipulator closure, until the manipulator touches the object to be grabbed, and It records the manipulator and touches initial crawl angle when grabbing object;
Logging modle 300, the preset percentage for being closed the initial crawl angle again for controlling the manipulator, and Record the current teady state pressure value of the manipulator and electrode impedance reduction amount;
Second soft or hard grade output module 400, for inputting the teady state pressure value and the electrode impedance reduction amount To being previously-completed in trained sorting algorithm, the second soft or hard grade of the object to be grabbed is exported;
Determining module 500, for determining described wait grab according to the described first soft or hard grade and the second soft or hard grade The soft or hard grade of the target of object;
Handling module 600 controls the manipulator for searching target pressure value corresponding with the soft or hard grade of the target Closure is until the pressure value of the manipulator reaches the target pressure value, the grasping manipulation of the realization object to be grabbed.
The present embodiment based on the device for grasping bodies of manipulator for realizing the grasping body above-mentioned based on manipulator Method, therefore the visible object based on manipulator hereinbefore of specific embodiment in the device for grasping bodies based on manipulator The embodiment part of grasping means, for example, the first soft or hard grade output module 100, control module 200, logging modle 300, the Two soft or hard grade output modules 400, determining module 500 and handling module 600 are respectively used to realize the above-mentioned object based on manipulator Step S101, S102, S103, S104, S105 and S106 in body grasping means, so, specific embodiment is referred to phase The description for the various pieces embodiment answered, details are not described herein.
The specific embodiment of the invention additionally provides a kind of grasping body equipment based on manipulator, comprising: memory is used for Store computer program;Processor realizes that a kind of above-mentioned object based on manipulator is grabbed when for executing the computer program The step of taking method.
The specific embodiment of the invention additionally provides a kind of computer readable storage medium, the computer readable storage medium On be stored with computer program, the computer program realizes that a kind of above-mentioned object based on manipulator is grabbed when being executed by processor The step of taking method.
Each embodiment in this specification is described in a progressive manner, the highlights of each of the examples are with it is other The difference of embodiment, same or similar part may refer to each other between each embodiment.For being filled disclosed in embodiment For setting, since it is corresponded to the methods disclosed in the examples, so being described relatively simple, related place is referring to method part Explanation.
Professional further appreciates that, unit described in conjunction with the examples disclosed in the embodiments of the present disclosure And algorithm steps, can be realized with electronic hardware, computer software, or a combination of the two, in order to clearly demonstrate hardware and The interchangeability of software generally describes each exemplary composition and step according to function in the above description.These Function is implemented in hardware or software actually, the specific application and design constraint depending on technical solution.Profession Technical staff can use different methods to achieve the described function each specific application, but this realization is not answered Think beyond the scope of this invention.
The step of method described in conjunction with the examples disclosed in this document or algorithm, can directly be held with hardware, processor The combination of capable software module or the two is implemented.Software module can be placed in random access memory (RAM), memory, read-only deposit Reservoir (ROM), electrically programmable ROM, electrically erasable ROM, register, hard disk, moveable magnetic disc, CD-ROM or technology In any other form of storage medium well known in field.
Grasping body method, apparatus to provided by the present invention based on manipulator, equipment and computer-readable above Storage medium is described in detail.Specific case used herein explains the principle of the present invention and embodiment It states, the above description of the embodiment is only used to help understand the method for the present invention and its core ideas.It should be pointed out that for this skill For the those of ordinary skill in art field, without departing from the principle of the present invention, several change can also be carried out to the present invention Into and modification, these improvements and modifications also fall within the scope of protection of the claims of the present invention.

Claims (10)

1. a kind of grasping body method based on manipulator characterized by comprising
The input of the image of object to be grabbed is previously-completed in trained convolutional neural networks, the of the object to be grabbed is exported One soft or hard grade;
Manipulator closure is controlled, until the manipulator touches the object to be grabbed, and the manipulator is recorded and touches Initial crawl angle when grabbing object;
Control the preset percentage that the manipulator is closed the initial crawl angle again, and it is current to record the manipulator Teady state pressure value and electrode impedance reduction amount;
The teady state pressure value and the electrode impedance reduction amount are input to and are previously-completed in trained sorting algorithm, institute is exported State the second soft or hard grade of object to be grabbed;
According to the described first soft or hard grade and the second soft or hard grade, the soft or hard grade of target of the object to be grabbed is determined;
Target pressure value corresponding with the soft or hard grade of the target is searched, controls the manipulator closure up to the manipulator Pressure value reaches the target pressure value, realizes the grasping manipulation of the object to be grabbed.
2. grasping body method as described in claim 1, which is characterized in that the image by object to be grabbed inputs preparatory In the convolutional neural networks for completing training, the first soft or hard grade for exporting the object to be grabbed includes:
In the Kinect observation scene for being previously-completed calibration using smart collaboration robot, Kinect image is obtained;
Utilize the target area that there is the object to be grabbed in Kinect image described in YOLOv3 target detection internet search;
The input of the image of target area is previously-completed in trained target VGG16 network, the of the object to be grabbed is exported One soft or hard grade.
3. grasping body method as claimed in claim 2, which is characterized in that further include:
The training object of preset quantity is chosen in ImageNet data set, the soft or hard grade of training object described in manual annotation;
The image of the trained object is input in VGG16 network, the VGG16 network is trained, obtains completing instruction Experienced target VGG16 network.
4. grasping body method as described in claim 1, which is characterized in that described by the teady state pressure value and the electrode Impedance reduction amount, which is input to, to be previously-completed in trained sorting algorithm, and the second soft or hard grade packet of the object to be grabbed is exported It includes:
The teady state pressure value and the electrode impedance reduction amount are input to and are previously-completed trained k nearest neighbour classification algorithm In, export the second soft or hard grade of the object to be grabbed.
5. grasping body method as described in claim 1, which is characterized in that described according to the described first soft or hard grade and described Second soft or hard grade determines that the soft or hard grade of target of the object to be grabbed includes:
It is that the first weighted factor is arranged in the described first soft or hard grade using weighted mean method, is the described second soft or hard grade setting the Two weighted factors, wherein first weighted factor and second weighted factor and be 1;
According to first weighted factor and second weighted factor, by the described first soft or hard grade and described second soft or hard etc. Grade is merged, and determines the target software grade of the object to be grabbed.
6. such as grasping body method described in any one of claim 1 to 5, which is characterized in that the control manipulator closure, directly Touch the object to be grabbed to the manipulator, and record the manipulator touch it is described initial when grabbing object Grabbing angle includes:
The manipulator closure is controlled, carries out the manipulator and the object to be grabbed using the PAC value of Biotac sensor Contact detection;
When the PAC value is shaken, then the manipulator is contacted with the object to be grabbed, and is recorded the manipulator and touched Contact initial crawl angle when grabbing object.
7. a kind of device for grasping bodies based on manipulator characterized by comprising
First soft or hard grade output module, for the image input of object to be grabbed to be previously-completed trained convolutional neural networks In, export the first soft or hard grade of the object to be grabbed;
Control module, for controlling manipulator closure, until the manipulator touches the object to be grabbed, and described in record Manipulator touches initial crawl angle when grabbing object;
Logging modle, the preset percentage for being closed the initial crawl angle again for controlling the manipulator, and record institute State the current teady state pressure value of manipulator and electrode impedance reduction amount;
Second soft or hard grade output module, it is complete in advance for the teady state pressure value and the electrode impedance reduction amount to be input to At the second soft or hard grade in trained sorting algorithm, exporting the object to be grabbed;
Determining module, for determining the object to be grabbed according to the described first soft or hard grade and the second soft or hard grade The soft or hard grade of target;
It is straight to control the manipulator closure for searching target pressure value corresponding with the soft or hard grade of the target for handling module Pressure value to the manipulator reaches the target pressure value, realizes the grasping manipulation of the object to be grabbed.
8. device for grasping bodies as claimed in claim 7, which is characterized in that the control module is specifically used for:
The manipulator closure is controlled, carries out the manipulator and the object to be grabbed using the PAC value of Biotac sensor Contact detection;When the PAC value is shaken, then the manipulator is contacted with the object to be grabbed, and described in record Manipulator touches initial crawl angle when grabbing object.
9. a kind of grasping body equipment based on manipulator characterized by comprising
Memory, for storing computer program;
Processor is realized a kind of based on manipulator as described in any one of claim 1 to 6 when for executing the computer program Grasping body method the step of.
10. a kind of computer readable storage medium, which is characterized in that be stored with computer on the computer readable storage medium Program is realized a kind of based on manipulator as described in any one of claim 1 to 6 when the computer program is executed by processor The step of grasping body method.
CN201910371561.5A 2019-05-06 2019-05-06 Grasping body method, apparatus, equipment and storage medium based on manipulator Pending CN110091331A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910371561.5A CN110091331A (en) 2019-05-06 2019-05-06 Grasping body method, apparatus, equipment and storage medium based on manipulator

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910371561.5A CN110091331A (en) 2019-05-06 2019-05-06 Grasping body method, apparatus, equipment and storage medium based on manipulator

Publications (1)

Publication Number Publication Date
CN110091331A true CN110091331A (en) 2019-08-06

Family

ID=67446951

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910371561.5A Pending CN110091331A (en) 2019-05-06 2019-05-06 Grasping body method, apparatus, equipment and storage medium based on manipulator

Country Status (1)

Country Link
CN (1) CN110091331A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111055279A (en) * 2019-12-17 2020-04-24 清华大学深圳国际研究生院 Multi-mode object grabbing method and system based on combination of touch sense and vision
CN111582186A (en) * 2020-05-11 2020-08-25 深圳阿米嘎嘎科技有限公司 Object edge identification method, device, system and medium based on vision and touch
CN112388655A (en) * 2020-12-04 2021-02-23 齐鲁工业大学 Grabbed object identification method based on fusion of touch vibration signals and visual images
CN113942009A (en) * 2021-09-13 2022-01-18 苏州大学 Robot bionic hand grabbing method and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008055584A (en) * 2006-09-04 2008-03-13 Toyota Motor Corp Robot for holding object and holding method of object by robot
US9501724B1 (en) * 2015-06-09 2016-11-22 Adobe Systems Incorporated Font recognition and font similarity learning using a deep neural network
CN107972069A (en) * 2017-11-27 2018-05-01 胡明建 The design method that a kind of computer vision and Mechanical Touch are mutually mapped with the time
CN108145712A (en) * 2017-12-29 2018-06-12 深圳市越疆科技有限公司 A kind of method, apparatus and robot of robot segregating articles
CN109176521A (en) * 2018-09-19 2019-01-11 北京因时机器人科技有限公司 A kind of mechanical arm and its crawl control method and system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008055584A (en) * 2006-09-04 2008-03-13 Toyota Motor Corp Robot for holding object and holding method of object by robot
US9501724B1 (en) * 2015-06-09 2016-11-22 Adobe Systems Incorporated Font recognition and font similarity learning using a deep neural network
CN107972069A (en) * 2017-11-27 2018-05-01 胡明建 The design method that a kind of computer vision and Mechanical Touch are mutually mapped with the time
CN108145712A (en) * 2017-12-29 2018-06-12 深圳市越疆科技有限公司 A kind of method, apparatus and robot of robot segregating articles
CN109176521A (en) * 2018-09-19 2019-01-11 北京因时机器人科技有限公司 A kind of mechanical arm and its crawl control method and system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
汪礼超: "基于机械手触觉信息的物体软硬属性识别", 《基于机械手触觉信息的物体软硬属性识别 *
陈慧岩: "《智能车辆理论与应用》", 31 July 2018 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111055279A (en) * 2019-12-17 2020-04-24 清华大学深圳国际研究生院 Multi-mode object grabbing method and system based on combination of touch sense and vision
CN111055279B (en) * 2019-12-17 2022-02-15 清华大学深圳国际研究生院 Multi-mode object grabbing method and system based on combination of touch sense and vision
CN111582186A (en) * 2020-05-11 2020-08-25 深圳阿米嘎嘎科技有限公司 Object edge identification method, device, system and medium based on vision and touch
CN111582186B (en) * 2020-05-11 2023-12-15 深圳阿米嘎嘎科技有限公司 Object edge recognition method, device, system and medium based on vision and touch
CN112388655A (en) * 2020-12-04 2021-02-23 齐鲁工业大学 Grabbed object identification method based on fusion of touch vibration signals and visual images
CN112388655B (en) * 2020-12-04 2021-06-04 齐鲁工业大学 Grabbed object identification method based on fusion of touch vibration signals and visual images
CN113942009A (en) * 2021-09-13 2022-01-18 苏州大学 Robot bionic hand grabbing method and system

Similar Documents

Publication Publication Date Title
CN110091331A (en) Grasping body method, apparatus, equipment and storage medium based on manipulator
JP6855098B2 (en) Face detection training methods, equipment and electronics
JP6810087B2 (en) Machine learning device, robot control device and robot vision system using machine learning device, and machine learning method
CN112809679B (en) Method and device for grabbing deformable object and computer readable storage medium
JP6857332B2 (en) Arithmetic logic unit, arithmetic method, and its program
CN109784391A (en) Sample mask method and device based on multi-model
CN109176532B (en) Method, system and device for planning path of mechanical arm
Schill et al. Learning continuous grasp stability for a humanoid robot hand based on tactile sensing
CN107081774B (en) Robot shakes hands control method and system
CN105196290B (en) Real-time robot Grasp Planning
WO2014188177A1 (en) Grasp modelling
Jiang et al. Learning hardware agnostic grasps for a universal jamming gripper
CN114643586B (en) Multi-finger dexterous hand grabbing gesture planning method based on deep neural network
Ruppel et al. Simulation of the SynTouch BioTac sensor
CN116494247A (en) Mechanical arm path planning method and system based on depth deterministic strategy gradient
CN107193498A (en) A kind of method and device that data are carried out with deduplication processing
CN113986561A (en) Artificial intelligence task processing method and device, electronic equipment and readable storage medium
CN109048915A (en) Mechanical arm grabs control method, device, storage medium and electronic equipment
CN116766212A (en) Bionic hand control method, bionic hand control device, bionic hand control equipment and storage medium
Averta Learning to prevent grasp failure with soft hands: From on-line prediction to dual-arm grasp recovery
Bhardwaj et al. Data-driven haptic modeling of normal interactions on viscoelastic deformable objects using a random forest
Sui et al. Transfer of robot perception module with adversarial learning
CN107807993A (en) A kind of implementation method and device of web-page histories writing function
CN111590575B (en) Robot control system and method
JP4023614B2 (en) Gripping device, gripping control method, gripping control program, and recording medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190806