CN109333536A - A kind of robot and its grasping body method and apparatus - Google Patents

A kind of robot and its grasping body method and apparatus Download PDF

Info

Publication number
CN109333536A
CN109333536A CN201811256263.3A CN201811256263A CN109333536A CN 109333536 A CN109333536 A CN 109333536A CN 201811256263 A CN201811256263 A CN 201811256263A CN 109333536 A CN109333536 A CN 109333536A
Authority
CN
China
Prior art keywords
crawl
target object
sensing data
grab
gripper components
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811256263.3A
Other languages
Chinese (zh)
Inventor
蔡颖鹏
陈希
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Time Robot Technology Co Ltd
Original Assignee
Beijing Time Robot Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Time Robot Technology Co Ltd filed Critical Beijing Time Robot Technology Co Ltd
Priority to CN201811256263.3A priority Critical patent/CN109333536A/en
Publication of CN109333536A publication Critical patent/CN109333536A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Manipulator (AREA)

Abstract

This application provides a kind of robot and its grasping body method and apparatus, this method and device are specially type, position and the 3 d pose for detecting target object;Type, position and/or 3 d pose are inputted into grasping movement decision model, obtain crawl strategy, and grab institute's target object by crawl policy control gripper components;Judge whether to grab successfully according to the sensing data that gripper components detect, executes subsequent operation if grabbing successfully;Such as determine not grab success, crawl strategy is adjusted according to sensing data, it is adjusted crawl strategy, and executes crawl again according to adjustment crawl policy control gripper components, and the sensing data detected again according to gripper components judges whether to grab successfully.The occurrence of circulation is adjusted and is repeatedly grabbed in this way, can be to avoid there is crawl caused by situations such as deviation, skidding and dynamics are inadequate unsuccessfully because of position.

Description

A kind of robot and its grasping body method and apparatus
Technical field
This application involves field of artificial intelligence, more specifically to a kind of robot and its grasping body method And device.
Background technique
With the development of technology, robot is increasingly being applied to the every field of production and living, such as raw in automobile Industrial robot in producing line is substantially instead of the operation of the mankind, although the form of these robots is multifarious, especially It is industrial robot is all with a kind of appearance of nonhuman forms, but its basic movement-grasps and is substantially all robots Common trait, either directly grab the components still operation to corresponding position, require to be grabbed using gripper Hold operation.
When needing to grab has the target object of interference, small volume or complex contour, due to the intrinsic error of system Will lead to grasping position occur deviation, target object itself skid, dynamics not enough and so on, so as to cause to target The crawl of object fails.
Summary of the invention
In view of this, the application provides a kind of robot and its grasping body method and apparatus, for avoiding the occurrence of crawl Failure.
To achieve the goals above, it is proposed that scheme it is as follows:
A kind of grasping body method, be applied to robot, the grasping body method comprising steps of
Detect type, position and the 3 d pose of target object;
The type, the position and/or the 3 d pose are inputted into grasping movement decision model, obtain crawl plan Slightly, and by the crawl policy control gripper components institute's target object is grabbed;
Judged whether to grab successfully according to the sensing data that the gripper components detect, after being executed if grabbing successfully Continuous operation;
Such as determine not grab success, the crawl strategy is adjusted according to the sensing data, is adjusted Crawl strategy, and the gripper components according to adjustment crawl policy control execute crawl again, and return to the basis The sensing data that gripper components detect judges whether to grab successfully step.
Optionally, type, position and the 3 d pose of the detection target object, comprising:
The image of the target object is obtained using visible detection method;
The image is inputted into target detection model, obtains the type, the position and the 3 d pose.
Optionally, further includes:
In each crawl, the sensing data is recorded, forms the first data set, includes grabbing in first data set Sensing data when sensing data and crawl when taking successfully fail;
The training that deep neural network is carried out using first data set, obtains crawl judgment models, and the crawl is sentenced Disconnected model is for judging whether crawl succeeds.
Optionally, the sensing data detected according to the gripper components judges whether to grab successfully, comprising:
The sensing data is input to the crawl judgment models, whether is grabbed successful judging result.
Optionally, further includes:
In each crawl, the tactful and each crawl plan adjusted of the crawl before recording the sensing data and adjustment Slightly, the second data set is formed;
The training that deep neural network is carried out according to second data set obtains crawl decision adjustment model.
Optionally, described that the crawl strategy is adjusted according to the sensing data, obtain the crawl adjustment Strategy, comprising:
The sensing data is inputted into the crawl Developing Tactics model, obtains the crawl adjustable strategies.
Optionally, further includes:
When the number for attempting crawl to simple target object reaches pre-set limit, controls the gripper components and terminate execution Crawl.
A kind of device for grasping bodies, is applied to robot, and the device for grasping bodies includes:
Parameter detection module, for detecting type, position and the 3 d pose of target object;
First execution module, for the type, the position and/or the 3 d pose to be inputted grasping movement decision Model obtains crawl strategy, and grabs institute's target object by the crawl policy control gripper components;
Judgment module is grabbed, the sensing data for being detected according to the gripper components judges whether to grab successfully, Subsequent operation is executed if grabbing successfully;
Second execution module, for as determined without grabbing successfully, according to the sensing data to the crawl strategy It is adjusted, is adjusted crawl strategy, and the gripper components according to adjustment crawl policy control execute crawl again, And judge whether to grab successfully step according to the sensing data that gripper components detect back to described.
Optionally, the parameter detection module includes:
Image capturing unit, for obtaining the image of the target object using visible detection method;
Execution unit is detected, for the image to be inputted target detection model, obtains the type, the position and institute State 3 d pose.
Optionally, further includes:
First logging modle, in each crawl, recording the sensing data, the first data set of formation is described Sensing data when including sensing data and the crawl failure when grabbing successfully in the first data set;
First training module is obtained crawl and sentenced for being carried out the training of deep neural network using first data set Disconnected model, the crawl judgment models are for judging whether crawl succeeds.
Optionally, the crawl judgment module is used to the sensing data being input to the crawl judgment models, obtains To whether grabbing successful judging result.
Optionally, further includes:
Second logging modle, in each crawl, record the crawl strategy before the sensing data and adjustment and Crawl strategy adjusted every time, forms the second data set;
Second training module obtains crawl and determines for carrying out the training of deep neural network according to second data set Plan adjusts model.
Optionally, second execution module is used to the sensing data inputting the crawl Developing Tactics model, Obtain the crawl adjustable strategies.
Optionally, second execution module is also used to limit when the number for attempting crawl to simple target object reaches pre- When value, controls the gripper components termination and execute crawl.
A kind of robot is optionally provided with device for grasping bodies as described above.
It can be seen from the above technical scheme that this application discloses a kind of robot and its grasping body method and dresses It sets, this method and device are specially type, position and the 3 d pose for detecting target object;By type, position and/or three-dimensional appearance State inputs grasping movement decision model, obtains crawl strategy, and grab institute's target object by crawl policy control gripper components;Root Judge whether to grab successfully according to the sensing data that gripper components detect, executes subsequent operation if grabbing successfully;As determined Success is not grabbed, crawl strategy is adjusted according to sensing data, is adjusted crawl strategy, and grab according to adjustment Policy control gripper components execute crawl again, and the sensing data detected again according to gripper components judges whether to grab Success.Circulation is adjusted and is repeatedly grabbed in this way, can be realized most in the case where can not once grab successful situation by fine tuning Eventually to the crawl of target object, accordingly even when can also be kept away in the case where there is the interference such as interference, small volume even complex contour Exempt from occur the occurrence of grabbing failure caused by situations such as deviation, skidding and dynamics are inadequate because of position.
Detailed description of the invention
In order to illustrate the technical solutions in the embodiments of the present application or in the prior art more clearly, to embodiment or will show below There is attached drawing needed in technical description to be briefly described, it should be apparent that, the accompanying drawings in the following description is only this Some embodiments of application for those of ordinary skill in the art without creative efforts, can be with It obtains other drawings based on these drawings.
Fig. 1 is a kind of flow chart of grasping body method provided by the embodiments of the present application;
Fig. 2 is the flow chart of another grasping body method provided by the embodiments of the present application;
Fig. 3 is the flow chart of another grasping body method provided by the embodiments of the present application;
Fig. 4 is a kind of block diagram of device for grasping bodies provided by the embodiments of the present application;
Fig. 5 is the block diagram of another device for grasping bodies provided by the embodiments of the present application;
Fig. 6 is the block diagram of another device for grasping bodies provided by the embodiments of the present application;
Fig. 7 is the block diagram of another device for grasping bodies provided by the embodiments of the present application.
Specific embodiment
Below in conjunction with the attached drawing in the embodiment of the present application, technical solutions in the embodiments of the present application carries out clear, complete Site preparation description, it is clear that described embodiments are only a part of embodiments of the present application, instead of all the embodiments.It is based on Embodiment in the application, it is obtained by those of ordinary skill in the art without making creative efforts every other Embodiment shall fall in the protection scope of this application.
Embodiment one
Fig. 1 is a kind of flow chart of grasping body method provided by the embodiments of the present application.
As shown in Figure 1, grasping body method provided in this embodiment is applied to robot, for the machine to robot The gripper components such as hand, mechanical arm are controlled, so that the gripper components grab target object.The grasping body method packet Include following steps:
S1, type, position and the 3 d pose for detecting target object.
It is specially to be carried out using visible sensation method to the type of target object to be captured, position and 3 d pose in the application Detection.Wherein, type refers to the property of target object, such as books, cup or machine components;Position refers to target object phase For the space coordinate of the gripper components;3 d pose refers to space vector of the target object relative to gripper components, such as it The direction of one corresponding position.
Specifically, the application by the following method detects type, position and 3 d pose:
Firstly, the image of target object is obtained using visible detection method, for example, by using equipment pair such as camera, video cameras Target object is shot, to obtain the image of the target object.Then, which target trained in advance is input to examine Model is surveyed, to obtain the type of the target object, position and 3 d pose.
The target detection model is deep neural network model, is obtained, is had higher by the training of a large amount of data sample Accuracy.
S2, crawl strategy is determined according to type, position and 3 d pose and target object is grabbed.
After obtaining type, position and the 3 d pose of above-mentioned target object, the type, position and 3 d pose are inputted To corresponding grasping movement decision model, obtain grabbing strategy accordingly, then, according to grabbing for the crawl policy control robot It takes component to target object into close and make the movement to match with the shape of the target object, which is grabbed It takes.
Broad sense crawl include to target object into it is close, open gripper components, target object grasped, and execute grasping Other movements after the completion, such as shift to other positions for target object.And the crawl in the present embodiment specifically refers to gripper components Whether target object is grasped, the movement of next step can be carried out on this basis, target object is such as shifted into other positions.
S3, judge whether gripper components grab successfully target object.
Multiple sensors are provided in gripper components, such as detecting in gripper components multiple positions in crawl to mesh Mark the pressure sensor of the pressure of object.Here judgement be detected according to multiple pressure sensors sensing data to whether It grabs and is successfully judged.
For example, each position should phase to the pressure of target object when grabbing successfully for the gripper components of in general hand shape Seemingly, i.e., within certain pressure limit, thus can by the difference between multiple sensing datas to whether grabbing Successfully determined, this is an example certainly, can also by other methods according to the sensing data to whether grabbing into Function is judged.
Pass through judgement, if it is determined that successfully grab to target object, then terminate this grasping movement, turn to and execute subsequent move Make, the target object is such as shifted into other positions.
S4, target object is grabbed again if no grab successfully.
Pass through judgement above, if it is not determined that not grabbing success, then according to sensing data to crawl strategy It is adjusted, for convenience, the crawl strategy after we will be adjusted becomes crawl adjustable strategies, with difference and directly According to the type of target object, position and 3 d pose initial crawl strategy obtained.
After determining crawl adjustable strategies, control gripper components grab target object again, and crawl refers to again here Some or all of unclamp the gripper components, and change in corresponding position, angle and grip strength after element to target object into Row crawl.
Also, previous step is returned to after grabbing again, to whether grabbing and successfully judged again, and determines crawl It is grabbed again after terminating crawl, or crawl failure after success.
In addition, crawl always wastes time in the case where in order to avoid crawl failure, we can set crawl number, example It such as can be when the crawl number to simple target object be beyond pre-set limit, such as 1000 times, 10000 times or other numbers, most Terminate crawl.
It can be seen from the above technical proposal that present embodiments providing a kind of grasping body method, this method is applied to machine Device people specially detects type, position and the 3 d pose of target object;Type, position and/or 3 d pose are inputted and grabbed Decision model is acted, obtains crawl strategy, and grab institute's target object by crawl policy control gripper components;According to gripper components The sensing data detected judges whether to grab successfully, executes subsequent operation if grabbing successfully;Such as determine not grabbing into Function is adjusted crawl strategy according to sensing data, is adjusted crawl strategy, and grab according to adjustment crawl policy control Component is taken to execute crawl again, and the sensing data detected again according to gripper components judges whether to grab successfully.Pass through Circulation is adjusted and is repeatedly grabbed in this way, can be realized by fine tuning finally to object in the case where can not once grab successful situation The crawl of body, accordingly even when being also avoided that in the case where there is the interference such as interference, small volume even complex contour because position goes out Existing deviation, situations such as skidding and dynamics are inadequate caused crawl unsuccessfully the occurrence of.
Embodiment two
Fig. 2 is the flow chart of another grasping body method provided by the embodiments of the present application.
As shown in Fig. 2, grasping body method provided in this embodiment is applied to robot, for the machine to robot The gripper components such as hand, mechanical arm are controlled, so that the gripper components grab target object.The grasping body method packet Include following steps:
S1, type, position and the 3 d pose for detecting target object.
It is specially to be carried out using visible sensation method to the type of target object to be captured, position and 3 d pose in the application Detection.Wherein, type refers to the property of target object, such as books, cup or machine components;Position refers to target object phase For the space coordinate of the gripper components;3 d pose refers to space vector of the target object relative to gripper components, such as it The direction of one corresponding position.
Specifically, the application by the following method detects type, position and 3 d pose:
Firstly, the image of target object is obtained using visible detection method, for example, by using equipment pair such as camera, video cameras Target object is shot, to obtain the image of the target object.Then, which target trained in advance is input to examine Model is surveyed, to obtain the type of the target object, position and 3 d pose.
The target detection model is deep neural network model, is obtained, is had higher by the training of a large amount of data sample Accuracy.
S2, crawl strategy is determined according to type, position and 3 d pose and target object is grabbed.
After obtaining type, position and the 3 d pose of above-mentioned target object, the type, position and 3 d pose are inputted To corresponding grasping movement decision model, obtain grabbing strategy accordingly, then, according to grabbing for the crawl policy control robot It takes component to target object into close and make the movement to match with the shape of the target object, which is grabbed It takes.
Broad sense crawl include to target object into it is close, open gripper components, target object grasped, and execute grasping Other movements after the completion, such as shift to other positions for target object.And the crawl in the present embodiment specifically refers to gripper components Whether target object is grasped, the movement of next step can be carried out on this basis, target object is such as shifted into other positions.
S3, judge whether gripper components grab successfully target object.
Multiple sensors are provided in gripper components, such as detecting in gripper components multiple positions in crawl to mesh Mark the pressure sensor of the pressure of object.Here judgement be detected according to multiple pressure sensors sensing data to whether It grabs and is successfully judged.
For example, each position should phase to the pressure of target object when grabbing successfully for the gripper components of in general hand shape Seemingly, i.e., within certain pressure limit, thus can by the difference between multiple sensing datas to whether grabbing Successfully determined, this is an example certainly, can also by other methods according to the sensing data to whether grabbing into Function is judged.
Pass through judgement, if it is determined that successfully grab to target object, then terminate this grasping movement, turn to and execute subsequent move Make, the target object is such as shifted into other positions.
S4, target object is grabbed again if no grab successfully.
Pass through judgement above, if it is not determined that not grabbing success, then according to sensing data to crawl strategy It is adjusted, for convenience, the crawl strategy after we will be adjusted becomes crawl adjustable strategies, with difference and directly According to the type of target object, position and 3 d pose initial crawl strategy obtained.
After determining crawl adjustable strategies, control gripper components grab target object again, and crawl refers to again here Some or all of unclamp the gripper components, and change in corresponding position, angle and grip strength after element to target object into Row crawl.
Also, previous step is returned to after grabbing again, to whether grabbing and successfully judged again, and determines crawl It is grabbed again after terminating crawl, or crawl failure after success.
In addition, crawl always wastes time in the case where in order to avoid crawl failure, we can set crawl number, example It such as can be when the crawl number to simple target object be beyond pre-set limit, such as 1000 times, 10000 times or other numbers, most Terminate crawl.
Sensing data is recorded when S5, each crawl, forms the first data set.
Sensing data detected by sensor when i.e. record grabs every time in gripper components ultimately forms the first number It further include sensing data when crawl fails wherein not only including sensing data when grabbing successfully according to collection, and including Type, position and the 3 d pose of respective objects object.
S6, model training is carried out using the first data set, obtains crawl judgment models.
After obtaining the first data set, it is trained using deep neural network of first data set to preparatory component, from And obtain the crawl judgment models that whether can be successfully detected to crawl.
It, can be by sensor number when whether successfully judging crawl on the basis of successfully training this model According to the crawl judgment models are input to, corresponding judging result can be exported.
It can be seen from the above technical proposal that present embodiments providing a kind of grasping body method, this method is applied to machine Device people specially detects type, position and the 3 d pose of target object;Type, position and/or 3 d pose are inputted and grabbed Decision model is acted, obtains crawl strategy, and grab institute's target object by crawl policy control gripper components;According to gripper components The sensing data detected judges whether to grab successfully, executes subsequent operation if grabbing successfully;Such as determine not grabbing into Function is adjusted crawl strategy according to sensing data, is adjusted crawl strategy, and grab according to adjustment crawl policy control Component is taken to execute crawl again, and the sensing data detected again according to gripper components judges whether to grab successfully.Pass through Circulation is adjusted and is repeatedly grabbed in this way, can be realized by fine tuning finally to object in the case where can not once grab successful situation The crawl of body, accordingly even when being also avoided that in the case where there is the interference such as interference, small volume even complex contour because position goes out Existing deviation, situations such as skidding and dynamics are inadequate caused crawl unsuccessfully the occurrence of.
By training and using crawl judgment models, judging result can be made more accurate.
Embodiment three
Fig. 3 is the flow chart of another grasping body method provided by the embodiments of the present application.
As shown in figure 3, grasping body method provided in this embodiment is applied to robot, for the machine to robot The gripper components such as hand, mechanical arm are controlled, so that the gripper components grab target object.The grasping body method packet Include following steps:
S1, type, position and the 3 d pose for detecting target object.
It is specially to be carried out using visible sensation method to the type of target object to be captured, position and 3 d pose in the application Detection.Wherein, type refers to the property of target object, such as books, cup or machine components;Position refers to target object phase For the space coordinate of the gripper components;3 d pose refers to space vector of the target object relative to gripper components, such as it The direction of one corresponding position.
Specifically, the application by the following method detects type, position and 3 d pose:
Firstly, the image of target object is obtained using visible detection method, for example, by using equipment pair such as camera, video cameras Target object is shot, to obtain the image of the target object.Then, which target trained in advance is input to examine Model is surveyed, to obtain the type of the target object, position and 3 d pose.
The target detection model is deep neural network model, is obtained, is had higher by the training of a large amount of data sample Accuracy.
S2, crawl strategy is determined according to type, position and 3 d pose and target object is grabbed.
After obtaining type, position and the 3 d pose of above-mentioned target object, the type, position and 3 d pose are inputted To corresponding grasping movement decision model, obtain grabbing strategy accordingly, then, according to grabbing for the crawl policy control robot It takes component to target object into close and make the movement to match with the shape of the target object, which is grabbed It takes.
Broad sense crawl include to target object into it is close, open gripper components, target object grasped, and execute grasping Other movements after the completion, such as shift to other positions for target object.And the crawl in the present embodiment specifically refers to gripper components Whether target object is grasped, the movement of next step can be carried out on this basis, target object is such as shifted into other positions.
S3, judge whether gripper components grab successfully target object.
Multiple sensors are provided in gripper components, such as detecting in gripper components multiple positions in crawl to mesh Mark the pressure sensor of the pressure of object.Here judgement be detected according to multiple pressure sensors sensing data to whether It grabs and is successfully judged.
For example, each position should phase to the pressure of target object when grabbing successfully for the gripper components of in general hand shape Seemingly, i.e., within certain pressure limit, thus can by the difference between multiple sensing datas to whether grabbing Successfully determined, this is an example certainly, can also by other methods according to the sensing data to whether grabbing into Function is judged.
Pass through judgement, if it is determined that successfully grab to target object, then terminate this grasping movement, turn to and execute subsequent move Make, the target object is such as shifted into other positions.
S4, target object is grabbed again if no grab successfully.
Pass through judgement above, if it is not determined that not grabbing success, then according to sensing data to crawl strategy It is adjusted, for convenience, the crawl strategy after we will be adjusted becomes crawl adjustable strategies, with difference and directly According to the type of target object, position and 3 d pose initial crawl strategy obtained.
After determining crawl adjustable strategies, control gripper components grab target object again, and crawl refers to again here Some or all of unclamp the gripper components, and change in corresponding position, angle and grip strength after element to target object into Row crawl.
Also, previous step is returned to after grabbing again, to whether grabbing and successfully judged again, and determines crawl It is grabbed again after terminating crawl, or crawl failure after success.
In addition, crawl always wastes time in the case where in order to avoid crawl failure, we can set crawl number, example It such as can be when the crawl number to simple target object be beyond pre-set limit, such as 1000 times, 10000 times or other numbers, most Terminate crawl.
Sensing data is recorded when S7, each crawl, forms the second data set.
I.e. record every time crawl when gripper components on sensor detected by sensing data and every time it is adjusted Crawl strategy, ultimately forms the second data set, and the type including respective objects object, position and 3 d pose.
S8, model training is carried out using the second data set, obtains crawl Developing Tactics model.
After obtaining the second data set, it is trained using deep neural network of second data set to preparatory component, from And obtain can be to the crawl Developing Tactics model that is adjusted of crawl strategy.
On the basis of successfully training this model, when being adjusted every time to crawl strategy, by respective sensor number According to the model is input to, crawl strategy adjusted, i.e. crawl adjustable strategies can be obtained.
It can be seen from the above technical proposal that present embodiments providing a kind of grasping body method, this method is applied to machine Device people specially detects type, position and the 3 d pose of target object;Type, position and/or 3 d pose are inputted and grabbed Decision model is acted, obtains crawl strategy, and grab institute's target object by crawl policy control gripper components;According to gripper components The sensing data detected judges whether to grab successfully, executes subsequent operation if grabbing successfully;Such as determine not grabbing into Function is adjusted crawl strategy according to sensing data, is adjusted crawl strategy, and grab according to adjustment crawl policy control Component is taken to execute crawl again, and the sensing data detected again according to gripper components judges whether to grab successfully.Pass through Circulation is adjusted and is repeatedly grabbed in this way, can be realized by fine tuning finally to object in the case where can not once grab successful situation The crawl of body, accordingly even when being also avoided that in the case where there is the interference such as interference, small volume even complex contour because position goes out Existing deviation, situations such as skidding and dynamics are inadequate caused crawl unsuccessfully the occurrence of.
By grabbing and using crawl Developing Tactics model, can make every time more smart to the adjustment of crawl strategy progress Standard enables gripper components faster to realize the accurate crawl to target object.
Example IV
Fig. 4 is a kind of block diagram of device for grasping bodies provided by the embodiments of the present application.
As shown in figure 4, device for grasping bodies provided in this embodiment is applied to robot, for the machine to robot The gripper components such as hand, mechanical arm are controlled, so that the gripper components grab target object.The device for grasping bodies packet Include parameter detection module 10, the first execution module 20, crawl judgment module 30 and the second execution module 40.
Parameter detection module is used to detect type, position and the 3 d pose of target object.
It is specially to be carried out using visible sensation method to the type of target object to be captured, position and 3 d pose in the application Detection.Wherein, type refers to the property of target object, such as books, cup or machine components;Position refers to target object phase For the space coordinate of the gripper components;3 d pose refers to space vector of the target object relative to gripper components, such as it The direction of one corresponding position.
Specifically, which specifically includes image capturing unit and detection execution unit.
Image capturing unit is used to obtain the image of target object using visible detection method, for example, by using camera, takes the photograph The equipment such as camera shoot target object, to obtain the image of the target object.Execution unit is detected to be used for the shadow As being input to target detection model trained in advance, to obtain the type of the target object, position and 3 d pose.
The target detection model is deep neural network model, is obtained, is had higher by the training of a large amount of data sample Accuracy.
First execution module is used to determine crawl strategy according to type, position and 3 d pose and grab to target object It takes.
After obtaining type, position and the 3 d pose of above-mentioned target object, the type, position and 3 d pose are inputted To corresponding grasping movement decision model, obtain grabbing strategy accordingly, then, according to grabbing for the crawl policy control robot It takes component to target object into close and make the movement to match with the shape of the target object, which is grabbed It takes.
Broad sense crawl include to target object into it is close, open gripper components, target object grasped, and execute grasping Other movements after the completion, such as shift to other positions for target object.And the crawl in the present embodiment specifically refers to gripper components Whether target object is grasped, the movement of next step can be carried out on this basis, target object is such as shifted into other positions.
Crawl judgment module is for judging whether gripper components grab successfully target object.
Multiple sensors are provided in gripper components, such as detecting in gripper components multiple positions in crawl to mesh Mark the pressure sensor of the pressure of object.Here judgement be detected according to multiple pressure sensors sensing data to whether It grabs and is successfully judged.
For example, each position should phase to the pressure of target object when grabbing successfully for the gripper components of in general hand shape Seemingly, i.e., within certain pressure limit, thus can by the difference between multiple sensing datas to whether grabbing Successfully determined, this is an example certainly, can also by other methods according to the sensing data to whether grabbing into Function is judged.
Pass through judgement, if it is determined that successfully grab to target object, then terminate this grasping movement, turn to and execute subsequent move Make, the target object is such as shifted into other positions.
Second execution module is used for as without grabbing again if grabbing successfully to target object.
Pass through judgement above, if it is not determined that not grabbing success, then according to sensing data to crawl strategy It is adjusted, for convenience, the crawl strategy after we will be adjusted becomes crawl adjustable strategies, with difference and directly According to the type of target object, position and 3 d pose initial crawl strategy obtained.
After determining crawl adjustable strategies, control gripper components grab target object again, and crawl refers to again here Some or all of unclamp the gripper components, and change in corresponding position, angle and grip strength after element to target object into Row crawl.
Also, whether control crawl judgment module is to grabbing and successfully judged again after grabbing again, and determination is grabbed It takes and successfully terminates crawl afterwards, or grabbed again after crawl failure.
In addition, crawl always wastes time in the case where in order to avoid crawl failure, we can set crawl number, example It such as can be when the crawl number to simple target object be beyond pre-set limit, such as 1000 times, 10000 times or other numbers, most Terminate crawl.
It can be seen from the above technical proposal that present embodiments providing a kind of device for grasping bodies, which is applied to machine Device people specially detects type, position and the 3 d pose of target object;Type, position and/or 3 d pose are inputted and grabbed Decision model is acted, obtains crawl strategy, and grab institute's target object by crawl policy control gripper components;According to gripper components The sensing data detected judges whether to grab successfully, executes subsequent operation if grabbing successfully;Such as determine not grabbing into Function is adjusted crawl strategy according to sensing data, is adjusted crawl strategy, and grab according to adjustment crawl policy control Component is taken to execute crawl again, and the sensing data detected again according to gripper components judges whether to grab successfully.Pass through Circulation is adjusted and is repeatedly grabbed in this way, can be realized by fine tuning finally to object in the case where can not once grab successful situation The crawl of body, accordingly even when being also avoided that in the case where there is the interference such as interference, small volume even complex contour because position goes out Existing deviation, situations such as skidding and dynamics are inadequate caused crawl unsuccessfully the occurrence of.
Embodiment five
Fig. 5 is the block diagram of another device for grasping bodies provided by the embodiments of the present application.
As shown in figure 5, device for grasping bodies provided in this embodiment is applied to robot, relative to a upper embodiment Speech, the present embodiment are additionally arranged the first logging modle 50 and the first training module 60.
Sensing data is recorded when first logging modle for grabbing every time, forms the first data set.
Sensing data detected by sensor when i.e. record grabs every time in gripper components ultimately forms the first number It further include sensing data when crawl fails wherein not only including sensing data when grabbing successfully according to collection, and including Type, position and the 3 d pose of respective objects object.
First training module is used to carry out model training using the first data set, obtains crawl judgment models.
After obtaining the first data set, it is trained using deep neural network of first data set to preparatory component, from And obtain the crawl judgment models that whether can be successfully detected to crawl.
It, can when whether crawl judgment module successfully judges crawl on the basis of successfully training this model Sensing data is input to the crawl judgment models, corresponding judging result can be exported.
It can be seen from the above technical proposal that present embodiments providing a kind of device for grasping bodies, this method is applied to machine Device people specially detects type, position and the 3 d pose of target object;Type, position and/or 3 d pose are inputted and grabbed Decision model is acted, obtains crawl strategy, and grab institute's target object by crawl policy control gripper components;According to gripper components The sensing data detected judges whether to grab successfully, executes subsequent operation if grabbing successfully;Such as determine not grabbing into Function is adjusted crawl strategy according to sensing data, is adjusted crawl strategy, and grab according to adjustment crawl policy control Component is taken to execute crawl again, and the sensing data detected again according to gripper components judges whether to grab successfully.Pass through Circulation is adjusted and is repeatedly grabbed in this way, can be realized by fine tuning finally to object in the case where can not once grab successful situation The crawl of body, accordingly even when being also avoided that in the case where there is the interference such as interference, small volume even complex contour because position goes out Existing deviation, situations such as skidding and dynamics are inadequate caused crawl unsuccessfully the occurrence of.
By training and using crawl judgment models, judging result can be made more accurate.
Embodiment six
Fig. 6 is the block diagram of another device for grasping bodies provided by the embodiments of the present application.
As shown in fig. 6, device for grasping bodies provided in this embodiment is applied to robot, for example IV, The present embodiment is additionally arranged the second logging modle 70 and the second training module 80.
Sensing data is recorded when second logging modle for grabbing every time, forms the second data set.
I.e. record every time crawl when gripper components on sensor detected by sensing data and every time it is adjusted Crawl strategy, ultimately forms the second data set, and the type including respective objects object, position and 3 d pose.
Second training module is used to carry out model training using the second data set, obtains crawl Developing Tactics model.
After obtaining the second data set, it is trained using deep neural network of second data set to preparatory component, from And obtain can be to the crawl Developing Tactics model that is adjusted of crawl strategy.
It, will when the second execution module is every time adjusted crawl strategy on the basis of successfully training this model Respective sensor data are input to the model, and crawl strategy adjusted, i.e. crawl adjustable strategies can be obtained.
It can be seen from the above technical proposal that present embodiments providing a kind of device for grasping bodies, which is applied to machine Device people specially detects type, position and the 3 d pose of target object;Type, position and/or 3 d pose are inputted and grabbed Decision model is acted, obtains crawl strategy, and grab institute's target object by crawl policy control gripper components;According to gripper components The sensing data detected judges whether to grab successfully, executes subsequent operation if grabbing successfully;Such as determine not grabbing into Function is adjusted crawl strategy according to sensing data, is adjusted crawl strategy, and grab according to adjustment crawl policy control Component is taken to execute crawl again, and the sensing data detected again according to gripper components judges whether to grab successfully.Pass through Circulation is adjusted and is repeatedly grabbed in this way, can be realized by fine tuning finally to object in the case where can not once grab successful situation The crawl of body, accordingly even when being also avoided that in the case where there is the interference such as interference, small volume even complex contour because position goes out Existing deviation, situations such as skidding and dynamics are inadequate caused crawl unsuccessfully the occurrence of.
By grabbing and using crawl Developing Tactics model, can make every time more smart to the adjustment of crawl strategy progress Standard enables gripper components faster to realize the accurate crawl to target object.
In addition, the application also provides another specific embodiment, as shown in fig. 7, specifically including parameter detection module 10, One execution module 20, crawl judgment module 30 and the second execution module 40, the first logging modle 50, the first training module 60, the Two logging modles 70 and the second training module 80.
Specific introduction is done in the embodiment of the specific effect of above-mentioned modules in front, which is not described herein again.
Embodiment seven
A kind of robot is present embodiments provided, which is provided with the dress of grasping body described in above example It sets.
Grasping body side's device is specifically used for type, position and the 3 d pose of detection target object;By type, position And/or 3 d pose inputs grasping movement decision model, obtains crawl strategy, and grab institute by crawl policy control gripper components Target object;Judge whether to grab successfully according to the sensing data that gripper components detect, be executed if grabbing successfully subsequent Operation;Such as determine not grab success, crawl strategy is adjusted according to sensing data, is adjusted crawl strategy, and Crawl is executed again according to adjustment crawl policy control gripper components, and again according to the sensing data that gripper components detect Judge whether to grab successfully.Circulation is adjusted and is repeatedly grabbed in this way, can be led in the case where can not once grab successful situation It crosses fine tuning to realize finally to the crawl of target object, accordingly even when there are the interference such as interference, small volume even complex contour In the case of, also it is avoided that the occurrence of grabbing failure caused by there is situations such as deviation, skidding and dynamics are inadequate because of position.
All the embodiments in this specification are described in a progressive manner, the highlights of each of the examples are with The difference of other embodiments, the same or similar parts between the embodiments can be referred to each other.
It should be understood by those skilled in the art that, the embodiments of the present application may be provided as method, apparatus or calculating Machine program product.Therefore, the embodiment of the present application can be used complete hardware embodiment, complete software embodiment or combine software and The form of the embodiment of hardware aspect.Moreover, the embodiment of the present application can be used one or more wherein include computer can With in the computer-usable storage medium (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.) of program code The form of the computer program product of implementation.
The embodiment of the present application is referring to according to the method for the embodiment of the present application, terminal device (system) and computer program The flowchart and/or the block diagram of product describes.It should be understood that flowchart and/or the block diagram can be realized by computer program instructions In each flow and/or block and flowchart and/or the block diagram in process and/or box combination.It can provide these Computer program instructions are set to general purpose computer, special purpose computer, Embedded Processor or other programmable data processing terminals Standby processor is to generate a machine, so that being held by the processor of computer or other programmable data processing terminal devices Capable instruction generates for realizing in one or more flows of the flowchart and/or one or more blocks of the block diagram The device of specified function.
These computer program instructions, which may also be stored in, is able to guide computer or other programmable data processing terminal devices In computer-readable memory operate in a specific manner, so that instruction stored in the computer readable memory generates packet The manufacture of command device is included, which realizes in one side of one or more flows of the flowchart and/or block diagram The function of being specified in frame or multiple boxes.
These computer program instructions can also be loaded into computer or other programmable data processing terminal devices, so that Series of operation steps are executed on computer or other programmable terminal equipments to generate computer implemented processing, thus The instruction executed on computer or other programmable terminal equipments is provided for realizing in one or more flows of the flowchart And/or in one or more blocks of the block diagram specify function the step of.
Although preferred embodiments of the embodiments of the present application have been described, once a person skilled in the art knows bases This creative concept, then additional changes and modifications can be made to these embodiments.So the following claims are intended to be interpreted as Including preferred embodiment and all change and modification within the scope of the embodiments of the present application.
Finally, it is to be noted that, herein, relational terms such as first and second and the like be used merely to by One entity or operation are distinguished with another entity or operation, without necessarily requiring or implying these entities or operation Between there are any actual relationship or orders.Moreover, the terms "include", "comprise" or its any other variant meaning Covering non-exclusive inclusion, so that process, method, article or terminal device including a series of elements not only wrap Those elements are included, but also including other elements that are not explicitly listed, or further includes for this process, method, article Or the element that terminal device is intrinsic.In the absence of more restrictions, being wanted by what sentence "including a ..." limited Element, it is not excluded that there is also other identical elements in process, method, article or the terminal device for including the element.
Technical solution provided herein is described in detail above, specific case used herein is to this Shen Principle and embodiment please is expounded, the present processes that the above embodiments are only used to help understand and its Core concept;At the same time, for those skilled in the art, according to the thought of the application, in specific embodiment and application There will be changes in range, in conclusion the contents of this specification should not be construed as limiting the present application.

Claims (15)

1. a kind of grasping body method, be applied to robot, which is characterized in that the grasping body method comprising steps of
Detect type, position and the 3 d pose of target object;
The type, the position and/or the 3 d pose are inputted into grasping movement decision model, obtain crawl strategy, and Institute's target object is grabbed by the crawl policy control gripper components;
Judge whether to grab successfully according to the sensing data that the gripper components detect, subsequent behaviour is executed if grabbing successfully Make;
Such as determine not grab success, the crawl strategy is adjusted according to the sensing data, is adjusted crawl Strategy, and the gripper components according to adjustment crawl policy control execute crawl again, and return to described according to crawl The sensing data that component detects judges whether to grab successfully step.
2. grasping body method as described in claim 1, which is characterized in that the detection type of target object, position and 3 d pose, comprising:
The image of the target object is obtained using visible detection method;
The image is inputted into target detection model, obtains the type, the position and the 3 d pose.
3. grasping body method as described in claim 1, which is characterized in that further include:
In each crawl, record the sensing data, form the first data set, include crawl in first data set at Sensing data when sensing data and crawl when function fail;
The training that deep neural network is carried out using first data set, obtains crawl judgment models, and the crawl judges mould Type is for judging whether crawl succeeds.
4. grasping body method as claimed in claim 3, which is characterized in that the biography detected according to the gripper components Sensor data judge whether to grab successfully, comprising:
The sensing data is input to the crawl judgment models, whether is grabbed successful judging result.
5. grasping body method as described in claim 1, which is characterized in that further include:
In each grab, the crawl before recording the sensing data and adjustment is tactful tactful with crawl adjusted every time, Form the second data set;
The training that deep neural network is carried out according to second data set obtains crawl decision adjustment model.
6. grasping body method as claimed in claim 5, which is characterized in that described to be grabbed according to the sensing data to described It takes strategy to be adjusted, obtains the crawl adjustable strategies, comprising:
The sensing data is inputted into the crawl Developing Tactics model, obtains the crawl adjustable strategies.
7. grasping body method as described in any one of claims 1 to 6, which is characterized in that further include:
When the number for attempting crawl to simple target object reaches pre-set limit, controls the gripper components termination execution and grab It takes.
8. a kind of device for grasping bodies, it is applied to robot, which is characterized in that the device for grasping bodies includes:
Parameter detection module, for detecting type, position and the 3 d pose of target object;
First execution module, for the type, the position and/or the 3 d pose to be inputted grasping movement decision model Type obtains crawl strategy, and grabs institute's target object by the crawl policy control gripper components;
Judgment module is grabbed, the sensing data for detecting according to the gripper components judges whether to grab successfully, such as grab It takes and successfully then executes subsequent operation;
Second execution module, for as determined to carry out the crawl strategy according to the sensing data without grabbing successfully Adjustment is adjusted crawl strategy, and the gripper components according to adjustment crawl policy control execute crawl again, and return It returns to and described judges whether to grab successfully step according to the sensing data that gripper components detect.
9. device for grasping bodies as claimed in claim 8, which is characterized in that the parameter detection module includes:
Image capturing unit, for obtaining the image of the target object using visible detection method;
Execution unit is detected, for the image to be inputted target detection model, obtains the type, the position and described three Tie up posture.
10. device for grasping bodies as claimed in claim 8, which is characterized in that further include:
First logging modle forms the first data set for recording the sensing data in each crawl, and described first Sensing data when including sensing data and the crawl failure when grabbing successfully in data set;
First training module is obtained crawl and judges mould for being carried out the training of deep neural network using first data set Type, the crawl judgment models are for judging whether crawl succeeds.
11. device for grasping bodies as claimed in claim 10, which is characterized in that the crawl judgment module is used for the biography Sensor data are input to the crawl judgment models, whether are grabbed successful judging result.
12. device for grasping bodies as claimed in claim 8, which is characterized in that further include:
Second logging modle, for the crawl strategy before in each crawl, recording the sensing data and adjusting and every time Crawl strategy adjusted, forms the second data set;
Second training module obtains crawl decision tune for carrying out the training of deep neural network according to second data set Integral mould.
13. device for grasping bodies as claimed in claim 12, which is characterized in that second execution module is used for the biography Sensor data input the crawl Developing Tactics model, obtain the crawl adjustable strategies.
14. such as the described in any item device for grasping bodies of claim 8~13, which is characterized in that second execution module is also It is grabbed for when the number for attempting crawl to simple target object reaches pre-set limit, controlling the gripper components termination execution It takes.
15. a kind of robot, which is characterized in that be arranged just like the described in any item device for grasping bodies of claim 8~14.
CN201811256263.3A 2018-10-26 2018-10-26 A kind of robot and its grasping body method and apparatus Pending CN109333536A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811256263.3A CN109333536A (en) 2018-10-26 2018-10-26 A kind of robot and its grasping body method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811256263.3A CN109333536A (en) 2018-10-26 2018-10-26 A kind of robot and its grasping body method and apparatus

Publications (1)

Publication Number Publication Date
CN109333536A true CN109333536A (en) 2019-02-15

Family

ID=65311903

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811256263.3A Pending CN109333536A (en) 2018-10-26 2018-10-26 A kind of robot and its grasping body method and apparatus

Country Status (1)

Country Link
CN (1) CN109333536A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110271007A (en) * 2019-07-24 2019-09-24 广东工业大学 A kind of the grasping body method and relevant apparatus of mechanical arm
CN110329710A (en) * 2019-05-31 2019-10-15 牧今科技 Robot system and its operating method with robots arm's absorption and control mechanism
CN110605714A (en) * 2019-08-06 2019-12-24 华中科技大学 Hand-eye coordination grabbing method based on human eye fixation point
CN110782038A (en) * 2019-09-27 2020-02-11 深圳蓝胖子机器人有限公司 Method and system for automatically marking training sample and method and system for supervised learning
CN111230878A (en) * 2020-02-14 2020-06-05 珠海格力智能装备有限公司 Stacking robot control method, device and equipment and stacking robot system
CN111730606A (en) * 2020-08-13 2020-10-02 深圳国信泰富科技有限公司 Grabbing action control method and system of high-intelligence robot
CN112040124A (en) * 2020-08-28 2020-12-04 深圳市商汤科技有限公司 Data acquisition method, device, equipment, system and computer storage medium
CN113232019A (en) * 2021-05-13 2021-08-10 中国联合网络通信集团有限公司 Mechanical arm control method and device, electronic equipment and storage medium
CN113753562A (en) * 2021-08-24 2021-12-07 深圳市长荣科机电设备有限公司 Carrying method, system and device based on linear motor and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106530297A (en) * 2016-11-11 2017-03-22 北京睿思奥图智能科技有限公司 Object grabbing region positioning method based on point cloud registering
CN106530276A (en) * 2016-10-13 2017-03-22 中科金睛视觉科技(北京)有限公司 Manipulator positioning method and system for grabbing of non-standard component
CA3029834A1 (en) * 2016-07-18 2018-01-25 Lael Odhner Assessing robotic grasping
CN108058172A (en) * 2017-11-30 2018-05-22 深圳市唯特视科技有限公司 A kind of manipulator grasping means based on autoregression model
CN108340367A (en) * 2017-12-13 2018-07-31 深圳市鸿益达供应链科技有限公司 Machine learning method for mechanical arm crawl
CN108399639A (en) * 2018-02-12 2018-08-14 杭州蓝芯科技有限公司 Fast automatic crawl based on deep learning and arrangement method
CN108537841A (en) * 2017-03-03 2018-09-14 株式会社理光 A kind of implementation method, device and the electronic equipment of robot pickup

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA3029834A1 (en) * 2016-07-18 2018-01-25 Lael Odhner Assessing robotic grasping
CN106530276A (en) * 2016-10-13 2017-03-22 中科金睛视觉科技(北京)有限公司 Manipulator positioning method and system for grabbing of non-standard component
CN106530297A (en) * 2016-11-11 2017-03-22 北京睿思奥图智能科技有限公司 Object grabbing region positioning method based on point cloud registering
CN108537841A (en) * 2017-03-03 2018-09-14 株式会社理光 A kind of implementation method, device and the electronic equipment of robot pickup
CN108058172A (en) * 2017-11-30 2018-05-22 深圳市唯特视科技有限公司 A kind of manipulator grasping means based on autoregression model
CN108340367A (en) * 2017-12-13 2018-07-31 深圳市鸿益达供应链科技有限公司 Machine learning method for mechanical arm crawl
CN108399639A (en) * 2018-02-12 2018-08-14 杭州蓝芯科技有限公司 Fast automatic crawl based on deep learning and arrangement method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
北京机床研究所: "《国外工业机械手参考资料》", 31 December 1974 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110329710A (en) * 2019-05-31 2019-10-15 牧今科技 Robot system and its operating method with robots arm's absorption and control mechanism
CN110271007A (en) * 2019-07-24 2019-09-24 广东工业大学 A kind of the grasping body method and relevant apparatus of mechanical arm
CN110605714A (en) * 2019-08-06 2019-12-24 华中科技大学 Hand-eye coordination grabbing method based on human eye fixation point
CN110605714B (en) * 2019-08-06 2021-08-03 华中科技大学 Hand-eye coordination grabbing method based on human eye fixation point
CN110782038A (en) * 2019-09-27 2020-02-11 深圳蓝胖子机器人有限公司 Method and system for automatically marking training sample and method and system for supervised learning
CN111230878A (en) * 2020-02-14 2020-06-05 珠海格力智能装备有限公司 Stacking robot control method, device and equipment and stacking robot system
CN111230878B (en) * 2020-02-14 2021-10-26 珠海格力智能装备有限公司 Stacking robot control method, device and equipment and stacking robot system
CN111730606A (en) * 2020-08-13 2020-10-02 深圳国信泰富科技有限公司 Grabbing action control method and system of high-intelligence robot
CN111730606B (en) * 2020-08-13 2022-03-04 深圳国信泰富科技有限公司 Grabbing action control method and system of high-intelligence robot
CN112040124A (en) * 2020-08-28 2020-12-04 深圳市商汤科技有限公司 Data acquisition method, device, equipment, system and computer storage medium
CN113232019A (en) * 2021-05-13 2021-08-10 中国联合网络通信集团有限公司 Mechanical arm control method and device, electronic equipment and storage medium
CN113753562A (en) * 2021-08-24 2021-12-07 深圳市长荣科机电设备有限公司 Carrying method, system and device based on linear motor and storage medium

Similar Documents

Publication Publication Date Title
CN109333536A (en) A kind of robot and its grasping body method and apparatus
US11338435B2 (en) Gripping system with machine learning
CN111055279B (en) Multi-mode object grabbing method and system based on combination of touch sense and vision
US10981272B1 (en) Robot grasp learning
CN109176521A (en) A kind of mechanical arm and its crawl control method and system
CN110000785B (en) Agricultural scene calibration-free robot motion vision cooperative servo control method and equipment
CN110125930B (en) Mechanical arm grabbing control method based on machine vision and deep learning
CN105598965B (en) The autonomous grasping means of robot drive lacking hand based on stereoscopic vision
US11813749B2 (en) Robot teaching by human demonstration
CN112297013B (en) Robot intelligent grabbing method based on digital twin and deep neural network
JP6671694B1 (en) Machine learning device, machine learning system, data processing system, and machine learning method
CN109421071A (en) Article stacking adapter and machine learning device
CN110238855B (en) Robot out-of-order workpiece grabbing method based on deep reverse reinforcement learning
CN110271007A (en) A kind of the grasping body method and relevant apparatus of mechanical arm
Ji et al. Learning-based automation of robotic assembly for smart manufacturing
CN110539299B (en) Robot working method, controller and robot system
CN109955244A (en) Grabbing control method and device based on visual servo and robot
CN110428464A (en) Multi-class out-of-order workpiece robot based on deep learning grabs position and orientation estimation method
JP2019057250A (en) Work-piece information processing system and work-piece recognition method
CN109693234A (en) Robot falling prediction method and device, terminal equipment and computer storage medium
CN115989117A (en) System and method for object detection
US20230330858A1 (en) Fine-grained industrial robotic assemblies
CN112633187B (en) Automatic robot carrying method, system and storage medium based on image analysis
TW201914782A (en) Holding position and posture instruction apparatus, holding position and posture instruction method, and robot system
Hafiane et al. 3D hand recognition for telerobotics

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190215