CN106886165A - A kind of manipulator crawl and method of operating based on simulator - Google Patents

A kind of manipulator crawl and method of operating based on simulator Download PDF

Info

Publication number
CN106886165A
CN106886165A CN201710141949.7A CN201710141949A CN106886165A CN 106886165 A CN106886165 A CN 106886165A CN 201710141949 A CN201710141949 A CN 201710141949A CN 106886165 A CN106886165 A CN 106886165A
Authority
CN
China
Prior art keywords
crawl
candidate
image
manipulator
simulator
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710141949.7A
Other languages
Chinese (zh)
Inventor
夏春秋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Vision Technology Co Ltd
Original Assignee
Shenzhen Vision Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Vision Technology Co Ltd filed Critical Shenzhen Vision Technology Co Ltd
Priority to CN201710141949.7A priority Critical patent/CN106886165A/en
Publication of CN106886165A publication Critical patent/CN106886165A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B17/00Systems involving the use of models or simulators of said systems
    • G05B17/02Systems involving the use of models or simulators of said systems electric

Abstract

A kind of the manipulator crawl and method of operating based on simulator proposed in the present invention, its main contents include:Simulation initialization, execution artificial tasks and data acquisition, later stage treatment, its process is, first pass through initial object, the configuration of hand, attribute and crawl candidate data storehouse and be simulated initialization, then mapped by the crawl of different images and data set is created in feasible time quantum by some form of parallelization, perform artificial tasks and data acquisition, aggregated data and the pretreatment of object are finally based on, object crawl database is formed, and carry out later stage treatment.Manipulator crawl and method of operating of the present invention based on simulator, reduce noise, are limited by sensor and also greatly reduced;Flexibility, the stability of manipulator are improve, the accuracy of measurement is also improved.

Description

A kind of manipulator crawl and method of operating based on simulator
Technical field
The present invention relates to field in intelligent robotics, more particularly, to a kind of manipulator crawl and manipulation based on simulator Method.
Background technology
Robot delicate can complete flexible, fine grasping manipulation, more as effective extension of mankind's activity limbs More to turn into one of the popular research direction of robot field.Relative to simple end-effector, robot delicate has Highly versatile, perception are abundant, can realize meeting geometry closing and the advantages of force-closed accurate, firm crawl.It is One it is highly integrated, with various perceptional functions and intelligentized Mechatronic Systems, be related to theory of mechanisms, bionics, automatically control, The multiple research field such as sensor technology, computer technology, artificial intelligence, mechanics of communication, microelectronics, materialogy and intersection are learned Section.
Manipulator can be widely applied in hazardous environment operation, marine resources detection and space exploration, in the future also will be by In gradually spreading to our daily life.But current manipulator is due to being limited (including noise) by sensor, generally, survey Amount result is all not accurate enough.
The present invention proposes a kind of manipulator crawl based on simulator and method of operating, first passes through initial object, hand Configuration, attribute and crawl candidate data storehouse are simulated initialization, are then mapped and by certain by the crawl of different images The parallelization of form creates data set in feasible time quantum, performs artificial tasks and data acquisition, is finally based on object Aggregated data and pretreatment, form object crawl database, and carry out later stage treatment.The present invention reduces noise, sensed The limitation of device is also greatly reduced;Flexibility, the stability of manipulator are improve, the accuracy of measurement is also improved.
The content of the invention
For the inaccurate problem of measurement result, grabbed it is an object of the invention to provide a kind of manipulator based on simulator Take and method of operating, first pass through initial object, the configuration of hand, attribute and crawl candidate data storehouse and be simulated initialization, then By the crawl mapping and parallelization of different images, artificial tasks and data acquisition are performed, be finally based on the aggregated data of object And pretreatment, form object crawl database.
To solve the above problems, the present invention provides a kind of manipulator crawl based on simulator and method of operating, and its is main Content includes:
(1) simulation initialization;
(2) artificial tasks and data acquisition are performed;
(3) later stage treatment.
Wherein, described simulation initialization, each simulation needs the configuration of an initial object and hand;Object properties need It is defined, and needs to generate possible crawl candidate list;Including initial object, the configuration of hand, attribute and crawl candidate's number According to storehouse.
Further, the configuration of described initial object and hand and attribute, since all object grids are pre-processed;Each Object grid is loaded into Python scripts, and obtains the estimation in object quality and the center of inertia;
Using these pretreatment values, each grid is loaded into robot simulation simulation softward (V-REP), determines object Rest attitude and handgrip initial attitude;First for object distributes a bounding box, the bounding box passes through relative { W } weight The posture of new orientation object, is assigned as the geometric center of object to estimate the posture of object by frame center;Then along { T } just Z-direction placing objects 0.3m;Relative to { T }, object is set to be centered in (x, y)=(0.0), remains stationary appearance using pure translational component State;
The stationary posture is given, the initial position being placed on along positive Z-direction in { O } then will be captured;Chosen distance pair The center of elephantRice, from Partial frame to along x, the bounding box edge direction of y, z;Record all objects Attribute (including subject poses, object bounds frame and material) and crawl attitude, and this mistake is repeated to each object that data are concentrated Journey.
Further, described crawl candidate data storehouse, in simulations for covering the possible side for capturing candidate spatial Method, based on the front and rear multiplication of object configuration, it is represented as transformation matrix;
The bounding box and fixture postures of given object, by around object globally rotary grasping (pre-multiplied) and locally (after Multiply) calculate offline crawl candidate;Respectively in X, 3 × 3 spin matrix R are multiplied by Y and Z axisX(α)RY(β)RZ(γ);Omit α, Beta, gamma, transformation matrix is calculated according to following formula:
Wherein, Q represents the final conversion of crawl coordinate system;Crawl candidate item off-line execution in Python scripts is calculated, And use the estimation bounding box of object, transformation matrixWithSelection constraint makes around Z axis 8 rotations of (i.e. every 45 °) generation, And locally rotating will occur with the yardstick somewhat thinner than overall situation rotation.
Further, described crawl candidate, after computing formula (1), check whether new chucking position suitable (if Be, cancel clamp candidate), solve system of linear equations, check the vectorial normal from fixture palm whether the bounding box with object It is intersecting;If intersecting, crawl candidate is added in crawl candidate data storehouse, and repeats the process, until rotation list is used up; In all possible candidate item in database, most 10000 checkings are selected in simulator.
Wherein, described execution artificial tasks and data acquisition, including different images crawl mapping and parallelization;Pass through Object is loaded into simulation, and is initialized its quality, inertia and attitude in the value that initial phase is recorded and is started;Object It is initially placed into static step so that when finger tip and object contact, object is not moved;After loading object, simulator exists Initial phase is sampled to test to potential candidate item subset;It is due to crawl configuration and latent with workbench or object In collision, if crawl is infeasible, stopping currently being attempted and moved at next candidate.
Further, described test, using proximity transducer check each feasible crawl candidate verify palm towards Object;If in the position, connecting the proximity transducer detection object of crawl, then the detected surface point of its record, and away from It is with a distance from the surface point for detecting:Three crawls (capturing orientation using identical) are attempted at 0.06,0.09,0.12m, and Along original palm normal;These distances are selected as the finger tip in palm and Barrett Hand (Barrett manipulator) The distance between (0.145m) is interior, and allows to detect the geometry of object with slightly different yardstick;
During attempting each time, camera is positioned at the distance of palm 0.25m along local negative Z-direction, and And the image of object was recorded before crawl is attempted;Once crawl is placed and have recorded image, executor is around object Closure;If all finger tips are contacted with object, object is changed into dynamic analog, and since elevation process;
The target raised position relative to { T } (0.0,0.0,0.60m) is selected, and forces manipulator to be protected during advancing Hold current grip posture;Once crawl has arrived at target location, if all finger tips are still contacted with object, then it is assumed that hold Hold is stabilization and success;The process is repeated, until crawl candidate list is used up;In V-REP, motor pool wrapper is used To calculate track, and along the path execution incremental steps of generation.
Further, described different image and crawl map, due to crawl programming always in a similar way around Object is closed, so two different views of object are collected in clamping process being respectively:
(1) direction of camera is always upward (one-to-many mapping);
(2) direction of camera always matches the direction (one-to-one mapping) of crawl;
Be incorporated into ambiguity in grasping space by the one-to-many mapping between by arousing image and grasping;This In the case of, fixture orientation is not directly related to camera orientation, it means that single image can correspond to may many differences Fixture;However, more direct relation between introducing image and grasping;The similar orientation reflection of the object for capturing in the picture Similar orientation in grasping.
Further, described parallelization, because this large amount of crawl candidate is sampled, and the object to be assessed Quantity is relatively high, in order to create data set in feasible time quantum, it is necessary to some form of parallelization;
Due to the requirement from the vision sensor for needing a small amount of memory from graphics card, under server, not have The mode operation for having any graphical interfaces operates each scene.
Wherein, described later stage treatment, in simulations, the information captured by depth buffer be encoded into [0,1] it Between scope, and can by it is following operation be decoded as actual value:
I=Xnear+I*(Xfar-Xnear) (2)
Wherein, I is the image collected, Xnear, XfarIt is respectively that near and far cuts out plane;Delete all object crawl examples Right, wherein image variance is less than 1e-3;When camera heights and table height match, all crawls of collected image are deleted.
Brief description of the drawings
Fig. 1 is a kind of system flow chart of manipulator crawl and method of operating based on simulator of the present invention.
Fig. 2 is a kind of crawl candidate data storehouse of manipulator crawl and method of operating based on simulator of the present invention.
Fig. 3 is a kind of flow of the execution artificial tasks of manipulator crawl and method of operating based on simulator of the present invention Figure.
Fig. 4 is a kind of data acquisition of manipulator crawl and method of operating based on simulator of the present invention.
Specific embodiment
It should be noted that in the case where not conflicting, the feature in embodiment and embodiment in the application can phase Mutually combine, the present invention is described in further detail with specific embodiment below in conjunction with the accompanying drawings.
Fig. 1 is a kind of system flow chart of manipulator crawl and method of operating based on simulator of the present invention.Mainly include Simulation initialization, performs artificial tasks and data acquisition, later stage treatment.
Simulation initialization, each simulation needs the configuration of an initial object and hand;Object properties need to be defined, and Need to generate possible crawl candidate list;Including initial object, the configuration of hand, attribute and crawl candidate data storehouse.
Wherein, the configuration of initial object and hand and attribute, since all object grids are pre-processed;Each object grid quilt It is loaded into Python scripts, and obtains the estimation in object quality and the center of inertia;
Using these pretreatment values, each grid is loaded into robot simulation simulation softward (V-REP), determines object Rest attitude and handgrip initial attitude;First for object distributes a bounding box, the bounding box passes through relative { W } weight The posture of new orientation object, is assigned as the geometric center of object to estimate the posture of object by frame center;Then along { T } just Z-direction placing objects 0.3m;Relative to { T }, object is set to be centered in (x, y)=(0.0), remains stationary appearance using pure translational component State;
The stationary posture is given, the initial position being placed on along positive Z-direction in { O } then will be captured;Chosen distance pair The center of elephantRice, from Partial frame to along x, the bounding box edge direction of y, z;Record all objects Attribute (including subject poses, object bounds frame and material) and crawl attitude, and this mistake is repeated to each object that data are concentrated Journey.
Later stage is processed, and in simulations, the information captured by depth buffer is encoded into the scope between [0,1], and And actual value can be decoded as by following operation:
I=Xnear+I*(Xfar-Xnear) (2)
Wherein, I is the image collected, Xnear, XfarIt is respectively that near and far cuts out plane;Delete all object crawl examples Right, wherein image variance is less than 1e-3;When camera heights and table height match, all crawls of collected image are deleted.
Fig. 2 is a kind of crawl candidate data storehouse of manipulator crawl and method of operating based on simulator of the present invention.In mould The method for being used to cover possible crawl candidate spatial in plan, based on the front and rear multiplication of object configuration, it is represented as converting square Battle array;
The bounding box and fixture postures of given object, by around object globally rotary grasping (pre-multiplied) and locally (after Multiply) calculate offline crawl candidate;Respectively in X, 3 × 3 spin matrix R are multiplied by Y and Z axisX(α)RY(β)RZ(γ);Omit α, Beta, gamma, transformation matrix is calculated according to following formula:
Wherein, Q represents the final conversion of crawl coordinate system;Crawl candidate item off-line execution in Python scripts is calculated, And use the estimation bounding box of object, transformation matrixWithSelection constraint makes around Z axis 8 rotations of (i.e. every 45 °) generation, And locally rotating will occur with the yardstick somewhat thinner than overall situation rotation.
After computing formula (1), check whether new chucking position is suitable (if it is, cancel clamping candidate), solves line Property equation group, check the vectorial normal from fixture palm whether intersect with the bounding box of object;If intersecting, candidate will be captured It is added in crawl candidate data storehouse, and repeats the process, until rotation list is used up;All possible time in database In option, most 10000 checkings are selected in simulator.
Fig. 3 is a kind of flow of the execution artificial tasks of manipulator crawl and method of operating based on simulator of the present invention Figure.Performing artificial tasks and data acquisition includes the crawl mapping and parallelization of different images;Simulation is loaded into by by object In, and initialize its quality, inertia and attitude in the value that initial phase is recorded and start;Object is initially placed into static state In step so that when finger tip and object contact, object is not moved;After loading object, simulator is in initial phase to potential Candidate item subset is sampled to be tested;Potential collision due to crawl configuration and with workbench or object, if crawl is not It is feasible, then stop currently attempting and moving at next candidate.
Fig. 4 is a kind of data acquisition of manipulator crawl and method of operating based on simulator of the present invention.Using connecing Nearly sensor checks that each feasible crawl candidate verifies palm towards object;If in the position, connection crawl close to biography Sensor detection object, then it records detected surface point, and the distance of the surface point detected in distance is:0.06,0.09, Three crawls (capturing orientation using identical) are attempted at 0.12m, and along original palm normal;These distances are selected as The distance between finger tip (0.145m) in palm and Barrett Hand (Barrett manipulator) is interior, and allows with slightly different Yardstick detect the geometry of object;
During attempting each time, camera is positioned at the distance of palm 0.25m along local negative Z-direction, and And the image of object was recorded before crawl is attempted;Once crawl is placed and have recorded image, executor is around object Closure;If all finger tips are contacted with object, object is changed into dynamic analog, and since elevation process;
The target raised position relative to { T } (0.0,0.0,0.60m) is selected, and forces manipulator to be protected during advancing Hold current grip posture;Once crawl has arrived at target location, if all finger tips are still contacted with object, then it is assumed that hold Hold is stabilization and success;The process is repeated, until crawl candidate list is used up;In V-REP, motor pool wrapper is used To calculate track, and along the path execution incremental steps of generation.
Because crawl programming is always closed around object in a similar way, so collecting the two of object in clamping process Individual different views are respectively:
(1) direction of camera is always upward (one-to-many mapping);
(2) direction of camera always matches the direction (one-to-one mapping) of crawl;
Be incorporated into ambiguity in grasping space by the one-to-many mapping between by arousing image and grasping;This In the case of, fixture orientation is not directly related to camera orientation, it means that single image can correspond to may many differences Fixture;However, more direct relation between introducing image and grasping;The similar orientation reflection of the object for capturing in the picture Similar orientation in grasping.
Because this large amount of crawl candidate is sampled, and the quantity of the object to be assessed is relatively high, in order to feasible Time quantum in create data set, it is necessary to some form of parallelization;
Due to the requirement from the vision sensor for needing a small amount of memory from graphics card, under server, not have The mode operation for having any graphical interfaces operates each scene.
For those skilled in the art, the present invention is not restricted to the details of above-described embodiment, without departing substantially from essence of the invention In the case of god and scope, the present invention can be realized with other concrete forms.Additionally, those skilled in the art can be to this hair Bright to carry out various changes and modification without departing from the spirit and scope of the present invention, these improvement also should be regarded as of the invention with modification Protection domain.Therefore, appended claims are intended to be construed to include preferred embodiment and fall into all changes of the scope of the invention More and modification.

Claims (10)

1. a kind of manipulator crawl and method of operating based on simulator, it is characterised in that main to include simulation initialization (); Perform artificial tasks and data acquisition (two);Later stage processes (three).
2. based on simulation initialization () described in claims 1, it is characterised in that each simulation needs an initial object With the configuration of hand;Object properties need to be defined, and need to generate possible crawl candidate list;Including initial object, hand Configuration, attribute and crawl candidate data storehouse.
3. based on the initial object and configuration and the attribute of hand described in claims 2, it is characterised in that all right from pre-processing As grid starts;Each object grid is loaded into Python scripts, and obtains the estimation in object quality and the center of inertia;
Using these pretreatment values, each grid is loaded into robot simulation simulation softward (V-REP), determines the first of object The initial attitude of beginning stationary posture and handgrip;First for object distributes a bounding box, it is again fixed that the bounding box passes through relative { W } To the posture of object, frame center is assigned as into the geometric center of object to estimate the posture of object;Then along the positive Z side of { T } To placing objects 0.3m;Relative to { T }, object is set to be centered in (x, y)=(0.0), remains stationary attitude using pure translational component;
The stationary posture is given, the initial position being placed on along positive Z-direction in { O } then will be captured;Chosen distance object CenterRice, from Partial frame to along x, the bounding box edge direction of y, z;Record all object properties (including subject poses, object bounds frame and material) and crawl attitude, and this process is repeated to each object that data are concentrated.
4. based on the crawl candidate data storehouse described in claims 2, it is characterised in that in simulations for covering possible grabbing The method for taking candidate spatial, based on the front and rear multiplication of object configuration, it is represented as transformation matrix;
The bounding box and fixture postures of given object, by around object globally rotary grasping (pre-multiplied) and locally (multiplying afterwards) come Calculate offline crawl candidate;Respectively in X, 3 × 3 spin matrix R are multiplied by Y and Z axisX(α)RY(β)RZ(γ);Omission α, beta, gamma, Transformation matrix is calculated according to following formula:
Q = R X R Y R Z T G O R X R Y R Z - - - ( 1 )
Wherein, Q represents the final conversion of crawl coordinate system;Crawl candidate item off-line execution in Python scripts is calculated, and is made With the estimation bounding box of object, transformation matrixWithSelection constraint makes around Z axis 8 rotations of (i.e. every 45 °) generation, and Local rotation will occur with the yardstick somewhat thinner than overall situation rotation.
5., based on the crawl candidate described in claims 4, it is characterised in that after computing formula (1), new fixture position is checked Put whether properly (if it is, cancel clamping candidate), solve system of linear equations, whether check the vectorial normal from fixture palm Intersect with the bounding box of object;If intersecting, crawl candidate is added in crawl candidate data storehouse, and repeats the process, directly Used up to rotation list;In all possible candidate item in database, most 10000 checkings are selected in simulator.
6. based on the execution artificial tasks described in claims 1 and data acquisition (two), it is characterised in that including different images Crawl mapping and parallelization;It is loaded into simulation by by object, and its matter is initialized in the value that initial phase is recorded Amount, inertia and attitude and start;Object is initially placed into static step so that when finger tip and object contact, object is not It is mobile;After loading object, simulator is sampled to test in initial phase to potential candidate item subset;Due to crawl Configuration and the potential collision with workbench or object, if crawl is infeasible, stopping currently being attempted and moves to next candidate Place.
7. based on the test described in claims 6, it is characterised in that check that each feasible crawl is waited using proximity transducer Palm towards object is verified in choosing;If in the position, connecting the proximity transducer detection object of crawl, then its record is detected Surface point, and the distance of the surface point detected in distance is:Three crawls are attempted at 0.06,0.09,0.12m (using identical Crawl orientation), and along original palm normal;These distances are selected as in palm and Barrett Hand (Barretts Manipulator) the distance between finger tip (0.145m) it is interior, and allow to detect the geometry of object with slightly different yardstick;
During attempting each time, camera is positioned at the distance of palm 0.25m along local negative Z-direction, and The image of object is recorded before attempting crawl;Once crawl is placed and have recorded image, executor is closed around object; If all finger tips are contacted with object, object is changed into dynamic analog, and since elevation process;
The target raised position relative to { T } (0.0,0.0,0.60m) is selected, and forces manipulator to keep working as during advancing Preceding grip posture;Once crawl has arrived at target location, if all finger tips are still contacted with object, then it is assumed that gripping is It is stabilization and successful;The process is repeated, until crawl candidate list is used up;In V-REP, counted using motor pool wrapper Track is calculated, and along the path execution incremental steps of generation.
8. mapped based on the different image described in claims 6 and crawl, it is characterised in that due to crawl programming always with Similar mode is closed around object, so two different views of object are collected in clamping process being respectively:
(1) direction of camera is always upward (one-to-many mapping);
(2) direction of camera always matches the direction (one-to-one mapping) of crawl;
Be incorporated into ambiguity in grasping space by the one-to-many mapping between by arousing image and grasping;In such case Under, fixture orientation is not directly related to camera orientation, it means that single image can correspond to may many different folders Tool;However, more direct relation between introducing image and grasping;The similar orientation of the object for capturing in the picture is reflected grabs Hold interior similar orientation.
9. based on the parallelization described in claims 6, it is characterised in that because this large amount of crawl candidate is sampled, and And the quantity of the object to be assessed is relatively high, in order to create data set in feasible time quantum, it is necessary to some form of parallel Change;
Due to the requirement from the vision sensor for needing a small amount of memory from graphics card, under server, appointed with no The mode operation of what graphical interfaces operates each scene.
10. based on later stage treatment (three) described in claims 1, it is characterised in that in simulations, caught by depth buffer The information for obtaining is encoded into the scope between [0,1], and can be decoded as actual value by following operation:
I=Xnear+I*(Xfar-Xnear) (2)
Wherein, I is the image collected, Xnear, XfarIt is respectively that near and far cuts out plane;All object crawl examples pair are deleted, its Middle image variance is less than 1e-3;When camera heights and table height match, all crawls of collected image are deleted.
CN201710141949.7A 2017-03-10 2017-03-10 A kind of manipulator crawl and method of operating based on simulator Pending CN106886165A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710141949.7A CN106886165A (en) 2017-03-10 2017-03-10 A kind of manipulator crawl and method of operating based on simulator

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710141949.7A CN106886165A (en) 2017-03-10 2017-03-10 A kind of manipulator crawl and method of operating based on simulator

Publications (1)

Publication Number Publication Date
CN106886165A true CN106886165A (en) 2017-06-23

Family

ID=59179603

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710141949.7A Pending CN106886165A (en) 2017-03-10 2017-03-10 A kind of manipulator crawl and method of operating based on simulator

Country Status (1)

Country Link
CN (1) CN106886165A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109144036A (en) * 2018-10-22 2019-01-04 江苏艾科半导体有限公司 A kind of manipulator simulated testing system and test method based on fpga chip
CN110115849A (en) * 2019-04-30 2019-08-13 厦门大学 A kind of small-sized marionette robot control method, system, terminal device
WO2020016717A1 (en) * 2018-07-19 2020-01-23 International Business Machines Corporation Perform peg-in-hole task with unknown tilt
CN111861305A (en) * 2018-10-30 2020-10-30 牧今科技 Robotic system with automated package registration mechanism and minimum feasible area detection
CN113043325A (en) * 2019-12-27 2021-06-29 沈阳新松机器人自动化股份有限公司 Method and device for detecting motion state of robot joint
US11780101B2 (en) 2018-10-30 2023-10-10 Mujin, Inc. Automated package registration systems, devices, and methods

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MATTHEW VERES等: "An Integrated Simulator and Dataset that Combines Grasping and Vision for Deep Learning", 《ARXIV(HTTPS://ARXIV.ORG/ABS/1702.02103V1)》 *

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2588873B (en) * 2018-07-19 2021-10-13 Ibm Perform peg-in-hole task with unknown tilt
CN112424703A (en) * 2018-07-19 2021-02-26 国际商业机器公司 Performing pin hole tasks with unknown inclinations
WO2020016717A1 (en) * 2018-07-19 2020-01-23 International Business Machines Corporation Perform peg-in-hole task with unknown tilt
GB2588873A (en) * 2018-07-19 2021-05-12 Ibm Perform peg-in-hole task with unknown tilt
US10953548B2 (en) 2018-07-19 2021-03-23 International Business Machines Corporation Perform peg-in-hole task with unknown tilt
CN109144036A (en) * 2018-10-22 2019-01-04 江苏艾科半导体有限公司 A kind of manipulator simulated testing system and test method based on fpga chip
CN109144036B (en) * 2018-10-22 2023-11-21 江苏艾科半导体有限公司 Manipulator simulation test system and test method based on FPGA chip
US11288810B2 (en) 2018-10-30 2022-03-29 Mujin, Inc. Robotic system with automated package registration mechanism and methods of operating the same
CN111861305A (en) * 2018-10-30 2020-10-30 牧今科技 Robotic system with automated package registration mechanism and minimum feasible area detection
US11501445B2 (en) 2018-10-30 2022-11-15 Mujin, Inc. Robotic system with automated package scan and registration mechanism and methods of operating the same
US11636605B2 (en) 2018-10-30 2023-04-25 Mujin, Inc. Robotic system with automated package registration mechanism and minimum viable region detection
US11780101B2 (en) 2018-10-30 2023-10-10 Mujin, Inc. Automated package registration systems, devices, and methods
US11797926B2 (en) 2018-10-30 2023-10-24 Mujin, Inc. Robotic system with automated object detection mechanism and methods of operating the same
US11961042B2 (en) 2018-10-30 2024-04-16 Mujin, Inc. Robotic system with automated package registration mechanism and auto-detection pipeline
CN110115849A (en) * 2019-04-30 2019-08-13 厦门大学 A kind of small-sized marionette robot control method, system, terminal device
CN113043325B (en) * 2019-12-27 2022-08-16 沈阳新松机器人自动化股份有限公司 Method and device for detecting motion state of robot joint
CN113043325A (en) * 2019-12-27 2021-06-29 沈阳新松机器人自动化股份有限公司 Method and device for detecting motion state of robot joint

Similar Documents

Publication Publication Date Title
CN106886165A (en) A kind of manipulator crawl and method of operating based on simulator
Newbury et al. Deep learning approaches to grasp synthesis: A review
Popović et al. A strategy for grasping unknown objects based on co-planarity and colour information
JP5743499B2 (en) Image generating apparatus, image generating method, and program
Kang et al. Toward automatic robot instruction from perception-mapping human grasps to manipulator grasps
Ekvall et al. Learning and evaluation of the approach vector for automatic grasp generation and planning
JP7022076B2 (en) Image recognition processors and controllers for industrial equipment
CN109015640B (en) Grabbing method, grabbing system, computer device and readable storage medium
JP5458885B2 (en) Object detection method, object detection apparatus, and robot system
CN110355754A (en) Robot eye system, control method, equipment and storage medium
Eizicovits et al. Efficient sensory-grounded grasp pose quality mapping for gripper design and online grasp planning
CN108712946A (en) Cargo arrangement method, device, system and electronic equipment and readable storage medium storing program for executing
Jiang et al. Learning hardware agnostic grasps for a universal jamming gripper
CN105196290B (en) Real-time robot Grasp Planning
WO2020190166A1 (en) Method and system for grasping an object by means of a robotic device
Moisio et al. Model of tactile sensors using soft contacts and its application in robot grasping simulation
CN109213202A (en) Cargo arrangement method, device, equipment and storage medium based on optical servo
Lepora et al. Pose-based tactile servoing: Controlled soft touch using deep learning
Rydén et al. A method for constraint-based six degree-of-freedom haptic interaction with streaming point clouds
Cao et al. Fuzzy-depth objects grasping based on fsg algorithm and a soft robotic hand
Song et al. Learning optimal grasping posture of multi-fingered dexterous hands for unknown objects
Caselli et al. Haptic object recognition with a dextrous hand based on volumetric shape representations
CN117103277A (en) Mechanical arm sensing method based on multi-mode data fusion
Marchionne et al. GNC architecture solutions for robust operations of a free-floating space manipulator via image based visual servoing
Kawasaki et al. Virtual robot teaching for humanoid hand robot using muti-fingered haptic interface

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20170623

WD01 Invention patent application deemed withdrawn after publication