US20230105746A1 - Systems and methods for robotic control under contact - Google Patents
Systems and methods for robotic control under contact Download PDFInfo
- Publication number
- US20230105746A1 US20230105746A1 US18/077,135 US202218077135A US2023105746A1 US 20230105746 A1 US20230105746 A1 US 20230105746A1 US 202218077135 A US202218077135 A US 202218077135A US 2023105746 A1 US2023105746 A1 US 2023105746A1
- Authority
- US
- United States
- Prior art keywords
- grasp
- robot
- determining
- behavior
- physical
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 41
- 230000006399 behavior Effects 0.000 claims description 50
- 230000009471 action Effects 0.000 claims description 5
- 238000005457 optimization Methods 0.000 claims description 3
- 230000002787 reinforcement Effects 0.000 claims description 3
- 230000007613 environmental effect Effects 0.000 claims 4
- 238000011156 evaluation Methods 0.000 claims 4
- 238000004891 communication Methods 0.000 description 30
- 230000006870 function Effects 0.000 description 17
- 238000003860 storage Methods 0.000 description 16
- 230000033001 locomotion Effects 0.000 description 11
- 230000008569 process Effects 0.000 description 9
- 238000004590 computer program Methods 0.000 description 8
- 238000013528 artificial neural network Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 7
- 239000012636 effector Substances 0.000 description 7
- 239000000463 material Substances 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 230000005236 sound signal Effects 0.000 description 5
- 230000007704 transition Effects 0.000 description 5
- 230000003213 activating effect Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 230000001953 sensory effect Effects 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000013500 data storage Methods 0.000 description 3
- 230000003203 everyday effect Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 238000004088 simulation Methods 0.000 description 3
- 239000000835 fiber Substances 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 230000008447 perception Effects 0.000 description 2
- 238000013442 quality metrics Methods 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 241000699670 Mus sp. Species 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000010420 art technique Methods 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000002354 daily effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000005315 distribution function Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 230000005670 electromagnetic radiation Effects 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000010813 municipal solid waste Substances 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 230000010399 physical interaction Effects 0.000 description 1
- 239000004033 plastic Substances 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000010845 search algorithm Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000036962 time dependent Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000013526 transfer learning Methods 0.000 description 1
- 235000013311 vegetables Nutrition 0.000 description 1
- 238000005303 weighing Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
- B25J9/1666—Avoiding collision or forbidden zones
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/02—Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian coordinate type
- B25J9/023—Cartesian coordinate type
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/081—Touching devices, e.g. pressure-sensitive
- B25J13/082—Grasping-force detectors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/081—Touching devices, e.g. pressure-sensitive
- B25J13/084—Tactile sensors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/085—Force or torque sensors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/087—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices for sensing other physical parameters, e.g. electrical or chemical properties
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/088—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J15/00—Gripping heads and other end effectors
- B25J15/0033—Gripping heads and other end effectors with gripping surfaces having special shapes
- B25J15/0038—Cylindrical gripping surfaces
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J15/00—Gripping heads and other end effectors
- B25J15/02—Gripping heads and other end effectors servo-actuated
- B25J15/0206—Gripping heads and other end effectors servo-actuated comprising articulated grippers
- B25J15/0233—Gripping heads and other end effectors servo-actuated comprising articulated grippers actuated by chains, cables or ribbons
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J15/00—Gripping heads and other end effectors
- B25J15/08—Gripping heads and other end effectors having finger members
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0009—Constructional details, e.g. manipulator supports, bases
- B25J9/0012—Constructional details, e.g. manipulator supports, bases making use of synthetic construction materials, e.g. plastics, composites
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0084—Programme-controlled manipulators comprising a plurality of manipulators
- B25J9/0087—Dual arms
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/161—Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1612—Programme controls characterised by the hand, wrist, grip control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/163—Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/1633—Programme controls characterised by the control loop compliant, force, torque control, e.g. combined with position control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/1653—Programme controls characterised by the control loop parameters identification, estimation, stiffness, accuracy, error analysis
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1669—Programme controls characterised by programming, planning systems for manipulators characterised by special application, e.g. multi-arm co-operation, assembly, grasping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1674—Programme controls characterised by safety, monitoring, diagnostic
- B25J9/1676—Avoiding collision or forbidden zones
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1682—Dual arm manipulator; Coordination of several manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G67/00—Loading or unloading vehicles
- B65G67/02—Loading or unloading land vehicles
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/18—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
- G05B19/4155—Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by programme execution, i.e. part programme or machine function execution, e.g. selection of a programme
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/02—Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian coordinate type
- B25J9/023—Cartesian coordinate type
- B25J9/026—Gantry-type
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/02—Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian coordinate type
- B25J9/04—Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian coordinate type by rotating at least one arm, excluding the head movement itself, e.g. cylindrical coordinate type or polar coordinate type
- B25J9/046—Revolute coordinate type
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/10—Programme-controlled manipulators characterised by positioning means for manipulator elements
- B25J9/12—Programme-controlled manipulators characterised by positioning means for manipulator elements electric
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39343—Force based impedance control
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39346—Workspace impedance control
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39347—Joint space impedance control
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39348—Generalized impedance control
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39528—Measuring, gripping force sensor build into hand
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39532—Gripping force sensor build into finger
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39536—Planning of hand motion, grasping
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40272—Manipulator on slide, track
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40293—Gantry, portal
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40627—Tactile image sensor, matrix, array of tactile elements, tixels
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/50—Machine tool, machine tool null till machine tool work handling
- G05B2219/50391—Robot
Definitions
- the embodiments described herein are related to robotic control systems, and more specifically to a robotic software system for accurate control of robots that physically interact with various objects in their environments, while simultaneously incorporating the force feedback from these physical interactions into a “control policy”.
- robots perform many kinds of manipulation on a daily basis. They lift massive objects, move with blurring speed, and repeat complex performances with unerring precision. Yet outside of these carefully controlled robot realms, even the most sophisticated robot would be unable to perform many tasks that involve contact with other objects. Everyday manipulation tasks would stump conventionally controlled robots. As such, outside of controlled environments, robots have only performed sophisticated manipulation tasks when operated by a human.
- robots have performed sophisticated manipulation tasks such as grasping multifaceted objects, tying knots, carrying objects around complex obstacles, and extracting objects from piles of entangled objects.
- the control algorithms for these demonstrations often employ search algorithms to find satisfactory solutions, such as a path to a goal state, or a configuration of a gripper that maximizes a measure of grasp quality against an object.
- robots have only performed sophisticated manipulation tasks when operated by a human. Through teleoperation, even highly complex humanoid robots have performed a variety of challenging everyday manipulation tasks, such as grasping everyday objects, using a power drill, throwing away trash, and retrieving a drink from a refrigerator.
- a system comprises a database; at least one hardware processor coupled with the database; and one or more software modules that, when executed by the at least one hardware processor, receive at least one of sensory data from a robot and images from a camera, identify and build models of objects in an environment, wherein the model encompasses immutable properties of identified objects including mass and geometry, and wherein the geometry is assumed not to change, estimate the state including position, orientation, and velocity, of the identified objects, determine based on the state and model, potential configurations, or pre-grasp poses, for grasping the identified objects and return multiple grasping configurations per identified object, determine an object to be picked based on a quality metric, translate the pre- grasp poses into behaviors that define motor forces and torques, communicate the motor forces and torques to the robot in order to allow the robot to perform a complex behavior generated from the behaviors.
- FIG. 1 is a diagram illustrating an example an environment in which a robot can be controlled in accordance with one embodiment
- FIG. 2 is a diagram illustrating an example robot that can be used in the environment of FIG. 1 and controlled in accordance with one embodiment
- FIG. 3 is a diagram illustrating the software modules for controlling a robot within an environment such as depicted in FIG. 1 , in accordance with one example embodiment.
- FIG. 3 illustrates a process for controlling machines to accurately manipulate objects that can be modeled by rigid body kinematics and dynamics, e.g., bottles, silverware, chairs, etc., and the software modules configured to implement the process in accordance with one embodiment.
- Rigid body dynamics can be used to accurately model even non-stiff objects, like a rubber ball, due to the accuracy obtainable by a robot’s sensors during real-time operation.
- robots can be made to, e.g., move furniture, pick up a pen, or use a wrench to tighten a bolt.
- a behavior consists of a control policy that maps estimates of the state of the world to motor signals and, optionally, a deliberative component. For example, controlling a robot to reach toward an object, a “reach” behavior, in order to grasp it, requires planning a path through space such that the robot will not inadvertently collide with itself or its environment. After the plan has been computed, it can be input to an impedance controller, i.e., the control policy, sequentially until the planned motion is complete, which means either that the plan has been successfully executed or that execution failed.
- an impedance controller i.e., the control policy
- Grasping is a key component of the system 300 illustrated in FIG. 3 . Grasping is required before performing any other manipulation tasks, e.g., it is necessary to pick up the wrench to tighten the bolt.
- the grasping system i.e., software and hardware configured to control the robot uses a database (120 in FIG. 1 ), the “grasp database”, to determine how various objects should be grasped.
- the grasp database 120 For every possible object that the robot needs to grasp, the grasp database 120 provides the desired relative poses, or other proxies for this information such as key points for the robot 104 or manipuland between the manipuland, the object to be grasped, and each of the robot links that is to contact the manipuland, for the following stages of grasping: pre-grasp, the configuration prior to the grasp; grasping, the configuration during the grasp; and release, the configuration subsequent to releasing the manipuland.
- the grasp database 120 informs the reach behavior as to how the robot should move toward the object in order to grasp it.
- the reach behavior queries the grasp database 120 for the pregrasp configuration.
- the robot plans a motion free of contact with the environment, excepting the manipuland, using any one of a number of motion planning algorithms, like RRT. It should be noted that the choice of algorithm will affect only the time it takes to find a solution.
- a contact-free path represented as a set of points in the robot’s configuration space, has been obtained, polynomial splines are fit to the points, yielding a trajectory, i.e., a time-dependent path.
- the reach behavior converts the trajectory in the robot’s joint-space into operational or task space.
- a learning approach to map a model or partial model, i.e., only the parts of the model that are observable, of the object’s geometry and apparent surface properties, e.g., friction, to an appropriate way to grasp it, as opposed to or in addition to using the grasp database 120.
- Behaviors can be executed sequentially, one immediately following the other; in parallel; or both. When executed in parallel, the outputs from all behaviors or motor signals are summed or combined.
- a state machine acts to switch between combinations of active behaviors, inactive behaviors output zero motor signals, at regular intervals, e.g., at 100 Hz, as a function of behaviors’ conditions, which the programmer defines on an as-behavior basis.
- a state machine for picking objects with the robot might consist of reaching and grasping behaviors.
- the state machine would be initialized to an idle state. After receiving a signal with a target object to pick from an operator, which could be human or a different software program, the state machine would transition to the reach node, activating the reach behavior.
- the reach behavior would generate a trajectory to pick the object, using the grasp database 120 described above, and then execute it.
- the state machine When the execution of the reach trajectory has completed, the state machine would transition to the pre- grasp state, activating the reach behavior.
- the reach behavior would generate a trajectory to pick the object, using the grasp database 120 described above, and then execute it.
- the state machine When the execution of the reach trajectory has completed, the state machine would transition to the grasp state, activating a grasp behavior and deactivating the reach behavior.
- the grasp behavior uses a fixed control policy and the grasp database 120 to move the hand or gripper from the pre-grasp configuration to the grasping configuration.
- the state machine would transition to the transport state, activating the reach behavior, in addition to the grasp behavior that is already activated.
- the reach behavior again generates and executes a trajectory to effect this goal, at which point the state machine will have successfully executed the pick action.
- this sequence of executed behaviors represents a successful pick
- the state machine provides contingent operations for when one of the behaviors fails to effect its goal due to, e.g., errors in estimating the state of the manipuland or environment or imprecision in controlling the robot. For example, if the object slips from the grasp while the final sequence of the pick is executing, the state machine transitions back to the pre-grasp state, beginning the sequence anew.
- FIG. 3 is a diagram of example software modules that can be configured to effect the process described above. As noted with respect to FIG. 3 , these modules can be implemented on, e.g., the processor system 550 illustrated in FIG. 1 , in order to create a specialized robot control system that provides accurate control of robots that can now autonomously or semi-autonomously, physically interact with various objects in their environments.
- a “model” encompasses immutable properties of an object, like mass and geometry. Since the objects are typically stiff, geometries do not generally change, but the systems and methods described herein do not rely upon this assumption.
- the outputs of these processes can then be fed into the other software modules, which consist of a grasp generator 306 that can determine potential configurations, or pre-grasp poses, for grasping the objects that have been identified in the environment.
- This process returns multiple grasping configurations, grasp data options, per identified object.
- the grasp selector 308 can be configured to choose among the various grasp data options.
- a quality metric e.g., which grasp requires the robot to move the least, can be computed for each option, and the grasp, and associated object, is which object should be picked is arbitrary, with the highest quality is selected.
- a human operator can select a target object from a user interface 310 , and the highest quality grasp associated with that target will be used.
- grasp behavior 312 a , 312 b , and 312 c interact to perform the pick and place task.
- the behavior outputs labeled “u” in the diagram, represent motor forces/torques, and are summed together (“fused”) and sent to the robot. Combinations of these behaviors permit complex behavior to emerge. For example, transporting as described above, emerges from the interactions between the grasp and reach behaviors: the grasp behavior maintains the grasp on the object while the reach behavior is responsible for moving the robot’s end effector to a pose where the object will be placed.
- the system can be applied to any robot for which inertial (dynamics), shape, and appearance models of the robot are available.
- the system’s model can be built using a combination of CAD/CAE and system identification to determine best-fit parameters through physical experimentation.
- Dynamics studies the movement of systems of interconnected bodies under the action of external forces.
- the dynamics of a multi-body system are described by the laws of kinematics and by the application of Newton’s second law (kinetics) or their derivative form Lagrangian mechanics.
- the solution of these equations of motion provides a description of the position, the motion and the acceleration of the individual components of the system and overall the system itself, as a function of time.
- the model consists of the following information, at minimum: the object mass, inertia matrix, i.e., set of six non-negative values that predict how an object rotates as a function of torques applied to the object; center-of-mass location; “undeformed” geometry, i.e., the shape of the object when it is not subject to any forces from loading; material stiffness, dry friction coefficient(s); visual appearance through, e.g., a bidirectional reflectance distribution function; and, if the object is articulated, then location; type, e.g., universal, prismatic, hinge; and parameters, e.g., directional axis of any joints.
- the object mass inertia matrix, i.e., set of six non-negative values that predict how an object rotates as a function of torques applied to the object
- center-of-mass location “undeformed” geometry, i.e., the shape of the object when it is not subject to any forces from loading
- This information can be gathered from direct measurement, estimation, or both.
- a user can create a geometric description of the object manually using 3D modeling or computer-aided engineering software or automatically using a 3D scanner.
- the object mass i.e., from weighing the object; density information, known from material composition, and a geometric model can be input to an existing algorithm, such as described in B. Mirtich. Fast and accurate computation of polyhedral mass properties. J. Graphics Tools, Vol. 1, 1996, which will return the center-of-mass and inertia matrix.
- the material stiffness can be estimated using ubiquitous tables, provided in engineering reference books, listing Young’s Modulus for various materials.
- FIG. 1 is a diagram illustrating an example environment 100 in which the systems and methods described herein can be implemented.
- the system can use at least one RGB-D (color + depth) camera 105 or similar sensor using electromagnetic radiation, e.g., LIDAR, to be aimed into the workspace 102 that the robot 104 will be operating in.
- the camera can be used for example to determine pose for various objects 108 .
- poses can be determined or estimated using radio/electromagnetic wave triangulation, motion capture, etc.
- every joint 106 of the robot 104 is instrumented with a position sensor (not shown), such as an optical encoder.
- force/torque sensors (not shown), such as a 6-axis force/torque sensor, can be used to sense forces acting on the robot’s links 104 .
- the sensors can be placed inline between two rigid links affixed together.
- tactile skins over the surface of the robot’s rigid links can be used to precisely localize pressures from contact arising between the robot and objects (or people) in the environment.
- the camera(s) 105 and sensors 106 can be wired and/or wirelessly communicatively coupled to a back end server or servers, comprising one or more processors running software as described above and below, which in turn can be wired and/or wirelessly communicatively coupled with one or more processors included in robot 104 .
- the server processors run various programs and algorithms 112 that identify objects 108 within the images workspace 102 that the system has been trained to identify.
- a camera image may contain a corrugated box, a wrench, and a sealed can of vegetables, all of which can be identified and added to a model containing the objects in the camera’s sightline and the robot’s vicinity).
- Server(s) 110 can be local or remote from workspace 102 .
- the one or more programs/algorithms can be included in robot 104 and can be run by the one or more processors included in robot 104 .
- the programs/algorithms 112 can include deep neural networks that do bounding box identification from camera 105 (RGB) images to identify and demarcate, with boxes that overlay every object that the system has been trained to manipulate and observe in a particular image.
- This software can also encompass software that is specialized at identifying certain objects, like corrugated cardboard boxes, using algorithms like edge detectors, and using multiple camera views (e.g., calibrated stereo cameras) in order to get the 3D position of points in the 2D camera image.
- the 2D bounding box from a single camera is sufficient to estimate the object’s 3D pose.
- objects instead belong to a class e.g., corrugated cardboard box, such that object sizes can vary
- multiple camera views e.g., from a calibrated stereo camera setup, are needed to establish correspondences between points in the 2D camera images and points in 3D.
- State-of-the-art techniques for training these neural networks use domain randomization to allow objects to be recognized under various lighting conditions, backgrounds, and even object appearances.
- Function approximation e.g., deep neural networks trained on synthetic images, or a combination of function approximation and state estimation algorithms, can be used to estimate objects’ 3D poses, or to estimate the value of a different representation, like keypoints, that uniquely determines the location and orientation of essentially rigid objects from RGB-D data.
- a Bayesian filter like a “particle filter” can fuse the signals from force sensors with the pose estimates output from a neural network in order to track the object’s state position and velocity.
- Function approximation e.g., deep neural networks trained on camera images
- a dynamic e.g., inertia, friction, etc.
- geometric (shape) model of all novel objects that are tracked by the system, e.g., using the bounding boxes.
- the coffee can example used above might not require this process, because it is reasonable to expect that every coffee can is identical to the limitations of the accuracy brought to bear on the problem, i.e., due to the accuracy provided by the sensors and required by the control system.
- boxes will exhibit different surface friction depending on the material of the box, e.g., corrugated, plastic, etc., and the location on the box. For example, if there is a shipping label placed on part of the box, then this can affect surface friction.
- a neural network can infer the geometry of the obscured part of a box from a single image showing part of the box.
- a kinematic model of the object can be estimated as well. Examples include doors, cabinets, drawers, ratchets, steering wheels, bike pumps, etc.
- the ground and any other stationary parts of the environment are modeled as having infinite inertia, making them immobile.
- Function approximation e.g., deep neural networks trained on pressure fields, can be used to estimate the 3D poses of objects that the robot is contacting and thereby possibly obscuring the RGB-D sensor normally used for this purpose.
- Kinematic commands (“desireds”) for the robot can be accepted for each object that the robot attempts to manipulate.
- the desireds can come from behaviors.
- a behavior can be either a fast-to-compute reactive policy, such as a look up table that maps, e.g., the joint estimated state of the robot and manipuland to a vector of motor commands, or can include deliberative components, or planners, e.g., a motion planner that determines paths for the robot that do not result in contact with the environment.
- the output will be a time-indexed trajectory that specifies position and derivatives for the robot and any objects that the robot wants to manipulate.
- the planner can use high level specifications, e.g., put the box on the table, to compute the output trajectories. This process is where motion planning comes into play.
- the forces necessary to apply to the robots actuators can be computed in order to produce forces on contacting objects, and thereby move both them and the robot as commanded.
- the sensed forces on the robot can be compared against the forces predicted by the dynamics model. If, after applying some filtering as necessary, the forces are largely different, the robot can halt its current activity and act to re- sense its environment, i.e., reconcile its understanding of the state of its surroundings with the data it is perceiving. For example, a grasped 1 kg box might slip from the robot’s end effectors’ grasp while the robot is picking the object.
- the robot’s end effector would accelerate upward, since less force would be pulling the end effector downward, while the dynamics model, which assumes the object is still within the object’s grasp, might predict that the end effector would remain at a constant vertical position.
- the disparity between the actual end-effector acceleration and predicted end-effector acceleration becomes greater than the model’s bounds of accuracy, it becomes clear that the estimated model state is direly incorrect.
- controllers use gains to determine how quickly errors should be corrected; stiff (large) gains correct errors quickly at the expense of possible damage to the robot or environment if the error is due to inadvertent contact between the robot and environment.
- All such open parameters which are state-dependent, i.e., they generally should change dynamically in response to the conditions of the robot and environment, are optimally computed to maximize the robot’s task performance by solving an optimal control problem. Since the optimal control problem generally requires too much computation to solve, even offline, approximations are computed instead.
- Approximations include using dynamic programming along with discretizing the state and action spaces and reinforcement learning algorithms, e.g., the policy gradient algorithm.
- Our system uses simulations, given the detailed physical models previously described, to perform these optimizations and compute performant parameters offline. Further optimization can be performed online: parameters can be adjusted based on actual task performance, measured using sensory data. Such transfer learning can even use performance of similar, but not identical, robots on similar, but not identical, tasks in order to adjust parameters.
- FIG. 2 is a block diagram illustrating an example wired or wireless system 550 that can be used in connection with various embodiments described herein.
- the system 550 can be used to implement the robot control system described above and can comprise part of the robot 104 or backend servers 110 .
- the system 550 can be a server or any conventional personal computer, or any other processor-enabled device that is capable of wired or wireless data communication.
- Other computer systems and/or architectures may be also used, as will be clear to those skilled in the art.
- the system 550 preferably includes one or more processors, such as processor 560 .
- Additional processors may be provided, such as an auxiliary processor to manage input/output, an auxiliary processor to perform floating point mathematical operations, a special-purpose microprocessor having an architecture suitable for fast execution of signal processing algorithms (e.g., digital signal processor), a slave processor subordinate to the main processing system (e.g., back-end processor), an additional microprocessor or controller for dual or multiple processor systems, or a coprocessor.
- System 550 can also include a tensor processing unit as well as motion planning processors or systems.
- auxiliary processors may be discrete processors or may be integrated with the processor 560 .
- processors which may be used with system 550 include, without limitation, the Pentium® processor, Core i7® processor, and Xeon® processor, all of which are available from Intel Corporation of Santa Clara, California.
- the processor 560 is preferably connected to a communication bus 555 .
- the communication bus 555 may include a data channel for facilitating information transfer between storage and other peripheral components of the system 550 .
- the communication bus 555 further may provide a set of signals used for communication with the processor 560 , including a data bus, address bus, and control bus (not shown).
- the communication bus 555 may comprise any standard or non-standard bus architecture such as, for example, bus architectures compliant with industry standard architecture (ISA), extended industry standard architecture (EISA), Micro Channel Architecture (MCA), peripheral component interconnect (PCI) local bus, or standards promulgated by the Institute of Electrical and Electronics Engineers (IEEE) including IEEE 488 general-purpose interface bus (GPIB), IEEE 696/5-100, and the like.
- ISA industry standard architecture
- EISA extended industry standard architecture
- MCA Micro Channel Architecture
- PCI peripheral component interconnect
- System 550 preferably includes a main memory 565 and may also include a secondary memory 570 .
- the main memory 565 provides storage of instructions and data for programs executing on the processor 560 , such as one or more of the functions and / or modules discussed above. It should be understood that programs stored in the memory and executed by processor 560 may be written and/or compiled according to any suitable language, including without limitation C/C ++ , Java, JavaScript, Pearl, Visual Basic, .NET, and the like.
- the main memory 565 is typically semiconductor-based memory such as dynamic random access memory (DRAM) and/or static random access memory (SRAM). Other semiconductor-based memory types include, for example, synchronous dynamic random access memory (SDRAM), Rambus dynamic random access memory (RDRAM), ferroelectric random access memory (FRAM), and the like, including read only memory (ROM).
- SDRAM synchronous dynamic random access memory
- RDRAM Rambus dynamic random access memory
- FRAM ferroelectric random access memory
- ROM read only memory
- the secondary memory 570 may optionally include an internal memory 575 an d/or a removable medium 580 , for example a floppy disk drive, a magnetic tape drive, a compact disc (CD) drive, a digital versatile disc (DVD) drive, other optical drive, a flash memory drive, etc.
- the removable medium 580 is read from and/or written to in a well-known manner.
- Removable storage medium 580 may be, for example, a floppy disk, magnetic tape, CD, DVD, SD card, etc.
- the removable storage medium 580 is a non-transitory computer-readable medium having stored thereon computer executable code (i.e., software) and/or data.
- the computer software or data stored on the removable storage medium 580 is read into the system 550 for execution by the processor 560 .
- secondary memory 570 may include other similar means for allowing computer programs or other data or instructions to be loaded into the system 550 .
- Such means may include, for example, an external storage medium 595 and an interface 590 .
- external storage medium 595 may include an external hard disk drive or an external optical drive, or and external magneto-optical drive.
- secondary memory 570 may include semiconductor-based memory such as programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable read-only memory (EEPROM), or flash memory (block oriented memory similar to EEPROM). Also included are any other removable storage media 580 and communication interface 590 , which allow software and data to be transferred from an external medium 595 to the system 550 .
- PROM programmable read-only memory
- EPROM erasable programmable read-only memory
- EEPROM electrically erasable read-only memory
- flash memory block oriented memory similar to EEPROM
- System 550 may include a communication interface 590 .
- the communication interface 590 allows software and data to be transferred between system 550 such as possibly robot 104 , camera 105 or other sensors, as well as external devices (e.g. printers), networks, or information sources.
- system 550 such as possibly robot 104 , camera 105 or other sensors, as well as external devices (e.g. printers), networks, or information sources.
- external devices e.g. printers
- computer software or executable code may be transferred to system 550 from a network server via communication interface 590 .
- Examples of communication interface 590 include a built-in network adapter, network interface card (NIC), Personal Computer Memory Card International Association (PCMCIA) network card, card bus network adapter, wireless network adapter, Universal Serial Bus (USB) network adapter, modem, a network interface card (NIC), a wireless data card, a communications port, an infrared interface, an IEEE 1394 fire-wire, or any other device capable of interfacing system 550 with a network or another computing device.
- NIC network interface card
- PCMCIA Personal Computer Memory Card International Association
- USB Universal Serial Bus
- Communication interface 590 preferably implements industry promulgated protocol standards, such as Ethernet IEEE 802 standards, Fiber Channel, digital subscriber line (DSL), asynchronous digital subscriber line (ADSL), frame relay, asynchronous transfer mode (ATM), integrated digital services network (ISDN), personal communications services (PCS), transmission control protocol/Internet protocol (TCP/IP), serial line Internet protocol/point to point protocol (SLIP/PPP), and so on, but may also implement customized or non-standard interface protocols as well.
- industry promulgated protocol standards such as Ethernet IEEE 802 standards, Fiber Channel, digital subscriber line (DSL), asynchronous digital subscriber line (ADSL), frame relay, asynchronous transfer mode (ATM), integrated digital services network (ISDN), personal communications services (PCS), transmission control protocol/Internet protocol (TCP/IP), serial line Internet protocol/point to point protocol (SLIP/PPP), and so on, but may also implement customized or non-standard interface protocols as well.
- industry promulgated protocol standards such as Ethernet IEEE 802 standards, Fiber Channel, digital subscriber
- Software and data transferred via communication interface 590 are generally in the form of electrical communication signals 605 . These signals 605 are preferably provided to communication interface 590 via a communication channel 600 .
- the communication channel 600 maybe a wired or wireless network, or any variety of other communication links.
- Communication channel 600 carries signals 605 and can be implemented using a variety of wired or wireless communication means including wire or cable, fiber optics, conventional phone line, cellular phone link, wireless data communication link, radio frequency (“RF”) link, or infrared link, just to name a few.
- RF radio frequency
- Computer executable code i.e., computer programs or software
- main memory 565 and/or the secondary memory 570 Computer programs can also be received via communication interface 590 and stored in the main memory 565 and/or the secondary memory 570 .
- Such computer programs when executed, enable the system 550 to perform the various functions of the present invention as previously described.
- computer readable medium is used to refer to any non- transitory computer readable storage media used to provide computer executable code (e.g., software and computer programs) to the system 550 .
- Examples of these media include main memory 565 , secondary memory 570 (including internal memory 575, removable medium 580 , and external storage medium 595 ), and any peripheral device communicatively coupled with communication interface 590 (including a network information server or other network device).
- These non-transitory computer readable mediums are means for providing executable code, programming instructions, and software to the system 550 .
- the software may be stored on a computer readable medium and loaded into the system 550 by way of removable medium 580 , I/O interface 585 , or communication interface 590 .
- the software is loaded into the system 550 in the form of electrical communication signals 605 .
- the software when executed by the processor 560 , preferably causes the processor 560 to perform the inventive features and functions previously described herein.
- I/O interface 585 provides an interface between one or more components of system 550 and one or more input and / or output devices.
- Example input devices include, without limitation, keyboards, touch screens or other touch-sensitive devices, biometric sensing devices, computer mice, trackballs, pen-based pointing devices, and the like.
- the input device can also be the camera 105 or other sensors within environment 102 as well as robot 104 .
- Examples of output devices include, without limitation, cathode ray tubes (CRTs), plasma displays, light-emitting diode (LED) displays, liquid crystal displays (LCDs), printers, vacuum fluorescent displays (VFDs), surface-conduction electron-emitter displays (SEDs), field emission displays (FEDs), and the like.
- CTRs cathode ray tubes
- LED light-emitting diode
- LCDs liquid crystal displays
- VFDs vacuum fluorescent displays
- SEDs surface-conduction electron-emitter displays
- FEDs field emission displays
- the system 550 also includes optional wireless communication components that facilitate wireless communication over a voice and over a data network.
- the wireless communication components comprise an antenna system 610 , a radio system 615 and a baseband system 620 .
- RF radio frequency
- the antenna system 610 may comprise one or more antennae and one or more multiplexors (not shown) that perform a switching function to provide the antenna system 610 with transmit and receive signal paths.
- received RF signals can be coupled from a multiplexor to a low noise amplifier (not shown) that amplifies the received RF signal and sends the amplified signal to the radio system 615 .
- the radio system 615 may comprise one or more radios that are configured to communicate over various frequencies.
- the radio system 615 may combine a demodulator (not shown) and modulator (not shown) in one integrated circuit (IC).
- the demodulator and modulator can also be separate components. In the incoming path, the demodulator strips away the RF carrier signal leaving a baseband receive audio signal, which is sent from the radio system 615 to the baseband system 620 .
- baseband system 620 decodes the signal and converts it to an analog signal. Then the signal is amplified and sent to a speaker.
- the baseband system 620 also receives analog audio signals from a microphone. These analog audio signals are converted to digital signals and encoded by the baseband system 620 .
- the baseband system 620 also codes the digital signals for transmission and generates a baseband transmit audio signal that is routed to the modulator portion of the radio system 615 .
- the modulator mixes the baseband transmit audio signal with an RF carrier signal generating an RF transmit signal that is routed to the antenna system and may pass through a power amplifier (not shown).
- the power amplifier amplifies the RF transmit signal and routes it to the antenna system 610 where the signal is switched to the antenna port for transmission.
- the baseband system 620 can also be communicatively coupled with the processor 560 .
- Radio system 615 can for example be used to communicate with robot 104 , camera 105 , as well as other sensors.
- the central processing unit 560 has access to data storage areas 565 and 570 .
- the central processing unit 560 is preferably configured to execute instructions (i.e., computer programs or software) that can be stored in the memory 565 or the secondary memory 570 .
- Computer programs can also be received from the baseband processor 610 and stored in the data storage area 565 or in secondary memory 570 , or executed upon receipt. Such computer programs, when executed, enable the system 550 to perform the various functions of the present invention as previously described.
- data storage areas 565 may include various software modules (not shown).
- Various embodiments may also be implemented primarily in hardware using, for example, components such as application specific integrated circuits (ASICs), or field programmable gate arrays (FPGAs).
- ASICs application specific integrated circuits
- FPGAs field programmable gate arrays
- ASICs application specific integrated circuits
- FPGAs field programmable gate arrays
- DSP digital signal processor
- a general- purpose processor can be a microprocessor, but in the alternative, the processor can be any processor, controller, microcontroller, or state machine.
- a processor can also be implemented as a combination of computing devices, for example, a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
- a software module can reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium including a network storage medium.
- An exemplary storage medium can be coupled to the processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium can be integral to the processor.
- the processor and the storage medium can also reside in an ASIC.
- a component may be a stand-alone software package, or it may be a software package incorporated as a “tool” in a larger software product. It may be downloadable from a network, for example, a website, as a stand-alone product or as an add-in package for installation in an existing software application. It may also be available as a client-server software application, as a web-enabled software application, and/or as a mobile application.
Abstract
In variants, a method for robot control can include: receiving sensor data of a scene, modeling the physical objects within the scene, determining a set of potential grasp configurations for grasping a physical object within the scene, determining a reach behavior based on the potential grasp configuration, determining a trajectory for the reach behavior, and grasping the object using the trajectory.
Description
- This application is a continuation of U.S. Pat. Application No. 16/943,884, filed 30-JUL-2020, which claims priority under 35 U.S.C. 119(e) to U.S. Provisional Pat. Application No. 62/882,395, filed 02-AUG-2019, U.S. Provisional Pat. Application No. 62/882,396, filed 02-AUG-2019 and U.S. Provisional Pat. Application No. 62/882,397, filed Aug. 2, 2019 the disclosures of which are hereby incorporated herein by reference in their entirety as if set forth in full.
- This application is also related to U.S. Pat. Application Serial No. 16/943,915, filed on 30-JUL-2020, entitled ROBOTIC MANIPULATORS, and is also related to U.S. Pat. Application Serial No. 16/944,020, filed on 30-JUL-2020, entitled ROBOTIC SYSTEM FOR PICKING AND PLACING OBJECTS FROM AND INTO A CONSTRAINED SPACE, all of which are incorporated herein by reference in their entirety as if set forth in full.
- The embodiments described herein are related to robotic control systems, and more specifically to a robotic software system for accurate control of robots that physically interact with various objects in their environments, while simultaneously incorporating the force feedback from these physical interactions into a “control policy”.
- It is currently very hard to build automated machines for manipulating objects of various shapes, sizes, inertias, and materials. Within factories robots perform many kinds of manipulation on a daily basis. They lift massive objects, move with blurring speed, and repeat complex performances with unerring precision. Yet outside of these carefully controlled robot realms, even the most sophisticated robot would be unable to perform many tasks that involve contact with other objects. Everyday manipulation tasks would stump conventionally controlled robots. As such, outside of controlled environments, robots have only performed sophisticated manipulation tasks when operated by a human.
- Within simulation, robots have performed sophisticated manipulation tasks such as grasping multifaceted objects, tying knots, carrying objects around complex obstacles, and extracting objects from piles of entangled objects. The control algorithms for these demonstrations often employ search algorithms to find satisfactory solutions, such as a path to a goal state, or a configuration of a gripper that maximizes a measure of grasp quality against an object.
- For example, many virtual robots use algorithms for motion planning that rapidly search for paths through a state space that describes the kinematics and dynamics of the world. Almost all of these simulations ignore the robot’s sensory systems and assume that the state of the world is known with certainty. As examples, they might be provided with greater accuracy of the objects’ states, e.g., positions and velocities, than is obtainable using state-of-the-art sensors, they might be provided with states for objects that, due to occlusions, are not visible to sensors, or both.
- In a carefully controlled environment, these assumptions can be met. For example, within a traditional factory setting, engineers can ensure that a robot knows the state of relevant objects in the world to accuracy sufficient to perform necessary tasks. The robot typically needs to perform a few tasks using a few known objects, and people are usually banned from the area while the robot is working. Mechanical feeders can enforce constraints on the pose of the objects to be manipulated. And in the event that a robot needs to sense the world, engineers can make the environment favorable to sensing by controlling factors such as the lighting and the placement of objects relative to the sensor. Moreover, since the objects and tasks are known in advance, perception can be specialized to the environment and task. Whether by automated planning or direct programming, robots perform exceptionally well in factories or other controlled environments. Within research labs, successful demonstrations of robots autonomously performing complicated manipulation tasks have relied on some combination of known objects, easily identified and tracked objects (e.g., a bright red ball), uncluttered environments, fiducial markers, or narrowly defined, task specific controllers.
- Outside of controlled settings, however, robots have only performed sophisticated manipulation tasks when operated by a human. Through teleoperation, even highly complex humanoid robots have performed a variety of challenging everyday manipulation tasks, such as grasping everyday objects, using a power drill, throwing away trash, and retrieving a drink from a refrigerator.
- But accurate control of robots that autonomously, physically interact with various objects in their environments has proved elusive.
- Systems and methods for controlling machines to accurately manipulate objects that can be effectively modeled by rigid body kinematics and dynamics are described herein.
- A system comprises a database; at least one hardware processor coupled with the database; and one or more software modules that, when executed by the at least one hardware processor, receive at least one of sensory data from a robot and images from a camera, identify and build models of objects in an environment, wherein the model encompasses immutable properties of identified objects including mass and geometry, and wherein the geometry is assumed not to change, estimate the state including position, orientation, and velocity, of the identified objects, determine based on the state and model, potential configurations, or pre-grasp poses, for grasping the identified objects and return multiple grasping configurations per identified object, determine an object to be picked based on a quality metric, translate the pre- grasp poses into behaviors that define motor forces and torques, communicate the motor forces and torques to the robot in order to allow the robot to perform a complex behavior generated from the behaviors.
- These and other features, aspects, and embodiments are described below in the section entitled “ Detailed Description. ”
-
FIG. 1 is a diagram illustrating an example an environment in which a robot can be controlled in accordance with one embodiment; -
FIG. 2 is a diagram illustrating an example robot that can be used in the environment ofFIG. 1 and controlled in accordance with one embodiment; and -
FIG. 3 is a diagram illustrating the software modules for controlling a robot within an environment such as depicted inFIG. 1 , in accordance with one example embodiment. - Systems, methods, and non-transitory computer-readable media are disclosed for robot control. The disclosure and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments and examples that are described and/or illustrated in the accompanying drawings and detailed in the following description. It should be noted that the features illustrated in the drawings are not necessarily drawn to scale, and features of one embodiment can be employed with other embodiments, even if not explicitly stated herein. Descriptions of well-known components and processing techniques may be omitted so as to not unnecessarily obscure the embodiments of the disclosure. The examples used herein are intended merely to facilitate an understanding of ways in which the disclosure can be practiced and to further enable those of skill in the art to practice the embodiments of the disclosure. Accordingly, the examples and embodiments herein should not be construed as limiting the scope of the disclosure. Moreover, it is noted that like reference numerals represent similar parts throughout the several views of the drawings.
-
FIG. 3 illustrates a process for controlling machines to accurately manipulate objects that can be modeled by rigid body kinematics and dynamics, e.g., bottles, silverware, chairs, etc., and the software modules configured to implement the process in accordance with one embodiment. Rigid body dynamics can be used to accurately model even non-stiff objects, like a rubber ball, due to the accuracy obtainable by a robot’s sensors during real-time operation. With such software, robots can be made to, e.g., move furniture, pick up a pen, or use a wrench to tighten a bolt. - The highest-level construct of the software system can be termed a behavior. A behavior consists of a control policy that maps estimates of the state of the world to motor signals and, optionally, a deliberative component. For example, controlling a robot to reach toward an object, a “reach” behavior, in order to grasp it, requires planning a path through space such that the robot will not inadvertently collide with itself or its environment. After the plan has been computed, it can be input to an impedance controller, i.e., the control policy, sequentially until the planned motion is complete, which means either that the plan has been successfully executed or that execution failed. Various other kinds of control schemes will work as well, like admittance control, operational space control, etc.
- Grasping is a key component of the system 300 illustrated in
FIG. 3 . Grasping is required before performing any other manipulation tasks, e.g., it is necessary to pick up the wrench to tighten the bolt. The grasping system, i.e., software and hardware configured to control the robot uses a database (120 inFIG. 1 ), the “grasp database”, to determine how various objects should be grasped. For every possible object that the robot needs to grasp, the grasp database 120 provides the desired relative poses, or other proxies for this information such as key points for therobot 104 or manipuland between the manipuland, the object to be grasped, and each of the robot links that is to contact the manipuland, for the following stages of grasping: pre-grasp, the configuration prior to the grasp; grasping, the configuration during the grasp; and release, the configuration subsequent to releasing the manipuland. - The grasp database 120 informs the reach behavior as to how the robot should move toward the object in order to grasp it. Given a target manipuland as input, the reach behavior queries the grasp database 120 for the pregrasp configuration. The robot then plans a motion free of contact with the environment, excepting the manipuland, using any one of a number of motion planning algorithms, like RRT. It should be noted that the choice of algorithm will affect only the time it takes to find a solution. Once a contact-free path, represented as a set of points in the robot’s configuration space, has been obtained, polynomial splines are fit to the points, yielding a trajectory, i.e., a time-dependent path. The reach behavior converts the trajectory in the robot’s joint-space into operational or task space.
- In certain embodiments, it is also be possible to use a learning approach to map a model or partial model, i.e., only the parts of the model that are observable, of the object’s geometry and apparent surface properties, e.g., friction, to an appropriate way to grasp it, as opposed to or in addition to using the grasp database 120.
- Behaviors can be executed sequentially, one immediately following the other; in parallel; or both. When executed in parallel, the outputs from all behaviors or motor signals are summed or combined. A state machine acts to switch between combinations of active behaviors, inactive behaviors output zero motor signals, at regular intervals, e.g., at 100 Hz, as a function of behaviors’ conditions, which the programmer defines on an as-behavior basis.
- As an example of the entire system in operation, a state machine for picking objects with the robot might consist of reaching and grasping behaviors. The state machine would be initialized to an idle state. After receiving a signal with a target object to pick from an operator, which could be human or a different software program, the state machine would transition to the reach node, activating the reach behavior. The reach behavior would generate a trajectory to pick the object, using the grasp database 120 described above, and then execute it. When the execution of the reach trajectory has completed, the state machine would transition to the pre- grasp state, activating the reach behavior. The reach behavior would generate a trajectory to pick the object, using the grasp database 120 described above, and then execute it. When the execution of the reach trajectory has completed, the state machine would transition to the grasp state, activating a grasp behavior and deactivating the reach behavior.
- The grasp behavior uses a fixed control policy and the grasp database 120 to move the hand or gripper from the pre-grasp configuration to the grasping configuration. When the grasp behavior has completed successfully, indicated by detecting sufficient contact forces at the hand/gripper and absence of slip between the robot and the manipuland, the state machine would transition to the transport state, activating the reach behavior, in addition to the grasp behavior that is already activated.
- Given a target pose in 3D space, the reach behavior again generates and executes a trajectory to effect this goal, at which point the state machine will have successfully executed the pick action. While this sequence of executed behaviors represents a successful pick, the state machine provides contingent operations for when one of the behaviors fails to effect its goal due to, e.g., errors in estimating the state of the manipuland or environment or imprecision in controlling the robot. For example, if the object slips from the grasp while the final sequence of the pick is executing, the state machine transitions back to the pre-grasp state, beginning the sequence anew.
-
FIG. 3 is a diagram of example software modules that can be configured to effect the process described above. As noted with respect toFIG. 3 , these modules can be implemented on, e.g., theprocessor system 550 illustrated inFIG. 1 , in order to create a specialized robot control system that provides accurate control of robots that can now autonomously or semi-autonomously, physically interact with various objects in their environments. - Sensory data flows from the
robot 104 to thesystem identification process 302, which builds and refines models of objects (108 inFIG. 1 ) in the environment (102 inFIG. 1 ), and to thestate estimation process 304, which estimates the state, position, orientation, and velocity, of the identified objects 102. A “model” encompasses immutable properties of an object, like mass and geometry. Since the objects are typically stiff, geometries do not generally change, but the systems and methods described herein do not rely upon this assumption. - The outputs of these processes, i.e., object models and states, can then be fed into the other software modules, which consist of a
grasp generator 306 that can determine potential configurations, or pre-grasp poses, for grasping the objects that have been identified in the environment. This process returns multiple grasping configurations, grasp data options, per identified object. - Since the grasp generator generates many potential grasps among the various identified objects, a mechanism is necessary to determine which object should be picked, when the object that should be picked is arbitrary, as is the case when, e.g., physically sorting a collection of objects. The
grasp selector 308 can be configured to choose among the various grasp data options. A quality metric, e.g., which grasp requires the robot to move the least, can be computed for each option, and the grasp, and associated object, is which object should be picked is arbitrary, with the highest quality is selected. Alternatively, a human operator can select a target object from auser interface 310, and the highest quality grasp associated with that target will be used. - Given grasp data, and reach, grasp, and release
behaviors - The system can be applied to any robot for which inertial (dynamics), shape, and appearance models of the robot are available. The system’s model can be built using a combination of CAD/CAE and system identification to determine best-fit parameters through physical experimentation. Dynamics studies the movement of systems of interconnected bodies under the action of external forces. The dynamics of a multi-body system are described by the laws of kinematics and by the application of Newton’s second law (kinetics) or their derivative form Lagrangian mechanics. The solution of these equations of motion provides a description of the position, the motion and the acceleration of the individual components of the system and overall the system itself, as a function of time.
- The model consists of the following information, at minimum: the object mass, inertia matrix, i.e., set of six non-negative values that predict how an object rotates as a function of torques applied to the object; center-of-mass location; “undeformed” geometry, i.e., the shape of the object when it is not subject to any forces from loading; material stiffness, dry friction coefficient(s); visual appearance through, e.g., a bidirectional reflectance distribution function; and, if the object is articulated, then location; type, e.g., universal, prismatic, hinge; and parameters, e.g., directional axis of any joints.
- This information can be gathered from direct measurement, estimation, or both. As one example, a user can create a geometric description of the object manually using 3D modeling or computer-aided engineering software or automatically using a 3D scanner. The object mass, i.e., from weighing the object; density information, known from material composition, and a geometric model can be input to an existing algorithm, such as described in B. Mirtich. Fast and accurate computation of polyhedral mass properties. J. Graphics Tools, Vol. 1, 1996, which will return the center-of-mass and inertia matrix. As another example, the material stiffness can be estimated using ubiquitous tables, provided in engineering reference books, listing Young’s Modulus for various materials.
-
FIG. 1 is a diagram illustrating anexample environment 100 in which the systems and methods described herein can be implemented. As can be seen, the system can use at least one RGB-D (color + depth)camera 105 or similar sensor using electromagnetic radiation, e.g., LIDAR, to be aimed into theworkspace 102 that therobot 104 will be operating in. The camera can be used for example to determine pose forvarious objects 108. Alternatively, poses can be determined or estimated using radio/electromagnetic wave triangulation, motion capture, etc. - In certain embodiments, every joint 106 of the
robot 104 is instrumented with a position sensor (not shown), such as an optical encoder. Further, force/torque sensors (not shown), such as a 6-axis force/torque sensor, can be used to sense forces acting on the robot’slinks 104. The sensors can be placed inline between two rigid links affixed together. Alternatively, tactile skins over the surface of the robot’s rigid links can be used to precisely localize pressures from contact arising between the robot and objects (or people) in the environment. - The camera(s) 105 and
sensors 106 can be wired and/or wirelessly communicatively coupled to a back end server or servers, comprising one or more processors running software as described above and below, which in turn can be wired and/or wirelessly communicatively coupled with one or more processors included inrobot 104. The server processors run various programs andalgorithms 112 that identifyobjects 108 within theimages workspace 102 that the system has been trained to identify. For example, a camera image may contain a corrugated box, a wrench, and a sealed can of vegetables, all of which can be identified and added to a model containing the objects in the camera’s sightline and the robot’s vicinity). Server(s) 110 can be local or remote fromworkspace 102. Alternatively, the one or more programs/algorithms can be included inrobot 104 and can be run by the one or more processors included inrobot 104. - The programs/
algorithms 112 can include deep neural networks that do bounding box identification from camera 105 (RGB) images to identify and demarcate, with boxes that overlay every object that the system has been trained to manipulate and observe in a particular image. This software can also encompass software that is specialized at identifying certain objects, like corrugated cardboard boxes, using algorithms like edge detectors, and using multiple camera views (e.g., calibrated stereo cameras) in order to get the 3D position of points in the 2D camera image. - When objects are unique, e.g., a 12 oz can of Maxwell House coffee, the 2D bounding box from a single camera is sufficient to estimate the object’s 3D pose. When objects instead belong to a class, e.g., corrugated cardboard box, such that object sizes can vary, multiple camera views, e.g., from a calibrated stereo camera setup, are needed to establish correspondences between points in the 2D camera images and points in 3D. State-of-the-art techniques for training these neural networks use domain randomization to allow objects to be recognized under various lighting conditions, backgrounds, and even object appearances. Function approximation, e.g., deep neural networks trained on synthetic images, or a combination of function approximation and state estimation algorithms, can be used to estimate objects’ 3D poses, or to estimate the value of a different representation, like keypoints, that uniquely determines the location and orientation of essentially rigid objects from RGB-D data. For example, a Bayesian filter (like a “particle filter”) can fuse the signals from force sensors with the pose estimates output from a neural network in order to track the object’s state position and velocity.
- Function approximation, e.g., deep neural networks trained on camera images, can be used to estimate a dynamic, e.g., inertia, friction, etc., and geometric (shape) model of all novel objects that are tracked by the system, e.g., using the bounding boxes. The coffee can example used above might not require this process, because it is reasonable to expect that every coffee can is identical to the limitations of the accuracy brought to bear on the problem, i.e., due to the accuracy provided by the sensors and required by the control system. By way of contrasting example, boxes will exhibit different surface friction depending on the material of the box, e.g., corrugated, plastic, etc., and the location on the box. For example, if there is a shipping label placed on part of the box, then this can affect surface friction. Similarly, a neural network can infer the geometry of the obscured part of a box from a single image showing part of the box.
- If an object is articulated, a kinematic model of the object can be estimated as well. Examples include doors, cabinets, drawers, ratchets, steering wheels, bike pumps, etc. The ground and any other stationary parts of the environment are modeled as having infinite inertia, making them immobile. Function approximation, e.g., deep neural networks trained on pressure fields, can be used to estimate the 3D poses of objects that the robot is contacting and thereby possibly obscuring the RGB-D sensor normally used for this purpose.
- Kinematic commands (“desireds”) for the robot can be accepted for each object that the robot attempts to manipulate. The desireds can come from behaviors. A behavior can be either a fast-to-compute reactive policy, such as a look up table that maps, e.g., the joint estimated state of the robot and manipuland to a vector of motor commands, or can include deliberative components, or planners, e.g., a motion planner that determines paths for the robot that do not result in contact with the environment. In that case of a planner, the output will be a time-indexed trajectory that specifies position and derivatives for the robot and any objects that the robot wants to manipulate.
- In turn, the planner can use high level specifications, e.g., put the box on the table, to compute the output trajectories. This process is where motion planning comes into play.
- By inverting the dynamics model of the robot (from a=F/m to F=ma) and modeling contact interactions as mass-spring-damper systems, the forces necessary to apply to the robots actuators can be computed in order to produce forces on contacting objects, and thereby move both them and the robot as commanded.
- If the force/torque data is available, then the sensed forces on the robot can be compared against the forces predicted by the dynamics model. If, after applying some filtering as necessary, the forces are largely different, the robot can halt its current activity and act to re- sense its environment, i.e., reconcile its understanding of the state of its surroundings with the data it is perceiving. For example, a grasped 1 kg box might slip from the robot’s end effectors’ grasp while the robot is picking the object. At the time that the object slips from the robot’s grasp, the robot’s end effector would accelerate upward, since less force would be pulling the end effector downward, while the dynamics model, which assumes the object is still within the object’s grasp, might predict that the end effector would remain at a constant vertical position. When the disparity between the actual end-effector acceleration and predicted end-effector acceleration becomes greater than the model’s bounds of accuracy, it becomes clear that the estimated model state is direly incorrect. For a picking operation, we expect this mismatch to occur due to a small number of incidents: an object has been inadvertently dropped, multiple objects have been inadvertently grasped, e.g., the robot intended to grab one object but grabbed two, a human entered the workspace and was struck by the robot, or the robot inaccurately sensed the workspace, causing it to inadvertently collide with the environment, i.e., the robot failed to sense an object’s existence or it improperly parameterized a sensed object, e.g., estimating a box was small when it was really large.
- The behaviors in this system, as well as the controllers, the perception system, and the conditions for transitioning between states in the state machine all use robot and environment- specific numbers (parameters). For example, controllers use gains to determine how quickly errors should be corrected; stiff (large) gains correct errors quickly at the expense of possible damage to the robot or environment if the error is due to inadvertent contact between the robot and environment. All such open parameters, which are state-dependent, i.e., they generally should change dynamically in response to the conditions of the robot and environment, are optimally computed to maximize the robot’s task performance by solving an optimal control problem. Since the optimal control problem generally requires too much computation to solve, even offline, approximations are computed instead. Approximations include using dynamic programming along with discretizing the state and action spaces and reinforcement learning algorithms, e.g., the policy gradient algorithm. Our system uses simulations, given the detailed physical models previously described, to perform these optimizations and compute performant parameters offline. Further optimization can be performed online: parameters can be adjusted based on actual task performance, measured using sensory data. Such transfer learning can even use performance of similar, but not identical, robots on similar, but not identical, tasks in order to adjust parameters.
-
FIG. 2 is a block diagram illustrating an example wired orwireless system 550 that can be used in connection with various embodiments described herein. For example thesystem 550 can be used to implement the robot control system described above and can comprise part of therobot 104 orbackend servers 110. Thesystem 550 can be a server or any conventional personal computer, or any other processor-enabled device that is capable of wired or wireless data communication. Other computer systems and/or architectures may be also used, as will be clear to those skilled in the art. - The
system 550 preferably includes one or more processors, such asprocessor 560. Additional processors may be provided, such as an auxiliary processor to manage input/output, an auxiliary processor to perform floating point mathematical operations, a special-purpose microprocessor having an architecture suitable for fast execution of signal processing algorithms (e.g., digital signal processor), a slave processor subordinate to the main processing system (e.g., back-end processor), an additional microprocessor or controller for dual or multiple processor systems, or a coprocessor.System 550 can also include a tensor processing unit as well as motion planning processors or systems. - Such auxiliary processors may be discrete processors or may be integrated with the
processor 560. Examples of processors which may be used withsystem 550 include, without limitation, the Pentium® processor, Core i7® processor, and Xeon® processor, all of which are available from Intel Corporation of Santa Clara, California. - The
processor 560 is preferably connected to a communication bus 555. The communication bus 555 may include a data channel for facilitating information transfer between storage and other peripheral components of thesystem 550. The communication bus 555 further may provide a set of signals used for communication with theprocessor 560, including a data bus, address bus, and control bus (not shown). The communication bus 555 may comprise any standard or non-standard bus architecture such as, for example, bus architectures compliant with industry standard architecture (ISA), extended industry standard architecture (EISA), Micro Channel Architecture (MCA), peripheral component interconnect (PCI) local bus, or standards promulgated by the Institute of Electrical and Electronics Engineers (IEEE) including IEEE 488 general-purpose interface bus (GPIB), IEEE 696/5-100, and the like. -
System 550 preferably includes amain memory 565 and may also include asecondary memory 570. Themain memory 565 provides storage of instructions and data for programs executing on theprocessor 560, such as one or more of the functions and / or modules discussed above. It should be understood that programs stored in the memory and executed byprocessor 560 may be written and/or compiled according to any suitable language, including without limitation C/C ++ , Java, JavaScript, Pearl, Visual Basic, .NET, and the like. Themain memory 565 is typically semiconductor-based memory such as dynamic random access memory (DRAM) and/or static random access memory (SRAM). Other semiconductor-based memory types include, for example, synchronous dynamic random access memory (SDRAM), Rambus dynamic random access memory (RDRAM), ferroelectric random access memory (FRAM), and the like, including read only memory (ROM). - The
secondary memory 570 may optionally include aninternal memory 575 and/or aremovable medium 580, for example a floppy disk drive, a magnetic tape drive, a compact disc (CD) drive, a digital versatile disc (DVD) drive, other optical drive, a flash memory drive, etc. Theremovable medium 580 is read from and/or written to in a well-known manner.Removable storage medium 580 may be, for example, a floppy disk, magnetic tape, CD, DVD, SD card, etc. - The
removable storage medium 580 is a non-transitory computer-readable medium having stored thereon computer executable code (i.e., software) and/or data. The computer software or data stored on theremovable storage medium 580 is read into thesystem 550 for execution by theprocessor 560. - In alternative embodiments,
secondary memory 570 may include other similar means for allowing computer programs or other data or instructions to be loaded into thesystem 550. Such means may include, for example, anexternal storage medium 595 and aninterface 590. Examples ofexternal storage medium 595 may include an external hard disk drive or an external optical drive, or and external magneto-optical drive. - Other examples of
secondary memory 570 may include semiconductor-based memory such as programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable read-only memory (EEPROM), or flash memory (block oriented memory similar to EEPROM). Also included are any otherremovable storage media 580 andcommunication interface 590, which allow software and data to be transferred from anexternal medium 595 to thesystem 550. -
System 550 may include acommunication interface 590. Thecommunication interface 590 allows software and data to be transferred betweensystem 550 such as possiblyrobot 104,camera 105 or other sensors, as well as external devices (e.g. printers), networks, or information sources. For example, computer software or executable code may be transferred tosystem 550 from a network server viacommunication interface 590. Examples ofcommunication interface 590 include a built-in network adapter, network interface card (NIC), Personal Computer Memory Card International Association (PCMCIA) network card, card bus network adapter, wireless network adapter, Universal Serial Bus (USB) network adapter, modem, a network interface card (NIC), a wireless data card, a communications port, an infrared interface, an IEEE 1394 fire-wire, or any other device capable of interfacingsystem 550 with a network or another computing device. -
Communication interface 590 preferably implements industry promulgated protocol standards, such as Ethernet IEEE 802 standards, Fiber Channel, digital subscriber line (DSL), asynchronous digital subscriber line (ADSL), frame relay, asynchronous transfer mode (ATM), integrated digital services network (ISDN), personal communications services (PCS), transmission control protocol/Internet protocol (TCP/IP), serial line Internet protocol/point to point protocol (SLIP/PPP), and so on, but may also implement customized or non-standard interface protocols as well. - Software and data transferred via
communication interface 590 are generally in the form of electrical communication signals 605. Thesesignals 605 are preferably provided tocommunication interface 590 via acommunication channel 600. In one embodiment, thecommunication channel 600 maybe a wired or wireless network, or any variety of other communication links.Communication channel 600 carriessignals 605 and can be implemented using a variety of wired or wireless communication means including wire or cable, fiber optics, conventional phone line, cellular phone link, wireless data communication link, radio frequency (“RF”) link, or infrared link, just to name a few. - Computer executable code (i.e., computer programs or software) is stored in the
main memory 565 and/or thesecondary memory 570. Computer programs can also be received viacommunication interface 590 and stored in themain memory 565 and/or thesecondary memory 570. Such computer programs, when executed, enable thesystem 550 to perform the various functions of the present invention as previously described. - In this description, the term “computer readable medium” is used to refer to any non- transitory computer readable storage media used to provide computer executable code (e.g., software and computer programs) to the
system 550. Examples of these media includemain memory 565, secondary memory 570 (includinginternal memory 575,removable medium 580, and external storage medium 595), and any peripheral device communicatively coupled with communication interface 590 (including a network information server or other network device). These non-transitory computer readable mediums are means for providing executable code, programming instructions, and software to thesystem 550. - In an embodiment that is implemented using software, the software may be stored on a computer readable medium and loaded into the
system 550 by way ofremovable medium 580, I/O interface 585, orcommunication interface 590. In such an embodiment, the software is loaded into thesystem 550 in the form of electrical communication signals 605. The software, when executed by theprocessor 560, preferably causes theprocessor 560 to perform the inventive features and functions previously described herein. - In an embodiment, I/
O interface 585 provides an interface between one or more components ofsystem 550 and one or more input and / or output devices. Example input devices include, without limitation, keyboards, touch screens or other touch-sensitive devices, biometric sensing devices, computer mice, trackballs, pen-based pointing devices, and the like. The input device can also be thecamera 105 or other sensors withinenvironment 102 as well asrobot 104. Examples of output devices include, without limitation, cathode ray tubes (CRTs), plasma displays, light-emitting diode (LED) displays, liquid crystal displays (LCDs), printers, vacuum fluorescent displays (VFDs), surface-conduction electron-emitter displays (SEDs), field emission displays (FEDs), and the like. - The
system 550 also includes optional wireless communication components that facilitate wireless communication over a voice and over a data network. The wireless communication components comprise anantenna system 610, aradio system 615 and abaseband system 620. In thesystem 550, radio frequency (RF) signals are transmitted and received over the air by theantenna system 610 under the management of theradio system 615. - In one embodiment, the
antenna system 610 may comprise one or more antennae and one or more multiplexors (not shown) that perform a switching function to provide theantenna system 610 with transmit and receive signal paths. In the receive path, received RF signals can be coupled from a multiplexor to a low noise amplifier (not shown) that amplifies the received RF signal and sends the amplified signal to theradio system 615. - In alternative embodiments, the
radio system 615 may comprise one or more radios that are configured to communicate over various frequencies. In one embodiment, theradio system 615 may combine a demodulator (not shown) and modulator (not shown) in one integrated circuit (IC). The demodulator and modulator can also be separate components. In the incoming path, the demodulator strips away the RF carrier signal leaving a baseband receive audio signal, which is sent from theradio system 615 to thebaseband system 620. - If the received signal contains audio information, then baseband
system 620 decodes the signal and converts it to an analog signal. Then the signal is amplified and sent to a speaker. Thebaseband system 620 also receives analog audio signals from a microphone. These analog audio signals are converted to digital signals and encoded by thebaseband system 620. Thebaseband system 620 also codes the digital signals for transmission and generates a baseband transmit audio signal that is routed to the modulator portion of theradio system 615. The modulator mixes the baseband transmit audio signal with an RF carrier signal generating an RF transmit signal that is routed to the antenna system and may pass through a power amplifier (not shown). The power amplifier amplifies the RF transmit signal and routes it to theantenna system 610 where the signal is switched to the antenna port for transmission. Thebaseband system 620 can also be communicatively coupled with theprocessor 560. -
Radio system 615 can for example be used to communicate withrobot 104,camera 105, as well as other sensors. - The
central processing unit 560 has access todata storage areas central processing unit 560 is preferably configured to execute instructions (i.e., computer programs or software) that can be stored in thememory 565 or thesecondary memory 570. Computer programs can also be received from thebaseband processor 610 and stored in thedata storage area 565 or insecondary memory 570, or executed upon receipt. Such computer programs, when executed, enable thesystem 550 to perform the various functions of the present invention as previously described. For example,data storage areas 565 may include various software modules (not shown). - Various embodiments may also be implemented primarily in hardware using, for example, components such as application specific integrated circuits (ASICs), or field programmable gate arrays (FPGAs). Implementation of a hardware state machine capable of performing the functions described herein will also be apparent to those skilled in the relevant art. Various embodiments may also be implemented using a combination of both hardware and software.
- Furthermore, those of skill in the art will appreciate that the various illustrative logical blocks, modules, circuits, and method steps described in connection with the above described figures and the embodiments disclosed herein can often be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled persons can implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the invention. In addition, the grouping of functions within a module, block, circuit or step is for ease of description. Specific functions or steps can be moved from one module, block or circuit to another without departing from the invention.
- Moreover, the various illustrative logical blocks, modules, functions, and methods described in connection with the embodiments disclosed herein can be implemented or performed with a general purpose processor, a digital signal processor (DSP), an ASIC, FPGA or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general- purpose processor can be a microprocessor, but in the alternative, the processor can be any processor, controller, microcontroller, or state machine. A processor can also be implemented as a combination of computing devices, for example, a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
- Additionally, the steps of a method or algorithm described in connection with the embodiments disclosed herein can be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module can reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium including a network storage medium. An exemplary storage medium can be coupled to the processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium can be integral to the processor. The processor and the storage medium can also reside in an ASIC.
- Any of the software components described herein may take a variety of forms. For example, a component may be a stand-alone software package, or it may be a software package incorporated as a “tool” in a larger software product. It may be downloadable from a network, for example, a website, as a stand-alone product or as an add-in package for installation in an existing software application. It may also be available as a client-server software application, as a web-enabled software application, and/or as a mobile application.
- While certain embodiments have been described above, it will be understood that the embodiments described are by way of example only. Accordingly, the systems and methods described herein should not be limited based on the described embodiments. Rather, the systems and methods described herein should only be limited in light of the claims that follow when taken in conjunction with the above description and accompanying drawings.
Claims (20)
1. A method comprising:
receiving a set of sensor data;
with the sensor data, determining a set of physical objects in an environment;
determining a virtual model of a physical object of the set, wherein the virtual model comprises immutable properties of physical object;
determining a state estimate for the physical object;
based on the state estimate and virtual model, determining a set of potential grasp configurations for grasping the physical object;
based on the set of potential grasp configurations, determining a reach behavior associated with a collision-free path towards the physical object relative to a remainder of the set of physical objects in the environment; and
determining a trajectory for a robot based on the reach behavior.
2. The method of claim 1 , wherein the virtual model comprises an object mass and an object geometry.
3. The method of claim 2 , wherein the set of physical objects in the environment are determined with an object detector, wherein the object mass is estimated based on an object class associated with the object detector.
4. The method of claim 1 , wherein determining the trajectory comprises: fitting a set of splines to points of the collision-free path in a configuration space of the robot.
5. The method of claim 1 , further comprising: selecting the physical object as a grasp target from the set of physical objects based on a set of heuristics; and, based on the selection of the physical object as the grasp target, controlling the robot based on the trajectory.
6. The method of claim 5 , wherein the set of heuristics comprises: a grasp quality heuristic or a path length optimization.
7. The method of claim 1 , wherein the collision-free path is determined based on an environmental collision evaluation based on pre-computed parameter values for simulated robot behaviors.
8. The method of claim 7 , wherein the environmental collision evaluation is based on a lookup table.
9. The method of claim 7 , wherein the pre-computed parameter values are determined using reinforcement learning for discretized state and action spaces.
10. The method of claim 1 , wherein the virtual model comprises a rigid-body kinematic model for the physical object.
11. The method of claim 1 , further comprising: determining a set of robot behaviors based on the potential grasp configurations, the set of robot behaviors comprising at least one of: a pre-grasp behavior, a grasp behavior, a transport behavior, or a release behavior.
12. The method of claim 11 , wherein the reach behavior defines, based on the pre-grasp poses, how the robot should move toward the object in order to grasp it.
13. The method of claim 1 , wherein the reach behavior defines a set of motor forces and torques for joints of the robot.
14. A method comprising:
receiving a set of sensor data;
with the sensor data, identifying a physical object within a surrounding environment;
determining a virtual model and a state estimate for the physical object, the virtual model comprising an object mass and an object geometry;
based on the state estimate and virtual model, determining a set of potential grasp configurations for grasping the physical object; and
determining a pre-grasp trajectory for a robot to achieve at least one potential grasp configuration of the set, wherein determining the pre-grasp trajectory comprises enforcing a non-collision constraint between the robot and the surrounding environment.
15. The method of claim 14 , further comprising: identifying a plurality of physical objects within the surrounding environment based on the sensor data; and determining the surrounding environment for the physical object, wherein the surrounding environment is associated with a virtual model and a state estimate for each physical object of the plurality.
16. The method of claim 14 , wherein the object model comprises estimated values for a set of immutable object properties, wherein the object geometry and object mass are modeled as immutable object properties.
17. The method of claim 14 , wherein determining the pre-grasp trajectory comprises: fitting a set of splines to points of the collision-free path in a configuration space of the robot.
18. The method of claim 14 , wherein determining the pre-grasp trajectory comprises an environmental collision evaluation based on pre-computed parameter values for simulated robot behaviors.
19. The method of claim 18 , wherein the environmental collision evaluation is based on a lookup table.
20. The method of claim 18 , wherein the pre-computed parameter values are determined using reinforcement learning for discretized state and action spaces.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/077,135 US20230105746A1 (en) | 2019-08-02 | 2022-12-07 | Systems and methods for robotic control under contact |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201962882396P | 2019-08-02 | 2019-08-02 | |
US201962882397P | 2019-08-02 | 2019-08-02 | |
US201962882395P | 2019-08-02 | 2019-08-02 | |
US16/943,884 US11548152B2 (en) | 2019-08-02 | 2020-07-30 | Systems and methods for robotic control under contact |
US18/077,135 US20230105746A1 (en) | 2019-08-02 | 2022-12-07 | Systems and methods for robotic control under contact |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/943,884 Continuation US11548152B2 (en) | 2019-08-02 | 2020-07-30 | Systems and methods for robotic control under contact |
Publications (1)
Publication Number | Publication Date |
---|---|
US20230105746A1 true US20230105746A1 (en) | 2023-04-06 |
Family
ID=74259872
Family Applications (7)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/630,795 Pending US20220281120A1 (en) | 2019-08-02 | 2020-07-30 | Robotic manipulators |
US16/944,020 Abandoned US20210031368A1 (en) | 2019-08-02 | 2020-07-30 | Robotic system for picking and placing objects from and into a constrained space |
US17/630,720 Pending US20220258355A1 (en) | 2019-08-02 | 2020-07-30 | Systems and method for robotics control under contact |
US16/943,884 Active 2041-06-08 US11548152B2 (en) | 2019-08-02 | 2020-07-30 | Systems and methods for robotic control under contact |
US17/630,804 Pending US20220274256A1 (en) | 2019-08-02 | 2020-07-30 | A robotic system for picking and placing objects from and into a constrained space |
US16/943,915 Abandoned US20210031373A1 (en) | 2019-08-02 | 2020-07-30 | Robotic manipulators |
US18/077,135 Pending US20230105746A1 (en) | 2019-08-02 | 2022-12-07 | Systems and methods for robotic control under contact |
Family Applications Before (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/630,795 Pending US20220281120A1 (en) | 2019-08-02 | 2020-07-30 | Robotic manipulators |
US16/944,020 Abandoned US20210031368A1 (en) | 2019-08-02 | 2020-07-30 | Robotic system for picking and placing objects from and into a constrained space |
US17/630,720 Pending US20220258355A1 (en) | 2019-08-02 | 2020-07-30 | Systems and method for robotics control under contact |
US16/943,884 Active 2041-06-08 US11548152B2 (en) | 2019-08-02 | 2020-07-30 | Systems and methods for robotic control under contact |
US17/630,804 Pending US20220274256A1 (en) | 2019-08-02 | 2020-07-30 | A robotic system for picking and placing objects from and into a constrained space |
US16/943,915 Abandoned US20210031373A1 (en) | 2019-08-02 | 2020-07-30 | Robotic manipulators |
Country Status (4)
Country | Link |
---|---|
US (7) | US20220281120A1 (en) |
EP (1) | EP4007679A4 (en) |
AU (1) | AU2020325084A1 (en) |
WO (3) | WO2021025956A1 (en) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7267994B2 (en) * | 2017-08-14 | 2023-05-02 | コンタクタイル ピーティーワイ リミテッド | Friction-based tactile sensor to measure grip security |
US11775699B2 (en) * | 2019-05-02 | 2023-10-03 | Dassault Systemes Americas Corp. | Extracting grasping cues from tool geometry for digital human models |
US20220281120A1 (en) * | 2019-08-02 | 2022-09-08 | Dextrous Robotics, Inc. | Robotic manipulators |
US11389968B2 (en) * | 2019-10-02 | 2022-07-19 | Toyota Research Institute, Inc. | Systems and methods for determining pose of objects held by flexible end effectors |
US11331799B1 (en) * | 2019-12-31 | 2022-05-17 | X Development Llc | Determining final grasp pose of robot end effector after traversing to pre-grasp pose |
US11642787B2 (en) * | 2020-06-30 | 2023-05-09 | Samsung Electronics Co., Ltd. | Trajectory generation of a robot using a neural network |
US20210299866A1 (en) * | 2020-12-23 | 2021-09-30 | Intel Corporation | Robotic manipulation planning based on probalistic elastoplastic deformation material point method |
WO2022251657A2 (en) * | 2021-05-28 | 2022-12-01 | Idaho Forest Group, LLC | Robotic wrapping system |
US20230138629A1 (en) * | 2021-10-28 | 2023-05-04 | Toyota Research Institute, Inc. | Structures and sensor assemblies having engagement structures for securing a compliant substrate assembly |
JP2023130891A (en) * | 2022-03-08 | 2023-09-21 | 株式会社安川電機 | Robot system, planning system, robot control method, and planning program |
US20240104487A1 (en) * | 2022-03-18 | 2024-03-28 | Xpo Logistics, Inc. | Intelligent shipment analysis and routing |
TR2022014644A2 (en) * | 2022-09-23 | 2022-10-21 | Ahmet Saygin Oeguelmues | MANIPULATOR AND MOTOR SYSTEM WITH SEVEN-DEGREES OF FREEDOM TWO-Spherical LINEAR ACTUATOR |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120290133A1 (en) * | 2011-05-10 | 2012-11-15 | Seiko Epson Corporation | Robot hand and robot |
US20180001472A1 (en) * | 2015-01-26 | 2018-01-04 | Duke University | Specialized robot motion planning hardware and methods of making and using same |
US20190283241A1 (en) * | 2018-03-19 | 2019-09-19 | Kabushiki Kaisha Toshiba | Holding device, transport system, and controller |
US20190308320A1 (en) * | 2018-04-05 | 2019-10-10 | Omron Corporation | Object recognition processing apparatus and method, and object picking apparatus and method |
US20210178591A1 (en) * | 2018-08-23 | 2021-06-17 | Realtime Robotics, Inc. | Collision detection useful in motion planning for robotics |
US20220379473A1 (en) * | 2019-11-19 | 2022-12-01 | Hitachi, Ltd. | Trajectory plan generation device, trajectory plan generation method, and trajectory plan generation program |
US11548152B2 (en) * | 2019-08-02 | 2023-01-10 | Dextrous Robotics, Inc. | Systems and methods for robotic control under contact |
Family Cites Families (49)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS58184866A (en) * | 1982-04-22 | 1983-10-28 | Murata Giken Kk | Processing system of transmission disable information of facsimile multiple address device |
JP2560262B2 (en) | 1994-11-09 | 1996-12-04 | 工業技術院長 | Two-finger micro hand mechanism |
WO2004028753A2 (en) * | 2002-09-26 | 2004-04-08 | Barrett Technology, Inc. | Intelligent, self-contained robotic hand |
US7854108B2 (en) * | 2003-12-12 | 2010-12-21 | Vision Robotics Corporation | Agricultural robot system and method |
US7688016B2 (en) * | 2005-09-28 | 2010-03-30 | Canadian Space Agency | Robust impedance-matching of manipulators interacting with unknown environments |
US8033189B2 (en) * | 2005-12-28 | 2011-10-11 | Honda Motor Co., Ltd. | Robot skin |
EP1815949A1 (en) | 2006-02-03 | 2007-08-08 | The European Atomic Energy Community (EURATOM), represented by the European Commission | Medical robotic system with manipulator arm of the cylindrical coordinate type |
US20080247857A1 (en) | 2007-04-05 | 2008-10-09 | Ichiro Yuasa | End effector and robot for transporting substrate |
JP2008260110A (en) * | 2007-04-13 | 2008-10-30 | Honda Motor Co Ltd | Work gripping device and work gripping method |
US20090306825A1 (en) * | 2008-04-21 | 2009-12-10 | Ying Li | Manipulation system and method |
US8276959B2 (en) | 2008-08-08 | 2012-10-02 | Applied Materials, Inc. | Magnetic pad for end-effectors |
JP5267213B2 (en) * | 2009-03-02 | 2013-08-21 | 株式会社安川電機 | Multi-fingered hand and robot |
DE102010029745A1 (en) * | 2010-06-07 | 2011-12-08 | Kuka Laboratories Gmbh | Workpiece handling system and method for manipulating workpieces by means of cooperating manipulators |
JP2014108466A (en) | 2012-11-30 | 2014-06-12 | Fanuc Ltd | Electric hand with force sensor |
US9095978B2 (en) * | 2012-12-07 | 2015-08-04 | GM Global Technology Operations LLC | Planning a grasp approach, position, and pre-grasp pose for a robotic grasper based on object, grasper, and environmental constraint data |
US20140265394A1 (en) | 2013-03-13 | 2014-09-18 | Varian Semiconductor Equipment Associates, Inc. | Composite end effectors |
US9199376B2 (en) | 2013-03-14 | 2015-12-01 | GM Global Technology Operations LLC | Intuitive grasp control of a multi-axis robotic gripper |
US9102055B1 (en) * | 2013-03-15 | 2015-08-11 | Industrial Perception, Inc. | Detection and reconstruction of an environment to facilitate robotic interaction with the environment |
US9796089B2 (en) * | 2013-03-15 | 2017-10-24 | Carnegie Mellon University | Supervised autonomous robotic system for complex surface inspection and processing |
US9512912B1 (en) * | 2013-06-24 | 2016-12-06 | Redwood Robotics, Inc. | Robot actuator utilizing a differential pulley transmission |
WO2015058297A1 (en) * | 2013-10-25 | 2015-04-30 | Vakanski Aleksandar | Image-based trajectory robot programming planning approach |
US9597797B2 (en) | 2013-11-01 | 2017-03-21 | Brain Corporation | Apparatus and methods for haptic training of robots |
US9283676B2 (en) * | 2014-06-20 | 2016-03-15 | GM Global Technology Operations LLC | Real-time robotic grasp planning |
US9554512B2 (en) * | 2014-09-12 | 2017-01-31 | Washington State University | Robotic systems, methods, and end-effectors for harvesting produce |
JP6166305B2 (en) | 2015-05-08 | 2017-07-19 | ファナック株式会社 | Load parameter setting device and load parameter setting method |
US20190060019A1 (en) | 2016-01-28 | 2019-02-28 | Transenterix Surgical, Inc. | Force estimation using robotic manipulator force torque sensors |
JP6586532B2 (en) | 2016-03-03 | 2019-10-02 | グーグル エルエルシー | Deep machine learning method and apparatus for robot gripping |
US11065767B2 (en) * | 2016-04-22 | 2021-07-20 | Mitsubishi Electric Corporation | Object manipulation apparatus and object manipulation method for automatic machine that picks up and manipulates an object |
US9774409B1 (en) * | 2016-08-17 | 2017-09-26 | T-Mobile Usa, Inc. | Radio frequency shielded robotic telecommunication device testing platform |
IT201600112324A1 (en) | 2016-11-08 | 2018-05-08 | A B L S P A | MACHINE FOR THE REMOVAL OF SEEDS |
DK3537867T3 (en) | 2016-11-08 | 2023-11-06 | Dogtooth Tech Limited | ROBOT FRUIT PICKING SYSTEM |
US10583557B2 (en) * | 2017-02-10 | 2020-03-10 | GM Global Technology Operations LLC | Redundant underactuated robot with multi-mode control framework |
JP2018202641A (en) | 2017-05-31 | 2018-12-27 | セイコーエプソン株式会社 | Liquid discharge device and cable |
JP6807280B2 (en) * | 2017-06-02 | 2021-01-06 | 日立Geニュークリア・エナジー株式会社 | Remote work robot control system and remote work robot control method |
US10773382B2 (en) * | 2017-09-15 | 2020-09-15 | X Development Llc | Machine learning methods and apparatus for robotic manipulation and that utilize multi-task domain adaptation |
US11576735B2 (en) * | 2017-11-15 | 2023-02-14 | Steerable Instruments N.V. | Controllable steerable instrument |
WO2019146007A1 (en) | 2018-01-24 | 2019-08-01 | 三菱電機株式会社 | Position control device and position control method |
US11707955B2 (en) * | 2018-02-21 | 2023-07-25 | Outrider Technologies, Inc. | Systems and methods for automated operation and handling of autonomous trucks and trailers hauled thereby |
JP6857332B2 (en) * | 2018-03-13 | 2021-04-14 | オムロン株式会社 | Arithmetic logic unit, arithmetic method, and its program |
CN108858199B (en) | 2018-07-27 | 2020-04-07 | 中国科学院自动化研究所 | Method for grabbing target object by service robot based on vision |
US11465279B2 (en) * | 2018-11-29 | 2022-10-11 | X Development Llc | Robot base position planning |
US10597264B1 (en) * | 2018-12-20 | 2020-03-24 | Advanced Construction Robotics, Inc. | Semi-autonomous system for carrying and placing elongate objects |
JP7168514B2 (en) * | 2019-04-05 | 2022-11-09 | 川崎重工業株式会社 | Holding device and robot equipped with the same |
US10618172B1 (en) | 2019-05-31 | 2020-04-14 | Mujin, Inc. | Robotic system with error detection and dynamic packing mechanism |
KR20220016819A (en) * | 2019-06-05 | 2022-02-10 | 소니그룹주식회사 | Control device and method, and program |
JP7204587B2 (en) | 2019-06-17 | 2023-01-16 | 株式会社東芝 | OBJECT HANDLING CONTROL DEVICE, OBJECT HANDLING DEVICE, OBJECT HANDLING METHOD AND OBJECT HANDLING PROGRAM |
US11213947B2 (en) | 2019-06-27 | 2022-01-04 | Intel Corporation | Apparatus and methods for object manipulation via action sequence optimization |
US11285612B2 (en) * | 2019-08-20 | 2022-03-29 | X Development Llc | Coordinating agricultural robots |
US11772262B2 (en) * | 2019-10-25 | 2023-10-03 | Dexterity, Inc. | Detecting slippage from robotic grasp |
-
2020
- 2020-07-30 US US17/630,795 patent/US20220281120A1/en active Pending
- 2020-07-30 WO PCT/US2020/044303 patent/WO2021025956A1/en unknown
- 2020-07-30 US US16/944,020 patent/US20210031368A1/en not_active Abandoned
- 2020-07-30 EP EP20849928.5A patent/EP4007679A4/en active Pending
- 2020-07-30 US US17/630,720 patent/US20220258355A1/en active Pending
- 2020-07-30 WO PCT/US2020/044319 patent/WO2021025960A1/en active Application Filing
- 2020-07-30 US US16/943,884 patent/US11548152B2/en active Active
- 2020-07-30 US US17/630,804 patent/US20220274256A1/en active Pending
- 2020-07-30 US US16/943,915 patent/US20210031373A1/en not_active Abandoned
- 2020-07-30 WO PCT/US2020/044297 patent/WO2021025955A1/en active Application Filing
- 2020-07-30 AU AU2020325084A patent/AU2020325084A1/en not_active Abandoned
-
2022
- 2022-12-07 US US18/077,135 patent/US20230105746A1/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120290133A1 (en) * | 2011-05-10 | 2012-11-15 | Seiko Epson Corporation | Robot hand and robot |
US20180001472A1 (en) * | 2015-01-26 | 2018-01-04 | Duke University | Specialized robot motion planning hardware and methods of making and using same |
US20190283241A1 (en) * | 2018-03-19 | 2019-09-19 | Kabushiki Kaisha Toshiba | Holding device, transport system, and controller |
US20190308320A1 (en) * | 2018-04-05 | 2019-10-10 | Omron Corporation | Object recognition processing apparatus and method, and object picking apparatus and method |
US20210178591A1 (en) * | 2018-08-23 | 2021-06-17 | Realtime Robotics, Inc. | Collision detection useful in motion planning for robotics |
US11548152B2 (en) * | 2019-08-02 | 2023-01-10 | Dextrous Robotics, Inc. | Systems and methods for robotic control under contact |
US20220379473A1 (en) * | 2019-11-19 | 2022-12-01 | Hitachi, Ltd. | Trajectory plan generation device, trajectory plan generation method, and trajectory plan generation program |
Also Published As
Publication number | Publication date |
---|---|
US20220274256A1 (en) | 2022-09-01 |
US20210031373A1 (en) | 2021-02-04 |
EP4007679A1 (en) | 2022-06-08 |
US20210031368A1 (en) | 2021-02-04 |
AU2020325084A1 (en) | 2022-02-24 |
EP4007679A4 (en) | 2023-08-09 |
US20220281120A1 (en) | 2022-09-08 |
US20210031375A1 (en) | 2021-02-04 |
WO2021025960A1 (en) | 2021-02-11 |
WO2021025956A1 (en) | 2021-02-11 |
WO2021025955A1 (en) | 2021-02-11 |
US20220258355A1 (en) | 2022-08-18 |
US11548152B2 (en) | 2023-01-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230105746A1 (en) | Systems and methods for robotic control under contact | |
US20210187735A1 (en) | Positioning a Robot Sensor for Object Classification | |
Bagnell et al. | An integrated system for autonomous robotics manipulation | |
US11090808B2 (en) | Control device, picking system, distribution system, program, control method and production method | |
EP3566824B1 (en) | Method, apparatus, computer-readable storage media and a computer program for robotic programming | |
JP2015520040A (en) | Training and operating industrial robots | |
US11097421B2 (en) | Control device, picking system, distribution system, program, control method and production method | |
US11426864B2 (en) | Robot manipulator system and methods for providing supplemental securement of objects | |
US20210023702A1 (en) | Systems and methods for determining a type of grasp for a robotic end-effector | |
JPWO2018185861A1 (en) | Control device, picking system, distribution system, program, and control method | |
US20210016454A1 (en) | Control of modular end-of-arm tooling for robotic manipulators | |
US11633861B2 (en) | Systems, methods and associated components for robotic manipulation of physical objects | |
Sarantopoulos et al. | Human-inspired robotic grasping of flat objects | |
Patel et al. | Improving grasp performance using in-hand proximity and contact sensing | |
US10933526B2 (en) | Method and robotic system for manipulating instruments | |
Bolano et al. | Towards a vision-based concept for gesture control of a robot providing visual feedback | |
Shaju et al. | Conceptual design and simulation study of an autonomous indoor medical waste collection robot | |
Henst | Declaration concerning the TU/e Code of Scientific Conduct | |
Gwozdz et al. | Enabling semi-autonomous manipulation on Irobot’s Packbot | |
GB2622222A (en) | Hand-Eye Calibration for a Robotic Manipulator | |
Khurana | Human-Robot Collaborative Control for Inspection and Material Handling using Computer Vision and Joystick | |
CN117519469A (en) | Space interaction device and method applied to man-machine interaction | |
Macias | Visual and gesture guided robotic block stacking | |
Bagnell et al. | Additional Information: If you wish to contact a Curtin researcher associated with this document, you may obtain an email address from http://find. curtin. edu. au/staff/index. cfm |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
AS | Assignment |
Owner name: DEXTROUS ROBOTICS, INC., TENNESSEE Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DRUMWRIGHT, EVAN;ZAPOLSKY, SAMUEL;SIGNING DATES FROM 20230110 TO 20230111;REEL/FRAME:062341/0975 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |