CN109318226B - Robot control device, robot control method, and storage medium - Google Patents

Robot control device, robot control method, and storage medium Download PDF

Info

Publication number
CN109318226B
CN109318226B CN201810606950.7A CN201810606950A CN109318226B CN 109318226 B CN109318226 B CN 109318226B CN 201810606950 A CN201810606950 A CN 201810606950A CN 109318226 B CN109318226 B CN 109318226B
Authority
CN
China
Prior art keywords
robot
workpiece
picking
posture
predicted
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810606950.7A
Other languages
Chinese (zh)
Other versions
CN109318226A (en
Inventor
中岛茜
小岛岳史
林剑之介
藤井春香
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Omron Corp
Original Assignee
Omron Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Omron Corp filed Critical Omron Corp
Publication of CN109318226A publication Critical patent/CN109318226A/en
Application granted granted Critical
Publication of CN109318226B publication Critical patent/CN109318226B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0093Programme-controlled manipulators co-operating with conveyor means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • B25J9/1633Programme controls characterised by the control loop compliant, force, torque control, e.g. combined with position control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • B25J9/1666Avoiding collision or forbidden zones
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1669Programme controls characterised by programming, planning systems for manipulators characterised by special application, e.g. multi-arm co-operation, assembly, grasping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G47/00Article or material-handling devices associated with conveyors; Methods employing such devices
    • B65G47/74Feeding, transfer, or discharging devices of particular kinds or types
    • B65G47/90Devices for picking-up and depositing articles or materials
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G61/00Use of pick-up or transfer devices or of manipulators for stacking or de-stacking articles not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G2203/00Indexing code relating to control or detection of the articles or the load carriers during conveying
    • B65G2203/04Detection means
    • B65G2203/041Camera
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/37555Camera detects orientation, position workpiece, points of workpiece
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39106Conveyor, pick up article, object from conveyor, bring to test unit, place it
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40007Optimize sequence of pick and place operations upon arrival of workpiece on conveyor
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40022Snatching, dynamic pick, effector contacts object, moves with object
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40053Pick 3-D object from pile of objects
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40554Object recognition to track object on conveyor

Abstract

A robot control device, a robot control method, and a storage medium, which can quickly pick up a conveyed workpiece. A robot control device (300) is provided with a workpiece attitude calculation unit (310), an arrival prediction unit (320), a robot attitude calculation unit (330), and a trajectory data generation unit (340). An arrival prediction unit (320) obtains a picking prediction position at which a workpiece (W) conveyed by a conveying device (200) can be picked by a picking device (120) and the like, based on conveying speed information (Ivt) supplied from the conveying device (200) and the like and sensing information (Ise) supplied from an image acquisition device (410).

Description

Robot control device, robot control method, and storage medium
Technical Field
The present invention relates to a control technique for a robot for picking an object.
Background
A so-called Bin-Picking system is known in which objects (hereinafter, also referred to as "workpieces") randomly stacked in a stationary container are picked up one by a robot and transferred to a predetermined position. In the Bin-Picking system, operations such as measuring the position and orientation of each workpiece, specifying a workpiece to be taken out next based on the measurement result, gripping (that is, Picking) the workpiece by a robot arm, and transferring the workpiece are performed (for example, see patent document 1).
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open No. 2014-180704
Disclosure of Invention
Technical problem to be solved by the invention
However, as described above, the conventional Bin-Picking system is premised on that a workpiece to be picked and a container for storing the workpiece are stationary. Therefore, in order to pick a workpiece conveyed by a conveyor or the like by a robot, it is necessary to stop the conveyor every time and to stop the workpiece to be picked, and there is a problem that it takes a lot of time for the robot to pick the workpiece.
The present invention has been made in view of the above-described circumstances, and an object thereof is to provide a control technique for a robot capable of quickly picking a workpiece without stopping the robot.
Means for solving the problems
A robot control device according to an aspect of the present invention is a robot control device for controlling a robot that picks up a workpiece conveyed by a conveyor, the robot control device including: a workpiece attitude calculation unit that calculates an attitude of the workpiece based on the sensing information of the workpiece output from the measurement device; an arrival prediction unit that obtains a predicted picking position predicted as a position at which the workpiece being conveyed is picked by the robot, based on the sensing information of the workpiece and the conveying speed of the conveying device; a robot posture calculation unit that calculates a posture of the robot at the picking prediction position based on the calculated posture of the workpiece and the picking prediction position; and a trajectory data generation unit that acquires the posture of the work start position of the robot, and generates trajectory data indicating the work trajectory of the robot from the work start position to the picking prediction position based on the acquired posture of the robot at the work start position and the acquired posture of the robot at the picking prediction position.
According to the above configuration, the predicted picking position predicted as the workpiece being picked by the robot during conveyance is obtained based on the sensing information of the workpiece and the conveying speed of the conveying device. Further, since trajectory data indicating the operation trajectory from the operation start position of the robot to the predicted picking position is generated based on the posture of the operation start position of the robot and the posture of the robot at the predicted picking position, it is possible to control the picking operation of the robot in consideration of the movement of the workpiece caused by the conveyance even when the workpiece to be picked is conveyed from the start of the operation of the robot to the picking operation. Thereby, a picking work can be realized more rapidly than in the prior art.
A robot control method according to another aspect of the present invention is a robot control method for controlling a robot that picks up a workpiece conveyed by a conveying device, wherein the robot control method includes the steps of: calculating a posture of the workpiece based on sensing information of the workpiece output from the measuring device; calculating a predicted picking position predicted as a position at which the workpiece being conveyed is picked by the robot based on the sensing information of the workpiece and the conveying speed of the conveying device; calculating the posture of the robot at the picking predicted position based on the calculated posture of the workpiece and the picking predicted position; and acquiring a posture of the work start position of the robot, and generating trajectory data indicating a work trajectory of the robot from the work start position to the picking prediction position based on the acquired posture of the robot at the work start position and the posture of the robot at the picking prediction position.
According to another aspect of the present invention, there is stored a robot control program for causing a computer that controls a robot that picks up a workpiece conveyed by a conveying device to operate as: a workpiece attitude calculation unit that calculates an attitude of the workpiece based on the sensing information of the workpiece output from the measurement device; an arrival prediction unit that obtains a predicted picking position predicted as a position at which the workpiece being conveyed is picked by the robot, based on the sensing information of the workpiece and the conveying speed of the conveying device; a robot posture calculation unit that calculates a posture of the robot at the picking prediction position based on the calculated posture of the workpiece and the picking prediction position; and a trajectory data generation unit that acquires the posture of the work start position of the robot, and generates trajectory data indicating the work trajectory of the robot from the work start position to the picking prediction position based on the acquired posture of the robot at the work start position and the acquired posture of the robot at the picking prediction position.
According to the present invention, it is possible to provide a control technique for a robot capable of quickly picking a workpiece without stopping the robot.
Drawings
Fig. 1 is a diagram showing the overall structure of a robotic picking system.
Fig. 2 is a diagram showing a hardware configuration of the robot controller.
Fig. 3 is a block diagram showing the configuration of the robot controller.
Fig. 4 is a flowchart showing the robot control process.
Fig. 5 is a block diagram showing a configuration of a robot control device according to a modification.
Fig. 6 is a flowchart showing a robot control process of a modification.
Fig. 7 is a block diagram showing a configuration of a robot control device according to an application example.
Description of reference numerals:
1 … robotic picking system; 100 … robot; 110 … arms; 120 … picking device; 200 … conveying device; 300. 300a, 300b … robot control devices; 301 … CPU; 302 … memory; 303 … input means; 304 … output device; 305 … storage devices; 306 … communication means; 310 … workpiece posture calculation unit; 320 … reach the prediction section; 330 … robot posture calculating part; 340 … trajectory data generating part; 341 … operational required time calculating section; 342 … speed synchronizer; 343 … interference determination section; 344 … speed/acceleration calculation; 350 … grip data generating unit; 360 … placing posture calculating part; 370 … trajectory data selecting part; 400 … workpiece measuring device; 410 … image acquisition device.
Detailed Description
Embodiments of the present invention will be described in detail below with reference to the drawings. The same elements are denoted by the same reference numerals, and redundant description thereof is omitted. The following embodiments are examples for illustrating the present invention, and the present invention is not intended to be limited to these embodiments. The present invention can be variously modified within a range not departing from the gist thereof.
A. The present embodiment
A-1. structure
< robotic picking System >
Fig. 1 is a diagram showing a robot picking system 1 according to the present embodiment. As shown in fig. 1, the robot picking system 1 is a system in which a robot 100 picks workpieces W in a housing C conveyed by a conveyor 200 one by one and transfers the workpieces W to a predetermined rack D (or a conveyor or the like).
The conveying device 200 is, for example, various conveyors installed in a factory, and a plurality of workpieces W accommodated in the casing C are conveyed by the conveying device 200. In the present embodiment, a case is assumed in which a plurality of workpieces W are conveyed in a state of being stacked in bulk in the casing C, but the plurality of workpieces W may be conveyed in a state of being directly placed on a conveyor (i.e., a state of not being accommodated in the casing C).
The workpiece W is, for example, a metal member, various workpieces such as food, or a processed product. The workpiece W picked by the robot 100 from the housing C is transferred to a predetermined position on the rack D, for example.
The robot picking system 1 includes: a robot 100 that picks up the workpiece W from the housing C and transfers the workpiece W onto the rack D without stopping the workpiece W conveyed by the conveyor 200; a robot controller 300 that controls the operation of the robot 100; the workpiece measuring apparatus 400 acquires an image of the workpiece W and measures the posture, position, and the like of the workpiece W.
< robot >
The robot 100 is a so-called articulated robot including an arm 110 and a picking device (manipulator) 120. The robot 100 has a six-degree-of-freedom structure required for performing a work in a three-dimensional space. However, the present invention is not intended to be limited to a six-degree-of-freedom robot, and may be applied to a robot having a seven-degree-of-freedom structure in which a redundant degree of freedom is added to six degrees of freedom.
A picking device 120 is mounted to the front end of the arm 110. The picking device 120 of the present embodiment is, for example, a parallel hand, a multi-fingered multi-joint hand, or the like, and holds the workpiece W by opening and closing a plurality of members. The opening and closing operations of the picking device 120 are controlled in accordance with a control signal supplied from the robot control device 300.
< workpiece measuring apparatus >
The workpiece measuring apparatus 400 outputs image data of the workpiece W to the robot controller 300. Specifically, the workpiece measuring apparatus 400 includes an image acquiring apparatus 410 that acquires image data of the workpiece W, a CPU, a memory such as a ROM or a RAM that stores a control program executed by the CPU, a communication interface, and the like. The image acquisition device 410 can use, for example, an image sensor. The workpiece measuring apparatus 400 reads and executes the control program from the memory by the CPU, and outputs the image data of the workpiece W as sensing information to the robot control apparatus 300 via the communication interface.
< robot control device >
The robot controller 300 provides a control signal to the robot 100 to perform central control of each part of the robot 100. The robot controller 300 is configured by a computer connected to the robot 100 so as to be capable of bidirectional communication.
Fig. 2 is a diagram showing the configuration of the robot controller 300.
The robot controller 300 is constituted by an information processing device such as a personal computer. The robot control device 300 includes a memory 302 such as a ROM or a RAM storing various control programs executed by the CPU301, an input device 303 having a keyboard, a mouse, operation buttons, and the like, an output device 304 having a liquid crystal panel and the like, a storage device 305 such as a hard disk, a communication device 306 having each communication interface, and the like.
Fig. 3 is a block diagram showing the configuration of the robot controller 300.
The robot control device 300 reads and interprets/executes software stored in the memory 302 or the like by the CPU301, and realizes the respective sections shown below.
As shown in fig. 3, the robot controller 300 includes a workpiece posture calculation unit 310, an arrival prediction unit 320, a robot posture calculation unit 330, and a trajectory data generation unit 340.
< workpiece orientation calculating section >
The workpiece attitude calculation section 310 detects the workpiece W based on the sensing information Ise supplied from the workpiece measuring device 400, and obtains attitude data Dwp1 indicating the attitude of the work start position of the workpiece W to be picked (i.e., the position at the time of measurement of the workpiece W). In the present embodiment, the case where the workpiece W to be picked is detected by the workpiece attitude calculation section 310 is exemplified, but instead, the workpiece W may be detected by the workpiece measurement device 400 from the image data of the workpiece W, the obtained sensing information indicating the coordinate values, the coordinate system, and the like of the workpiece W may be output to the workpiece attitude calculation section 310, and the workpiece attitude calculation section 310 may obtain the attitude data Dwp1 based on the sensing information. On the other hand, based on arrival prediction information Ipa of the work W (specifically, predicted picking position information Ipp indicating a predicted picking position of the work W to be predicted and predicted arrival time information Iat indicating a predicted arrival time) output from the arrival prediction unit 320 and posture data Dwp1 of the work W, posture data Dwp2 indicating the posture of the predicted picking position of the work W is obtained. The workpiece posture calculating unit 310 performs matching with a CAD model (model data) when calculating the posture of the work start position of the workpiece W to be picked. In the matching operation with the CAD model, first, the CAD model of the workpiece W is stored in advance in the memory or the like of the workpiece posture calculation unit 310. The workpiece orientation calculation unit 310 matches the sensing information Ise supplied from the workpiece measuring device 400 with the CAD model, and calculates the orientation of the workpiece W using the matching result. This makes it possible to calculate the posture with high accuracy, compared with a case where a CAD model of the workpiece W as a reference is not used.
In addition, as the attitude of the workpiece W, for example, an attitude expression of roll, pitch, and yaw using three rotation axes is used. Further, workpiece attitude calculation unit 310 multiplies attitude data Dwp1 of the work start position of workpiece W by the amount of movement in the XYZ-axis direction to the predicted picking position of workpiece W and the amount of rotation (roll, pitch, and yaw angle) about the XYZ-axis, thereby calculating the attitude of the predicted picking position of workpiece W and obtaining attitude data Dwp 2.
In addition, when the posture of the workpiece W changes during movement, if the amount of change in the posture of the workpiece W is known in advance, the amount of change in the posture can also be given as compensation. After obtaining the attitude data Dwp1 of the work start position of the work W and the attitude data Dwp2 of the predicted picking position of the work W in the above manner, the work attitude calculation section 310 outputs the attitude data Dwp2 of the predicted picking position to the robot attitude calculation section 330. In the present embodiment, the case where the workpiece attitude calculation section 310 obtains the attitude data Dwp1 indicating the attitude of the work start position of the workpiece W to be picked is exemplified, but instead, the workpiece measurement device 400 may calculate the attitude data Dwp1 and output the calculated attitude data Dwp1 to the workpiece attitude calculation section 310 together with the sensing information indicating the position, distance, and the like of the workpiece W.
< arrival predicting section >
The arrival prediction unit 320 obtains a position at which the workpiece W conveyed by the conveying device 200 can be picked by the picking device 120 (i.e., the robot) (hereinafter, also referred to as "picking prediction position"), a time at which the workpiece W can reach the picking prediction position (hereinafter, also referred to as "arrival prediction time"), and the like, based on the information Ivt indicating the conveying speed supplied from the conveying device 200 and the like (hereinafter, also referred to as "conveying speed information") and the sensing information Ise supplied from the image acquisition device 410. As an example of the picking prediction position, for example, the picking prediction position may be set within a range of X degrees to the left and right and a radius Ym from the front center of the robot 100, but the picking prediction position is not limited to this, and may be set as appropriate in consideration of layout design of a factory or the like. After the arrival predicting unit 320 obtains the predicted picking position and the predicted arrival time of the workpiece W, it outputs the predicted picking position information Ipp indicating the predicted picking position of the workpiece W and the predicted arrival time information Iat indicating the predicted arrival time to the workpiece posture calculating unit 310 as the predicted arrival information Ipa of the workpiece W.
< robot posture calculation Unit >
The robot posture calculating section 330 calculates the posture of the robot 100 picking the predicted position based on the posture data Dwp2 of the picking predicted position of the workpiece W supplied from the workpiece posture calculating section 310. The robot posture calculating unit 330 calculates the posture of the robot 100 picking the predicted position, and then outputs the calculation result to the trajectory data generating unit 340 as the posture data Drp2 of the robot 100 picking the predicted position. In the present embodiment, the case where the posture data Dwp2 of the predicted picking position of the workpiece W is obtained by the workpiece posture calculation unit 310 is exemplified, but the posture data Dwp2 of the predicted picking position of the workpiece W may be obtained by the robot posture calculation unit 330 using the arrival prediction information Ipa of the workpiece W supplied from the arrival prediction unit 320 and the posture data Dwp1 of the work start position of the workpiece W supplied from the workpiece posture calculation unit 310.
< trajectory data generating section >
The trajectory data generation unit 340 generates trajectory data Dtr of the robot 100 based on the posture data Drp1 indicating the posture of the work start position of the robot 100 supplied from the robot 100 and the posture data Drp2 indicating the posture of the picking predicted position supplied from the robot posture calculation unit 330. Here, the trajectory data Dtr of the robot 100 is data indicating trajectory information from the operation start position of the robot 100 to the predicted picking position. The trajectory information can be obtained by specifying the initial state of the robot 100 at the work start position based on the posture of the robot 100 and calculating the trajectory from the initial state to the picking prediction position. The trajectory data generator 340 generates trajectory data Dtr and outputs the trajectory data Dtr or a control signal Sc for controlling the operation of the robot 100 to the robot 100. The robot 100 performs operations of the arm 110 and the joints and opening and closing operations of the picking device 120 based on the control signal Sc output from the trajectory data generation unit 340.
A-2. work
Fig. 4 is a flowchart illustrating a robot control process executed by the robot control device 300 of the present embodiment.
After the robot controller 300 acquires the image data (sensing information) of the workpiece W from the workpiece measuring device 400 (step S1), the conveying speed of the conveying device 200 is first acquired (step S2). Specifically, the arrival prediction unit 320 of the robot controller 300 acquires the conveyance speed of the conveyance device 200 based on the conveyance speed information Ivt supplied from the conveyance device 200. The arrival predicting unit 320 receives the sensing information Ise from the workpiece measuring device 400, and obtains a predicted picking position and a predicted arrival time of the workpiece W based on the received sensing information Ise and the acquired conveying speed (step S3). The arrival predicting unit 320 outputs the predicted picking position information Ipp indicating the predicted picking position of the workpiece W and the predicted arrival time information Iat indicating the predicted arrival time to the workpiece posture calculating unit 310 as the predicted arrival information Ipa of the workpiece W. In the present embodiment, the arrival prediction information Ipa of the workpiece W is output to the workpiece posture calculation unit 310, but the posture data Dwp2 indicating the posture of the workpiece W at the picking prediction position may be calculated by the robot posture calculation unit 330, and in this case, the posture data Dwp2 may be output to the robot posture calculation unit 330.
On the other hand, workpiece posture calculating unit 310 detects workpiece W based on sensing information Ise supplied from workpiece measuring device 400, and obtains posture data Dwp1 indicating the posture of the work start position of workpiece W to be picked, and on the other hand, obtains posture data Dwp2 indicating the posture of the picking predicted position of workpiece W based on arrival prediction information Ipa of workpiece W (i.e., picking predicted position information Ipp indicating the predicted picking predicted position of workpiece W, arrival predicted time information Iat indicating the arrival predicted time) output from arrival predicting unit 320 and posture data Dwp1 of workpiece W (step S4). The workpiece posture calculating section 310 outputs the obtained posture data Dwp2 of the workpiece W to the robot posture calculating section 330.
The robot posture calculating section 330 calculates the posture of the robot 100 picking the predicted position based on the posture data Dwp2 of the picking predicted position of the work W supplied from the work posture calculating section 310 (step S5). The robot posture calculating unit 330 calculates the posture of the robot 100 picking the predicted position, and then outputs the calculation result to the trajectory data generating unit 340 as the posture data Drp2 of the robot 100 picking the predicted position.
The trajectory data generator 340 generates trajectory data Dtr of the robot 100 based on the posture data Drp1 indicating the posture of the work start position of the robot 100 supplied from the robot 100 and the posture data Drp2 indicating the posture of the picking predicted position supplied from the robot posture calculator 330 (step S6), and outputs the trajectory data Dtr or the control signal Sc to the robot 100 (step S7). The robot 100 performs operations of the arm 110 and each joint, opening and closing operations of the picking device 120, and the like based on the control signal Sc output from the trajectory data generating unit 340, and transfers the workpiece W moving on the conveying device 200 to a predetermined position on the rack D. The series of processes described above is repeatedly executed by the robot controller 300 for each workpiece W. By executing the robot control process, each workpiece W on the conveyor 200 can be transferred to a predetermined position of the rack D without stopping the conveyor 200.
B. Modification example
Structure B-1
Fig. 5 is a block diagram showing the configuration of a robot control device 300a according to a modification. The hardware configuration of the robot controller 300a shown in fig. 5 is the same as that of the robot controller 300 of the present embodiment. However, the robot control device 300a is different from the robot control device 300 of the present embodiment in that it includes a required operation time calculation unit 341, a speed synchronization unit 342, an interference determination unit 343, a speed/acceleration calculation unit 344, and a grip data generation unit 350. In other respects, since the same as the robot controller 300 shown in fig. 3 is used, the same reference numerals are given to corresponding parts, and detailed description thereof is omitted. Similarly to the robot control device 300, the robot control device 300a reads software stored in the memory 302 or the like from the CPU301 and interprets and executes the software, thereby realizing each part shown in fig. 5.
< calculation part of time required for operation >
The operation required time calculation unit 341 calculates the operation required time of the robot 100 from the current position of the robot to the predicted picking position of the workpiece W based on the posture data Drp1, Drp2, and the like of the robot 100. The trajectory data generation unit 340 can calculate the trajectory data Dtr without delay with respect to the movement of the workpiece W (i.e., the conveyance of the workpiece W until the workpiece W is picked up by the robot 100) by taking into account the required operation time calculated by the required operation time calculation unit 341.
< speed synchronization section >
The speed synchronization unit 342 synchronizes the conveyance speed of the conveyance device 200 with the operation speed of the robot 100 (i.e., makes the relative speed between the two zero) after the picking device (hand) 120 reaches the workpiece W. Specifically, after the robot 120 reaches the workpiece W, the difference between the conveyance speed of the conveyance device 200 and the operation speed of the robot 120 is sequentially detected by using a synchronous encoder or the like, and the operations of the conveyance device 200 and the robot 120 are controlled so that the difference becomes zero. After the robot hand 120 reaches the picking expected position, the conveying speed of the conveying device 200 is synchronized with the operating speed of the robot 100, so that the picking device 120 can perform a stable picking operation on the workpiece W. Note that, when the robot hand 120 is at a constant speed before reaching the workpiece W, the robot hand may be operated at a constant speed in accordance with the conveying speed of the conveying device 200.
< interference judging section >
The interference determination unit (constraint condition determination unit) 343 determines whether or not the motion (trajectory) of the robot 100 interferes with the obstacle based on the trajectory data generated by the trajectory data generation unit 340 based on the following two concepts (1) and (2).
(1) Interference judgment is performed on the partially generated tracks each time, and track data which do not collide with the obstacle is generated by connecting non-interfering tracks.
(2) The interference determination is performed on the entire trajectory from the operation start position of the robot 100 to the predicted picking position, and if there is no interference, the trajectory is used as trajectory data.
When the interference determination unit 343 determines that the robot 100 does not collide with an obstacle, the trajectory data generation unit 340 generates trajectory data Dtr of the robot 100 and outputs the trajectory data Dtr or the control signal Sc to the robot 100. This can prevent the robot 100 from colliding with an obstacle during picking operation. In the present modification, interference of the robot 100 by an obstacle is exemplified as one of the constraint conditions, but the present modification is not intended to be limited thereto. In addition, the trajectory may be generated in the movable area of the robot 100, and the velocity and acceleration of the robot 100 may be considered as one of the constraint conditions.
< speed/acceleration calculation section >
When the trajectory data generator 340 generates the trajectory data Dtr, the velocity/acceleration calculator 344 reflects the optimal velocity/acceleration of each joint of the robot 100 on the trajectory data Dtr of the robot 100. The optimization of the velocity/acceleration of each joint is performed from approximately two points of view. One is to realize smooth operation of each joint of the robot 100 by trapezoidal control of the speed or the like when the operation of the robot 100 changes (change point of the operation). Another is to realize reliable gripping of the workpiece W by the robot 100 by slowly moving each joint of the robot 100 when gripping the workpiece W.
< grip data Generation section >
The gripping data generating unit 350 generates gripping data such as a position and an angle required for the picking device (robot) 120 to stably grip the workpiece W. The gripping data generating unit 350 executes a gripping plan based on the sensing information Ise supplied from the workpiece measuring device 400, information indicating the weight and friction of the workpiece W, and the like, selects an optimal gripping point, for example, and generates gripping data Dgr necessary to grip the workpiece W at the optimal gripping point. Here, the gripping plan is a plan problem of obtaining the arrangement of the hand 120 capable of stably gripping the workpiece W when the position and posture of the workpiece W are given, and the gripping point is the position and posture of the hand coordinate system of the hand 120 at the time of gripping. The grip data generation unit 350 generates the grip data Dgr, and outputs the generated grip data Dgr to the robot posture calculation unit 330. In this way, by selecting a gripping position based on the gripping plan and specifying the position and angle of the picking device 120 gripping the workpiece W, a stable picking operation can be performed. The gripping data generation unit 350 may be provided in the workpiece measuring apparatus 400.
B-2. work
Fig. 6 is a flowchart illustrating a robot control process executed by the robot control device 300 of the modification. Steps corresponding to the robot control process shown in fig. 4 are denoted by the same reference numerals, and detailed description thereof is omitted.
After the robot controller 300 acquires the image data of the workpiece W from the workpiece measuring device 400 (step S1), the conveying speed of the conveyor 200 is first acquired (step S2). Specifically, the arrival prediction unit 320 of the robot controller 300 acquires the conveyance speed of the conveyance device 200 based on the conveyance speed information Ivt supplied from the conveyance device 200. The arrival predicting unit 320 receives the sensing information Ise from the workpiece measuring device 400, and obtains a predicted picking position and a predicted arrival time of the workpiece W based on the received sensing information Ise and the acquired conveying speed (step S3). The arrival predicting unit 320 outputs the predicted picking position information Ipp indicating the predicted picking position of the workpiece W and the predicted arrival time information Iat indicating the predicted arrival time to the workpiece posture calculating unit 310 as the predicted arrival information Ipa of the workpiece W. In the present modification, the attitude data Dwp2 indicating the attitude of the workpiece W at the picking predicted position may be calculated by the robot attitude calculation unit 330, and in this case, the attitude data Dwp2 may be output to the robot attitude calculation unit 330.
On the other hand, the workpiece posture calculating section 310 detects the workpiece W based on the sensing information Ise supplied from the workpiece measuring device 400, and obtains posture data Dwp1 indicating the posture of the workpiece W as the picking target at the work start position, and on the other hand, obtains posture data Dwp2 indicating the posture of the workpiece W at the picking predicted position based on the arrival prediction information Ipa of the workpiece W (that is, the picking predicted position information Ipp indicating the predicted picking position of the workpiece W, the arrival predicted time information Iat indicating the arrival predicted time) output from the arrival predicting section 320 and the posture data Dwp1 of the workpiece W (step S4). The workpiece posture calculating section 310 outputs the obtained posture data Dwp2 of the workpiece W to the grip data generating section 350.
The gripping data generation unit 350 executes a gripping plan based on the attitude data Dwp2 of the predicted picking position of the workpiece W supplied from the workpiece attitude calculation unit 310, the sensing information Ise supplied from the workpiece measuring device 400, information indicating the weight and friction of the workpiece W, and the like, selects an optimal gripping point, and generates gripping data Dgr required to grip the workpiece W at the optimal gripping point. The grip data generation unit 350 outputs the generated grip data Dgr to the robot posture calculation unit 330.
The robot posture calculation unit 330 calculates the posture of the robot 100 at the predicted picking position based on the grip data Dgr output from the grip data generation unit 350 (step S5). Then, the robot posture calculation unit 330 outputs the calculation result to the trajectory data generation unit 340 as the posture data Drp2 of the robot 100 picking the predicted position. The trajectory data generation section 340 generates trajectory data Dtr of the robot 100 based on the posture data Drp1 indicating the posture of the robot 100 at the work start position supplied from the robot 100 and the posture data Drp2 indicating the posture at the picking predicted position supplied from the robot posture calculation section 330 (step S6).
The interference determination unit 343 determines whether or not the motion (trajectory) of the robot 100 interferes with the obstacle based on the trajectory data generated by the trajectory data generation unit 340 (step S6A). After the interference determination unit 343 confirms that the robot 100 does not interfere with the obstacle, the speed synchronization unit 342 calculates the synchronization speed in order to synchronize the conveyance speed of the conveyance device 200 with the operating speed of the robot 100 after the picking device (hand) 120 reaches the workpiece W when the trajectory data generation unit 340 generates the trajectory data Dtr (step S6B).
On the other hand, when the trajectory data generation unit 340 generates the trajectory data Dtr, the velocity/acceleration calculation unit 344 calculates the optimal velocity/acceleration of each joint of the robot 100 (step S6C). Further, the operation required time calculation unit 341 calculates the operation required time of the robot 100 from the current position of the robot to the picking predicted position of the workpiece W based on the posture data Drp1, Drp2, and the like of the robot 100 (step S6D). Then, the required operation time calculation unit 341 determines whether or not the obtained required operation time is within the lower limit operation time (set time) Tmin (step S6E). When determining that the obtained operation time exceeds the lower limit operation time Tmin (no in step S6E), the operation required time calculation unit 341 returns to step S6 to repeat the above-described series of operations in order to regenerate the trajectory data Dtr of the robot 100.
On the other hand, when determining that the obtained required operation time is within the lower limit operation time Tmin (yes in step S6E), the required operation time calculation unit 341 instructs the trajectory data generation unit 340 to output the trajectory data Dtr. The trajectory data generator 340 outputs the trajectory data Dtr generated in step S6 to the robot 100 as the control signal Sc in accordance with the instruction from the required operation time calculator 341 (step S7). The robot 100 performs operations of the arm 110 and the joints, opening and closing operations of the picking device 120, and the like based on the control signal Sc output from the trajectory data generating unit 340, and transfers the workpiece W moving on the conveying device 200 to a predetermined position on the rack D. In this way, the operation plan of the robot 100 may be optimized in consideration of the presence or absence of an obstacle, the speed and acceleration of the robot 100, and the time required for the operation of the robot 100. The lower limit operating time Tmin may be calculated in advance based on, for example, the conveying speed at which the conveying device 200 conveys the workpiece W, and the calculated lower limit operating time Tmin may be set for the required operating time calculating unit 341. However, the lower limit operating time Tmin may be set and changed as appropriate by a line operator or the like.
C. Application example
Here, fig. 7 is a block diagram showing the configuration of the robot control device 300b of the application example. The hardware configuration of the robot controller 300b shown in fig. 7 is the same as that of the robot controller 300 of the present embodiment. However, the robot control device 300b is different from the robot control device 300 of the present embodiment in that it includes the placement posture calculation unit 360 and the trajectory data selection unit 370. In other respects, since the same as the robot controller 300 shown in fig. 3 is used, the same reference numerals are given to corresponding parts, and detailed description thereof is omitted. Similarly to the robot control device 300, the robot control device 300b reads software stored in the memory 302 or the like from the CPU301 and interprets and executes the software, thereby realizing each part shown in fig. 7.
< Placement posture calculation section >
The placement posture calculation unit 360 calculates the posture of the workpiece W when the workpiece W picked by the robot 100 is placed (placed) at a predetermined position on the rack D or the conveying device 200). With this configuration, for example, the workpieces W transferred by the robot 100 can be placed in a pallet or magazine placed on the rack D or the conveyor 200 in an aligned state. The posture of the workpiece W during placement can be calculated using the sensor information Ise output from the workpiece measuring device 400 and information on placement of the workpiece W input from various other sensors and the like.
< trajectory data selection section >
The trajectory data selection unit 370 selects trajectory data Dtr optimal for picking from the plurality of trajectory data Dtr generated by the trajectory data generation unit 340 based on the predicted arrival position and time to the workpiece W, the velocity and acceleration that can be given to the robot 100, and the like. This enables an optimal route plan to be generated. Note that, in the present invention, it is assumed that the trajectory data generating unit 340 generates a plurality of trajectory data Dtr for one workpiece W or generates a plurality of trajectory data Dtr for a plurality of workpieces W.
The robot 100 may be a mobile robot (e.g., an automatic transport robot that moves by itself forming a map) or a robot mounted on an Automatic Guided Vehicle (AGV), or may be a multi-joint robot coaxially combined with a straight line in the XYZ direction. In this case, the trajectory data generating unit 340 may generate trajectory data of a mobile robot or an automatic transport vehicle, in addition to the robot 100 picking up the workpiece W. Thereby, the robot 100 can pick the workpiece W at any place.
In the present embodiment, the physical robot 100 is caused to perform the picking operation, but the present invention is not limited to this, and the robot on the simulator may be caused to perform the picking operation. In this way, the picking work of the virtual robot can be simulated instead of actually working the robot.
D. Others
In the above-described embodiments and the like, the case where a plurality of workpieces W are put in the casing C is exemplified, but various types of containers (for example, a space-saving container or a stocker) and the like may be used. By placing the workpiece W in the container, the posture of the workpiece W is easily changed to an arbitrary posture.
The conveying device 200 may be a mobile robot, an automatic conveying vehicle, a movable table, or the like, in addition to the conveyor. By using a mobile robot or an automatic transport vehicle as a transport device, the workpiece W can be transported to an arbitrary place.
The workpiece measuring apparatus 400 may be configured by combining a three-dimensional distance image sensor, an image sensor, and a distance measuring sensor, in addition to the image sensor. By using a three-dimensional distance image sensor or the like, the position, shape, inclination, and the like of the workpiece W can be obtained with higher accuracy.
The picking device 120 may be a suction device or the like, in addition to a hand device such as a parallel hand, a multi-fingered hand, or a multi-fingered multi-joint hand. In other words, various sorting devices capable of holding the work W can be used.
Here, the workpiece posture calculation unit 310 may perform matching with data generated from measurement results and matching with a pattern using a 2D sensor, instead of performing matching with a CAD model (model data), when calculating the posture of the workpiece W to be picked at the work start position. The matching between the sensing information Ise of the workpiece W and the model data may be performed by the workpiece attitude calculation unit 310, or may be performed by the workpiece measurement device 400 with a CAD model.
The arrival predicting unit 320 may use an encoder value, a rotation speed, and the like of the conveyor supplied from the conveying device 200 when generating the predicted picking position information Ipp indicating the predicted picking position of the workpiece W, the attitude data Dwp2 indicating the attitude of the workpiece W at the predicted picking position, and the predicted arrival time information Iat indicating the predicted arrival time. The predicted picking position information Ipp, the posture data Dwp2 of the workpiece W at the predicted picking position, and the predicted arrival time information Iat may be generated by calculating the moving speed and direction of the workpiece W. By using these pieces of information, it is possible to obtain the position, posture, arrival prediction time, and the like of the workpiece W at an arbitrary position, in addition to the predicted picking position, posture of the workpiece W at the predicted picking position, and arrival prediction time of the workpiece W.
In addition, the trajectory data generating unit 340 may use trajectory Optimization methods such as a Random-sampling Method (RRT) (rapid-traversal Random Tree) and a Random sampling Method (PRM) (systematic Roadmap Method), and a coomp (Covariant Hamiltonian Optimization for Motion Planning) when generating the trajectory data Dtr of the robot 100. According to this aspect, the trajectory data Dtr can be generated more quickly. However, the present invention is not limited to this, and the trajectory data Dtr of the robot 100 may be generated by a cell division method or a potential field method.
The trajectory data generating unit 340 may be configured to input obstacle data (including data specifying a position and a shape) relating to an obstacle such as a conveyor or a casing C of the conveyor apparatus 200 in advance. By inputting the obstacle data in advance, the trajectory data generation unit 340 can generate trajectory data of the robot 100 that does not collide with the obstacle. The obstacle data may be generated using various sensors or the like and output to the trajectory data generating unit 340.
In the present specification, the term "section" refers not only to a physical structure but also to a case where the processing executed by the "section" is realized by software. In addition, one "section" and processing executed by the apparatus may be realized by two or more physical structures or apparatuses, or two or more "sections" and apparatuses may be realized by one physical means or apparatus.
In the present specification, the steps in each of the above-described processes may be arbitrarily changed in order or executed in parallel within a range not inconsistent with the contents of the processes.
Note that the program for executing each process described in this specification may be stored in a recording medium. By using this recording medium, the above program can be installed in the robot controller 300. Here, the recording medium storing the program may be a non-transitory recording medium. The non-transitory recording medium is not particularly limited, and may be a recording medium such as a CD-ROM.
(attached note 1)
A robot control device comprising at least one memory and at least one hardware processor connected to the memory, for controlling a robot for picking a workpiece conveyed by a conveyor,
the hardware processor operates as the following means by executing a predetermined program stored in the memory,
a workpiece attitude calculation unit that calculates an attitude of the workpiece based on the sensing information of the workpiece output from the measuring device;
an arrival prediction unit that obtains a predicted picking position predicted to be picked by the robot while the workpiece is being conveyed, based on the sensing information of the workpiece and the conveying speed of the conveying device;
a robot posture calculation unit that calculates a posture of the robot at the predicted picking position based on the calculated posture of the workpiece and the predicted picking position; and
and a trajectory data generation unit that acquires a posture of the robot at a work start position, and generates trajectory data indicating a work trajectory of the robot from the work start position to the picking predicted position based on the acquired posture of the robot at the work start position and the acquired posture of the robot at the picking predicted position.
(attached note 2)
A robot control method for controlling a robot that picks up a workpiece conveyed by a conveyor by at least one hardware processor,
the hardware processor performs the steps of:
calculating a posture of the workpiece based on sensing information of the workpiece output from a measuring device;
obtaining a predicted picking position predicted that the workpiece being conveyed is picked by the robot based on the sensing information of the workpiece and the conveying speed of the conveying device;
calculating a posture of the robot at the picking predicted position based on the calculated posture of the workpiece and the picking predicted position; and
acquiring a posture of the robot at a work start position, and generating trajectory data representing a work trajectory of the robot from the work start position to the picking predicted position based on the acquired posture of the robot at the work start position and the posture of the robot at the picking predicted position.

Claims (14)

1. A robot control device for controlling a robot that picks up a workpiece conveyed by a conveyor, the robot control device comprising:
a workpiece attitude calculation unit that calculates an attitude of the workpiece based on the sensing information of the workpiece output from the measuring device;
an arrival prediction unit that obtains a predicted picking position predicted to be picked by the robot while the workpiece is being conveyed, based on the sensing information of the workpiece and the conveying speed of the conveying device;
a robot posture calculation unit that calculates a posture of the robot at the predicted picking position based on the calculated posture of the workpiece and the predicted picking position; and
a trajectory data generation unit that acquires a posture of the robot at a work start position, and generates trajectory data indicating a work trajectory of the robot from the work start position to the picking prediction position based on the acquired posture of the robot at the work start position and the acquired posture of the robot at the picking prediction position,
the trajectory data generation unit generates the trajectory data that does not collide with the obstacle by using obstacle data for specifying the position and shape of the obstacle moving on the conveyor device.
2. The robot control device according to claim 1, wherein,
the robot control device further includes a required operation time calculation unit that calculates a required operation time of the robot from the work start position to the predicted picking position based on an attitude of the robot at the work start position and an attitude of the robot at the predicted picking position,
the trajectory data generation unit outputs the trajectory data determined that the operation required time is within a set time to the robot.
3. The robot control device according to claim 1, wherein,
the robot control device further includes a speed synchronization unit configured to synchronize the conveying speed of the conveying device with the operating speed of the robot when the robot reaches the picking expected position.
4. The robot control device according to claim 1, wherein,
the robot control device further includes a grip data generation unit that generates grip data including a position and an angle of the robot required for the robot to stably hold the workpiece, based on the sensing information supplied from the measuring device.
5. The robot control device according to claim 1, wherein,
the robot control device further includes a constraint condition determination unit that determines whether the robot interferes with an obstacle during operation of the robot, based on the trajectory data generated by the trajectory data generation unit.
6. The robot control device according to claim 1, wherein,
the robot control device further includes a speed/acceleration calculation unit that calculates a speed and an acceleration of each joint of the robot when at least one of a change in operation of the robot and a holding of the workpiece by the robot is performed.
7. The robot control device according to claim 1, wherein,
the workpiece attitude calculation unit calculates the attitude of the workpiece by matching the sensing information of the workpiece output from the measuring device with the model data of the workpiece.
8. The robot control device according to claim 1, wherein,
the arrival prediction unit obtains the predicted picking position using an encoder value and a rotational speed of the conveying device, and a moving speed and a direction of the workpiece, in addition to the sensing information of the workpiece and the conveying speed of the conveying device.
9. The robot control device according to claim 1, wherein,
the trajectory data generation unit generates the trajectory data by using a random sampling method or a trajectory optimization method.
10. The robot control device according to claim 1, wherein,
the robot control device further includes a placement posture calculation unit that calculates a posture of the workpiece when the robot places the workpiece at a predetermined position after picking the workpiece based on the trajectory data.
11. The robot control device according to claim 1, wherein,
the trajectory data generation unit generates a plurality of the trajectory data,
the robot controller further includes a trajectory data selection unit that selects trajectory data suitable for the picking from among the plurality of trajectory data based on the predicted picking position.
12. The robot control device according to claim 1, wherein,
the robot is a robot on a simulator.
13. A robot control method for controlling a robot that picks up a workpiece conveyed by a conveying device, wherein the robot control method comprises the steps of:
calculating a posture of the workpiece based on sensing information of the workpiece output from a measuring device;
obtaining a predicted picking position predicted that the workpiece being conveyed is picked by the robot based on the sensing information of the workpiece and the conveying speed of the conveying device;
calculating a posture of the robot at the picking predicted position based on the calculated posture of the workpiece and the picking predicted position; and
acquiring a posture of the robot at a work start position, and generating trajectory data representing a working trajectory of the robot from the work start position to the picking predicted position based on the acquired posture of the robot at the work start position and the posture of the robot at the picking predicted position,
in the step of generating the trajectory data, the trajectory data that does not collide with the obstacle is generated by using obstacle data for specifying a position and a shape of the obstacle moving on the conveyor.
14. A storage medium storing a robot control program for causing a computer that controls a robot that picks up a workpiece conveyed by a conveying device to operate as:
a workpiece attitude calculation unit that calculates an attitude of the workpiece based on the sensing information of the workpiece output from the measuring device;
an arrival prediction unit that obtains a predicted picking position predicted to be picked by the robot while the workpiece is being conveyed, based on the sensing information of the workpiece and the conveying speed of the conveying device;
a robot posture calculation unit that calculates a posture of the robot at the predicted picking position based on the calculated posture of the workpiece and the predicted picking position; and
a trajectory data generation unit that acquires a posture of the robot at a work start position, and generates trajectory data indicating a work trajectory of the robot from the work start position to the picking prediction position based on the acquired posture of the robot at the work start position and the acquired posture of the robot at the picking prediction position,
the trajectory data generation unit generates the trajectory data that does not collide with the obstacle by using obstacle data for specifying the position and shape of the obstacle moving on the conveyor device.
CN201810606950.7A 2017-08-01 2018-06-13 Robot control device, robot control method, and storage medium Active CN109318226B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-149437 2017-08-01
JP2017149437A JP7116901B2 (en) 2017-08-01 2017-08-01 ROBOT CONTROL DEVICE, ROBOT CONTROL METHOD AND ROBOT CONTROL PROGRAM

Publications (2)

Publication Number Publication Date
CN109318226A CN109318226A (en) 2019-02-12
CN109318226B true CN109318226B (en) 2021-11-02

Family

ID=62705470

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810606950.7A Active CN109318226B (en) 2017-08-01 2018-06-13 Robot control device, robot control method, and storage medium

Country Status (4)

Country Link
US (1) US10569414B2 (en)
EP (1) EP3437807A3 (en)
JP (1) JP7116901B2 (en)
CN (1) CN109318226B (en)

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6500852B2 (en) * 2016-07-11 2019-04-17 株式会社安川電機 Robot system, robot control method, robot controller
JP6879238B2 (en) * 2018-03-13 2021-06-02 オムロン株式会社 Work picking device and work picking method
US11185978B2 (en) * 2019-01-08 2021-11-30 Honda Motor Co., Ltd. Depth perception modeling for grasping objects
EP3699832A1 (en) * 2019-02-19 2020-08-26 Tata Consultancy Services Limited Systems and methods for optimizing scheduling of non-preemptive tasks in multi-robotic environment
JP7275759B2 (en) * 2019-03-28 2023-05-18 セイコーエプソン株式会社 OBJECT DETECTION METHOD, OBJECT DETECTION DEVICE, AND ROBOT SYSTEM
KR20200116741A (en) * 2019-04-02 2020-10-13 현대자동차주식회사 Control method and control system of manupulator
CN110032965B (en) * 2019-04-10 2023-06-27 南京理工大学 Visual positioning method based on remote sensing image
CN109986564A (en) * 2019-05-20 2019-07-09 上海应用技术大学 Industrial machinery arm paths planning method
US10647528B1 (en) * 2019-05-31 2020-05-12 Mujin, Inc. Robotic system for palletizing packages using real-time placement simulation
CN111605938B (en) * 2019-05-31 2021-04-30 牧今科技 Robotic system for palletizing packages using real-time placement simulation
US10696493B1 (en) 2019-05-31 2020-06-30 Mujin, Inc. Robotic system with packing mechanism
US11077554B2 (en) 2019-05-31 2021-08-03 Mujin, Inc. Controller and control method for robotic system
US10618172B1 (en) 2019-05-31 2020-04-14 Mujin, Inc. Robotic system with error detection and dynamic packing mechanism
US10679379B1 (en) 2019-05-31 2020-06-09 Mujin, Inc. Robotic system with dynamic packing mechanism
CN111559544B (en) * 2019-05-31 2021-05-11 牧今科技 Robot system with error detection and dynamic packaging mechanism
US10696494B1 (en) 2019-05-31 2020-06-30 Mujin, Inc. Robotic system for processing packages arriving out of sequence
WO2020244778A1 (en) * 2019-06-07 2020-12-10 Bystronic Laser Ag Sorting system, mobile robot, method for operating a sorting system, computer program product and computer-readable medium
JP7328017B2 (en) * 2019-06-11 2023-08-16 ファナック株式会社 Robot system and controller
JP7018637B2 (en) * 2019-07-08 2022-02-14 TechMagic株式会社 Automatic dishwashing system, automatic dishwashing method, automatic dishwashing program and storage media
JP7351702B2 (en) * 2019-10-04 2023-09-27 ファナック株式会社 Workpiece conveyance system
JP2021091055A (en) * 2019-12-12 2021-06-17 株式会社キーエンス measuring device
KR102300752B1 (en) * 2020-01-29 2021-09-10 한국과학기술원 Method and Apparatus for Collision-Free Trajectory Optimization of Redundant Manipulator given an End-Effector Path
US11548158B2 (en) * 2020-04-17 2023-01-10 Abb Schweiz Ag Automatic sensor conflict resolution for sensor fusion system
JP7386140B2 (en) * 2020-07-15 2023-11-24 Pacraft株式会社 Bag feeding device and bag feeding method
KR102597583B1 (en) * 2020-12-29 2023-11-02 세메스 주식회사 Device and method for diagnosing abnormalities in transfer robots
CN112659133A (en) * 2020-12-31 2021-04-16 软控股份有限公司 Glue grabbing method, device and equipment based on machine vision
US20230071384A1 (en) * 2021-09-09 2023-03-09 Intrinsic Innovation Llc In-hand pose refinement for pick and place automation
WO2023209827A1 (en) * 2022-04-26 2023-11-02 ファナック株式会社 Robot, robot control device, and work robot system
CN115258508A (en) * 2022-07-21 2022-11-01 京东科技控股股份有限公司 Method and device for sorting items and computer-readable storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1986006050A1 (en) * 1985-04-08 1986-10-23 Odetics, Inc. Robotic materials handling system
US5040056A (en) * 1990-01-29 1991-08-13 Technistar Corporation Automated system for locating and transferring objects on a conveyor belt
JP2007030087A (en) * 2005-07-26 2007-02-08 Fanuc Ltd Physical distribution tracking device
CN101726722A (en) * 2008-10-27 2010-06-09 精工爱普生株式会社 Workpiece detecting system, picking apparatus, picking method, and transport system
CN105645010A (en) * 2014-12-02 2016-06-08 发那科株式会社 Device and method of transferring articles by using robot
US9457970B1 (en) * 2015-03-30 2016-10-04 Google Inc. Modular cross-docking system
JP2017064910A (en) * 2015-07-31 2017-04-06 ファナック株式会社 Machine learning device for learning taking-out operation of workpiece, robot system, and machine learning method
CN106985161A (en) * 2016-12-22 2017-07-28 北京京东尚科信息技术有限公司 Article grasping system and method

Family Cites Families (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000061875A (en) * 1998-08-25 2000-02-29 Matsushita Electric Works Ltd Robot hand
JP2004001122A (en) * 2002-05-31 2004-01-08 Suzuki Motor Corp Picking device
FR2896441B1 (en) * 2006-01-23 2009-07-03 Jerome Grosbois METHOD AND SYSTEM FOR AUTOMATED REALIZATION OF WORKPIECE (S)
EP1886772B1 (en) * 2006-08-01 2008-11-12 Albert Handtmann Maschinenfabrik GmbH &amp; Co. KG Apparatus and process for transporting objects and gripping them from below
PL2146821T3 (en) * 2007-04-26 2012-08-31 Adept Tech Inc Vacuum gripping apparatus
JP5446887B2 (en) * 2010-01-06 2014-03-19 セイコーエプソン株式会社 Control device, robot, robot system, and robot tracking method
JP2012187651A (en) * 2011-03-09 2012-10-04 Omron Corp Image processing apparatus, image processing system, and guidance apparatus therefor
JP5810562B2 (en) * 2011-03-15 2015-11-11 オムロン株式会社 User support device directed to image processing system, program thereof, and image processing device
JP5316580B2 (en) * 2011-05-17 2013-10-16 株式会社安川電機 Robot system
JP5806606B2 (en) * 2011-12-01 2015-11-10 キヤノン株式会社 Information processing apparatus and information processing method
JP5911299B2 (en) * 2011-12-27 2016-04-27 キヤノン株式会社 Information processing apparatus, information processing apparatus control method, and program
JP5975685B2 (en) * 2012-03-09 2016-08-23 キヤノン株式会社 Information processing apparatus and information processing method
DE102012013030A1 (en) * 2012-06-29 2014-04-24 Liebherr-Verzahntechnik Gmbh Device for the automatic removal of workpieces arranged in a container
US9064920B2 (en) * 2012-07-22 2015-06-23 Varian Semiconductor Equipment Associates, Inc. Electrostatic charge removal for solar cell grippers
JP5754454B2 (en) 2013-03-18 2015-07-29 株式会社安川電機 Robot picking system and workpiece manufacturing method
JP5616478B1 (en) * 2013-04-18 2014-10-29 ファナック株式会社 Robot system equipped with a robot for transporting workpieces
CA2951523C (en) * 2013-06-11 2021-06-01 Somatis Sensor Solutions LLC Systems and methods for sensing objects
JP6005299B2 (en) * 2013-11-28 2016-10-12 三菱電機株式会社 Robot system and control method of robot system
CN109650012A (en) * 2014-07-02 2019-04-19 多宾有限公司 System and method with the drag conveyor welded for high-speed production
JP2016147330A (en) * 2015-02-10 2016-08-18 株式会社三次元メディア Control apparatus based on object recognition
US9486921B1 (en) * 2015-03-26 2016-11-08 Google Inc. Methods and systems for distributing remote assistance to facilitate robotic object manipulation
US20170075331A1 (en) * 2015-09-11 2017-03-16 Yaskawa America, Inc. Apparatus, system, and method for configuring and programming control of a robot
JP2017068553A (en) * 2015-09-30 2017-04-06 株式会社三次元メディア Analysis system
ES2929729T3 (en) * 2015-11-13 2022-12-01 Berkshire Grey Operating Company Inc Classification systems to provide classification of a variety of objects
JP6540472B2 (en) * 2015-11-18 2019-07-10 オムロン株式会社 Simulation apparatus, simulation method, and simulation program
CN108778636B (en) * 2016-02-08 2021-11-19 伯克希尔格雷股份有限公司 System and method for providing treatment of various objects using motion planning
MX2018011440A (en) * 2016-03-24 2019-01-10 Masonite Corp Wood door slab processing system, and related methods.
US9925663B2 (en) * 2016-07-07 2018-03-27 Universal City Studios Llc Movable hardstop for a robotic component
JP2018027581A (en) * 2016-08-17 2018-02-22 株式会社安川電機 Picking system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1986006050A1 (en) * 1985-04-08 1986-10-23 Odetics, Inc. Robotic materials handling system
US5040056A (en) * 1990-01-29 1991-08-13 Technistar Corporation Automated system for locating and transferring objects on a conveyor belt
JP2007030087A (en) * 2005-07-26 2007-02-08 Fanuc Ltd Physical distribution tracking device
CN101726722A (en) * 2008-10-27 2010-06-09 精工爱普生株式会社 Workpiece detecting system, picking apparatus, picking method, and transport system
CN105645010A (en) * 2014-12-02 2016-06-08 发那科株式会社 Device and method of transferring articles by using robot
US9457970B1 (en) * 2015-03-30 2016-10-04 Google Inc. Modular cross-docking system
JP2017064910A (en) * 2015-07-31 2017-04-06 ファナック株式会社 Machine learning device for learning taking-out operation of workpiece, robot system, and machine learning method
CN106985161A (en) * 2016-12-22 2017-07-28 北京京东尚科信息技术有限公司 Article grasping system and method

Also Published As

Publication number Publication date
CN109318226A (en) 2019-02-12
EP3437807A2 (en) 2019-02-06
EP3437807A3 (en) 2019-06-19
US20190039237A1 (en) 2019-02-07
JP7116901B2 (en) 2022-08-12
US10569414B2 (en) 2020-02-25
JP2019025618A (en) 2019-02-21

Similar Documents

Publication Publication Date Title
CN109318226B (en) Robot control device, robot control method, and storage medium
JP5620445B2 (en) Article takeout device for determining holding position and posture of robot based on selection condition
US10589424B2 (en) Robot control device, robot, and robot system
US10864632B2 (en) Direct teaching method of robot
US9469035B2 (en) Component supply apparatus
KR101686517B1 (en) Working support robot system
CN104589354A (en) robot control device, robot system, and robo
JP6444499B1 (en) Control device, picking system, distribution system, program, and control method
JP6258556B1 (en) Control device, picking system, distribution system, program, control method, and production method
CN111328305B (en) Control apparatus, work robot, program, and control method
JP2007098501A (en) Robot system
JPWO2018185855A1 (en) Control device, picking system, distribution system, program, control method, and production method
US11752621B2 (en) Article transport system having plurality of movable parts
CN112672857A (en) Route generation device, route generation method, and route generation program
JP5446887B2 (en) Control device, robot, robot system, and robot tracking method
EP3904015B1 (en) System and method for setting up a robotic assembly operation
KR20210041048A (en) Data generation device, data generation method, data generation program and remote operation system
JP2013013948A (en) Robot, and method for controlling robot
JP2021088019A (en) Robot system and method for controlling robot system
JP6641804B2 (en) How to set the moving route of the carrier
Larouche et al. Investigation of impedance controller for autonomous on-orbit servicing robot
Su et al. Collaborative assembly operation between two modular robots based on the optical position feedback
WO2018180298A1 (en) Robot teaching device, method for controlling robot teaching device, and robot teaching program
WO2024075394A1 (en) Control device, and control method
JP7286524B2 (en) Picking robot, picking method and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant