WO2023013699A1 - ロボット制御装置、ロボット制御システム、及びロボット制御方法 - Google Patents
ロボット制御装置、ロボット制御システム、及びロボット制御方法 Download PDFInfo
- Publication number
- WO2023013699A1 WO2023013699A1 PCT/JP2022/029853 JP2022029853W WO2023013699A1 WO 2023013699 A1 WO2023013699 A1 WO 2023013699A1 JP 2022029853 W JP2022029853 W JP 2022029853W WO 2023013699 A1 WO2023013699 A1 WO 2023013699A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- control unit
- measurement object
- finger
- end effector
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 24
- 238000005259 measurement Methods 0.000 claims abstract description 241
- 239000012636 effector Substances 0.000 claims abstract description 114
- 230000009471 action Effects 0.000 claims description 7
- 238000013459 approach Methods 0.000 claims description 2
- 238000001514 detection method Methods 0.000 description 33
- 238000010586 diagram Methods 0.000 description 18
- 238000004891 communication Methods 0.000 description 11
- 230000006870 function Effects 0.000 description 6
- 238000012937 correction Methods 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000005553 drilling Methods 0.000 description 1
- 238000003754 machining Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 210000000707 wrist Anatomy 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/10—Programme-controlled manipulators characterised by positioning means for manipulator elements
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
Definitions
- the present disclosure relates to a robot control device, a robot control system, and a robot control method.
- Patent Document 1 Conventionally, there has been known a robot control device that creates teaching data based on image recognition processing of alignment marks of a work target (see Patent Document 1, for example).
- a robot control device includes a control unit that controls a robot having sensors and end effectors.
- the control unit acquires positional information of the end effector with respect to a measurement object placed in the robot's motion space, and corrects a coordinate system relating to the motion of the robot based on the positional information.
- a robot control system includes the robot control device and the robot.
- a robot control method controls a robot having sensors and end effectors.
- the robot control method includes acquiring position information of the end effector with respect to an object to be measured arranged in an operation space of the robot, and correcting a coordinate system related to the operation of the robot based on the position information. including.
- FIG. 1 is a block diagram showing a configuration example of a robot control system according to an embodiment
- FIG. 1 is a schematic diagram showing a configuration example of a robot control system according to an embodiment
- FIG. 4 is a schematic diagram showing a configuration example of an end effector
- 4 is a flow chart showing an example procedure of a robot control method according to an embodiment
- FIG. 10 is a schematic diagram showing an operation example of contacting an object to be measured with the outer side of the first finger in a state where the first finger and the second finger are spread apart
- FIG. 10 is a schematic diagram showing an operation example of contacting the measurement object with the side surface of the first finger in a state in which the first finger and the second finger are spread apart
- FIG. 10 is a schematic diagram showing an operation example of contacting the measurement object with the side surface of the first finger in a state in which the first finger and the second finger are spread apart
- FIG. 10 is a schematic diagram showing an operation example of contacting the measurement object with the side surface of the first finger in a
- FIG. 10 is a schematic diagram showing an operation example of contacting the measurement object with the tip of the first finger in a state where the first finger and the second finger are spread apart;
- FIG. 10 is a schematic diagram showing an operation example of contacting an object to be measured with the outer side of the first finger in a state where the first finger and the second finger are closed;
- FIG. 10 is a schematic diagram showing an operation example of contacting the measurement object with the side surface of the first finger or the second finger in a state where the first finger and the second finger are closed;
- FIG. 10 is a schematic diagram showing an operation example of contacting an object to be measured with the tip of the first finger or the second finger in a state where the first finger and the second finger are closed;
- FIG. 10 is a schematic diagram showing an operation example of contacting an object to be measured with the tip of the first finger or the second finger in a state where the first finger and the second finger are closed;
- FIG. 11 is a schematic diagram showing an example of movement in which the first finger or the second finger is in contact with the upper surface of the object to be measured and moves in the X_RB axis direction along the upper surface;
- FIG. 10 is a schematic diagram showing an operation example in which the first finger or the second finger moves in the Y_RB axis direction along the upper surface while being in contact with the upper surface of the object to be measured;
- FIG. 10 is a schematic diagram showing an operation example in which the first finger or the second finger is in contact with the upper surface of the measurement object and simultaneously moves in the X_RB axis direction and the Y_RB axis direction along the upper surface;
- FIG. 10 is a schematic diagram showing an example of an operation in which the first finger and the second finger hold the measurement object and rotate around the measurement object;
- FIG. 3 is a schematic diagram showing a configuration example of a measurement object having marks;
- FIG. 4 is a schematic diagram showing a configuration example of a measurement object having a mark and a contact area;
- FIG. 4 is a schematic diagram showing a configuration example of a measurement object without marks.
- FIG. 4 is a schematic diagram showing a configuration example of a cylindrical measurement object;
- FIG. 4 is a schematic diagram showing an example of arranging measurement objects at diagonal positions on the upper surface of the workbench;
- robot calibration accuracy can be improved.
- a robot control system 1 includes a robot 40 , a robot control device 10 and a spatial information acquisition section 20 .
- the robot 40 operates in a predetermined motion space.
- the space information acquisition unit 20 captures an action space in which the robot 40 operates, and generates depth information of the action space.
- the spatial information acquisition unit 20 calculates the distance to the measurement point located on the surface of the object 50 existing in the motion space.
- the distance from the spatial information acquisition unit 20 to the measurement point is also called depth.
- Depth information is information about the depth measured for each measurement point. In other words, the depth information is information about the distance to the measurement point located on the surface of the object 50 existing in the motion space.
- the depth information may be expressed as a depth map that associates the direction viewed from the spatial information acquisition unit 20 and the depth in that direction.
- the spatial information acquisition unit 20 generates depth information of the motion space based on the (X_CA, Y_CA, Z_CA) coordinate system.
- the space information acquisition unit 20 may generate an image of the motion space. An image obtained by capturing the motion space is also referred to as a motion space image.
- the robot control device 10 may perform calibration based on an image of the action space captured by the space information acquisition unit 20. Further, the robot control device 10 can correct the coordinate system related to the motion of the robot by detecting the position information of the robot 40 with respect to the measurement object 52 (see FIG. 3, etc.) arranged in the motion space.
- the coordinate system relating to the motion of the robot is, for example, the coordinate system of the motion space, the coordinate system of the robot 40, or the coordinate system of the spatial information acquisition section 20.
- the robot control device 10 may detect position information of the robot 40 with respect to the measurement object 52 by detecting various physical quantities such as load or pressure.
- the robot control device 10 when the robot control device 10 detects a load or pressure, the position of the robot 40 with respect to the measurement object 52 is 0 (zero) or a constant distance. Further, the robot control device 10 may detect the position information of the robot 40 with respect to the measurement object 52 by measuring the distance between the robot 40 and the measurement object 52 . In addition, the robot 40 has a sensor 48 capable of detecting position information with respect to the measurement object 52 .
- the robot control device 10 operates the robot 40 based on the depth information generated by the spatial information acquisition section 20.
- the robot controller 10 controls and operates the robot 40 based on the (X_RB, Y_RB, Z_RB) coordinate system.
- the robot 40 may be installed on the workbench 70, for example.
- the robot control device 10 may cause the robot 40 to perform work with the object 50 positioned within the motion space of the robot 40 as the work target.
- the robot control device 10 may recognize the object 50 positioned within the motion space of the robot 40 as an obstacle.
- the robot control device 10 may perform calibration before causing the robot 40 to perform a task.
- the robot controller 10 corrects the coordinates of the object 52 to be measured or corrects the coordinate system by bringing the robot 40 into contact with the object 52 to be measured and detecting the load acting on the robot 40 . Corrections may be made.
- the (X_RB, Y_RB, Z_RB) coordinate system is also called the coordinate system of the robot 40 .
- the (X_CA, Y_CA, Z_CA) coordinate system is also called the coordinate system of the spatial information acquisition unit 20 .
- the coordinate system of the robot 40 may be set as the same coordinate system as the coordinate system of the spatial information acquisition unit 20, or may be set as a different coordinate system.
- the robot control device 10 transfers the depth information generated in the coordinate system of the spatial information acquisition unit 20 to the robot 40. used after conversion to the coordinate system of Note that the coordinate system of the spatial information acquisition unit 20 may be calibrated in advance with the coordinate system of the motion space.
- the coordinate system of the spatial information acquisition unit 20 is expressed as, for example, a coordinate system (X_CA, Y_CA, Z_CA) by camera calculation. may be calibrated to match the coordinate system (X, Y, Z) indicating It should be noted that prior calibration of the coordinate system of the spatial information acquisition unit 20 may not be performed.
- the number of robots 40 and robot control devices 10 is not limited to one as illustrated, but may be two or more. As illustrated, the number of spatial information acquisition units 20 may be one for one motion space, or may be two or more. Each component will be specifically described below.
- the robot control device 10 includes a control section 11 , a storage section 12 and an interface 13 .
- the interface 13 is also called I/F13.
- the control unit 11 may include at least one processor to implement various functions of the robot control device 10 .
- the processor may execute programs that implement various functions of the robot controller 10 .
- a processor may be implemented as a single integrated circuit.
- An integrated circuit is also called an IC (Integrated Circuit).
- a processor may be implemented as a plurality of communicatively coupled integrated and discrete circuits.
- the processor may be configured including a CPU (Central Processing Unit).
- the processor may be configured including a DSP (Digital Signal Processor) or a GPU (Graphics Processing Unit). Processors may be implemented based on various other known technologies.
- the storage unit 12 may be configured including an electromagnetic storage medium such as a magnetic disk, or may be configured including a memory such as a semiconductor memory or a magnetic memory.
- the storage unit 12 may be configured as an HDD (Hard Disk Drive) or an SSD (Solid State Drive).
- the storage unit 12 stores various information, programs executed by the control unit 11, and the like.
- the storage unit 12 may function as a work memory for the control unit 11 .
- the control unit 11 may include at least part of the storage unit 12 .
- the I/F 13 acquires an image of the motion space of the robot 40 captured by the space information acquisition unit 20 and outputs the image to the control unit 11 .
- the I/F 13 acquires information on the robot 40 and outputs it to the control unit 11 .
- the information on the robot 40 includes information on the sensor 48, which will be described later.
- the I/F 13 acquires information for controlling the robot 40 from the control unit 11 and outputs the information to the robot 40 .
- the I/F 13 may include a communication device capable of communicating with the spatial information acquisition unit 20 and the robot 40 by wire or wirelessly.
- a communication device may be configured to be able to communicate with communication schemes based on various communication standards.
- a communication device may be configured according to known communication technologies. A detailed description of the hardware of the communication device and the like is omitted.
- the function of the communication device may be realized by one communication interface, or may be realized by separate communication interfaces for each connection destination.
- the control unit 11 may be configured to communicate with the spatial information acquisition unit 20 and the robot
- the robot 40 is equipped with a sensor 48, as illustrated in FIG.
- the robot 40 also includes an arm 42, an end effector 44 attached to the arm 42, and a mark 46 installed on the end effector 44, as illustrated in FIG.
- Robot 40 further comprises an interface 49, although this is not essential.
- the interface 49 is also called I/F49. Note that the mark 46 may be placed on the arm 42 instead of the end effector 44 .
- the arm 42 may be configured as, for example, a 6-axis or 7-axis vertical articulated robot.
- the arm 42 may be configured as a 3-axis or 4-axis horizontal articulated robot or SCARA robot.
- Arm 42 may be configured as a 2-axis or 3-axis Cartesian robot.
- Arm 42 may be configured as a parallel link robot or the like.
- the number of axes forming the arm 42 is not limited to the illustrated one.
- the end effector 44 may include, for example, a gripping hand configured to grip a work object.
- the grasping hand may have multiple fingers. The number of fingers of the grasping hand may be two or more. The fingers of the grasping hand may have one or more joints.
- the end effector 44 may include a suction hand configured to be able to suction a work target.
- the end effector 44 may include a scooping hand configured to scoop the work object.
- the end effector 44 may include a tool such as a drill, and may be configured to perform various machining operations such as drilling holes in a work object.
- the end effector 44 is not limited to these examples, and may be configured to perform various other operations. Sensor information may be controlled by end effector 44 or robot 40 .
- the end effector 44 may comprise a controller 440, as shown in FIG.
- the robot 40 may include the controller 410 .
- the controller 440 of the end effector 44 or the controller 410 of the robot 40 may capture the sensor information.
- the control unit 440 of the end effector 44 or the control unit 410 of the robot 40 may output to the robot control device 10 state information (for example, the amount of displacement or the amount of contact) that can be estimated from the sensor information.
- the sensor information may be handled by the robot control device 10 side, may be handled by the control section 410 of the robot 40 , or may be handled by the control section 440 of the end effector 44 .
- the end effector 44 of the robot 40 is configured as a grasping hand, and includes a first finger 441, a second finger 442, and a drive section 443.
- the number of fingers or suction nozzles provided in the end effector 44 is not limited to two, and may be one or three or more. good.
- the object to be worked on by the end effector 44 is represented as the measurement object 52 .
- the end effector 44 is configured to contact the measurement object 52 with at least one finger or a holding portion such as a suction nozzle.
- the end effector 44 may be configured to hold the measurement object 52 with a holding portion such as three or more fingers or a suction nozzle.
- the robot 40 can control the position of the end effector 44 by operating the arm 42 .
- the end effector 44 may have an axis that serves as a reference for the direction in which it acts on the work object. If the end effector 44 has an axis, the robot 40 can control the orientation of the end effector 44 axis by moving the arm 42 .
- the robot 40 controls the start and end of the motion of the end effector 44 acting on the work piece.
- the robot 40 can move or process a work object by controlling the position of the end effector 44 or the direction of the axis of the end effector 44 and controlling the motion of the end effector 44 .
- the sensor 48 detects the state of each component of the robot 40 .
- Sensors 48 may detect information regarding the actual position or orientation of each component of robot 40 or the velocity or acceleration of each component of robot 40 .
- the sensors 48 may detect loads acting on each component of the robot 40 .
- the sensor 48 may detect the current flowing through the motor or the torque of the motor that drives each component of the robot 40 .
- Sensors 48 can detect information resulting from the actual movement of robot 40 .
- Sensors 48 may detect the distance between robot 40 and other objects.
- the sensor 48 includes a force sensor 444 that detects the direction or magnitude of the load acting on the end effector 44 when the end effector 44 contacts the work object. Assume that the force sensor 444 is mounted on the end effector 44 .
- the force sensor 444 may include, for example, a strain gauge, but is not limited to this.
- the sensor 48 further includes a tactile sensor 445, although this is not essential. It is assumed that the tactile sensor 445 is mounted on the first finger 441 or the second finger 442 .
- the tactile sensor 445 may include, for example, a pressure sensor, but is not limited to this.
- the sensor 48 may further include a distance sensor.
- a distance sensor may be provided at the tip of the end effector 44, for example. Specifically, the distance sensor may be mounted on the first finger 441 or the second finger 442 .
- the robot control device 10 recognizes the position of the mark 46 or the position of the end effector 44 on which the mark 46 is installed based on the image of the mark 46 captured by the spatial information acquisition unit 20 . Further, the robot control device 10 recognizes the state of the robot 40 based on the image of the mark 46 captured by the space information acquisition section 20 . The robot control device 10 performs calibration of the robot 40 by comparing the state of the robot 40 obtained based on the detection result of the sensor 48 and the state of the robot 40 obtained based on the image of the mark 46 . can run.
- the spatial information acquisition unit 20 acquires spatial information regarding the motion space of the robot 40 .
- the spatial information acquisition unit 20 may photograph the motion space and acquire an image of the motion space as the spatial information. As illustrated in FIG. 2, the spatial information acquisition unit 20 may photograph a working object such as an object 50 or a measuring object 52 that exists in the working space.
- the spatial information acquisition section 20 may be configured as a camera.
- the 3D stereo camera photographs an object 50 existing in the motion space, calculates the distance to a measurement point located on the surface of the object 50 existing in the motion space as depth, and generates depth information.
- the spatial information acquisition section 20 may be configured as a 3D stereo camera.
- the spatial information acquisition unit 20 may be configured as a LiDAR (light detection and ranging).
- the spatial information acquisition unit 20 may acquire depth information of the motion space as the spatial information.
- the spatial information acquisition unit 20 is not limited to these devices and may be configured as various devices.
- the spatial information acquisition unit 20 may acquire various types of information as the spatial information, without being limited to the image or depth information of the motion space.
- the spatial information acquisition section 20 may include an imaging device.
- the spatial information acquisition section 20 may further include an optical system.
- the space information acquisition unit 20 may output the captured image of the motion space to the robot control device 10 .
- the space information acquisition unit 20 may generate depth information in the motion space of the robot 40 and output it to the robot control device 10 .
- the space information acquisition unit 20 may generate point group information in the motion space of the robot 40 and output it to the robot control device 10 . That is, the spatial information may be output in the form of point cloud data. In other words, the point cloud information may have spatial information.
- the point group information is information on a set of measurement points located on the surface of the object 50 existing in the motion space, and is information including coordinate information or color information on each measurement point.
- the point group information can also be said to be data representing the object 50 in the measurement space with a plurality of points. Since the spatial information is in the form of point cloud data, the data density can be made smaller than the spatial information based on the initial data acquired by the spatial information acquiring section 20 .
- the spatial information acquisition unit 20 has an FOV (Field Of View).
- the FOV corresponds to the imaging range of the spatial information acquisition unit 20.
- the spatial information acquisition unit 20 can photograph the range included in the FOV.
- the actual field of view size of the spatial information acquisition section 20 is determined based on the FOV of the spatial information acquisition section 20 and the depth information.
- the robot control device 10 detects the mark 46 of the robot 40 based on the actual field of view size of the space information acquisition unit 20 and the space information including an image captured by the space information acquisition unit 20 so as to reflect the mark 46 of the robot 40 .
- the robot control device 10 can calculate the position and orientation of the mark 46 based on the spatial information including the image or the like by analyzing the spatial information including the image or the like of the mark 46 using a predetermined algorithm.
- a predetermined algorithm may include, for example, a mathematical formula or a table, or may include a program specifying arithmetic processing.
- the predetermined algorithm may include parameters for correcting calculation results based on spatial information including images and the like.
- the robot control device 10 operates the robot 40 to act on a work object such as the object 50 or the measurement object 52 existing in the operation space, or controls the robot 40 to avoid the object 50 or the measurement object 52. make it work.
- the robot control device 10 acts on a work object such as the object 50 or the measurement target 52 based on the photographed image of the object 50 or the measurement target 52 or the like taken by the spatial information acquisition unit 20, or controls the object 50 or the measurement target 52.
- the robot 40 is operated so as to avoid an object 52 or the like.
- the control unit 11 of the robot control device 10 acquires the state of the robot 40 based on the position and orientation of the mark 46 captured in the image captured by the space information acquisition unit 20, and determines the state of the robot 40 and the object 50 or measurement object 52. You can get the positional relationship of On the other hand, the controller 11 acquires the state of the robot 40 based on the sensor 48 of the robot 40 .
- the state based on the sensor 48 of the robot 40 represents the position and orientation of the robot 40 with higher accuracy than the state based on the captured image of the spatial information acquisition unit 20 .
- control unit 11 matches the state of the robot 40 based on the captured image of the space information acquisition unit 20 with the state of the robot 40 based on the sensor 48 of the robot 40, thereby controlling the robot 40 in the motion space with high accuracy. can.
- the operation of matching the state of the robot 40 based on the spatial information including the image captured by the spatial information acquisition unit 20 with the state of the robot 40 based on the sensor 48 of the robot 40 is also called first calibration.
- Control unit 11 executes the first calibration at least once. Specifically, the control unit 11 controls the first coordinate system so that the depth information generated in the (X_CA, Y_CA, Z_CA) coordinate system by the space information acquisition unit 20 matches the (X_RB, Y_RB, Z_RB) coordinate system of the robot 40 .
- control unit 11 can transform the coordinate system of the spatial information acquisition unit 20 into the coordinate system of the robot 40 by executing the first calibration.
- the control unit 11 estimates the relative positional relationship between the coordinate system of the spatial information acquiring unit 20 and the coordinate system of the robot 40, and converts the coordinate system of the spatial information acquiring unit 20 to the coordinate system of the robot 40 based on the estimated relative positional relationship. Good to match.
- the control unit 11 may perform the first calibration with at least part of the FOV of the spatial information acquisition unit 20 as a range.
- the control unit 11 performs the first calibration within the calibration range 60 shown in FIG.
- a calibration range 60 is shown as a region surrounded by a two-dot chain line in FIG.
- a calibration range 60 corresponds to a range in which the first calibration of the robot 40 is performed.
- Calibration range 60 may include the work area of robot 40 .
- the calibration range 60 may be the range where the working area of the robot 40 and the FOV overlap.
- control unit 11 sets a point for executing the first calibration by moving the mark 46 of the robot 40 within the calibration range 60 .
- the points for performing the first calibration are also referred to as calibration positions.
- the control unit 11 moves the mark 46 of the robot 40 to the calibration position and causes the spatial information acquisition unit 20 to photograph the mark 46 .
- the control unit 11 calculates the position and orientation of the mark 46 based on spatial information including an image of the mark 46 and the like.
- the control unit 11 controls the position and orientation of the mark 46 calculated based on the spatial information including the image etc. to match the position and orientation of the mark 46 determined based on the detection result of the sensor 48 of the robot 40.
- the position and posture of the mark 46 are corrected based on spatial information including images.
- Correction of the position and orientation of the mark 46 based on spatial information including an image or the like corresponds to the first calibration.
- the position and orientation of the mark 46 are also referred to as mark position and orientation.
- the first calibration corresponds to correction of mark position and orientation.
- a calibration position corresponds to a position for correcting the mark position and orientation.
- control unit 11 may perform the first calibration as described below.
- the control unit 11 generates control information for the robot 40 for moving the mark 46 of the robot 40 to the calibration position.
- the control unit 11 operates the robot 40 based on the control information to move the mark 46 of the robot 40 to the calibration position.
- the control unit 11 acquires spatial information including an image of the mark 46 from the spatial information acquiring unit 20 .
- the control unit 11 calculates the position and orientation of the mark 46 based on spatial information including images and the like.
- the position and orientation of the mark 46 calculated based on the spatial information including the image etc. are also referred to as the mark position and orientation based on the spatial information including the image etc.
- the control unit 11 calculates the position and orientation of the mark 46 determined based on the detection result of the sensor 48 of the robot 40 .
- the position and orientation of the mark 46 calculated based on the detection result of the sensor 48 are also referred to as the mark position and orientation based on the sensor 48 .
- the control unit 11 compares the mark position/orientation based on spatial information including an image and the mark position/orientation based on the sensor 48 .
- the control unit 11 corrects the mark position/orientation based on the spatial information including the image or the like so that the mark position/orientation based on the spatial information including the image or the like matches the mark position/orientation based on the sensor 48 .
- the control unit 11 may correct an algorithm for calculating the mark position and orientation based on spatial information including images.
- the control unit 11 may correct the parameters included in the algorithm, or may correct the formula, table, or program. When a plurality of calibration positions are set, the control unit 11 moves the robot 40 to each calibration position, acquires spatial information including an image of the mark 46 at each calibration position, and obtains the image, etc. corrects the mark position and orientation based on spatial information including
- the control unit 11 sets the calibration range 60 in advance before executing the first calibration. Further, the control unit 11 sets calibration positions included in the calibration range 60 . The controller 11 sets the calibration position within the calibration range 60 .
- the control unit 11 generates control information for the robot 40 so as to move the robot 40 to the calibration position.
- the control unit 11 generates, as a calibration item, information specifying the mark position and orientation when the robot 40 is moved to the calibration position and the recognition result of the mark 46 of the robot 40 .
- the calibration item is, for example, coordinate information.
- the calibration item is, for example, coordinate information indicating the mark position and orientation based on the detection result of the sensor 48 of the robot 40 when the robot 40 is moved to the calibration position, or is recognized by the spatial information acquisition unit 20. coordinate information indicating the position and orientation of the mark 46 based on the recognition result of the mark 46.
- the control unit 11 may generate calibration items as described below.
- the control unit 11 acquires, for example, information on the real field size of the spatial information acquisition unit 20 or information on the FOV from the spatial information acquisition unit 20 .
- the control unit 11 sets the calibration range 60 based on the actual field of view size or FOV of the spatial information acquisition unit 20 and the work area of the robot 40 .
- the control unit 11 may set the calibration range 60 based on the position of the object 50 or the measurement object 52 in the motion space of the robot 40 .
- the control unit 11 may set the calibration range 60 based on depth information or point group information of the object 50 detected by the spatial information acquisition unit 20 or the measurement object 52 .
- the shape of the calibration range 60 is set to a truncated quadrangular pyramid shape.
- the shape of the calibration range 60 is not limited to these and may be set to various other shapes.
- the control unit 11 matches the mark position/orientation based on the sensor 48 of the robot 40 with the mark position/orientation based on the image of the spatial information acquisition unit 20 . Specifically, the controller 11 moves the robot 40 to the first position.
- the control unit 11 generates control information for operating the robot 40 so that the mark 46 of the robot 40 assumes a predetermined position and posture, and controls the robot 40 based on the control information to move the robot 40 to the first position. move.
- the first position may be a predetermined position included in the FOV of the spatial information acquisition section 20 .
- the first position may be the center position of the FOV of the spatial information acquisition unit 20, for example.
- the control unit 11 obtains an image of the mark 46 when the robot 40 moves to the first position, and calculates the position and orientation of the mark 46 as the mark position and orientation based on the image. Also, the control unit 11 calculates the mark position and orientation based on the sensor 48 . Based on the comparison between the mark position/orientation based on the image and the mark position/orientation based on the sensor 48, the control unit 11 controls the robot 40 so that the position of the robot 40 becomes the first position based on the detection result of the sensor 48 in the image. 40 control information is corrected. The control unit 11 moves the robot 40 based on the corrected control information so that the position of the robot 40 in the coordinate system of the robot 40 and the position of the robot 40 in the coordinate system of the space information acquisition unit 20 match. Update 40 states. In other words, it can be said that the control unit 11 updates the state of the robot 40 so that the position of the robot 40 becomes the first position in the image.
- the control unit 11 may generate a position that is a candidate for a calibration position different from the first position within the calibration range 60 .
- a position that is a candidate for the calibration position is also referred to as a second position.
- the second position is included in calibration range 60 .
- the control unit 11 estimates the state of the robot 40 when the robot 40 moves to the second position by simulating the motion of the robot 40 . That is, the control unit 11 calculates the state of the robot 40 assuming that the robot 40 moves to the second position. As a result, the controller 11 can determine whether the robot 40 can move to the second position.
- the control unit 11 assumes that the state of the robot 40 when it is assumed to move to the second position is a state in which the robot 40 is not in contact with the object 50 or the measurement target 52 or the like, is in a state in which the joint is within the range of motion, and is not in a singular point. , register the second position as the calibration position. When registering the second position as the calibration position, the control unit 11 recognizes the mark position and orientation based on the detection result of the sensor 48 of the robot 40 when the robot 40 is moved to the second position and the mark 46 of the robot 40. Information specifying each of the tip position and orientation based on the result is generated as a plurality of calibration items.
- the control unit 11 may generate a new second position of a different position and determine whether the new second position can be registered as the calibration position.
- the control unit 11 may determine that the state of the robot 40 is not joint-restricted when the numerical value representing the angle of the joint of the robot 40 is within the range of motion.
- the control unit 11 may determine that the state of the robot 40 is the joint-restricted state when the numerical value representing the angle of the joint of the robot 40 is outside the range of motion.
- a singular point corresponds to a posture of the robot 40 where the robot 40 is structurally uncontrollable. If the trajectory for operating the robot 40 includes a singular point, the robot 40 moves (runs away) at high speed near the singular point and stops at the singular point.
- the singular points of the robot 40 are the following three types (1) to (3). (1) Points outside the work area when controlling the robot 40 to near the outer limits of the work area. (The work area is the area corresponding to the motion space of the robot 40.) (2) Points when controlling the robot 40 directly above and below the robot base even within the work area. (3) A point where the joint angle one before the tip joint of the arm 42 of the robot 40 is zero or 180 degrees (wrist alignment singular point).
- the control unit 11 may determine that the state of the robot 40 is the state of singularity when the numerical value representing the state of the robot 40 matches the numerical value representing the state of singularity.
- the control unit 11 may determine that the state of the robot 40 is the state of singularity when the difference between the numerical value representing the state of the robot 40 and the numerical value representing the state of singularity is less than a predetermined value.
- the numerical value representing the state of the robot 40 may include, for example, the angle of the joint of the arm 42 or the torque of the motor that drives the robot 40 .
- control unit 11 sets the calibration range 60 and sets the calibration positions of the first position and the second position within the calibration range 60 . Further, the control unit 11 can generate a calibration item as information specifying the mark position and orientation of the robot 40 when the robot 40 is moved to the calibration position.
- the control unit 11 performs the first calibration so that the tip position/orientation calibration item regarding the recognition result of the mark 46 matches the mark position/orientation calibration item regarding the detection result of the sensor 48 of the robot 40 . Specifically, the controller 11 moves the robot 40 to the calibration position. The control unit 11 acquires the recognition result of the mark 46 of the robot 40 when the robot 40 moves to the calibration position by the space information acquisition unit 20 . The control unit 11 calculates the relative positional relationship of the mark position/orientation calibration item acquired as the recognition result of the mark 46 with respect to the mark position/orientation calibration item based on the sensor 48 of the robot 40 .
- the relative positional relationship corresponds to the coordinate difference and angle difference between the mark position and orientation specified by both calibration items and the recognition result of the mark 46 .
- the control unit 11 controls the spatial information acquisition unit so that the coordinate error and angle error corresponding to the relative positional relationship between the two calibration items are zero or close to zero (that is, the error is less than a predetermined value).
- the coordinate system of 20 is corrected to match the coordinate system of robot 40 . By doing so, the control unit 11 matches the recognition result of the mark 46 when the robot 40 moves to the calibration position with the mark position/orientation specified by the sensor 48 of the robot 40, so that the relative positional relationship can be calculated. can be calculated.
- the control unit 11 can set the calibration position by generating a calibration item. Conversely, the calibration position corresponds to the position to move the robot 40 to generate the calibration item. By applying the calibration item to the control of the robot 40, the controller 11 can move the robot 40 to the calibration position and perform calibration. Specifically, the control unit 11 performs the first calibration to correct the (X_CA, Y_CA, Z_CA) coordinate system of the spatial information acquisition unit 20 to the (X_RB, Y_RB, Z_RB) coordinate system of the robot 40. Let it match. The control unit 11 may identify the relationship between the coordinate system of the spatial information acquisition unit 20 and the coordinate system of the robot 40 by executing the first calibration.
- control unit 11 may perform further calibration using a measurement object 52 such as a pin placed in the motion space of the robot 40 .
- the calibration performed using the measurement object 52 to improve the accuracy of the first calibration is also called second calibration.
- the control unit 11 recognizes the measurement object 52 based on the image of the spatial information acquisition unit 20 and acquires the position of the measurement object 52 .
- the control unit 11 may recognize the measurement object 52 by image recognition of an image of the measurement object 52 .
- the measurement object 52 may have a measurement object mark for recognizing the position of the measurement object 52 in an image.
- the marks on the measurement object 52 may be configured identically to the marks 46 mounted on the robot 40 , for example, the object marks are sometimes referred to as marks 46 in this disclosure.
- the control unit 11 may acquire the position of the measurement object 52 based on an image of the measurement object mark of the measurement object 52 . In this embodiment, the controller 11 presses the first finger 441 or the second finger 442 of the end effector 44 against the object 52 to be measured.
- the control unit 11 detects the load acting on the first finger 441 or the second finger 442 from the measurement object 52 using the force sensor 444 or the tactile sensor 445 .
- the control unit 11 calculates the position of the measurement object 52 based on the detection result of the force sensor 444 or the tactile sensor 445 .
- the control unit 11 may estimate whether or not the coordinate system that affects the operation of the robot 40 can be corrected based on the calculation result of the position of the measurement object 52, and corrects the coordinate system if the correction is possible. may As will be described later, the second calibration may be performed while the robot 40 is in contact with the object 52 to be measured by manual operation.
- the number of measurement objects 52 arranged in the motion space is not limited to one, and may be two or more. That is, one or more measurement objects 52 may be arranged in the operating space.
- the control unit 11 detects one or more measurement objects 52 appearing in the motion space image, and acquires the coordinates of each measurement object 52 in the coordinate system of the spatial information acquisition unit 20 . Also, the control unit 11 acquires the coordinates of each measurement object 52 based on the position information of the end effector 44 with respect to each measurement object 52 .
- the control unit 11 determines the coordinate system of the spatial information acquisition unit 20 or A second calibration may be performed by correcting the coordinate system of the robot 40 .
- the coordinate system is corrected so that the depth information generated in the (X_CA, Y_CA, Z_CA) coordinate system by the spatial information acquisition unit 20 matches the (X_RB, Y_RB, Z_RB) coordinate system of the robot 40.
- the control unit 11 may correct the coordinate system, for example, in the rotational direction or the translational direction.
- the control unit 11 may correct the coordinate system so as to expand or contract, for example.
- the control unit 11 may correct distortion of the coordinate system, for example.
- the measurement object 52 may be configured to be less deformable when the end effector 44 contacts it.
- the rigidity of the measurement object 52 may be increased so that the measurement object 52 does not deform due to contact with the end effector 44 .
- the stiffness of the measurement object 52 may be determined based on the magnitude and direction of loads acting on the end effector 44 and the measurement object 52 when the end effector 44 is brought into contact with the measurement object 52 .
- the measurement object 52 may be configured to have a portion smaller than the distance between the first finger 441 and the second finger 442 when the first finger 441 and the second finger 442 are in the widest open state. By doing so, the control unit 11 can pinch the measurement object 52 between the first finger 441 and the second finger 442 . Also, the measurement object 52 may be configured to have a size capable of mounting the measurement object mark appearing in the image of the spatial information acquisition unit 20 . Also, the measurement object 52 may be configured to have a size that allows the position to be recognized in the image of the spatial information acquisition unit 20 .
- the measurement object 52 may be placed in the space where the robot 40 performs work.
- the measurement object 52 is located within the calibration range 60, for example.
- the measurement object 52 may be configured so that the position contacted by the end effector 44 can be easily moved.
- the measurement object 52 may be configured to be adjustable in height in the Z_RB axis direction when placed on a work table 70 having an upper surface extending along the X_RB axis and the Y_RB axis.
- the measurement object 52 may be configured as a block that can be assembled so that it can be stacked along the Z_RB axis.
- the control unit 11 of the robot control device 10 may execute the robot control method including the procedure of the flowchart illustrated in FIG. 4 so as to execute the second calibration for improving the accuracy of the first calibration.
- the robot control method may be implemented as a robot control program that is executed by a processor that configures the control unit 11 .
- the robot control program may be stored on a non-transitory computer-readable medium.
- the control unit 11 recognizes the measurement object 52 based on the image of the spatial information acquisition unit 20 (step S1).
- the control unit 11 brings the end effector 44 of the robot 40 into contact with the measurement object 52 (step S2).
- the control unit 11 acquires the detection result of the force sensor 444 or the tactile sensor 445 when the end effector 44 contacts the measurement object 52, and detects the load (step S3).
- the end effector 44 of the robot 40 may come into contact with the measurement object 52 by manual operation. Therefore, the control unit 11 not only detects the detection result of the force sensor 444 or the tactile sensor 445 when the end effector 44 is in contact with the measurement object 52, but also the state when the end effector 44 is in contact with the measurement object 52 by manual operation. can be obtained.
- Contact information is obtained by combining the detection result of the force sensor 444 or the tactile sensor 445 when the end effector 44 is in contact with the measurement object 52 and the detection result of the state when the end effector 44 is in contact with the measurement object 52 by manual operation. Also called The control unit 11 determines whether or not the end effector 44 has come into contact with all the measurement objects 52 and the loads have been detected (step S4). When the load is not detected for all the measurement objects 52 (step S4: NO), the control unit 11 returns to step S1 and repeats steps S1 to S3. In this embodiment, the end effector 44 is brought into contact with the object 52 to be measured. , the end effector 44 does not need to be in contact with the measurement object 52 .
- step S4 When the load is detected for all the measurement objects 52 (step S4: YES), the control unit 11 corrects the coordinate system of the spatial information acquisition unit 20 based on the load detection result (step S5). After executing the procedure of step S5, the control unit 11 ends the execution of the procedure of the flowchart of FIG.
- control unit 11 determines the coordinates recognized as the position of the measurement object 52 in the coordinate system of the spatial information acquisition unit 20. may be corrected.
- the control unit 11 may execute the procedure of the flowchart illustrated in FIG. 4 after calibration based on the image of the spatial information acquisition unit 20, or after calibration using another method.
- the control unit 11 brings the end effector 44 into contact with the measurement object 52 in various ways, and obtains the position of the measurement object 52 based on the detection results of the force sensor 444 or the tactile sensor 445 .
- An example of a mode of bringing the end effector 44 into contact with the measurement object 52 will be described below.
- the position of the measurement object 52 may be, for example, the center coordinates of the measurement object mark (mark 46), the coordinates of the edge of the measurement object 52, or the like.
- the control unit 11 brings the end effector 44 into contact with the measurement object 52 so that the measurement object 52 is sandwiched between the first finger 441 and the second finger 442 of the end effector 44 as shown in FIG. you can In this case, the inner side of the first finger 441 or the second finger 442 contacts the measurement object 52 .
- the control unit 11 detects the load acting on the first finger 441 or the second finger 442 from the measurement object 52 with the force sensor 444 and the position of the first finger 441 or the second finger 442.
- the position of the measurement object 52 may be calculated.
- the control unit 11 may calculate the position of the measurement object 52 based on the detection result of the tactile sensor 445 installed inside the first finger 441 or the second finger 442 .
- the position of the object 52 to be measured along the X_RB axis can be calculated. That is, the control unit 11 can calculate the X_RB axis components of the coordinates of the measurement object 52 .
- the control unit 11 controls the first finger 441 and the second finger 442 to be aligned along the Y_RB axis, and brings the first finger 441 or the second finger 442 into contact with the measurement object 52 on the inner side thereof.
- Y_RB axis components of the coordinates of the measurement object 52 can be calculated.
- the control unit 11 controls the first finger 441 and the second finger 442 to line up along the Z_RB axis, and brings the first finger 441 or the second finger 442 into contact with the measurement object 52 on the inner side thereof.
- Z_RB axis components of the coordinates of the measurement object 52 can be calculated.
- the control unit 11 controls the first finger 441 and the second finger 442 to be aligned along each of the three axes, and causes the first finger 441 or the second finger 442 to contact the measurement object 52 inside thereof. , the three-dimensional coordinates of the measurement object 52 can be calculated.
- the control unit 11 controls the first finger 441 and the second finger 442 to line up along three mutually independent directions, and the first finger 441 or the second finger 442 contacts the measurement object 52 inside thereof.
- the three-dimensional coordinates of the measurement object 52 can also be calculated by setting the In this case, for example, each component of the coordinates (X_RB, Y_RB, Z_RB) of the measurement object 52 is an intermediate value of each component of the coordinates (X_RB, Y_RB, Z_RB) of the first finger 441 or the second finger 442. can be
- the control unit 11 moves the first finger 441 and the second finger 442 of the end effector 44 with the first finger 441 and the second finger 442 open.
- a surface other than the inner side may be brought into contact with the measurement object 52 .
- the control unit 11 can calculate the coordinates of the measurement object 52 by moving the end effector 44 along each of the three axes and bringing it into contact with the measurement object 52 .
- the control unit 11 may move the end effector 44 along three mutually independent directions to contact the measurement object 52 . In this case, for example, the end effector 44 having one suction nozzle can also be used.
- the control unit 11 moves the end effector 44 in the negative direction of the X_RB axis to bring the outer side of the first finger 441 (the negative direction side of the X_RB axis) into contact with the measurement object 52 .
- the control unit 11 can calculate the position of the measurement object 52 in the X_RB axis direction based on the result of detecting the load acting on the first finger 441 by the force sensor 444 and the position of the first finger 441 .
- the control unit 11 determines the position of the measurement object 52 in the X_RB axis direction based on the detection result of the tactile sensor 445 and the position of the first finger 441. can be calculated.
- the X_RB component of the coordinates of the measurement object 52 is the X_RB component of the coordinates of the outer surface of the first finger 441, considering the dimensions of the measurement object 52. A value obtained by adding or subtracting half the width may be used.
- the control unit 11 moves the end effector 44 in the positive direction of the Y_RB axis to bring the side surface of the first finger 441 (negative direction side of the X_RB axis) into contact with the measurement object 52 .
- the control unit 11 can calculate the position of the measurement object 52 in the Y_RB axis direction based on the result of detecting the load acting on the first finger 441 with the force sensor 444 and the position of the first finger 441 .
- the tactile sensor 445 is installed on the side surface of the first finger 441
- the control unit 11 determines the position of the measurement object 52 in the Y_RB axis direction based on the detection result of the tactile sensor 445 and the position of the first finger 441. can be calculated.
- the Y_RB component of the coordinates of the measurement object 52 is the width It may be a value obtained by adding or subtracting half of .
- the control unit 11 moves the end effector 44 in the negative direction of the Z_RB axis, and moves the tip of the first finger 441 (negative direction side of the Z_RB axis) of the measurement object 52 in the positive direction of the Z_RB axis. It is in contact with the side surface.
- the control unit 11 can calculate the position of the measurement object 52 in the Z_RB axis direction based on the result of detecting the load acting on the first finger 441 with the force sensor 444 and the position of the first finger 441 .
- the control unit 11 determines the position of the measurement object 52 in the Z_RB axis direction based on the detection result of the tactile sensor 445 and the position of the first finger 441. can be calculated.
- the Z_RB component of the coordinates of the measurement object 52 may be the same as the Z_RB component of the coordinates of the tip surface of the first finger 441 .
- the control unit 11 moves the first finger 441 or the second finger 442 of the end effector 44 with the first finger 441 and the second finger 442 closed. It may be brought into contact with the object 52 to be measured.
- the control unit 11 can calculate the coordinates of the measurement object 52 by moving the end effector 44 along each of the three axes and bringing it into contact with the measurement object 52 .
- the control unit 11 may move the end effector 44 along three mutually independent directions to contact the measurement object 52 . In the following example, an example in which contact is made only on one side of each of the three axes or directions is described, but contact may be made from both sides of each of the three axes or directions.
- the control unit 11 moves the end effector 44 in the negative direction of the X_RB axis to bring the outer side of the first finger 441 (the negative direction side of the X_RB axis) into contact with the measurement object 52 .
- the control unit 11 can calculate the position of the measurement object 52 in the X_RB axis direction based on the result of detecting the load acting on the first finger 441 by the force sensor 444 and the position of the first finger 441 .
- the control unit 11 determines the position of the measurement object 52 in the X_RB axis direction based on the detection result of the tactile sensor 445 and the position of the first finger 441. can be calculated.
- the X_RB component of the coordinates of the measurement object 52 is the X_RB component of the coordinates of the outer surface of the first finger 441, considering the dimensions of the measurement object 52. A value obtained by adding or subtracting half the width may be used.
- the control unit 11 moves the end effector 44 in the positive direction of the Y_RB axis to bring the side surface of the first finger 441 or the second finger 442 (negative direction side of the X_RB axis) into contact with the measurement object 52. ing.
- the control unit 11 detects the load acting on the first finger 441 or the second finger 442 with the force sensor 444 and the position of the first finger 441 or the second finger 442. can be calculated in the Y_RB axis direction.
- the tactile sensor 445 is installed on the side surface of the first finger 441 or the second finger 442
- the control unit 11 measures based on the detection result of the tactile sensor 445 and the position of the first finger 441 or the second finger 442.
- the position of the object 52 in the Y_RB axis direction can be calculated.
- the Y_RB component of the coordinates of the measurement object 52 is the width It may be a value obtained by adding or subtracting half of .
- the control unit 11 moves the end effector 44 in the negative direction of the Z_RB axis to bring the tip of the first finger 441 or the second finger 442 (the negative direction side of the Z_RB axis) into contact with the measurement object 52. ing.
- the control unit 11 detects the load acting on the first finger 441 or the second finger 442 with the force sensor 444 and the position of the first finger 441 or the second finger 442. can be calculated in the Z_RB axis direction.
- the control unit 11 measures based on the detection result of the tactile sensor 445 and the position of the first finger 441 or the second finger 442.
- the position of the object 52 in the Z_RB axis direction can be calculated.
- the Z_RB component of the coordinates of the measurement object 52 may be the same as the Z_RB component of the coordinates of the tip surface of the first finger 441 .
- the control unit 11 touches the surface of the measurement object 52 on the positive direction side of the Z_RB axis with the first finger 441 or the second finger 442, and
- the end effector 44 may be moved in the X_RB axis direction or the Y_RB axis direction. That is, the control unit 11 may move the end effector 44 in the planar direction of the upper surface while the end effector 44 is in contact with the upper surface of the object 52 to be measured.
- the control unit 11 can collectively calculate the coordinates of the Z_RB axis and the coordinates of the X_RB axis or the Y_RB axis of the measurement object 52 .
- the measurement object 52 may be arranged in the motion space such that the surface (upper surface) of the measurement object 52 on the positive direction side of the Z_RB axis appears in the motion space image.
- the control unit 11 brings the first finger 441 or the second finger 442 into contact with the surface of the measurement object 52 on the positive direction side of the Z_RB axis.
- the control unit 11 can calculate the position of the measurement object 52 in the Z_RB axis direction based on the detection result of the force sensor 444 or the tactile sensor 445 and the position of the first finger 441 or the second finger 442 .
- the control unit 11 moves the first finger 441 or the second finger 442 along the X_RB axis or the Y_RB axis.
- the first finger 441 or the second finger 442 deviates from the surface of the measurement object 52 on the positive direction side of the Z_RB axis. That is, the first finger 441 or the second finger 442 does not contact the surface of the measurement object 52 on the positive direction side of the Z_RB axis.
- the control unit 11 adjusts the X_RB axis of the measurement object 52 based on the detection result that the contact between the first finger 441 or the second finger 442 and the surface of the measurement object 52 on the positive direction side of the Z_RB axis is lost.
- the direction or position in the Y_RB axis direction can be calculated.
- the control unit 11 may detect a change in force acting on the first finger 441 or the second finger 442 in the Z_RB axis direction based on the detection results of the force sensor 444 or the tactile sensor 445 .
- the control unit 11 can calculate the position of the measurement object 52 in the X_RB axis or Y_RB axis direction based on the change in force in the Z_RB axis direction. Assume, for example, that the control unit 11 moves the first finger 441 or the second finger 442 in the X_RB axis direction.
- the control unit 11 When the load acting on the first finger 441 or the second finger 442 in the Z_RB axis direction decreases by a predetermined value or more, the control unit 11 causes the first finger 441 or the second finger 442 to move toward the X_RB axis of the measurement object 52 . It may be determined that the end of the direction has been reached. The control unit 11 may calculate the position where it is determined that the first finger 441 or the second finger 442 has reached the end of the measurement object 52 in the X_RB axis direction as the position of the measurement object 52 in the X_RB axis direction.
- the control unit 11 moves the end effector 44 in the positive direction of the X_RB axis while the first finger 441 or the second finger 442 is in contact with the surface of the measurement object 52 on the positive side of the Z_RB axis. ing.
- the control unit 11 detects the load acting on the first finger 441 or the second finger 442 and the measurement object 52 with the force sensor 444 or the tactile sensor 445 and The position of the measuring object 52 in the Z_RB axis direction can be calculated based on the position.
- the control unit 11 controls the measurement object 52 based on the detection result that the first finger 441 or the second finger 442 and the measurement object 52 are out of contact with the measurement object 52 and the position of the first finger 441 or the second finger 442. can be calculated in the X_RB axis direction.
- the control unit 11 moves the end effector 44 in the positive direction of the Y_RB axis while the first finger 441 or the second finger 442 is in contact with the surface of the measurement object 52 on the positive direction side of the Z_RB axis. ing.
- the control unit 11 detects the load acting on the first finger 441 or the second finger 442 with the force sensor 444 or the tactile sensor 445 and based on the position of the first finger 441 or the second finger 442.
- the position of the measurement object 52 in the Z_RB axis direction can be calculated.
- control unit 11 controls the measurement object 52 based on the detection result that the first finger 441 or the second finger 442 and the measurement object 52 are out of contact with the measurement object 52 and the position of the first finger 441 or the second finger 442. can be calculated in the Y_RB axis direction.
- the control unit 11 moves the end effector 44 in the positive direction of the X_RB axis and the Y_RB axis in a state in which the first finger 441 or the second finger 442 is in contact with the surface of the measurement object 52 on the positive direction side of the Z_RB axis. are simultaneously moved in the positive direction of That is, the control unit 11 moves the first finger 441 or the second finger 442 in a direction oblique to each of the X_RB axis and the Y_RB axis. In this case, the control unit 11 detects the load acting on the first finger 441 or the second finger 442 with the force sensor 444 or the tactile sensor 445 and based on the position of the first finger 441 or the second finger 442.
- the position of the measurement object 52 in the Z_RB axis direction can be calculated. Further, the control unit 11 controls the measurement object 52 based on the detection result that the first finger 441 or the second finger 442 and the measurement object 52 are out of contact with the measurement object 52 and the position of the first finger 441 or the second finger 442. can be calculated for the X_RB axis direction and the Y_RB axis direction.
- the first finger 441 and the second finger 442 are closed as shown in FIGS. 7A, 7B and 7C. It may be in a closed state or in an open state.
- the control unit 11 moves the first finger 441 and the second finger 442 to the measurement object 52 while holding the measurement object 52 between the first finger 441 and the second finger 442 .
- the end effector 44 may be rotated so as to move along the circumference of the .
- the shape of the measurement object 52 may be a cylindrical shape with a circular upper surface.
- the control unit 11 moves at least one finger out of the first finger 441 or the second finger 442 so as to approach the other finger, and brings the first finger 441 and the second finger 442 into contact with the measurement object 52.
- the robot 40 may be controlled to rotate around the measurement object 52 while holding the measurement object 52 .
- the control unit 11 may rotate the end effector 44 based on the detection results of the tactile sensors 445 installed inside the first finger 441 and the second finger 442 .
- the control unit 11 can calculate the center coordinates of the measurement object 52 in the X_RB axis direction and the Y_RB axis direction based on the locus of movement of the first finger 441 or the second finger 442 .
- the shape of the measurement object 52 is not limited to a cylinder, and may be a prism shape having a polygonal upper surface.
- the shape of the measurement object 52 is not limited to these and may be other various shapes.
- the measurement object 52 may be placed in the motion space such that the upper surface of the measurement object 52 is shown in the motion space image.
- control section 11 may control the robot 40 so that the end effector 44 contacts at least one surface of the measurement object 52 .
- the control unit 11 can improve the detection accuracy of the position of the measurement object 52 .
- the coordinates of the measurement object 52 are obtained based on the fact that the end effector 44 is in contact with the measurement object 52.
- the positions of the end effector 44 and the measurement object 52 are If the relationship is known, the end effector 44 does not have to contact the measurement object 52 .
- the first finger 441 of the end effector 44 is equipped with a distance sensor capable of non-contact measurement, the above-described Thus, it is possible to calculate the coordinates of the object 52 to be measured. Also, each component of the coordinates of the measurement object 52 may be calculated based on different sensors.
- the control unit 11 may control the robot 40 so that the end effector 44 contacts the surface of the measurement object 52 that is reflected in the motion space image. By doing so, the control unit 11 can easily correct the position of the measurement object 52 based on the spatial information acquisition unit 20 based on the position of the measurement object 52 based on contact with the measurement object 52 . As a result, the precision of the coordinate system of the spatial information acquisition unit 20 can be improved.
- the first calibration or the second calibration of the robot 40 configured to grip the object 50 with the first finger 441 and the second finger 442 is described. was done.
- the aspect of the second calibration in this embodiment is not only applied to the robot 40 having two fingers, but may be applied to the robot 40 having three or more fingers, or to the robot 40 having three or more fingers. It may be applied to the robot 40 having a holding portion including, for example. Also, the aspect of the second calibration in the present embodiment may be applied to the robot 40 having a jig such as a rod for measurement in addition to the holding section.
- the robot 40 is brought into contact with the measurement object 52 placed in the action space, and the load is detected. 52 coordinates are detected.
- the coordinates of the measurement object 52 based on the image of the spatial information acquisition unit 20 can be corrected to match the coordinates of the measurement object 52 based on the contact of the robot 40 .
- the coordinate system of the spatial information acquisition unit 20 can be corrected in accordance with the coordinate system of the robot 40 . As a result, the calibration accuracy of the robot 40 can be improved.
- the accuracy of calibration can be ensured even if the calculation accuracy of the depth information in the spatial information acquisition unit 20 is lowered. As a result, the cost of the spatial information acquisition unit 20 can be reduced. In addition, the influence exerted on the accuracy of calibration by changing the configuration of the spatial information acquisition unit 20 or the arrangement of the spatial information acquisition unit 20 can be reduced.
- the coordinates or the coordinate system can be corrected by the contact of the robot 40 with the measurement object 52, the accuracy of calibration can be ensured without the operator's visual confirmation. As a result, workload and operating costs can be reduced. Also, automation of calibration can be facilitated. Also, the coordinates or coordinate system can be corrected even if the workspace of the robot 40 is not of uniform height.
- the robot 40 is installed on the workbench 70 , but the robot 40 may be installed on a support table other than the workbench 70 . Even if the robot 40 is positioned on a support stand different from the workbench 70, the work object and the like are installed on the workbench 70, so the coordinate system of the motion space is the coordinate system of the workbench 70. But also. Therefore, by performing the calibration of the present disclosure, accurate work can be performed even if the robot 40 is not installed on the workbench 70 .
- the control unit 11 of the robot control device 10 first performs the first calibration between the coordinate system of the robot 40 and the coordinate system of the spatial information acquisition unit 20, and then performs the first calibration.
- a second calibration was performed based on article 52.
- the controller 11 of the robot controller 10 may perform the second calibration first.
- the control unit 11 of the robot control device 10 calculates the coordinates of the measurement object 52 in the coordinate system of the robot 40 in the same manner as in the first embodiment. good to get Then, the control unit 11 may perform calibration by correcting the robot coordinates using the acquired coordinates of the measurement object 52 in the coordinate system of the robot 40 as the origin of the coordinate system of the robot 40 . Alternatively, for example, the control unit 11 acquires the coordinates (X, Y, Z) of the motion space of a specific location by user input, etc., and based on the input motion space coordinates, the coordinate system of the motion space Calibration may be performed by correcting the coordinate system of the robot 40 . Note that the robot 40 may be manually brought into contact with the measurement object 52 , or manually moved to the vicinity of the measurement object 52 and then moved in a certain direction by the control unit 11 to move the robot 40 to the measurement object 52 . may come into contact with
- the robot control device 10 may acquire the input from the user via the interface 13 .
- the robot control system 1 may further include a terminal device having a user interface that is preferentially or wirelessly connected to the interface 13 of the robot control device 10 .
- the user interface inputs information from the user and outputs information to the user.
- the user interface includes, for example, a touch sensor.
- the touch sensor detects contact with a user's finger, stylus pen, or the like, and identifies the contact position.
- the touch sensor may be integrated with the display to form a touch panel display.
- the control unit 11 may complete the calibration and start the work of the robot 40. Further, when the robot control system 1 includes a camera as the space information acquisition unit 20, the control unit 11 performs calibration between the coordinate system of the space information acquisition unit 20 and the coordinate system of the motion space, or the coordinate system of the robot 40. A conversion formula between the coordinate system of the robot 40 and the coordinate system of the spatial information acquisition section 20 may be determined by performing calibration with the coordinate system of the spatial information acquisition section 20 . Note that when the positions of the robot 40 and the spatial information acquisition unit 20 are fixed from the beginning, the coordinate system of the robot 40 and the coordinate system of the spatial information acquisition unit 20 need not be calibrated. Since the coordinate system and the coordinate system of the spatial information acquisition unit 20 can be converted, calibration need not be performed.
- measurement object 52 may be configured to have marks 46 .
- the measurement object 52 may be configured with a mark 46 that functions as a measurement object mark.
- the measurement object 52 may be configured to have separate measurement object marks and marks 46 .
- the measurement object 52 may have, for example, an isotropic planar shape when the measurement object 52 is viewed from above, and the upper surface may be flat.
- the measurement object 52 is, for example, a cube, a rectangular parallelepiped, a quadrangular prism, or a polygonal prism such as a triangular prism.
- the object 52 to be measured has a top surface including corners or straight sides, such as a polygonal prism, for example, when the end effector 44 is moved along the top surface of the object 52 to be measured.
- the presence of corners or sides of object 52 can improve the accuracy of determining the coordinate position. By doing so, the space occupied by the jigs for performing the first calibration and the second calibration in the operating space can be reduced.
- the measurement object 52 may be configured to have the mark 46 and further have a portion that is contacted by the end effector 44 of the robot 40 .
- the upper surface of the measurement object 52 can be used for the second calibration.
- the accuracy of the second calibration can be enhanced.
- the measurement object 52 may be configured to have no marks 46 . If the robot control system 1 does not have a camera as the spatial information acquisition unit 20, the second calibration can be performed by manually bringing the robot 40 closer to the measurement object 52 with the configuration illustrated in FIG. 9C.
- the measurement object 52 may be cylindrical and configured to have a mark 46 on the top surface of the cylindrical shape. By doing so, the accuracy of the second calibration can be enhanced.
- the measurement objects 52 may be arranged at diagonal positions on the upper surface of the workbench 70 . In this way, any tilt or distortion of the top surface of the worktable 70 can be corrected in the coordinate system of the motion space of the robot 40 .
- Embodiments according to the present disclosure are not limited to any specific configuration of the embodiments described above. Embodiments of the present disclosure extend to any novel feature or combination thereof described in the present disclosure or any novel method or process step or combination thereof described. be able to.
- references such as “first” and “second” are identifiers for distinguishing the configurations.
- Configurations that are differentiated in descriptions such as “first” and “second” in this disclosure may interchange the numbers in that configuration.
- a first calibration can exchange identifiers “first” and “second” with a second calibration. The exchange of identifiers is done simultaneously.
- the configurations are still distinct after the exchange of identifiers.
- Identifiers may be deleted. Configurations from which identifiers have been deleted are distinguished by codes.
- the description of identifiers such as “first” and “second” in this disclosure should not be used as a basis for interpreting the order of the configuration or the existence of lower numbered identifiers.
- robot control system 10 robot control device (11: control unit, 12: storage unit, 13: interface) 20 spatial information acquisition unit 40 robot (410: control unit, 42: arm, 44: end effector, 440: control unit, 441: first finger, 442: second finger, 443: driving unit, 444: force sensor, 445: tactile sensor, 46: mark, 48: sensor, 49: interface) 50 object 52 object to be measured (54: accessible range) 60 CALIBRATION RANGE 70 WORKBENCH
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
Abstract
Description
図1及び図2に例示されるように、一実施形態に係るロボット制御システム1は、ロボット40と、ロボット制御装置10と、空間情報取得部20とを備える。ロボット40は、所定の動作空間において動作する。空間情報取得部20は、ロボット40が動作する動作空間を撮影し、動作空間のデプス情報を生成する。空間情報取得部20は、後述するように、動作空間に存在する物体50の表面に位置する測定点までの距離を算出する。空間情報取得部20から測定点までの距離は、デプスとも称される。デプス情報は、各測定点について測定したデプスに関する情報である。言い換えれば、デプス情報は、動作空間に存在する物体50の表面に位置する測定点までの距離に関する情報である。デプス情報は、空間情報取得部20から見た方向とその方向のデプスとを関連づけたデプスマップとして表されてもよい。空間情報取得部20は、(X_CA,Y_CA,Z_CA)座標系に基づいて動作空間のデプス情報を生成する。空間情報取得部20は、動作空間を撮影した画像を生成してもよい。動作空間を撮影した画像は、動作空間画像とも称される。
ロボット制御装置10は、制御部11と、記憶部12と、インタフェース13とを備える。インタフェース13は、I/F13とも称される。
ロボット40は、図1に例示されるように、センサ48を備える。また、ロボット40は、図2に例示されるように、アーム42と、アーム42に取り付けられるエンドエフェクタ44と、エンドエフェクタ44に設置されているマーク46とを備える。ロボット40は、必須ではないがインタフェース49を更に備える。インタフェース49は、I/F49とも称される。なお、マーク46は、エンドエフェクタ44ではなく、アーム42に設置されていてもよい。
空間情報取得部20は、ロボット40の動作空間に関する空間情報を取得する。空間情報取得部20は、動作空間を撮影し、空間情報として動作空間の画像を取得してよい。空間情報取得部20は、図2に例示されるように、動作空間に存在する物体50又は測定対象物52等の作業対象物を撮影してよい。空間情報取得部20は、カメラとして構成されてよい。3Dステレオカメラは、動作空間に存在する物体50を撮影し、動作空間に存在する物体50の表面に位置する測定点までの距離をデプスとして算出し、デプス情報を生成する。空間情報取得部20は、3Dステレオカメラとして構成されてもよい。空間情報取得部20は、LiDAR(light detection and ranging)として構成されてもよい。LiDARは、動作空間に存在する物体50の表面に位置する測定点までの距離を測定し、デプス情報を生成する。つまり、空間情報取得部20は、空間情報として動作空間のデプス情報を取得してよい。空間情報取得部20は、これらに限られず種々のデバイスとして構成されてもよい。空間情報取得部20は、空間情報として、動作空間の画像又はデプス情報に限られず他の種々の情報を取得してよい。空間情報取得部20は、撮像素子を備えてよい。空間情報取得部20は、光学系を更に備えてよい。空間情報取得部20は、動作空間を撮影した画像をロボット制御装置10に出力してよい。空間情報取得部20は、ロボット40の動作空間におけるデプス情報を生成してロボット制御装置10に出力してもよい。空間情報取得部20は、ロボット40の動作空間における点群情報を生成してロボット制御装置10に出力してもよい。すなわち、空間情報を、点群データの形式で出力してもよい。言い換えれば、点群情報は、空間情報を有していてもよい。点群情報は、動作空間に存在する物体50の表面に位置する各測定点の集合の情報であり、各測定点の座標情報又は色情報を含む情報である。点群情報は、測定空間内の物体50を複数の点で表すデータであるともいえる。空間情報が点群データの形式であることによって、空間情報取得部20で取得された初期データに基づく空間情報よりも、データ密度を小さくすることができる。
ロボット制御装置10は、動作空間に存在する物体50又は測定対象物52等の作業対象物に作用するようにロボット40を動作させたり、物体50又は測定対象物52等を避けるようにロボット40を動作させたりする。ロボット制御装置10は、物体50又は測定対象物52等を空間情報取得部20で写した撮影画像に基づいて、物体50又は測定対象物52等の作業対象物に作用したり物体50又は測定対象物52等を避けたりするようにロボット40を動作させる。
ロボット制御装置10の制御部11は、空間情報取得部20の撮影画像に写ったマーク46の位置及び姿勢に基づいてロボット40の状態を取得し、ロボット40と物体50又は測定対象物52等との位置関係を取得できる。一方で、制御部11は、ロボット40のセンサ48に基づいてロボット40の状態を取得する。ロボット40のセンサ48に基づく状態は、空間情報取得部20の撮影画像に基づく状態よりもロボット40の位置及び姿勢を高精度に表す。したがって、制御部11は、空間情報取得部20の撮影画像に基づくロボット40の状態を、ロボット40のセンサ48に基づくロボット40の状態に一致させることによって、ロボット40を動作空間において高精度で制御できる。空間情報取得部20の撮影画像等を含む空間情報に基づくロボット40の状態を、ロボット40のセンサ48に基づくロボット40の状態に一致させる作業は、第1キャリブレーションとも称される。制御部11は、第1キャリブレーションを少なくとも1回実行する。具体的に、制御部11は、空間情報取得部20で(X_CA,Y_CA,Z_CA)座標系で生成されたデプス情報をロボット40の(X_RB,Y_RB,Z_RB)座標系に一致させるように第1キャリブレーションを実行する。言い換えれば、制御部11は、第1キャリブレーションを実行することによって、空間情報取得部20の座標系をロボット40の座標系に変換することができるようになる。制御部11は、空間情報取得部20の座標系とロボット40の座標系との相対位置関係を推定し、推定した相対位置関係に基づいて空間情報取得部20の座標系をロボット40の座標系に合わせてよい。
制御部11は、第1キャリブレーションを実行する前に、あらかじめキャリブレーション範囲60を設定する。また、制御部11は、キャリブレーション範囲60に含まれるキャリブレーション位置を設定する。制御部11は、キャリブレーション範囲60の中にキャリブレーション位置を設定する。
(1)作業領域の外側限界の近くまでにロボット40を制御するときの作業領域外の点。(作業領域は、ロボット40の動作空間に対応する領域である。)
(2)作業領域内であっても、ロボットベースの真上と真下にロボット40を制御するときの点。
(3)ロボット40のアーム42の先端の関節より1つ前の関節角度がゼロ又は180度になる点(手首整列特異点)。
制御部11は、マーク46の認識結果に関する先端位置姿勢のキャリブレーションアイテムが、ロボット40のセンサ48の検出結果に関するマーク位置姿勢のキャリブレーションアイテムに一致するように第1キャリブレーションを実行する。具体的に、制御部11は、キャリブレーション位置にロボット40を移動させる。制御部11は、空間情報取得部20によって、ロボット40がキャリブレーション位置に移動したときのロボット40のマーク46の認識結果を取得する。制御部11は、ロボット40のセンサ48に基づくマーク位置姿勢のキャリブレーションアイテムに対する、マーク46の認識結果として取得されたマーク位置姿勢のキャリブレーションアイテムの相対位置関係を算出する。相対位置関係は、両者のキャリブレーションアイテムで特定されるマーク位置姿勢とマーク46の認識結果との間の座標の差及び角度の差に対応する。制御部11は、両者のキャリブレーションアイテムに対する相対位置関係に対応する座標の誤差及び角度の誤差がゼロ又はゼロに近くなるように(つまり、誤差が所定値未満になるように)空間情報取得部20の座標系を補正してロボット40の座標系に合わせる。このようにすることで、制御部11は、ロボット40がキャリブレーション位置に移動したときのマーク46の認識結果をロボット40のセンサ48で特定されるマーク位置姿勢に一致させることによって、相対位置関係を算出できる。
動作空間に配置される測定対象物52の数は、1つに限られず2つ以上であってよい。つまり、動作空間に、1又は複数の測定対象物52が配置されてよい。制御部11は、動作空間画像に写っている1又は複数の測定対象物52を検出して、空間情報取得部20の座標系における各測定対象物52の座標を取得する。また、制御部11は、各測定対象物52に対するエンドエフェクタ44の位置情報に基づいて、各測定対象物52の座標を取得する。具体的には、本実施形態では、エンドエフェクタ44を接触させてエンドエフェクタ44に作用する荷重の検出結果に基づいて、荷重を検出している間は、各測定対象物52に対するエンドエフェクタ44の位置は0(ゼロ)又は一定の距離であるとして、ロボット40の座標系における各測定対象物52の座標を取得する。制御部11は、空間情報取得部20の座標系における各測定対象物52の座標と、ロボット40の座標系における各測定対象物52の座標とに基づいて、空間情報取得部20の座標系又はロボット40の座標系を補正することによって、第2キャリブレーションを行なってもよい。具体的には、例えば、空間情報取得部20で(X_CA,Y_CA,Z_CA)座標系で生成されたデプス情報をロボット40の(X_RB,Y_RB,Z_RB)座標系に一致させるように座標系を補正してもよい。制御部11は、例えば回転方向又は並進方向に座標系を補正してよい。制御部11は、例えば拡大又は縮小するように座標系を補正してよい。制御部11は、例えば座標系の歪みを補正してもよい。
測定対象物52は、エンドエフェクタ44が接触したときに変形しにくいように構成されてよい。例えばエンドエフェクタ44の接触によって測定対象物52が変形しないように測定対象物52の剛性が高められてもよい。測定対象物52の剛性は、エンドエフェクタ44を測定対象物52に接触させるときにエンドエフェクタ44及び測定対象物52それぞれに作用する荷重の大きさ及び方向に基づいて定められてよい。
ロボット制御装置10の制御部11は、第1キャリブレーションの精度を高めるための第2キャリブレーションを実行するように図4に例示されるフローチャートの手順を含むロボット制御方法を実行してもよい。ロボット制御方法は、制御部11を構成するプロセッサに実行させるロボット制御プログラムとして実現されてもよい。ロボット制御プログラムは、非一時的なコンピュータ読み取り可能な媒体に格納されてよい。
制御部11は、種々の態様でエンドエフェクタ44を測定対象物52に接触させ、力覚センサ444又は触覚センサ445の検出結果に基づいて測定対象物52の位置を取得する。以下、エンドエフェクタ44を測定対象物52に接触させる態様の例が説明される。なお、この場合、測定対象物52の位置とは、例えば、測定対象物マーク(マーク46)の中心座標又は、測定対象物52のエッジの座標などであってもよい。
以上述べてきたように、本実施形態に係るロボット制御装置10及びロボット制御方法によれば、動作空間に配置された測定対象物52にロボット40を接触させて荷重を検出することによって測定対象物52の座標が検出される。このようにすることで、空間情報取得部20の画像に基づく測定対象物52の座標が、ロボット40の接触に基づく測定対象物52の座標に合わせて補正され得る。また、空間情報取得部20の座標系がロボット40の座標系に合わせて補正され得る。その結果、ロボット40のキャリブレーション精度が向上され得る。
以下、他の実施形態が説明される。
上述してきた実施形態において、ロボット制御装置10の制御部11は、先に、ロボット40の座標系と空間情報取得部20の座標系との間で第1キャリブレーションを行ない、その後に、測定対象物52に基づいて第2キャリブレーションを実行した。しかしながら、本開示の他の実施形態としては、ロボット制御装置10の制御部11は、第2キャリブレーションを先に行なってもよい。
図9Aに例示されるように、測定対象物52は、マーク46を有するように構成されてよい。測定対象物52は、測定対象物マークとして機能するマーク46を有するように構成されてよい。測定対象物52は、測定対象物マークとマーク46とを別々に有するように構成されてよい。
図10に例示されるように、測定対象物52は、作業台70の上面の対角の位置に配置されてよい。このようにすることで、作業台70の上面の傾き又は歪みがある場合でも、ロボット40の動作空間の座標系において修正され得る。
10 ロボット制御装置(11:制御部、12:記憶部、13:インタフェース)
20 空間情報取得部
40 ロボット(410:制御部、42:アーム、44:エンドエフェクタ、440:制御部、441:第1指、442:第2指、443:駆動部、444:力覚センサ、445:触覚センサ、46:マーク、48:センサ、49:インタフェース)
50 物体
52 測定対象物(54:接触可能範囲)
60 キャリブレーション範囲
70 作業台
Claims (14)
- センサおよびエンドエフェクタを有するロボットを制御する制御部を備え、
前記制御部は、
前記ロボットの動作空間に配された測定対象物に対する前記エンドエフェクタの位置情報を取得し、
前記位置情報に基づき、ロボットの動作に関する座標系の補正する、
ロボット制御装置。 - 前記制御部は、
前記動作空間に関する空間情報に基づいて、前記測定対象物の座標を取得し、
前記測定対象物の座標に基づいて、前記エンドエフェクタを前記測定対象物へ移動させるように前記ロボットを制御し、
前記エンドエフェクタの前記測定対象物に対する位置情報に基づいて、前記ロボットの動作に関する座標系を補正する、
請求項1に記載のロボット制御装置。 - 前記測定対象物は測定対象物マークを備え、
前記制御部は、前記測定対象物マークに関する空間情報に基づいて、前記測定対象物の座標を算出する、請求項2に記載のロボット制御装置。 - 前記制御部は、複数の前記測定対象物に対する前記エンドエフェクタの位置情報に基づいて、ロボットの動作に関する座標系を補正する、請求項1から3までのいずれか一項に記載のロボット制御装置。
- 前記制御部は、前記位置情報として、前記エンドエフェクタを前記測定対象物に接触に基づく接触情報を取得する、請求項1から4までのいずれか一項に記載のロボット制御装置。
- 前記制御部は、前記エンドエフェクタを前記測定対象物に接触させた状態で前記エンドエフェクタを移動させることで、前記エンドエフェクタと前記測定対象物との接触が無くなることを検出した情報に基づいて、前記ロボットの動作空間に関する空間情報に基づく座標系を補正する、請求項1から5までのいずれか一項に記載のロボット制御装置。
- 前記制御部は、前記エンドエフェクタを前記測定対象物に上面から接触させた状態で前記エンドエフェクタを前記上面の面方向に移動させる、請求項6に記載のロボット制御装置。
- 前記エンドエフェクタは保持部を備え、
前記制御部は、前記保持部を移動させるように前記ロボットを制御することで、前記エンドエフェクタを前記測定対象物に接触させる、請求項1から4までのいずれか一項に記載のロボット制御装置。 - 前記制御部は、前記保持部が少なくとも2本の指を含む場合、少なくとも1本の前記指を、他の前記指と近接させるように移動させ、前記測定対象物に接触させた状態で前記測定対象物の周りで回転するように前記ロボットを制御する、請求項8に記載のロボット制御装置。
- 前記制御部は、前記エンドエフェクタが前記測定対象物の少なくとも1つの面に接触するように前記ロボットを制御する、請求項1から9までのいずれか一項に記載のロボット制御装置。
- 前記制御部は、前記エンドエフェクタが前記測定対象物の面であって前記ロボットの動作空間に関する空間情報に含まれる動作空間画像に写っている面に接触するように前記ロボットを制御する、請求項10に記載のロボット制御装置。
- 前記測定対象物の形状は、円形の上面を有する円柱形状、又は、多角形の上面を有する角柱形状であり、
前記測定対象物は、前記測定対象物の上面が前記ロボットの動作空間に関する空間情報に含まれる動作空間画像に写るように前記動作空間に配置される、請求項1から10までのいずれか一項に記載のロボット制御装置。 - 請求項1から12までのいずれか一項に記載のロボット制御装置と、前記ロボットとを備えるロボット制御システム。
- センサ及びエンドエフェクタを有するロボットを制御するロボット制御方法であって、
前記ロボットの動作空間内に配された測定対象物に対する前記エンドエフェクタの位置情報を取得することと、
前記位置情報に基づき、前記ロボットの動作に関する座標系を補正することと
を含む、ロボット制御方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280053851.5A CN117769482A (zh) | 2021-08-03 | 2022-08-03 | 机器人控制设备、机器人控制系统和机器人控制方法 |
EP22853117.4A EP4382259A1 (en) | 2021-08-03 | 2022-08-03 | Robot control device, robot control system, and robot control method |
JP2023540395A JPWO2023013699A1 (ja) | 2021-08-03 | 2022-08-03 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-127725 | 2021-08-03 | ||
JP2021127725 | 2021-08-03 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023013699A1 true WO2023013699A1 (ja) | 2023-02-09 |
Family
ID=85156013
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/029853 WO2023013699A1 (ja) | 2021-08-03 | 2022-08-03 | ロボット制御装置、ロボット制御システム、及びロボット制御方法 |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP4382259A1 (ja) |
JP (1) | JPWO2023013699A1 (ja) |
CN (1) | CN117769482A (ja) |
WO (1) | WO2023013699A1 (ja) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05301183A (ja) | 1992-04-28 | 1993-11-16 | Fujitsu Ltd | ロボット制御装置及びロボット制御方法 |
JP2012011531A (ja) * | 2010-07-05 | 2012-01-19 | Yaskawa Electric Corp | ロボット装置およびロボット装置による把持方法 |
JP2015085458A (ja) * | 2013-10-31 | 2015-05-07 | セイコーエプソン株式会社 | ロボット制御装置、ロボットシステム、及びロボット |
JP2016052695A (ja) * | 2014-09-03 | 2016-04-14 | キヤノン株式会社 | ロボット装置、およびロボット装置の制御方法 |
-
2022
- 2022-08-03 CN CN202280053851.5A patent/CN117769482A/zh active Pending
- 2022-08-03 EP EP22853117.4A patent/EP4382259A1/en active Pending
- 2022-08-03 JP JP2023540395A patent/JPWO2023013699A1/ja active Pending
- 2022-08-03 WO PCT/JP2022/029853 patent/WO2023013699A1/ja active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05301183A (ja) | 1992-04-28 | 1993-11-16 | Fujitsu Ltd | ロボット制御装置及びロボット制御方法 |
JP2012011531A (ja) * | 2010-07-05 | 2012-01-19 | Yaskawa Electric Corp | ロボット装置およびロボット装置による把持方法 |
JP2015085458A (ja) * | 2013-10-31 | 2015-05-07 | セイコーエプソン株式会社 | ロボット制御装置、ロボットシステム、及びロボット |
JP2016052695A (ja) * | 2014-09-03 | 2016-04-14 | キヤノン株式会社 | ロボット装置、およびロボット装置の制御方法 |
Also Published As
Publication number | Publication date |
---|---|
EP4382259A1 (en) | 2024-06-12 |
CN117769482A (zh) | 2024-03-26 |
JPWO2023013699A1 (ja) | 2023-02-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6966582B2 (ja) | ロボットモーション用のビジョンシステムの自動ハンドアイ校正のためのシステム及び方法 | |
US10551821B2 (en) | Robot, robot control apparatus and robot system | |
US9519736B2 (en) | Data generation device for vision sensor and detection simulation system | |
JP5778311B1 (ja) | ピッキング装置およびピッキング方法 | |
JP3946711B2 (ja) | ロボットシステム | |
JP6025386B2 (ja) | 画像計測装置、画像計測方法及び画像計測プログラム | |
JP7130927B2 (ja) | 制御装置およびロボットシステム | |
WO2018043525A1 (ja) | ロボットシステム、ロボットシステム制御装置、およびロボットシステム制御方法 | |
CN111278608B (zh) | 用于3d视觉机器人系统的校准物品 | |
JP2013036988A (ja) | 情報処理装置及び情報処理方法 | |
US11123864B2 (en) | Motion teaching apparatus, robot system, and motion teaching method | |
JP6897396B2 (ja) | 制御装置、ロボットシステムおよび制御方法 | |
JP2019195885A (ja) | 制御装置およびロボットシステム | |
JP6928015B2 (ja) | ロボットシステムおよび座標変換方法 | |
WO2023013740A1 (ja) | ロボット制御装置、ロボット制御システム、及びロボット制御方法 | |
JP7112528B2 (ja) | 作業座標作成装置 | |
WO2023013699A1 (ja) | ロボット制御装置、ロボット制御システム、及びロボット制御方法 | |
CN116852359A (zh) | 一种基于机器人手持示教装置的tcp快速标定的装置及方法 | |
US20230321823A1 (en) | Robot control device, and robot system | |
TWI805437B (zh) | 機器人點位演算校正系統與校正方法 | |
JP2022142376A (ja) | ロボットシステム、およびロボットシステムの制御方法 | |
WO2023013739A1 (ja) | ロボット制御装置、ロボット制御システム、及びロボット制御方法 | |
WO2023013698A1 (ja) | ロボット制御装置、ロボット制御システム、及びロボット制御方法 | |
CN113524147B (zh) | 一种基于3d相机的工业机器人示教系统及方法 | |
JP7509918B2 (ja) | 画像処理システム及び画像処理方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22853117 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280053851.5 Country of ref document: CN Ref document number: 2023540395 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022853117 Country of ref document: EP Effective date: 20240304 |