WO2020121396A1 - Robot calibration system and robot calibration method - Google Patents
Robot calibration system and robot calibration method Download PDFInfo
- Publication number
- WO2020121396A1 WO2020121396A1 PCT/JP2018/045423 JP2018045423W WO2020121396A1 WO 2020121396 A1 WO2020121396 A1 WO 2020121396A1 JP 2018045423 W JP2018045423 W JP 2018045423W WO 2020121396 A1 WO2020121396 A1 WO 2020121396A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- coordinate system
- coordinate
- camera
- vision
- robot
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/10—Programme-controlled manipulators characterised by positioning means for manipulator elements
Definitions
- This specification calibrates the correspondence between the coordinate system of a robot that works by recognizing the position of the work by imaging the work supplied to the work area from above with a fixed camera and the coordinate system of the fixed camera.
- the technology regarding the robot calibration system and the robot calibration method is described.
- a fixed camera is fixed downward above an arm movable area of a robot, and a work supplied to the arm movable area is fixed by the fixed camera.
- a robot control system that controls an operation of an arm of a robot while capturing an image and recognizing the position of the work, and includes a calibrator that calibrates a relationship between a coordinate system of a fixed camera and a coordinate system of the robot.
- a mark is attached to an arm of a robot, the mark attached to the arm of the robot is housed in the visual field of the fixed camera, an image is taken, and the position of the mark is set in a coordinate system of the fixed camera.
- a mark is attached to the robot arm, the mark attached to the robot arm is housed within the field of view of the fixed camera, the image is taken, and the position of the mark is set to the coordinate system of the fixed camera.
- it is possible to measure the deviation amount and deviation direction of the origin of the fixed camera coordinate system with respect to the origin of the robot coordinate system, so it is possible to correct the deviation of the origin of the fixed camera coordinate system with respect to the origin of the robot coordinate system. ..
- the tilt angle of the coordinate axis of the coordinate system of the fixed camera with respect to the coordinate axis of the coordinate system of the robot cannot be measured only by imaging the mark attached to the arm of the robot with the fixed camera. Therefore, it is impossible to correct the inclination of the coordinate axis of the coordinate system of the fixed camera caused by an attachment error or the like, and the inclination of the coordinate axis deteriorates the accuracy of calibration.
- a robot that performs a predetermined work on a work supplied to a work area, a hand camera attached to an arm of the robot, and a downward facing position higher than an arm movable area of the robot.
- a fixed camera that is fixed to the above and that images the work supplied from above to the work area from above, and a coordinate system that processes the image captured by the hand camera and sets the position of the imaging target to the origin of the reference point of the image (hereinafter It is recognized by the coordinate values of the "hand camera's vision coordinate system", the image captured by the fixed camera is processed, and the position of the image capturing target is set as the origin of the reference point of the image (hereinafter referred to as "the fixed camera An image processing unit that recognizes the coordinate values of the "vision coordinate system") and converts the coordinate values of the fixed camera's vision coordinate system to the coordinate values of the world coordinate system, which is the coordinate system that controls the operation of the robot arm.
- a coordinate conversion parameter calculation unit that calculates a coordinate conversion parameter for: a coordinate conversion unit that uses the coordinate conversion parameters to convert the coordinate values of the vision coordinate system of the fixed camera into the coordinate values of the world coordinate system; A control unit for controlling the position of the arm with coordinate values of the world coordinate system, and a calibration jig arranged at a predetermined position within the visual field of the fixed camera, wherein the coordinate conversion parameter is the fixed camera.
- the coordinate conversion parameter calculation unit calculates the coordinate conversion parameter.
- the control unit controls the operation of the arm of the robot to move the hand camera so that the calibration jig fits within the visual field of the hand camera, and the image processing unit causes the hand camera to move.
- the image taken by the calibration jig is processed to recognize the information about the position and the tilt angle of the calibration jig in the vision coordinate system of the hand camera, and the fixed camera takes an image of the calibration jig.
- the image of the hand camera recognized by the image processing unit by processing the processed image and recognizing information about the position and the tilt angle of the calibration jig in the vision coordinate system of the fixed camera.
- the origin correction parameter and the coordinate axis tilt correction parameter are calculated based on the relationship between the position and the tilt angle information.
- the calibration jig arranged at the predetermined position is imaged by both the fixed camera and the hand camera, and the information regarding the position and tilt angle of the calibration jig is recognized in each vision coordinate system by image processing.
- the origin correction parameter can be calculated based on the positional relationship between the calibration jigs in both coordinate systems, and the coordinate axis tilt correction parameter can be calculated based on the relationship between the tilt angles of the calibration jigs in both coordinate systems. be able to.
- the coordinate axis of the vision coordinate system of the fixed camera is inclined with respect to the coordinate axis of the world coordinate system that is the robot coordinate system
- the inclination of the coordinate axis of the vision coordinate system and the deviation of the origin are used as the coordinate axis inclination correction parameter.
- the coordinate value of the fixed camera's vision coordinate system can be accurately converted to the coordinate value of the world coordinate system by accurately correcting with the origin correction parameter.
- FIG. 1 is a front view showing the appearance of the robot control system.
- FIG. 2 is a block diagram showing the electrical configuration of the robot control system.
- FIG. 3 is a plan view of the calibration jig.
- FIG. 4 is a flow chart showing the flow of processing in the first half of the calibration execution program.
- FIG. 5 is a flowchart showing the flow of processing in the latter half of the calibration execution program.
- the robot 11 is, for example, a 5-axis vertical multi-joint robot, and includes a fixed base 13 installed on the factory floor 12 and a fixed base 13 rotatably provided on the fixed base 13 about a first joint shaft 14 (J1).
- the third arm 19 rotatably provided by the third arm 19, the wrist portion 21 rotatably provided by the fourth joint shaft 20 (J4) at the tip of the third arm 19, and the fifth joint shaft 21 by the wrist portion 21. 22 (J5) and an end effector 23 which is attached so as to be rotatable and replaceable.
- the end effector 23 attached to the wrist portion 21 is configured to rotate by the fourth joint shaft 20 which is the joint shaft of the wrist portion 21.
- the end effector 23 may be, for example, a suction nozzle, a hand, a gripper, a welding machine, or the like.
- the first to fifth joint shafts 14, 16, 18, 20, 22 of the robot 11 are driven by servomotors 25 to 29 (see FIG. 2), respectively.
- each of the servo motors 25 to 29 is provided with an encoder 31 to 35 for detecting a rotation angle, and information on the rotation angle detected by each of the encoders 31 to 35 is transmitted via a servo amplifier 36. It is fed back to the control unit 37.
- the controller 37 feeds back the servo motors 25 to 29 via the servo amplifier 36 so that the rotation angles of the servo motors 25 to 29 detected by the encoders 31 to 35 match the respective target rotation angles.
- the positions of the arms 15, 17, 19 of the robot 11, the wrist part 21, and the end effector 23 are feedback-controlled to the respective target positions.
- the servo amplifier 36 is a multi-axis amplifier that feedback-controls a plurality of servo motors 25 to 29, but the servo motors 25 to 29 are feedback-controlled by separate servo amplifiers one by one. Is also good.
- a work (not shown) as a work target is supplied to a work area 38 at a constant height at a predetermined position of an arm movable region of the robot 11 (a region where the end effector 23 on the tip side of the wrist 21 can move).
- a supply device 39 is installed.
- the supply device 39 may be configured by a conveyor, or a parts feeder having any structure such as a vibration type parts feeder may be used.
- the height position of the work area 38 is known. It may be a constant height position.
- a wrist camera 21 which is the arm tip of the robot 11, has a hand camera 40 for picking up an image of a work supplied to the work area 38 or a calibration jig 55 arranged at a predetermined position from above. It is installed.
- the hand camera 40 is a two-dimensional camera that captures a two-dimensional image, the optical axis of which is parallel to the fifth joint axis 22, and the lens 41 of the hand camera 40 has a tip end side of the fifth joint axis 22 ( It is fixed so that it is located on the arm tip side).
- a fixed camera 51 is fixed downward at a predetermined position of a fixed structure 50 (for example, the ceiling of a robot protection fence) installed at a position higher than the arm movable area of the robot 11, and its lens 52 side is directed downward.
- the fixed camera 51 is used as a two-dimensional camera for taking an image of the work supplied to the work area 38 or the calibration jig 55 arranged at a predetermined position from above.
- the calibration jig 55 is formed of a resin plate, a glass plate, a metal plate, or the like into a quadrangular plate shape, and a small circular calibration mark 56 is formed on the upper surface of the calibration jig 55 at a predetermined pitch in a matrix shape (lattice pattern). It is formed in a dot shape.
- the robot control unit 42 that controls the operation of the robot 11 configured as described above, as shown in FIG. 2, has an image processing unit 43, a coordinate conversion parameter calculation unit 45, a coordinate conversion unit 44, a control unit 37, and a servo amplifier 36. And so on.
- the image processing unit 43 processes the two-dimensional image captured by the hand camera 40 or the fixed camera 51 to determine the position of the work on the work area 38 or the position of the specific calibration mark 56 of the calibration jig 55. It is recognized by the coordinate value of a two-dimensional orthogonal coordinate system (hereinafter referred to as "vision coordinate system") having the reference point (for example, the center of the image) of the image as the origin.
- vision coordinate system a two-dimensional orthogonal coordinate system having the reference point (for example, the center of the image) of the image as the origin.
- the coordinate axes of this vision coordinate system are the Xv axis and the Yv axis that are orthogonal to each other on the image.
- the optical axis of the hand camera 40 is directed vertically downward, so the image taken by the hand camera 40 is a fixed camera. Similar to the image captured by 51, the image is obtained by capturing a horizontal plane, and the Xv axis and the Yv axis of the vision coordinate system are orthogonal coordinate axes on the horizontal plane.
- the image processing unit 43 recognizes the position of the work or the position of the specific calibration mark 56 of the calibration jig 55 by the image processing by the coordinate value of each pixel in the vision coordinate system.
- the coordinate conversion unit 44 converts the coordinate values of the two-dimensional vision coordinate system recognized as the position of the work by the image processing of the image processing unit 43 into the arms 15, 17, It is converted into coordinate values of the world coordinate system for controlling the positions of 19, the wrist 21, and the end effector 23.
- This world coordinate system is a three-dimensional orthogonal coordinate system with the reference point as the origin, and the coordinate axes of the world coordinate system are the orthogonal coordinate axes (X axis and Y axis) on the horizontal plane and the vertically upward coordinate axis (Z axis). is there.
- the unit of the coordinate value of this world coordinate system is the unit of length (for example, ⁇ m unit).
- the origin (reference point) of this world coordinate system is, for example, the center of the arm movable region of the robot 11 (the region where the end effector 23 on the tip side of the wrist 21 can move).
- the coordinate conversion unit 44 converts the coordinate values of the fixed camera 51 in the vision coordinate system into the coordinate values of the world coordinate system during the operation of the robot 11, the coordinate conversion calculated by the coordinate conversion parameter calculation unit 45.
- the coordinate values of the fixed camera 51 in the vision coordinate system are converted into coordinate values in the world coordinate system.
- the coordinate conversion parameter includes an origin correction parameter that corrects the origin of the vision coordinate system of the fixed camera 51 and a coordinate axis tilt correction parameter that corrects the tilt of the coordinate axis of the fixed camera 51 in the vision coordinate system.
- a method of calculating the origin correction parameter and the coordinate axis tilt correction parameter of the fixed camera 51 will be described later.
- the inclination angle Rz and the origin of the coordinate axis of the vision coordinate system of the hand camera 40 with respect to the coordinate axis of the world coordinate system are corrected.
- the coordinate value of the hand camera 40 in the vision coordinate system is converted into the coordinate value of the world coordinate system.
- This correction is performed by obtaining the rotation matrix indicating the inclination angle Rz of each coordinate axis and the translation vector to the position (X1w, Y1w) of the vision coordinate origin on the world coordinate system, and thus the coordinates of the vision coordinate system of the hand camera 40.
- the value (Xav, Yav) can be corrected by the following equation to be converted into the coordinate value (Xw', Yw') in the world coordinate system.
- the tilt angle Rz of the coordinate axis of the vision coordinate system of the hand camera 40 with respect to the XY coordinate axes of the world coordinate system is calculated by the following equation.
- Rz [installation angle of the hand camera 40]+[rotation angle of the first joint shaft 14]
- the installation angle of the hand camera 40 is the installation angle of the hand camera 40 with respect to the wrist 21 of the robot 11, and when the hand camera 40 is installed on the wrist 21 of the robot 11, the installation of the hand camera 40 with respect to the wrist 21.
- the installation angle is measured and the value stored in the non-volatile memory (not shown) of the robot control unit 42 is referred to.
- the output of the encoder 31 of the servo motor 25 of the first joint shaft 14 when the arm position of the robot 11 (position of the hand camera 40) is stopped at the imaging position The control unit 37 of the robot control unit 42 acquires the value in real time to calculate the rotation angle of the first joint shaft 14.
- the coordinate value of the origin of the vision coordinate system of the hand camera 40 is handed to the robot 11 so that the control unit 37 of the robot control unit 42 matches the coordinate value of the world coordinate system designated as the imaging position of the hand camera 40.
- the camera 40 is left in a calibrated state when assembled. Since the relative positional relationship between the arm position of the robot 11 and the hand camera 40 is always constant, the distance between the reference point of the robot 11 and the center of the optical axis of the hand camera 40 is measured, and this is used as a correction value from the controller 37.
- the origin of the vision coordinate system of the hand camera 40 moves to the position (X1w , Y1w ) on the world coordinate system. To be controlled.
- the control unit 37 of the robot control unit 42 uses the rotation angles of the servo motors 25 to 29 detected by the encoders 31 to 35 and the relative positional relationship between the hand camera 40 and the wrist unit 21 (end effector 23) in real time. Then, the coordinate value (X1w, Y1w) of the origin of the vision coordinate system of the hand camera 40 is calculated.
- the control unit 37 of the robot control unit 42 recognizes each arm 15, 17, 19 of the robot 11 based on the position of the work converted into the coordinate value of the three-dimensional world coordinate system by the coordinate conversion unit 44 during the operation of the robot 11.
- the target positions of the wrist 21 and the end effector 23 are set by the coordinate values of the world coordinate system, and the positions of the arms 15, 17, 19 and the wrist 21 and the end effector 23 are controlled by the coordinate values of the world coordinate system.
- step 101 each arm 15, 17, 19 of the robot 11 is operated to a position where the hand camera 40 is out of the field of view of the fixed camera 51, and the hand camera 40 is retracted.
- step 102 guidance and/or voice guidance for instructing the operator to set the calibration jig 55 at a predetermined position in the work area 38 located at the center of the visual field of the fixed camera 51 is displayed and/or voiced. Notify the worker. By this guidance, the worker performs the work of setting the calibration jig 55 at a predetermined position in the work area 38.
- step 104 the fixed camera 51 captures an image of the calibration jig 55, and the next step 105 processes the image captured by the fixed camera 51.
- the position of the central calibration mark 56 (A) which is one specific point of the calibration jig 55, is recognized by the coordinate value of the vision coordinate system of the fixed camera 51, and the two specific points 2
- the position of the calibration mark 56 (B, C) at the corner portion of the location is recognized by the coordinate value of the fixed camera 51 in the vision coordinate system.
- the tilt angle of the straight line connecting the calibration marks 56 (B, C) at the two corners of the fixed camera 51 in the vision coordinate system is calculated.
- the positions of the two calibration marks 56 (A, D) near the center of the calibration jig 55 are recognized by the coordinate values of the vision coordinate system of the fixed camera 51, and the calibration marks 56 (A , D), the actual length dimension between the positions is divided by the number of pixels between the positions on the image to calculate the resolution of the fixed camera 51.
- step 106 in FIG. 5 the arm position of the robot 11 is controlled so that the calibration jig 55 fits within the visual field of the hand camera 40, and the hand camera 40 is moved to one position of the calibration jig 55. It is moved to a predetermined position above the central calibration mark 56(A) which is a specific point.
- step 107 the hand jig 40 takes an image of the calibration jig 55, and in the next step 108, the image picked up by the hand camera 40 is processed to obtain a calibration mark at the center of the calibration jig 55.
- the position of 56 (A) is recognized by the coordinate value of the vision coordinate system of the hand camera 40.
- step 109 where the hand camera 40 is moved above the calibration mark 56 (B) at one corner of the calibration jig 55, and then the process proceeds to step 110 where the calibration of the hand camera 40 is corrected.
- the tool 55 is imaged, and in the next step 111, the image captured by the hand camera 40 is processed so that the position of the calibration mark 56 (B) at one corner of the calibration jig 55 is determined by the vision of the hand camera 40. Recognize with the coordinate values of the coordinate system.
- step 115 the origin correction parameter and the coordinate axis tilt correction parameter for converting the coordinate value of the fixed camera 51 in the vision coordinate system into the coordinate value in the world coordinate system are calculated as follows.
- the origin correction parameter When calculating the origin correction parameter, the coordinate value of the position of the calibration mark 56 (A) at the center of the calibration jig 55, which is recognized by processing the captured image of the fixed camera 51, and the captured image of the hand camera 40.
- the difference value ( ⁇ X, ⁇ Y) from the coordinate value of the position of the calibration mark 56(A) at the center of the calibration jig 55 which is recognized by processing is calculated as the origin correction parameter.
- the coordinate value of the recognition target is recognized in pixel units, whereas the unit of the coordinate value in the world coordinate system is a unit of length (for example, a unit of [ ⁇ m]). is there. For this reason, it is necessary to convert the coordinate value of the fixed camera 51 in the pixel unit of the vision coordinate system into the coordinate value of the unit of the same length as the coordinate value of the world coordinate system (for example, the unit of [ ⁇ m]).
- the coordinates of the fixed camera 51 in the vision coordinate system in pixel units are multiplied.
- the value is converted into a coordinate value of the same length unit as the coordinate value of the world coordinate system (for example, a unit of [ ⁇ m]).
- the calibration work is completed when the operator takes out the calibration jig 55 set at a predetermined position in the work area 38.
- the calibration mark 56(A) at the center of the calibration jig 55 is set as one specific point for origin correction, but one position other than this is set.
- the calibration mark 56 may be used as one specific point for origin correction, and the point is that the information about the position of the calibration jig 55 can be acquired by image processing.
- the calibration marks 56 (B, C) at the two corners of the calibration jig 55 are set as the two specific points for correcting the coordinate axis inclination.
- Two calibration marks 56 at other positions may be used as the two specific points for correcting the coordinate axis inclination.
- the inclination angle of the straight line connecting the two specific points is the inclination angle of the calibration jig 55. It may be possible to obtain it as information regarding.
- a straight line may be formed on the upper surface of the calibration jig 55 and the inclination angle of this straight line may be measured by image processing.
- the tilt angle of the calibration jig 55 may be measured by recognizing one edge of the calibration jig 55 by image processing. The point is that the angle information about the tilt angle of the calibration jig 55 may be acquired by image processing.
- the hand camera 40 performs the image capturing/image processing (steps 106 to 114). Since the coordinate values of the calibration marks (A), (B), and (C) that move in steps 106, 109, and 112 in the world coordinate system have already been recognized in the image processing of the fixed camera in step 105, since they have been performed. The coordinate value of this world coordinate system is automatically instructed by the control unit 37. That is, it is considered that the operator does not need to re-indicate the coordinate values of the calibration marks (A), (B), and (C) in the world coordinate system depending on the arrangement position of the calibration jig 55.
- the above-described calibration execution programs of FIGS. 4 and 5 are executed by the robot control unit 42 when the installation work of the robot 11 is completed. However, when the frequency of occurrence of a work error of the robot 11 increases, the calibration is performed again. When it is determined that is necessary, the robot control unit 42 re-executes the calibration execution program of FIGS. 4 and 5 to re-measure each correction parameter, or to inspect the robot 11.
- the robot control unit 42 is operated by the robot control unit 42 as shown in FIG.
- the calibration execution program of FIG. 5 may be re-executed to re-measure each correction parameter.
- each time the correction parameters are remeasured the correction parameters may be updated with the measured values, or the correction parameters measured at the completion of the installation work of the robot 11 may be updated by the robot control unit 42.
- each re-measured correction parameter Stored in the non-volatile memory, and at the time of re-measurement of each correction parameter, each re-measured correction parameter is compared with the stored value of the non-volatile memory, and when the difference between the two exceeds a predetermined threshold value. Alternatively, each correction parameter may be updated to the remeasured value.
- the relative position/angle relationship between the fixed camera 51 and the robot 11 changes with time
- the installation angle of the fixed camera 51 changes with time
- the fixed camera 51 and the work area 38 and the like.
- each correction parameter is remeasured and each correction parameter is remeasured. Can be corrected to an appropriate value.
- the calibration jig 55 arranged at a predetermined position is imaged by both the fixed camera 51 and the hand camera 40, and the calibration jig 55 is image-processed in each vision coordinate system. Since the information about the position and the tilt angle of the coordinate system is recognized, the origin correction parameter can be calculated based on the positional relationship between the calibration jigs 55 of both coordinate systems, and the tilt angle of the calibration jig 55 of both coordinate systems can be calculated.
- the coordinate axis tilt correction parameter can be calculated based on the relationship Accordingly, even if the coordinate axis of the vision coordinate system of the fixed camera 51 is inclined with respect to the coordinate axis of the world coordinate system which is the coordinate system of the robot 11, the inclination of the coordinate axis of the vision coordinate system and the deviation of the origin are corrected for the coordinate axis inclination correction.
- the coordinate values of the vision coordinate system of the fixed camera 51 can be accurately converted into the coordinate values of the world coordinate system by accurately correcting the parameters and the origin correction parameter.
- the calibration marks 56 arranged on the upper surface of the calibration jig 55 have a circular shape, but may have another shape such as a quadrangle.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
This invention is provided with: a hand camera (40) attached to an arm of a robot (11); a fixed camera (51) that is fixed facing downward at a site higher than the area in which the robot arm can move; a coordinate conversion parameter calculation unit (45) for calculating coordinate conversion parameters for converting coordinate values in the vision coordinate system of the fixed camera to coordinate values in a global coordinate system, which is a coordinate system of the robot; and a calibration tool (55) disposed at a prescribed position. The coordinate conversion parameters include an origin correction parameter for correcting the origin of the vision coordinate system of the fixed camera and a coordinate axis tilt correction parameter for correcting the tilt of the coordinate axes of the vision coordinate system of the fixed camera. The origin correction parameter and the coordinate axis tilt correction parameter are calculated on the basis of the relationship between information relating to the position and tilt angle of the calibration tool in the vision coordinate system of the hand camera and information relating to the position and tilt angle of the calibration tool in the vision coordinate system of the fixed camera.
Description
本明細書は、作業エリアに供給されたワークを上方から固定カメラで撮像して当該ワークの位置を認識して作業するロボットの座標系と固定カメラの座標系との対応関係をキャリブレーション(校正)するロボットキャリブレーションシステム及びロボットキャリブレーション方法に関する技術を開示したものである。
This specification calibrates the correspondence between the coordinate system of a robot that works by recognizing the position of the work by imaging the work supplied to the work area from above with a fixed camera and the coordinate system of the fixed camera. The technology regarding the robot calibration system and the robot calibration method is described.
近年、特許文献1(特開2013-215866号公報)に記載されているように、ロボットのアーム可動領域の上方に固定カメラを下向きに固定し、アーム可動領域に供給されたワークを固定カメラで撮像して当該ワークの位置を認識しながらロボットのアームの動作を制御するロボット制御システムにおいて、固定カメラの座標系とロボットの座標系との関係をキャリブレーションするキャリブレーターを備えたものがある。この特許文献1のキャリブレーション方法は、ロボットのアームにマークを付して、固定カメラの視野内にロボットのアームに付されたマークを収めて撮像して当該マークの位置を固定カメラの座標系で認識することで、固定カメラの座標系とロボットの座標系との対応関係をキャリブレーションするようにしている。
In recent years, as described in Japanese Patent Laid-Open No. 2013-215866, a fixed camera is fixed downward above an arm movable area of a robot, and a work supplied to the arm movable area is fixed by the fixed camera. 2. Description of the Related Art There is a robot control system that controls an operation of an arm of a robot while capturing an image and recognizing the position of the work, and includes a calibrator that calibrates a relationship between a coordinate system of a fixed camera and a coordinate system of the robot. According to the calibration method of Patent Document 1, a mark is attached to an arm of a robot, the mark attached to the arm of the robot is housed in the visual field of the fixed camera, an image is taken, and the position of the mark is set in a coordinate system of the fixed camera. By recognizing with, the correspondence between the coordinate system of the fixed camera and the coordinate system of the robot is calibrated.
ところで、キャリブレーションの精度を高めるには、固定カメラの座標系の座標軸(X軸とY軸)の方向をロボットの座標系の座標軸(X軸とY軸)の方向と正確に一致させる必要がある。しかし、実際には、ロボットのアーム可動領域の上方の固定構造物(ロボット防護柵の天井等)に固定カメラを取り付けるときの取付誤差によって固定カメラの座標系の座標軸の方向とロボットの座標系の座標軸の方向との間に多少のずれが生じることは避けられない。
By the way, in order to improve the accuracy of calibration, it is necessary to make the directions of the coordinate axes (X and Y axes) of the coordinate system of the fixed camera exactly match the directions of the coordinate axes (X and Y axes) of the robot coordinate system. is there. However, in reality, due to the mounting error when mounting the fixed camera on the fixed structure above the robot arm movable area (such as the ceiling of the robot protection fence), the direction of the coordinate axis of the fixed camera coordinate system and the robot coordinate system It is inevitable that some deviation from the direction of the coordinate axes will occur.
上記特許文献1のキャリブレーション方法では、ロボットのアームにマークを付して、固定カメラの視野内にロボットのアームに付されたマークを収めて撮像して当該マークの位置を固定カメラの座標系で認識することで、ロボットの座標系の原点に対する固定カメラの座標系の原点のずれ量とずれ方向を計測できるため、ロボットの座標系の原点に対する固定カメラの座標系の原点のずれを補正できる。しかし、ロボットのアームに付されたマークを固定カメラで撮像するだけでは、ロボットの座標系の座標軸に対する固定カメラの座標系の座標軸の傾き角度を計測することはできない。このため、取付誤差等によって生じる固定カメラの座標系の座標軸の傾きを補正することはできず、その座標軸の傾きによってキャリブレーションの精度が低下するという欠点がある。
In the calibration method of the above-mentioned Patent Document 1, a mark is attached to the robot arm, the mark attached to the robot arm is housed within the field of view of the fixed camera, the image is taken, and the position of the mark is set to the coordinate system of the fixed camera. By recognizing with, it is possible to measure the deviation amount and deviation direction of the origin of the fixed camera coordinate system with respect to the origin of the robot coordinate system, so it is possible to correct the deviation of the origin of the fixed camera coordinate system with respect to the origin of the robot coordinate system. .. However, the tilt angle of the coordinate axis of the coordinate system of the fixed camera with respect to the coordinate axis of the coordinate system of the robot cannot be measured only by imaging the mark attached to the arm of the robot with the fixed camera. Therefore, it is impossible to correct the inclination of the coordinate axis of the coordinate system of the fixed camera caused by an attachment error or the like, and the inclination of the coordinate axis deteriorates the accuracy of calibration.
上記課題を解決するために、作業エリアに供給されたワークに対して所定の作業を行うロボットと、前記ロボットのアームに取り付けられたハンドカメラと、前記ロボットのアーム可動領域よりも高い場所に下向きに固定され、前記作業エリアに供給されたワークを上方から撮像する固定カメラと、前記ハンドカメラで撮像した画像を処理して撮像対象の位置を当該画像の基準点を原点とする座標系(以下「ハンドカメラのビジョン座標系」という)の座標値で認識し、前記固定カメラで撮像した画像を処理して撮像対象の位置を当該画像の基準点を原点とする座標系(以下「固定カメラのビジョン座標系」という)の座標値で認識する画像処理部と、前記固定カメラのビジョン座標系の座標値を前記ロボットのアームの動作を制御する座標系である世界座標系の座標値に変換するための座標変換パラメータを算出する座標変換パラメータ算出部と、前記座標変換パラメータを用いて前記固定カメラのビジョン座標系の座標値を前記世界座標系の座標値に変換する座標変換部と、前記ロボットのアームの位置を前記世界座標系の座標値で制御する制御部と、前記固定カメラの視野内に収まる所定位置に配置されたキャリブレーション治具とを備え、前記座標変換パラメータは、前記固定カメラのビジョン座標系の原点を補正する原点補正パラメータと、前記固定カメラのビジョン座標系の座標軸の傾きを補正する座標軸傾き補正パラメータとを含み、前記座標変換パラメータ算出部が前記座標変換パラメータを算出する際に、前記制御部は、前記ハンドカメラの視野内に前記キャリブレーション治具が収まるように前記ロボットのアームの動作を制御して前記ハンドカメラを移動させ、前記画像処理部は、前記ハンドカメラで前記キャリブレーション治具を撮像した画像を処理して前記キャリブレーション治具の位置及び傾き角度に関する情報を前記ハンドカメラのビジョン座標系で認識すると共に、前記固定カメラで前記キャリブレーション治具を撮像した画像を処理して前記キャリブレーション治具の位置及び傾き角度に関する情報を前記固定カメラのビジョン座標系で認識し、前記座標変換パラメータ算出部は、前記画像処理部が認識した前記ハンドカメラのビジョン座標系での前記キャリブレーション治具の位置及び傾き角度に関する情報と前記固定カメラのビジョン座標系での前記キャリブレーション治具の位置及び傾き角度に関する情報との関係に基づいて前記原点補正パラメータ及び前記座標軸傾き補正パラメータを算出するようにしたものである。
In order to solve the above problems, a robot that performs a predetermined work on a work supplied to a work area, a hand camera attached to an arm of the robot, and a downward facing position higher than an arm movable area of the robot. A fixed camera that is fixed to the above and that images the work supplied from above to the work area from above, and a coordinate system that processes the image captured by the hand camera and sets the position of the imaging target to the origin of the reference point of the image (hereinafter It is recognized by the coordinate values of the "hand camera's vision coordinate system", the image captured by the fixed camera is processed, and the position of the image capturing target is set as the origin of the reference point of the image (hereinafter referred to as "the fixed camera An image processing unit that recognizes the coordinate values of the "vision coordinate system") and converts the coordinate values of the fixed camera's vision coordinate system to the coordinate values of the world coordinate system, which is the coordinate system that controls the operation of the robot arm. A coordinate conversion parameter calculation unit that calculates a coordinate conversion parameter for: a coordinate conversion unit that uses the coordinate conversion parameters to convert the coordinate values of the vision coordinate system of the fixed camera into the coordinate values of the world coordinate system; A control unit for controlling the position of the arm with coordinate values of the world coordinate system, and a calibration jig arranged at a predetermined position within the visual field of the fixed camera, wherein the coordinate conversion parameter is the fixed camera. Of the vision coordinate system and a coordinate axis tilt correction parameter that corrects the tilt of the coordinate axis of the vision coordinate system of the fixed camera, and the coordinate conversion parameter calculation unit calculates the coordinate conversion parameter. At this time, the control unit controls the operation of the arm of the robot to move the hand camera so that the calibration jig fits within the visual field of the hand camera, and the image processing unit causes the hand camera to move. The image taken by the calibration jig is processed to recognize the information about the position and the tilt angle of the calibration jig in the vision coordinate system of the hand camera, and the fixed camera takes an image of the calibration jig. The image of the hand camera recognized by the image processing unit by processing the processed image and recognizing information about the position and the tilt angle of the calibration jig in the vision coordinate system of the fixed camera. Information about the position and tilt angle of the calibration jig in the coordinate system and the position of the calibration jig in the fixed camera vision coordinate system. The origin correction parameter and the coordinate axis tilt correction parameter are calculated based on the relationship between the position and the tilt angle information.
このように、所定位置に配置されたキャリブレーション治具を固定カメラとハンドカメラの両方で撮像して、画像処理により各々のビジョン座標系でキャリブレーション治具の位置及び傾き角度に関する情報を認識するようにすれば、両座標系のキャリブレーション治具の位置関係に基づいて原点補正パラメータを算出できると共に、両座標系のキャリブレーション治具の傾き角度の関係に基づいて座標軸傾き補正パラメータを算出することができる。これにより、ロボットの座標系である世界座標系の座標軸に対して固定カメラのビジョン座標系の座標軸が傾いていたとしても、そのビジョン座標系の座標軸の傾きと原点のずれを座標軸傾き補正パラメータと原点補正パラメータで精度良く補正して、固定カメラのビジョン座標系の座標値を世界座標系の座標値に精度良く変換することができる。
In this way, the calibration jig arranged at the predetermined position is imaged by both the fixed camera and the hand camera, and the information regarding the position and tilt angle of the calibration jig is recognized in each vision coordinate system by image processing. By doing so, the origin correction parameter can be calculated based on the positional relationship between the calibration jigs in both coordinate systems, and the coordinate axis tilt correction parameter can be calculated based on the relationship between the tilt angles of the calibration jigs in both coordinate systems. be able to. As a result, even if the coordinate axis of the vision coordinate system of the fixed camera is inclined with respect to the coordinate axis of the world coordinate system that is the robot coordinate system, the inclination of the coordinate axis of the vision coordinate system and the deviation of the origin are used as the coordinate axis inclination correction parameter. The coordinate value of the fixed camera's vision coordinate system can be accurately converted to the coordinate value of the world coordinate system by accurately correcting with the origin correction parameter.
以下、本明細書に開示した一実施例を図面を用いて説明する。
まず、図1に基づいてロボット11の構成を説明する。 An embodiment disclosed in the present specification will be described below with reference to the drawings.
First, the configuration of therobot 11 will be described with reference to FIG.
まず、図1に基づいてロボット11の構成を説明する。 An embodiment disclosed in the present specification will be described below with reference to the drawings.
First, the configuration of the
ロボット11は、例えば5軸垂直多関節ロボットであり、工場フロア12に設置された固定ベース13と、この固定ベース13上に第1関節軸14(J1)を中心に回転可能に設けられた第1アーム15と、この第1アーム15の先端に第2関節軸16(J2)によって旋回可能に設けられた第2アーム17と、この第2アーム17の先端に第3関節軸18(J3)によって旋回可能に設けられた第3アーム19と、この第3アーム19の先端に第4関節軸20(J4)によって旋回可能に設けられた手首部21と、この手首部21に第5関節軸22(J5)を中心に回転可能且つ交換可能に取り付けられたエンドエフェクタ23とから構成されている。これにより、手首部21に取り付けたエンドエフェクタ23は、その手首部21の関節軸である第4関節軸20によって旋回動作するようになっている。
The robot 11 is, for example, a 5-axis vertical multi-joint robot, and includes a fixed base 13 installed on the factory floor 12 and a fixed base 13 rotatably provided on the fixed base 13 about a first joint shaft 14 (J1). One arm 15, a second arm 17 rotatably provided at the tip of the first arm 15 by a second joint shaft 16 (J2), and a third joint shaft 18 (J3) at the tip of the second arm 17. The third arm 19 rotatably provided by the third arm 19, the wrist portion 21 rotatably provided by the fourth joint shaft 20 (J4) at the tip of the third arm 19, and the fifth joint shaft 21 by the wrist portion 21. 22 (J5) and an end effector 23 which is attached so as to be rotatable and replaceable. As a result, the end effector 23 attached to the wrist portion 21 is configured to rotate by the fourth joint shaft 20 which is the joint shaft of the wrist portion 21.
この場合、エンドエフェクタ23は、例えば、吸着ノズル、ハンド、グリッパ、溶接機等のいずれであっても良い。ロボット11の第1~第5の各関節軸14,16,18,20,22は、それぞれサーボモータ25~29(図2参照)により駆動されるようになっている。図2に示すように、各サーボモータ25~29には、それぞれ回転角を検出するエンコーダ31~35が設けられ、各エンコーダ31~35で検出した回転角の情報がサーボアンプ36を経由して制御部37にフィードバックされる。これにより、制御部37は、各エンコーダ31~35で検出した各サーボモータ25~29の回転角が各々の目標回転角と一致するようにサーボアンプ36を介して各サーボモータ25~29をフィードバック制御することで、ロボット11の各アーム15,17,19と手首部21とエンドエフェクタ23の位置を各々の目標位置にフィードバック制御する。
In this case, the end effector 23 may be, for example, a suction nozzle, a hand, a gripper, a welding machine, or the like. The first to fifth joint shafts 14, 16, 18, 20, 22 of the robot 11 are driven by servomotors 25 to 29 (see FIG. 2), respectively. As shown in FIG. 2, each of the servo motors 25 to 29 is provided with an encoder 31 to 35 for detecting a rotation angle, and information on the rotation angle detected by each of the encoders 31 to 35 is transmitted via a servo amplifier 36. It is fed back to the control unit 37. As a result, the controller 37 feeds back the servo motors 25 to 29 via the servo amplifier 36 so that the rotation angles of the servo motors 25 to 29 detected by the encoders 31 to 35 match the respective target rotation angles. By controlling, the positions of the arms 15, 17, 19 of the robot 11, the wrist part 21, and the end effector 23 are feedback-controlled to the respective target positions.
図2の構成例では、サーボアンプ36は、複数のサーボモータ25~29をフィードバック制御する多軸アンプであるが、サーボモータ25~29を1台ずつ別々のサーボアンプでフィードバック制御するようにしても良い。
In the configuration example of FIG. 2, the servo amplifier 36 is a multi-axis amplifier that feedback-controls a plurality of servo motors 25 to 29, but the servo motors 25 to 29 are feedback-controlled by separate servo amplifiers one by one. Is also good.
ロボット11のアーム可動領域(手首部21先端側のエンドエフェクタ23が移動可能な領域)の所定位置には、作業対象となるワーク(図示せず)を一定の高さ位置の作業エリア38に供給する供給装置39が設置されている。この供給装置39は、コンベアで構成したものであっても良いし、振動式パーツフィーダ等、どの様な構成のパーツフィーダを用いても良く、要は、作業エリア38の高さ位置が既知の一定の高さ位置であれば良い。
A work (not shown) as a work target is supplied to a work area 38 at a constant height at a predetermined position of an arm movable region of the robot 11 (a region where the end effector 23 on the tip side of the wrist 21 can move). A supply device 39 is installed. The supply device 39 may be configured by a conveyor, or a parts feeder having any structure such as a vibration type parts feeder may be used. In short, the height position of the work area 38 is known. It may be a constant height position.
図1に示すように、ロボット11のアーム先端部である手首部21には、作業エリア38に供給されたワーク又は所定位置に配置されたキャリブレーション治具55を上方から撮像するハンドカメラ40が取り付けられている。このハンドカメラ40は、2次元の画像を撮像する2次元のカメラであり、光軸が第5関節軸22と平行で、且つ、ハンドカメラ40のレンズ41が第5関節軸22の先端側(アーム先端側)に位置するように固定されている。
As shown in FIG. 1, a wrist camera 21, which is the arm tip of the robot 11, has a hand camera 40 for picking up an image of a work supplied to the work area 38 or a calibration jig 55 arranged at a predetermined position from above. It is installed. The hand camera 40 is a two-dimensional camera that captures a two-dimensional image, the optical axis of which is parallel to the fifth joint axis 22, and the lens 41 of the hand camera 40 has a tip end side of the fifth joint axis 22 ( It is fixed so that it is located on the arm tip side).
更に、ロボット11のアーム可動領域よりも高い場所に設置された固定構造物50(例えばロボット防護柵の天井)の所定位置には、固定カメラ51が下向きに固定され、そのレンズ52側が下方に向けられている。この固定カメラ51は、作業エリア38に供給されたワーク又は所定位置に配置されたキャリブレーション治具55を上方から撮像する2次元のカメラとして使用される。
Further, a fixed camera 51 is fixed downward at a predetermined position of a fixed structure 50 (for example, the ceiling of a robot protection fence) installed at a position higher than the arm movable area of the robot 11, and its lens 52 side is directed downward. Has been. The fixed camera 51 is used as a two-dimensional camera for taking an image of the work supplied to the work area 38 or the calibration jig 55 arranged at a predetermined position from above.
図3に示すように、キャリブレーション治具55は、樹脂板、ガラス板、金属板等により四角形の板状に形成され、その上面に小さな円形のキャリブレーションマーク56が所定ピッチでマトリックス状(格子点状)に形成されている。
As shown in FIG. 3, the calibration jig 55 is formed of a resin plate, a glass plate, a metal plate, or the like into a quadrangular plate shape, and a small circular calibration mark 56 is formed on the upper surface of the calibration jig 55 at a predetermined pitch in a matrix shape (lattice pattern). It is formed in a dot shape.
後述するように、固定カメラ51のビジョン座標系の座標値をロボット11のアーム位置を制御する座標系である世界座標系の座標値に変換するための座標変換パラメータを算出する場合には、所定位置に配置されたキャリブレーション治具55を固定カメラ51とハンドカメラ40の両方で順番に撮像するが、作業エリア38に供給されたワークを撮像する場合には、要求される分解能やロボット11のアーム可動領域内における作業エリア38の位置等によってハンドカメラ40と固定カメラ51とを使い分けるようにしている。ここで、固定カメラ51の視野やハンドカメラ40の視野は、ロボット11のアーム可動領域よりも狭い。ハンドカメラ40と固定カメラ51とを比較すると、視野は固定カメラ51の方が広いが、分解能(1ピクセル当たりの撮像対象の実際の長さ)はハンドカメラ40の方が細かく、高精度の画像処理が可能である。従って、例えば、作業エリア38上のワークのサイズが小さくて細かい分解能が要求される場合には、ハンドカメラ40を使用し、また、固定カメラ51の視野から外れた位置に設置された作業エリア38についてはハンドカメラ40を使用するようにすれば良い。また、固定カメラ51の視野内に複数の作業エリア38が収まる場合には、固定カメラ51の視野内に複数の作業エリア38を収めて撮像すれば、複数の作業エリア38上のワークの画像処理を能率良く行うことかできる。或は、複数の作業エリア38が存在する場合に、作業エリア38毎にハンドカメラ40を移動させてワークを撮像するようにしても良い。
As will be described later, when the coordinate conversion parameter for converting the coordinate value of the vision coordinate system of the fixed camera 51 into the coordinate value of the world coordinate system, which is the coordinate system for controlling the arm position of the robot 11, is predetermined. The calibration jig 55 arranged at the position is sequentially imaged by both the fixed camera 51 and the hand camera 40. However, when imaging the work supplied to the work area 38, the required resolution and the robot 11 are required. The hand camera 40 and the fixed camera 51 are selectively used depending on the position of the work area 38 in the arm movable area. Here, the visual field of the fixed camera 51 and the visual field of the hand camera 40 are narrower than the arm movable region of the robot 11. Comparing the hand camera 40 and the fixed camera 51, the fixed camera 51 has a wider field of view, but the hand camera 40 has a finer resolution (actual length of the imaging target per pixel), and a high-precision image. It can be processed. Therefore, for example, when the size of the work on the work area 38 is small and a fine resolution is required, the hand camera 40 is used, and the work area 38 installed outside the field of view of the fixed camera 51 is used. For, the hand camera 40 may be used. Further, when a plurality of work areas 38 are contained within the field of view of the fixed camera 51, if the plurality of work areas 38 are housed within the field of view of the fixed camera 51 and imaged, image processing of the work on the plurality of work areas 38 is performed. Can be done efficiently. Alternatively, when there are a plurality of work areas 38, the hand camera 40 may be moved for each work area 38 to image the work.
以上のように構成したロボット11の動作を制御するロボット制御ユニット42は、図2に示すように、画像処理部43、座標変換パラメータ算出部45、座標変換部44、制御部37及びサーボアンプ36等を備えた構成となっている。画像処理部43は、ハンドカメラ40又は固定カメラ51で撮像した2次元の画像を処理して作業エリア38上のワークの位置、或はキャリブレーション治具55の特定のキャリブレーションマーク56の位置を当該画像の基準点(例えば画像の中心)を原点とする2次元の直交座標系(以下「ビジョン座標系」という)の座標値で認識する。このビジョン座標系の座標軸は、画像上で直交するXv 軸とYv 軸である。作業エリア38上のワーク又はキャリブレーション治具55をハンドカメラ40で撮像する際には、ハンドカメラ40の光軸を鉛直下向きに向けて撮像するため、ハンドカメラ40で撮像した画像は、固定カメラ51で撮像した画像と同様に、水平面を撮像した画像となり、ビジョン座標系のXv 軸とYv 軸は水平面上の直交座標軸となる。画像処理部43は、画像処理によってワークの位置又はキャリブレーション治具55の特定のキャリブレーションマーク56の位置をビジョン座標系のピクセル単位の座標値で認識する。
The robot control unit 42 that controls the operation of the robot 11 configured as described above, as shown in FIG. 2, has an image processing unit 43, a coordinate conversion parameter calculation unit 45, a coordinate conversion unit 44, a control unit 37, and a servo amplifier 36. And so on. The image processing unit 43 processes the two-dimensional image captured by the hand camera 40 or the fixed camera 51 to determine the position of the work on the work area 38 or the position of the specific calibration mark 56 of the calibration jig 55. It is recognized by the coordinate value of a two-dimensional orthogonal coordinate system (hereinafter referred to as "vision coordinate system") having the reference point (for example, the center of the image) of the image as the origin. The coordinate axes of this vision coordinate system are the Xv axis and the Yv axis that are orthogonal to each other on the image. When the hand camera 40 takes an image of the work on the work area 38 or the calibration jig 55, the optical axis of the hand camera 40 is directed vertically downward, so the image taken by the hand camera 40 is a fixed camera. Similar to the image captured by 51, the image is obtained by capturing a horizontal plane, and the Xv axis and the Yv axis of the vision coordinate system are orthogonal coordinate axes on the horizontal plane. The image processing unit 43 recognizes the position of the work or the position of the specific calibration mark 56 of the calibration jig 55 by the image processing by the coordinate value of each pixel in the vision coordinate system.
一方、座標変換部44は、ロボット11の稼働中には、画像処理部43の画像処理でワークの位置として認識した2次元のビジョン座標系の座標値を、ロボット11の各アーム15,17,19と手首部21とエンドエフェクタ23の位置を制御するための世界座標系の座標値に変換する。この世界座標系は、基準点を原点とする3次元の直交座標系であり、世界座標系の座標軸は、水平面上の直交座標軸(X軸とY軸)と鉛直上向きの座標軸(Z軸)である。この世界座標系の座標値の単位は、長さの単位(例えばμm単位)である。この世界座標系の原点(基準点)は、例えば、ロボット11のアーム可動領域(手首部21先端側のエンドエフェクタ23が移動可能な領域)の中心である。
On the other hand, while the robot 11 is in operation, the coordinate conversion unit 44 converts the coordinate values of the two-dimensional vision coordinate system recognized as the position of the work by the image processing of the image processing unit 43 into the arms 15, 17, It is converted into coordinate values of the world coordinate system for controlling the positions of 19, the wrist 21, and the end effector 23. This world coordinate system is a three-dimensional orthogonal coordinate system with the reference point as the origin, and the coordinate axes of the world coordinate system are the orthogonal coordinate axes (X axis and Y axis) on the horizontal plane and the vertically upward coordinate axis (Z axis). is there. The unit of the coordinate value of this world coordinate system is the unit of length (for example, μm unit). The origin (reference point) of this world coordinate system is, for example, the center of the arm movable region of the robot 11 (the region where the end effector 23 on the tip side of the wrist 21 can move).
この場合、座標変換部44は、ロボット11の稼働中に固定カメラ51のビジョン座標系の座標値を世界座標系の座標値に変換する場合には、座標変換パラメータ算出部45で算出した座標変換パラメータを用いて固定カメラ51のビジョン座標系の座標値を世界座標系の座標値に変換する。ここで、座標変換パラメータは、固定カメラ51のビジョン座標系の原点を補正する原点補正パラメータと、固定カメラ51のビジョン座標系の座標軸の傾きを補正する座標軸傾き補正パラメータとを含む。この固定カメラ51の原点補正パラメータと座標軸傾き補正パラメータの算出方法については後述する。
In this case, when the coordinate conversion unit 44 converts the coordinate values of the fixed camera 51 in the vision coordinate system into the coordinate values of the world coordinate system during the operation of the robot 11, the coordinate conversion calculated by the coordinate conversion parameter calculation unit 45. Using the parameters, the coordinate values of the fixed camera 51 in the vision coordinate system are converted into coordinate values in the world coordinate system. Here, the coordinate conversion parameter includes an origin correction parameter that corrects the origin of the vision coordinate system of the fixed camera 51 and a coordinate axis tilt correction parameter that corrects the tilt of the coordinate axis of the fixed camera 51 in the vision coordinate system. A method of calculating the origin correction parameter and the coordinate axis tilt correction parameter of the fixed camera 51 will be described later.
一方、ハンドカメラ40のビジョン座標系の座標値を世界座標系の座標値に変換する場合には、世界座標系の座標軸に対するハンドカメラ40のビジョン座標系の座標軸の傾き角度Rzと原点を補正することによって、ハンドカメラ40のビジョン座標系の座標値を世界座標系の座標値に変換する。この補正は、それぞれの座標軸の傾き角度Rzを示す回転行列とビジョン座標原点の世界座標系上の位置(X1w,Y1w)への並進ベクトルとを求めることによって、ハンドカメラ40のビジョン座標系の座標値(Xav,Yav)を次式で補正して世界座標系の座標値(Xw',Yw')に変換することができる。
On the other hand, when converting the coordinate value of the vision coordinate system of the hand camera 40 to the coordinate value of the world coordinate system, the inclination angle Rz and the origin of the coordinate axis of the vision coordinate system of the hand camera 40 with respect to the coordinate axis of the world coordinate system are corrected. Thus, the coordinate value of the hand camera 40 in the vision coordinate system is converted into the coordinate value of the world coordinate system. This correction is performed by obtaining the rotation matrix indicating the inclination angle Rz of each coordinate axis and the translation vector to the position (X1w, Y1w) of the vision coordinate origin on the world coordinate system, and thus the coordinates of the vision coordinate system of the hand camera 40. The value (Xav, Yav) can be corrected by the following equation to be converted into the coordinate value (Xw', Yw') in the world coordinate system.
上式により、座標変換を行うためには、ハンドカメラ40のビジョン座標系の座標軸の傾き角度Rzと原点の座標値を測定する必要がある。
In order to perform coordinate conversion using the above formula, it is necessary to measure the tilt angle Rz of the coordinate axis of the vision coordinate system of the hand camera 40 and the coordinate value of the origin.
ハンドカメラ40はロボット11の手首部21に固定されているため、ロボット11のアーム位置とハンドカメラ40との相対的位置関係は常に一定に維持される。従って、世界座標系のXY座標軸に対するハンドカメラ40のビジョン座標系の座標軸の傾き角度Rzは、次式により算出される。
Since the hand camera 40 is fixed to the wrist 21 of the robot 11, the relative positional relationship between the arm position of the robot 11 and the hand camera 40 is always maintained constant. Therefore, the tilt angle Rz of the coordinate axis of the vision coordinate system of the hand camera 40 with respect to the XY coordinate axes of the world coordinate system is calculated by the following equation.
Rz=[ハンドカメラ40の設置角度]+[第1関節軸14の回転角度]
ここで、ハンドカメラ40の設置角度は、ロボット11の手首部21に対するハンドカメラ40の設置角度であり、ハンドカメラ40をロボット11の手首部21に設置したときに手首部21に対するハンドカメラ40の設置角度を測定してロボット制御ユニット42の不揮発性メモリ(図示せず)に保存しておいた値を参照する。 Rz=[installation angle of the hand camera 40]+[rotation angle of the first joint shaft 14]
Here, the installation angle of thehand camera 40 is the installation angle of the hand camera 40 with respect to the wrist 21 of the robot 11, and when the hand camera 40 is installed on the wrist 21 of the robot 11, the installation of the hand camera 40 with respect to the wrist 21. The installation angle is measured and the value stored in the non-volatile memory (not shown) of the robot control unit 42 is referred to.
ここで、ハンドカメラ40の設置角度は、ロボット11の手首部21に対するハンドカメラ40の設置角度であり、ハンドカメラ40をロボット11の手首部21に設置したときに手首部21に対するハンドカメラ40の設置角度を測定してロボット制御ユニット42の不揮発性メモリ(図示せず)に保存しておいた値を参照する。 Rz=[installation angle of the hand camera 40]+[rotation angle of the first joint shaft 14]
Here, the installation angle of the
第1関節軸14の回転角度の検出方法については、ロボット11のアーム位置(ハンドカメラ40の位置)が撮像位置で停止しているときの第1関節軸14のサーボモータ25のエンコーダ31の出力値をロボット制御ユニット42の制御部37がリアルタイムで取得して、第1関節軸14の回転角度を算出するようにしている。
Regarding the method of detecting the rotation angle of the first joint shaft 14, the output of the encoder 31 of the servo motor 25 of the first joint shaft 14 when the arm position of the robot 11 (position of the hand camera 40) is stopped at the imaging position The control unit 37 of the robot control unit 42 acquires the value in real time to calculate the rotation angle of the first joint shaft 14.
一方、ハンドカメラ40のビジョン座標系の原点の座標値は、ロボット制御ユニット42の制御部37がハンドカメラ40の撮像位置に指定した世界座標系の座標値と一致するようにロボット11へのハンドカメラ40組付け時にキャリブレーション済の状態にしておく。ロボット11のアーム位置とハンドカメラ40との相対的位置関係は常に一定の為、ロボット11の基準点とハンドカメラ40の光軸中心の距離を測定し、これを補正値として制御部37からの座標指令値に反映する事によって、ロボット11に世界座標系上の位置(X1w 、Y1w )を指示すると、ハンドカメラ40のビジョン座標系の原点が世界座標系上の位置(X1w 、Y1w )に移動する様に制御される。
On the other hand, the coordinate value of the origin of the vision coordinate system of the hand camera 40 is handed to the robot 11 so that the control unit 37 of the robot control unit 42 matches the coordinate value of the world coordinate system designated as the imaging position of the hand camera 40. The camera 40 is left in a calibrated state when assembled. Since the relative positional relationship between the arm position of the robot 11 and the hand camera 40 is always constant, the distance between the reference point of the robot 11 and the center of the optical axis of the hand camera 40 is measured, and this is used as a correction value from the controller 37. When the robot 11 is instructed to the position (X1w , Y1w ) on the world coordinate system by reflecting it on the coordinate command value, the origin of the vision coordinate system of the hand camera 40 moves to the position (X1w , Y1w ) on the world coordinate system. To be controlled.
ロボット制御ユニット42の制御部37は、各エンコーダ31~35で検出した各サーボモータ25~29の回転角とハンドカメラ40と手首部21(エンドエフェクタ23)との相対的位置関係に基づいてリアルタイムでハンドカメラ40のビジョン座標系の原点の座標値(X1w,Y1w)を算出するようにしている。
The control unit 37 of the robot control unit 42 uses the rotation angles of the servo motors 25 to 29 detected by the encoders 31 to 35 and the relative positional relationship between the hand camera 40 and the wrist unit 21 (end effector 23) in real time. Then, the coordinate value (X1w, Y1w) of the origin of the vision coordinate system of the hand camera 40 is calculated.
ロボット制御ユニット42の制御部37は、ロボット11の稼働中に座標変換部44で3次元の世界座標系の座標値に変換したワークの位置に基づいてロボット11の各アーム15,17,19と手首部21とエンドエフェクタ23の目標位置を世界座標系の座標値で設定して、当該アーム15,17,19と手首部21とエンドエフェクタ23の位置を世界座標系の座標値で制御する。
The control unit 37 of the robot control unit 42 recognizes each arm 15, 17, 19 of the robot 11 based on the position of the work converted into the coordinate value of the three-dimensional world coordinate system by the coordinate conversion unit 44 during the operation of the robot 11. The target positions of the wrist 21 and the end effector 23 are set by the coordinate values of the world coordinate system, and the positions of the arms 15, 17, 19 and the wrist 21 and the end effector 23 are controlled by the coordinate values of the world coordinate system.
次に、固定カメラ51のビジョン座標系の座標値を世界座標系の座標値に変換する際に使用する原点補正パラメータと座標軸傾き補正パラメータと分解能の算出処理について説明する。この原点補正パラメータと座標軸傾き補正パラメータと分解能の算出処理は、ロボット11の設置作業完了時にロボット制御ユニット42によって図4及び図5のキャリブレーション実行プログラムに従って自動的に実行される。
Next, the calculation processing of the origin correction parameter, the coordinate axis tilt correction parameter, and the resolution used when converting the coordinate value of the fixed camera 51 in the vision coordinate system into the coordinate value of the world coordinate system will be described. The calculation processing of the origin correction parameter, the coordinate axis tilt correction parameter, and the resolution is automatically executed by the robot control unit 42 according to the calibration execution program of FIGS. 4 and 5 when the installation work of the robot 11 is completed.
本プログラムが起動されると、まず、ステップ101で、ハンドカメラ40が固定カメラ51の視野から外れる位置までロボット11の各アーム15,17,19を動作させてハンドカメラ40を退避させる。この後、ステップ102に進み、作業者がキャリブレーション治具55を固定カメラ51の視野中央に位置する作業エリア38の所定位置にセットするように作業者に指示する案内を表示及び/又は音声で作業者に知らせる。この案内により、作業者はキャリブレーション治具55を作業エリア38の所定位置にセットする作業を行うことになる。
When this program is started, first, in step 101, each arm 15, 17, 19 of the robot 11 is operated to a position where the hand camera 40 is out of the field of view of the fixed camera 51, and the hand camera 40 is retracted. After that, the process proceeds to step 102, and guidance and/or voice guidance for instructing the operator to set the calibration jig 55 at a predetermined position in the work area 38 located at the center of the visual field of the fixed camera 51 is displayed and/or voiced. Notify the worker. By this guidance, the worker performs the work of setting the calibration jig 55 at a predetermined position in the work area 38.
次のステップ103で、ロボット制御ユニット42は、作業者がキャリブレーション治具55を作業エリア38の所定位置にセットする作業を完了するまで待機する。このキャリブレーション治具55のセット作業の完了の判定は、センサ等で自動検知するようにしても良いし、セット作業の完了時に作業者がセット作業完了の情報を手動操作でロボット制御ユニット42に入力するようにしても良い。
At the next step 103, the robot control unit 42 waits until the worker completes the work of setting the calibration jig 55 at a predetermined position in the work area 38. The determination of the completion of the setting work of the calibration jig 55 may be automatically detected by a sensor or the like, or when the setting work is completed, the operator manually inputs the information of the setting work completion to the robot control unit 42. You may input it.
その後、キャリブレーション治具55のセット作業が完了した時点で、ステップ104に進み、固定カメラ51でキャリブレーション治具55を撮像し、次のステップ105で、固定カメラ51で撮像した画像を処理して、キャリブレーション治具55の1箇所の特定点である中央のキャリブレーションマーク56(A)の位置を固定カメラ51のビジョン座標系の座標値で認識すると共に、2箇所の特定点である2箇所のコーナー部分のキャリブレーションマーク56(B,C)の位置を固定カメラ51のビジョン座標系の座標値で認識する。更に、固定カメラ51のビジョン座標系での2箇所のコーナー部分のキャリブレーションマーク56(B,C)を結ぶ直線の傾き角度を算出する。更に、キャリブレーション治具55の中央付近の2箇所のキャリブレーションマーク56(A,D)の位置を固定カメラ51のビジョン座標系の座標値で認識し、当該2箇所のキャリブレーションマーク56(A,D)の位置間の実際の長さ寸法を画像上の当該位置間のピクセル数で割り算して固定カメラ51の分解能を算出する。
After that, when the setting operation of the calibration jig 55 is completed, the process proceeds to step 104, the fixed camera 51 captures an image of the calibration jig 55, and the next step 105 processes the image captured by the fixed camera 51. Then, the position of the central calibration mark 56 (A), which is one specific point of the calibration jig 55, is recognized by the coordinate value of the vision coordinate system of the fixed camera 51, and the two specific points 2 The position of the calibration mark 56 (B, C) at the corner portion of the location is recognized by the coordinate value of the fixed camera 51 in the vision coordinate system. Further, the tilt angle of the straight line connecting the calibration marks 56 (B, C) at the two corners of the fixed camera 51 in the vision coordinate system is calculated. Further, the positions of the two calibration marks 56 (A, D) near the center of the calibration jig 55 are recognized by the coordinate values of the vision coordinate system of the fixed camera 51, and the calibration marks 56 (A , D), the actual length dimension between the positions is divided by the number of pixels between the positions on the image to calculate the resolution of the fixed camera 51.
この後、図5のステップ106に進み、ハンドカメラ40の視野内にキャリブレーション治具55が収まるようにロボット11のアーム位置を制御して、ハンドカメラ40をキャリブレーション治具55の1箇所の特定点である中央のキャリブレーションマーク56(A)の上方の所定位置へ移動させる。この後、ステップ107に進み、ハンドカメラ40でキャリブレーション治具55を撮像し、次のステップ108で、ハンドカメラ40で撮像した画像を処理して、キャリブレーション治具55の中央のキャリブレーションマーク56(A)の位置をハンドカメラ40のビジョン座標系の座標値で認識する。
After that, the process proceeds to step 106 in FIG. 5, and the arm position of the robot 11 is controlled so that the calibration jig 55 fits within the visual field of the hand camera 40, and the hand camera 40 is moved to one position of the calibration jig 55. It is moved to a predetermined position above the central calibration mark 56(A) which is a specific point. After that, the process proceeds to step 107, the hand jig 40 takes an image of the calibration jig 55, and in the next step 108, the image picked up by the hand camera 40 is processed to obtain a calibration mark at the center of the calibration jig 55. The position of 56 (A) is recognized by the coordinate value of the vision coordinate system of the hand camera 40.
この後、ステップ109に進み、ハンドカメラ40をキャリブレーション治具55の一方のコーナー部分のキャリブレーションマーク56(B)の上方へ移動させた後、ステップ110に進み、ハンドカメラ40でキャリブレーション治具55を撮像し、次のステップ111で、ハンドカメラ40で撮像した画像を処理して、キャリブレーション治具55の一方のコーナー部分のキャリブレーションマーク56(B)の位置をハンドカメラ40のビジョン座標系の座標値で認識する。
After that, the process proceeds to step 109, where the hand camera 40 is moved above the calibration mark 56 (B) at one corner of the calibration jig 55, and then the process proceeds to step 110 where the calibration of the hand camera 40 is corrected. The tool 55 is imaged, and in the next step 111, the image captured by the hand camera 40 is processed so that the position of the calibration mark 56 (B) at one corner of the calibration jig 55 is determined by the vision of the hand camera 40. Recognize with the coordinate values of the coordinate system.
この後、ステップ112に進み、ハンドカメラ40をキャリブレーション治具55の他方のコーナー部分のキャリブレーションマーク56(C)の上方へ移動させた後、ステップ113に進み、ハンドカメラ40でキャリブレーション治具55を撮像し、次のステップ114で、ハンドカメラ40で撮像した画像を処理して、キャリブレーション治具55の他方のコーナー部分のキャリブレーションマーク56(C)の位置をハンドカメラ40のビジョン座標系の座標値で認識する。
After that, the process proceeds to step 112, where the hand camera 40 is moved above the calibration mark 56(C) at the other corner portion of the calibration jig 55, and then the process proceeds to step 113, where the calibration of the hand camera 40 is corrected. The tool 55 is imaged, and in the next step 114, the image captured by the hand camera 40 is processed so that the position of the calibration mark 56 (C) at the other corner portion of the calibration jig 55 is determined by the vision of the hand camera 40. Recognize with the coordinate values of the coordinate system.
この後、ステップ115に進み、固定カメラ51のビジョン座標系の座標値を世界座標系の座標値に変換するための原点補正パラメータと座標軸傾き補正パラメータを次のようにして算出する。
After that, the process proceeds to step 115, and the origin correction parameter and the coordinate axis tilt correction parameter for converting the coordinate value of the fixed camera 51 in the vision coordinate system into the coordinate value in the world coordinate system are calculated as follows.
原点補正パラメータを算出する場合には、固定カメラ51の撮像画像を処理して認識したキャリブレーション治具55の中央のキャリブレーションマーク56(A)の位置の座標値と、ハンドカメラ40の撮像画像を処理して認識したキャリブレーション治具55の中央のキャリブレーションマーク56(A)の位置の座標値との差分値(ΔX,ΔY)を原点補正パラメータとして算出する。
When calculating the origin correction parameter, the coordinate value of the position of the calibration mark 56 (A) at the center of the calibration jig 55, which is recognized by processing the captured image of the fixed camera 51, and the captured image of the hand camera 40. The difference value (ΔX, ΔY) from the coordinate value of the position of the calibration mark 56(A) at the center of the calibration jig 55 which is recognized by processing is calculated as the origin correction parameter.
座標軸傾き補正パラメータを算出する場合には、ハンドカメラ40の撮像画像を処理して認識したキャリブレーション治具55の2箇所のコーナー部分のキャリブレーションマーク56(B,C)を結ぶ直線の傾き角度を算出し、前記ステップ105で算出した固定カメラ51のビジョン座標系での2箇所のコーナー部分のキャリブレーションマーク56(B,C)を結ぶ直線の傾き角度と、ハンドカメラ40のビジョン座標系での2箇所のコーナー部分のキャリブレーションマーク56(B,C)を結ぶ直線の傾き角度との差分値を座標軸傾き補正パラメータとして算出する。
When calculating the coordinate axis tilt correction parameter, the tilt angle of a straight line connecting the calibration marks 56 (B, C) at the two corners of the calibration jig 55 recognized by processing the image captured by the hand camera 40. And the inclination angle of the straight line connecting the calibration marks 56 (B, C) at the two corners in the vision coordinate system of the fixed camera 51 calculated in step 105 and the vision coordinate system of the hand camera 40. The difference value from the tilt angle of the straight line connecting the calibration marks 56 (B, C) at the two corners of is calculated as the coordinate axis tilt correction parameter.
ところで、固定カメラ51のビジョン座標系では、認識対象の座標値をピクセル単位で認識するのに対して、世界座標系の座標値の単位は、長さの単位(例えば[μm]の単位)である。このため、固定カメラ51のビジョン座標系のピクセル単位の座標値を世界座標系の座標値と同じ長さの単位(例えば[μm]の単位)の座標値に変換する必要がある。
By the way, in the vision coordinate system of the fixed camera 51, the coordinate value of the recognition target is recognized in pixel units, whereas the unit of the coordinate value in the world coordinate system is a unit of length (for example, a unit of [μm]). is there. For this reason, it is necessary to convert the coordinate value of the fixed camera 51 in the pixel unit of the vision coordinate system into the coordinate value of the unit of the same length as the coordinate value of the world coordinate system (for example, the unit of [μm]).
そこで、前記ステップ105で算出した固定カメラ51の分解能[μm/ピクセル]を固定カメラ51のビジョン座標系のピクセル単位の座標値に乗算することで、固定カメラ51のビジョン座標系のピクセル単位の座標値を世界座標系の座標値と同じ長さの単位(例えば[μm]の単位)の座標値に変換する。
Therefore, by multiplying the resolution [μm/pixel] of the fixed camera 51 calculated in the step 105 by the coordinate value of the fixed camera 51 in the vision coordinate system in pixel units, the coordinates of the fixed camera 51 in the vision coordinate system in pixel units are multiplied. The value is converted into a coordinate value of the same length unit as the coordinate value of the world coordinate system (for example, a unit of [μm]).
図4及び図5のキャリブレーション実行プログラムの終了後は、作業エリア38の所定位置にセットされているキャリブレーション治具55を作業者が取り出せば、キャリブレーション作業は終了する。
After the calibration execution program shown in FIGS. 4 and 5, the calibration work is completed when the operator takes out the calibration jig 55 set at a predetermined position in the work area 38.
尚、図4及び図5のキャリブレーション実行プログラムでは、キャリブレーション治具55の中央のキャリブレーションマーク56(A)を原点補正用の1箇所の特定点としたが、これ以外の位置の1箇所のキャリブレーションマーク56を原点補正用の1箇所の特定点としても良く、要は、画像処理によりキャリブレーション治具55の位置に関する情報を取得できるようにすれば良い。
In the calibration execution programs of FIGS. 4 and 5, the calibration mark 56(A) at the center of the calibration jig 55 is set as one specific point for origin correction, but one position other than this is set. The calibration mark 56 may be used as one specific point for origin correction, and the point is that the information about the position of the calibration jig 55 can be acquired by image processing.
また、図4及び図5のキャリブレーション実行プログラムでは、キャリブレーション治具55の2箇所のコーナー部分のキャリブレーションマーク56(B,C)を座標軸傾き補正用の2箇所の特定点としたが、これ以外の位置の2箇所のキャリブレーションマーク56を座標軸傾き補正用の2箇所の特定点としても良く、要は、2箇所の特定点を結ぶ直線の傾き角度をキャリブレーション治具55の傾き角度に関する情報として取得できるようにすれば良い。或は、キャリブレーション治具55の上面に直線を形成して、画像処理によりこの直線の傾き角度を計測するようにしても良い。或は、キャリブレーション治具55の一辺のエッジを画像処理により認識してキャリブレーション治具55の傾き角度を計測するようにしても良い。要は、画像処理によりキャリブレーション治具55の傾き角度に関する角度情報を取得できるようにすれば良い。
Further, in the calibration execution programs of FIGS. 4 and 5, the calibration marks 56 (B, C) at the two corners of the calibration jig 55 are set as the two specific points for correcting the coordinate axis inclination. Two calibration marks 56 at other positions may be used as the two specific points for correcting the coordinate axis inclination. In short, the inclination angle of the straight line connecting the two specific points is the inclination angle of the calibration jig 55. It may be possible to obtain it as information regarding. Alternatively, a straight line may be formed on the upper surface of the calibration jig 55 and the inclination angle of this straight line may be measured by image processing. Alternatively, the tilt angle of the calibration jig 55 may be measured by recognizing one edge of the calibration jig 55 by image processing. The point is that the angle information about the tilt angle of the calibration jig 55 may be acquired by image processing.
また、図4及び図5のキャリブレーション実行プログラムでは、先に固定カメラ51の撮像・画像処理(ステップ104~105)を行った後に、ハンドカメラ40の撮像・画像処理(ステップ106~114)を行うようにしたので、ステップ106、109、112で移動するキャリブレーションマーク(A)(B)(C)の世界座標系の座標値は、ステップ105での固定カメラの画像処理で認識済なため、この世界座標系の座標値を制御部37から自動的に指示される。つまり、キャリブレーション冶具55の配置位置によって前記キャリブレーションマーク(A)(B)(C)の世界座標系の座標値を作業者が指示し直す必要がないように配慮されている。
In addition, in the calibration execution program of FIGS. 4 and 5, after the fixed camera 51 performs the image capturing/image processing (steps 104 to 105), the hand camera 40 performs the image capturing/image processing (steps 106 to 114). Since the coordinate values of the calibration marks (A), (B), and (C) that move in steps 106, 109, and 112 in the world coordinate system have already been recognized in the image processing of the fixed camera in step 105, since they have been performed. The coordinate value of this world coordinate system is automatically instructed by the control unit 37. That is, it is considered that the operator does not need to re-indicate the coordinate values of the calibration marks (A), (B), and (C) in the world coordinate system depending on the arrangement position of the calibration jig 55.
以上説明した図4及び図5のキャリブレーション実行プログラムは、ロボット11の設置作業完了時にロボット制御ユニット42によって実行されるが、ロボット11の作業ミスの発生頻度が増加した場合等、再度のキャリブレーションが必要と判断される状況になったときに、ロボット制御ユニット42が図4及び図5のキャリブレーション実行プログラムを再実行して各補正パラメータを再計測したり、或は、ロボット11の点検・修理時や異常停止時(エラー停止時)、又は、ロボット11の積算運転時間、積算運転日数、エラー発生回数(エラー発生率)等が所定値に達する毎に、ロボット制御ユニット42が図4及び図5のキャリブレーション実行プログラムを再実行して各補正パラメータを再計測するようにしても良い。
The above-described calibration execution programs of FIGS. 4 and 5 are executed by the robot control unit 42 when the installation work of the robot 11 is completed. However, when the frequency of occurrence of a work error of the robot 11 increases, the calibration is performed again. When it is determined that is necessary, the robot control unit 42 re-executes the calibration execution program of FIGS. 4 and 5 to re-measure each correction parameter, or to inspect the robot 11. The robot control unit 42 is operated by the robot control unit 42 as shown in FIG. The calibration execution program of FIG. 5 may be re-executed to re-measure each correction parameter.
この場合、各補正パラメータを再計測する毎に、その計測値で各補正パラメータを更新するようにしても良いし、或は、ロボット11の設置作業完了時に計測した各補正パラメータをロボット制御ユニット42の不揮発性メモリに記憶しておき、各補正パラメータの再計測時に、再計測した各補正パラメータを不揮発性メモリの記憶値と比較して、両者の差が所定のしきい値を超えたときに、各補正パラメータを再計測値に更新するようにしても良い。
In this case, each time the correction parameters are remeasured, the correction parameters may be updated with the measured values, or the correction parameters measured at the completion of the installation work of the robot 11 may be updated by the robot control unit 42. Stored in the non-volatile memory, and at the time of re-measurement of each correction parameter, each re-measured correction parameter is compared with the stored value of the non-volatile memory, and when the difference between the two exceeds a predetermined threshold value. Alternatively, each correction parameter may be updated to the remeasured value.
いずれの場合でも、例えば、固定カメラ51とロボット11との相対的位置・角度関係が経時的に変化したり、固定カメラ51の設置角度が経時的に変化したり、固定カメラ51と作業エリア38(キャリブレーション治具のセット位置)との相対的位置・角度関係が経時的に変化したりして、再度のキャリブレーションが必要となったときに、各補正パラメータを再計測して各補正パラメータを適正な値に修正することができる。
In any case, for example, the relative position/angle relationship between the fixed camera 51 and the robot 11 changes with time, the installation angle of the fixed camera 51 changes with time, the fixed camera 51 and the work area 38, and the like. When the relative position/angle relationship with (set position of the calibration jig) changes over time, and calibration is required again, each correction parameter is remeasured and each correction parameter is remeasured. Can be corrected to an appropriate value.
以上説明した本実施例によれば、所定位置に配置されたキャリブレーション治具55を固定カメラ51とハンドカメラ40の両方で撮像して、画像処理により各々のビジョン座標系でキャリブレーション治具55の位置及び傾き角度に関する情報を認識するようにしたので、両座標系のキャリブレーション治具55の位置関係に基づいて原点補正パラメータを算出できると共に、両座標系のキャリブレーション治具55の傾き角度の関係に基づいて座標軸傾き補正パラメータを算出することができる。これにより、ロボット11の座標系である世界座標系の座標軸に対して固定カメラ51のビジョン座標系の座標軸が傾いていたとしても、そのビジョン座標系の座標軸の傾きと原点のずれを座標軸傾き補正パラメータと原点補正パラメータで精度良く補正して、固定カメラ51のビジョン座標系の座標値を世界座標系の座標値に精度良く変換することができる。
According to the present embodiment described above, the calibration jig 55 arranged at a predetermined position is imaged by both the fixed camera 51 and the hand camera 40, and the calibration jig 55 is image-processed in each vision coordinate system. Since the information about the position and the tilt angle of the coordinate system is recognized, the origin correction parameter can be calculated based on the positional relationship between the calibration jigs 55 of both coordinate systems, and the tilt angle of the calibration jig 55 of both coordinate systems can be calculated. The coordinate axis tilt correction parameter can be calculated based on the relationship Accordingly, even if the coordinate axis of the vision coordinate system of the fixed camera 51 is inclined with respect to the coordinate axis of the world coordinate system which is the coordinate system of the robot 11, the inclination of the coordinate axis of the vision coordinate system and the deviation of the origin are corrected for the coordinate axis inclination correction. The coordinate values of the vision coordinate system of the fixed camera 51 can be accurately converted into the coordinate values of the world coordinate system by accurately correcting the parameters and the origin correction parameter.
尚、本実施例では、キャリブレーション治具55の上面に配列したキャリブレーションマーク56の形状を円形としたが、四角形等、他の形状であっても良い。
In this embodiment, the calibration marks 56 arranged on the upper surface of the calibration jig 55 have a circular shape, but may have another shape such as a quadrangle.
その他、本発明は、ロボット11の構成を適宜変更しても良い等、要旨を逸脱しない範囲内で種々変更して実施できることは言うまでもない。
Needless to say, the present invention can be implemented by variously changing the configuration of the robot 11 as appropriate without departing from the scope of the invention.
11…ロボット、14…第1関節軸、15…第1アーム、16…第2関節軸、17…第2アーム、18…第3関節軸、19…第3アーム、20…第4関節軸、21…手首部(アーム先端部)、22…第5関節軸、23…エンドエフェクタ、25~29…サーボモータ、31~35…エンコーダ、36…サーボアンプ、37…制御部、38…作業エリア、39…供給装置、40…ハンドカメラ、41…レンズ、42…ロボット制御ユニット、43…画像処理部、44…座標変換部、45…座標変換パラメータ算出部、50…固定構造物、51…固定カメラ、52…レンズ、55…キャリブレーション治具、56…キャリブレーションマーク
11... Robot, 14... 1st joint axis, 15... 1st arm, 16... 2nd joint axis, 17... 2nd arm, 18... 3rd joint axis, 19... 3rd arm, 20... 4th joint axis, 21... wrist (arm tip), 22... fifth joint axis, 23... end effector, 25-29... servo motor, 31-35... encoder, 36... servo amplifier, 37... control section, 38... work area, 39... Supply device, 40... Hand camera, 41... Lens, 42... Robot control unit, 43... Image processing unit, 44... Coordinate conversion unit, 45... Coordinate conversion parameter calculation unit, 50... Fixed structure, 51... Fixed camera , 52... Lens, 55... Calibration jig, 56... Calibration mark
Claims (5)
- 作業エリアに供給されたワークに対して所定の作業を行うロボットと、
前記ロボットのアームに取り付けられたハンドカメラと、
前記ロボットのアーム可動領域よりも高い場所に下向きに固定され、前記作業エリアに供給されたワークを上方から撮像する固定カメラと、
前記ハンドカメラで撮像した画像を処理して撮像対象の位置を当該画像の基準点を原点とする座標系(以下「ハンドカメラのビジョン座標系」という)の座標値で認識し、前記固定カメラで撮像した画像を処理して撮像対象の位置を当該画像の基準点を原点とする座標系(以下「固定カメラのビジョン座標系」という)の座標値で認識する画像処理部と、
前記固定カメラのビジョン座標系の座標値を前記ロボットのアームの動作を制御する座標系である世界座標系の座標値に変換するための座標変換パラメータを算出する座標変換パラメータ算出部と、
前記座標変換パラメータを用いて前記固定カメラのビジョン座標系の座標値を前記世界座標系の座標値に変換する座標変換部と、
前記ロボットのアームの位置を前記世界座標系の座標値で制御する制御部と、
前記固定カメラの視野内に収まる所定位置に配置されたキャリブレーション治具とを備え、
前記座標変換パラメータは、前記固定カメラのビジョン座標系の原点を補正する原点補正パラメータと、前記固定カメラのビジョン座標系の座標軸の傾きを補正する座標軸傾き補正パラメータとを含み、
前記座標変換パラメータ算出部が前記座標変換パラメータを算出する際に、前記制御部は、前記ハンドカメラの視野内に前記キャリブレーション治具が収まるように前記ロボットのアームの動作を制御して前記ハンドカメラを移動させ、前記画像処理部は、前記ハンドカメラで前記キャリブレーション治具を撮像した画像を処理して前記キャリブレーション治具の位置及び傾き角度に関する情報を前記ハンドカメラのビジョン座標系で認識すると共に、前記固定カメラで前記キャリブレーション治具を撮像した画像を処理して前記キャリブレーション治具の位置及び傾き角度に関する情報を前記固定カメラのビジョン座標系で認識し、
前記座標変換パラメータ算出部は、前記画像処理部が認識した前記ハンドカメラのビジョン座標系での前記キャリブレーション治具の位置及び傾き角度に関する情報と前記固定カメラのビジョン座標系での前記キャリブレーション治具の位置及び傾き角度に関する情報との関係に基づいて前記原点補正パラメータ及び前記座標軸傾き補正パラメータを算出する、ロボットキャリブレーションシステム。 A robot that performs a predetermined work on the work supplied to the work area,
A hand camera attached to the robot arm,
A fixed camera that is fixed downward at a position higher than the arm movable region of the robot and that images the work supplied to the work area from above;
The image captured by the hand camera is processed to recognize the position of the image capturing target with the coordinate value of the coordinate system (hereinafter referred to as "hand camera vision coordinate system") having the reference point of the image as the origin, and the fixed camera An image processing unit that processes the captured image and recognizes the position of the imaging target with coordinate values of a coordinate system whose origin is the reference point of the image (hereinafter referred to as "fixed camera vision coordinate system").
A coordinate conversion parameter calculation unit that calculates coordinate conversion parameters for converting the coordinate values of the vision coordinate system of the fixed camera into the coordinate values of the world coordinate system that is a coordinate system that controls the operation of the arm of the robot;
A coordinate conversion unit that converts the coordinate value of the vision coordinate system of the fixed camera into the coordinate value of the world coordinate system using the coordinate conversion parameter;
A control unit that controls the position of the robot arm with coordinate values of the world coordinate system;
A calibration jig disposed at a predetermined position within the visual field of the fixed camera,
The coordinate conversion parameters include an origin correction parameter that corrects the origin of the fixed camera vision coordinate system, and a coordinate axis tilt correction parameter that corrects the tilt of the coordinate axis of the fixed camera vision coordinate system,
When the coordinate conversion parameter calculation unit calculates the coordinate conversion parameter, the control unit controls the operation of the arm of the robot so that the calibration jig fits within the visual field of the hand camera. The camera is moved, and the image processing unit processes the image of the calibration jig captured by the hand camera to recognize information about the position and tilt angle of the calibration jig in the vision coordinate system of the hand camera. At the same time, the image of the calibration jig captured by the fixed camera is processed to recognize information about the position and tilt angle of the calibration jig in the vision coordinate system of the fixed camera,
The coordinate conversion parameter calculation unit includes information regarding the position and tilt angle of the calibration jig in the vision coordinate system of the hand camera recognized by the image processing unit, and the calibration correction in the vision coordinate system of the fixed camera. A robot calibration system for calculating the origin correction parameter and the coordinate axis tilt correction parameter based on a relationship with the position and tilt angle information of the tool. - 前記座標変換パラメータ算出部が前記原点補正パラメータを算出する際に、前記制御部は、前記ハンドカメラの視野内に前記キャリブレーション治具が収まるように前記ロボットのアームの動作を制御して前記ハンドカメラを移動させ、前記画像処理部は、前記ハンドカメラで前記キャリブレーション治具を撮像した画像を処理して前記キャリブレーション治具の1つの特定点の位置を前記キャリブレーション治具の位置に関する情報として前記ハンドカメラのビジョン座標系の座標値で認識すると共に、前記固定カメラで前記キャリブレーション治具を撮像した画像を処理して前記キャリブレーション治具の前記1つの特定点の位置を前記キャリブレーション治具の位置に関する情報として前記固定カメラのビジョン座標系の座標値で認識し、
前記座標変換パラメータ算出部は、前記画像処理部が認識した前記ハンドカメラのビジョン座標系での前記1つの特定点の位置の座標値と前記固定カメラのビジョン座標系での前記1つの特定点の位置の座標値との位置関係に基づいて前記原点補正パラメータを算出する、請求項1に記載のロボットキャリブレーションシステム。 When the coordinate conversion parameter calculation unit calculates the origin correction parameter, the control unit controls the operation of the arm of the robot so that the calibration jig fits within the visual field of the hand camera. The camera is moved, and the image processing unit processes the image of the calibration jig captured by the hand camera to determine the position of one specific point of the calibration jig as information regarding the position of the calibration jig. As the coordinate value of the vision coordinate system of the hand camera, and the image of the calibration jig taken by the fixed camera is processed to determine the position of the one specific point of the calibration jig. Recognized by the coordinate values of the vision coordinate system of the fixed camera as information about the position of the jig,
The coordinate conversion parameter calculation unit calculates the coordinate value of the position of the one specific point in the vision coordinate system of the hand camera recognized by the image processing unit and the one specific point in the vision coordinate system of the fixed camera. The robot calibration system according to claim 1, wherein the origin correction parameter is calculated based on a positional relationship with a coordinate value of a position. - 前記座標変換パラメータ算出部が前記座標軸傾き補正パラメータを算出する際に、前記制御部は、前記ハンドカメラの視野内に前記キャリブレーション治具が収まるように前記ロボットのアームの動作を制御して前記ハンドカメラを移動させ、前記画像処理部は、前記ハンドカメラで前記キャリブレーション治具を撮像した画像を処理して前記キャリブレーション治具の2箇所の特定点の位置を前記ハンドカメラのビジョン座標系の座標値で認識すると共に、前記固定カメラで前記キャリブレーション治具を撮像した画像を処理して前記キャリブレーション治具の前記2箇所の特定点の位置を前記固定カメラのビジョン座標系の座標値で認識し、
前記座標変換パラメータ算出部は、前記ハンドカメラのビジョン座標系での前記2箇所の特定点の位置の座標値に基づいて前記ハンドカメラのビジョン座標系での前記キャリブレーション治具の傾き角度を算出すると共に、前記固定カメラのビジョン座標系での前記2箇所の特定点の位置の座標値に基づいて前記固定カメラのビジョン座標系での前記キャリブレーション治具の傾き角度に関する情報を算出し、前記ハンドカメラのビジョン座標系と前記固定カメラのビジョン座標系で算出した2つの前記キャリブレーション治具の傾き角度に関する情報の差分値に基づいて前記座標軸傾き補正パラメータを算出する、請求項1又は2に記載のロボットキャリブレーションシステム。 When the coordinate conversion parameter calculation unit calculates the coordinate axis tilt correction parameter, the control unit controls the operation of the robot arm so that the calibration jig fits within the visual field of the hand camera. The hand camera is moved, and the image processing unit processes an image obtained by capturing the calibration jig by the hand camera to determine the positions of two specific points of the calibration jig as the vision coordinate system of the hand camera. The coordinate values of the fixed camera are processed, and the images of the calibration jig captured by the fixed camera are processed to determine the positions of the two specific points of the calibration jig as coordinate values in the vision coordinate system of the fixed camera. Recognized in
The coordinate conversion parameter calculation unit calculates the tilt angle of the calibration jig in the vision coordinate system of the hand camera based on the coordinate values of the positions of the two specific points in the vision coordinate system of the hand camera. In addition, based on the coordinate values of the positions of the two specific points in the vision coordinate system of the fixed camera, information regarding the tilt angle of the calibration jig in the vision coordinate system of the fixed camera is calculated, and The coordinate axis tilt correction parameter is calculated based on a difference value of information regarding tilt angles of the two calibration jigs calculated in a vision coordinate system of a hand camera and a vision coordinate system of the fixed camera. Robot calibration system described. - 前記キャリブレーション治具は、それを前記ハンドカメラと前記固定カメラで撮像する際に作業者が前記固定カメラの視野内の所定位置にセットし、撮像終了後に取り出す、請求項1乃至3のいずれかに記載のロボットキャリブレーションシステム。 4. The calibration jig according to claim 1, wherein an operator sets the calibration jig at a predetermined position within a field of view of the fixed camera when the image is taken by the hand camera and the fixed camera, and takes out after the imaging is completed. The robot calibration system described in.
- 作業エリアに供給されたワークに対して所定の作業を行うロボットと、
前記ロボットのアームに取り付けられたハンドカメラと、
前記ロボットのアーム動作範囲よりも高い場所に下向きに固定され、前記作業エリアに供給されたワークを上方から撮像する固定カメラと、
前記ロボットのアームの動作を世界座標系の座標値で制御する制御部と、
前記固定カメラの視野内に収まる所定位置に配置されたキャリブレーション治具とを備え、
前記固定カメラで前記キャリブレーション治具を撮像した画像を処理して前記キャリブレーション治具の位置及び傾き角度に関する情報を当該画像の基準点を原点とする座標系(以下「固定カメラのビジョン座標系」という)で認識する工程と、
前記ハンドカメラの視野内に前記キャリブレーション治具が収まるように前記ロボットのアームの動作を制御して前記ハンドカメラを移動させて、前記ハンドカメラで前記キャリブレーション治具を撮像した画像を処理して前記キャリブレーション治具の位置及び傾き角度に関する情報を当該画像の基準点を原点とする座標系(以下「ハンドカメラのビジョン座標系」という)で認識する工程と、
前記固定カメラのビジョン座標系での前記キャリブレーション治具の位置及び傾き角度に関する情報と前記ハンドカメラのビジョン座標系での前記キャリブレーション治具の位置及び傾き角度に関する情報との関係に基づいて前記固定カメラのビジョン座標系の座標値を前記世界座標系の座標値に変換するための座標変換パラメータを算出する工程と
を含む、ロボットキャリブレーション方法。 A robot that performs a predetermined work on the work supplied to the work area,
A hand camera attached to the arm of the robot,
A fixed camera that is fixed downward at a position higher than the arm operation range of the robot and that images the work supplied to the work area from above,
A control unit for controlling the operation of the arm of the robot with coordinate values in the world coordinate system;
And a calibration jig arranged at a predetermined position within the visual field of the fixed camera,
An image obtained by processing the calibration jig with the fixed camera is processed to obtain information about the position and the tilt angle of the calibration jig from a coordinate system having a reference point of the image as an origin (hereinafter, “fixed camera vision coordinate system”). "))
The operation of the arm of the robot is controlled so that the calibration jig fits within the field of view of the hand camera, the hand camera is moved, and an image obtained by capturing the calibration jig with the hand camera is processed. Recognizing the information about the position and the tilt angle of the calibration jig in a coordinate system whose origin is the reference point of the image (hereinafter referred to as "hand camera vision coordinate system"),
Based on the relationship between the information about the position and the tilt angle of the calibration jig in the vision coordinate system of the fixed camera and the information about the position and the tilt angle of the calibration jig in the vision coordinate system of the hand camera, Calculating a coordinate conversion parameter for converting the coordinate value of the fixed camera vision coordinate system into the coordinate value of the world coordinate system.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2018/045423 WO2020121396A1 (en) | 2018-12-11 | 2018-12-11 | Robot calibration system and robot calibration method |
JP2020558830A JP7153085B2 (en) | 2018-12-11 | 2018-12-11 | ROBOT CALIBRATION SYSTEM AND ROBOT CALIBRATION METHOD |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2018/045423 WO2020121396A1 (en) | 2018-12-11 | 2018-12-11 | Robot calibration system and robot calibration method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020121396A1 true WO2020121396A1 (en) | 2020-06-18 |
Family
ID=71077156
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/045423 WO2020121396A1 (en) | 2018-12-11 | 2018-12-11 | Robot calibration system and robot calibration method |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP7153085B2 (en) |
WO (1) | WO2020121396A1 (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112109374A (en) * | 2020-08-26 | 2020-12-22 | 合肥工业大学 | Method for positioning and controlling assembling and disassembling of bending die based on computer vision system |
CN112208113A (en) * | 2020-08-13 | 2021-01-12 | 苏州赛米维尔智能装备有限公司 | Automatic heat-conducting cotton attaching device based on visual guidance and attaching method thereof |
CN113093650A (en) * | 2021-04-14 | 2021-07-09 | 曹智军 | Data communication method for workpiece visual positioning of numerical control machine tool |
CN113276115A (en) * | 2021-05-21 | 2021-08-20 | 南京航空航天大学 | Hand-eye calibration method and device without robot movement |
CN113601158A (en) * | 2021-08-23 | 2021-11-05 | 深圳职业技术学院 | Bolt feeding and pre-tightening system based on visual positioning and control method |
CN113635311A (en) * | 2021-10-18 | 2021-11-12 | 杭州灵西机器人智能科技有限公司 | Method and system for out-of-hand calibration of eye for fixing calibration plate |
CN113753530A (en) * | 2021-10-12 | 2021-12-07 | 华南农业大学 | Machine vision tea branch mandarin orange gesture recognition and automatic adjusting device |
CN113843797A (en) * | 2021-10-08 | 2021-12-28 | 北京工业大学 | Automatic dismounting method for part hexagon bolt in non-structural environment based on monocular and binocular mixed vision |
CN114830911A (en) * | 2022-05-19 | 2022-08-02 | 苏州大学 | Intelligent weeding method and device and storage medium |
CN114872038A (en) * | 2022-04-13 | 2022-08-09 | 欣旺达电子股份有限公司 | Micro-needle buckling vision self-calibration system and calibration method thereof |
CN114939865A (en) * | 2021-02-17 | 2022-08-26 | 精工爱普生株式会社 | Calibration method |
CN116277035A (en) * | 2023-05-15 | 2023-06-23 | 北京壹点灵动科技有限公司 | Robot control method and device, processor and electronic equipment |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102719072B1 (en) * | 2024-02-26 | 2024-10-18 | 주식회사 포디아이비젼 | Apparatus and method for hand-eye calibration |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014151427A (en) * | 2013-02-14 | 2014-08-25 | Canon Inc | Robot system and control method therefor |
US9193073B1 (en) * | 2014-10-15 | 2015-11-24 | Quanta Storage Inc. | Robot calibration apparatus for calibrating a robot arm |
JP2016052695A (en) * | 2014-09-03 | 2016-04-14 | キヤノン株式会社 | Robot device and control method of robot device |
JP2017100240A (en) * | 2015-12-01 | 2017-06-08 | セイコーエプソン株式会社 | Control device, robot and robot system |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH01305303A (en) * | 1988-06-03 | 1989-12-08 | Yamatake Honeywell Co Ltd | Method and apparatus for visual measurement |
JP2009269110A (en) * | 2008-05-02 | 2009-11-19 | Olympus Corp | Assembly equipment |
JP2012030320A (en) * | 2010-07-30 | 2012-02-16 | Dainippon Screen Mfg Co Ltd | Work system, working robot controller, and work program |
-
2018
- 2018-12-11 WO PCT/JP2018/045423 patent/WO2020121396A1/en active Application Filing
- 2018-12-11 JP JP2020558830A patent/JP7153085B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014151427A (en) * | 2013-02-14 | 2014-08-25 | Canon Inc | Robot system and control method therefor |
JP2016052695A (en) * | 2014-09-03 | 2016-04-14 | キヤノン株式会社 | Robot device and control method of robot device |
US9193073B1 (en) * | 2014-10-15 | 2015-11-24 | Quanta Storage Inc. | Robot calibration apparatus for calibrating a robot arm |
JP2017100240A (en) * | 2015-12-01 | 2017-06-08 | セイコーエプソン株式会社 | Control device, robot and robot system |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112208113A (en) * | 2020-08-13 | 2021-01-12 | 苏州赛米维尔智能装备有限公司 | Automatic heat-conducting cotton attaching device based on visual guidance and attaching method thereof |
CN112109374A (en) * | 2020-08-26 | 2020-12-22 | 合肥工业大学 | Method for positioning and controlling assembling and disassembling of bending die based on computer vision system |
CN114939865A (en) * | 2021-02-17 | 2022-08-26 | 精工爱普生株式会社 | Calibration method |
CN114939865B (en) * | 2021-02-17 | 2023-06-02 | 精工爱普生株式会社 | Calibration method |
CN113093650A (en) * | 2021-04-14 | 2021-07-09 | 曹智军 | Data communication method for workpiece visual positioning of numerical control machine tool |
CN113093650B (en) * | 2021-04-14 | 2023-08-11 | 曹智军 | Data communication method for visual positioning of workpiece of numerical control machine tool |
CN113276115A (en) * | 2021-05-21 | 2021-08-20 | 南京航空航天大学 | Hand-eye calibration method and device without robot movement |
CN113601158B (en) * | 2021-08-23 | 2023-06-02 | 深圳职业技术学院 | Bolt feeding pre-tightening system based on visual positioning and control method |
CN113601158A (en) * | 2021-08-23 | 2021-11-05 | 深圳职业技术学院 | Bolt feeding and pre-tightening system based on visual positioning and control method |
CN113843797A (en) * | 2021-10-08 | 2021-12-28 | 北京工业大学 | Automatic dismounting method for part hexagon bolt in non-structural environment based on monocular and binocular mixed vision |
CN113843797B (en) * | 2021-10-08 | 2023-08-01 | 北京工业大学 | Automatic disassembly method for part hexagonal bolt under non-structural environment based on single-binocular hybrid vision |
CN113753530A (en) * | 2021-10-12 | 2021-12-07 | 华南农业大学 | Machine vision tea branch mandarin orange gesture recognition and automatic adjusting device |
CN113753530B (en) * | 2021-10-12 | 2024-04-12 | 华南农业大学 | Machine vision tea branch citrus gesture recognition and automatic adjustment device |
CN113635311A (en) * | 2021-10-18 | 2021-11-12 | 杭州灵西机器人智能科技有限公司 | Method and system for out-of-hand calibration of eye for fixing calibration plate |
CN114872038A (en) * | 2022-04-13 | 2022-08-09 | 欣旺达电子股份有限公司 | Micro-needle buckling vision self-calibration system and calibration method thereof |
CN114872038B (en) * | 2022-04-13 | 2023-12-01 | 欣旺达电子股份有限公司 | Micro-needle buckling vision self-calibration system and calibration method thereof |
CN114830911A (en) * | 2022-05-19 | 2022-08-02 | 苏州大学 | Intelligent weeding method and device and storage medium |
CN116277035A (en) * | 2023-05-15 | 2023-06-23 | 北京壹点灵动科技有限公司 | Robot control method and device, processor and electronic equipment |
CN116277035B (en) * | 2023-05-15 | 2023-09-12 | 北京壹点灵动科技有限公司 | Robot control method and device, processor and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
JPWO2020121396A1 (en) | 2021-09-02 |
JP7153085B2 (en) | 2022-10-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020121396A1 (en) | Robot calibration system and robot calibration method | |
US10099380B2 (en) | Robot, robot control device, and robot system | |
JP4267005B2 (en) | Measuring apparatus and calibration method | |
JP2022028672A (en) | System and method for automatic hand-eye calibration of vision system for robot motion | |
US20110029131A1 (en) | Apparatus and method for measuring tool center point position of robot | |
US11267142B2 (en) | Imaging device including vision sensor capturing image of workpiece | |
US11964396B2 (en) | Device and method for acquiring deviation amount of working position of tool | |
US20130123982A1 (en) | Calibration method for tool center point of a robot manipulator | |
WO2018092243A1 (en) | Working-position correcting method and working robot | |
JP6869159B2 (en) | Robot system | |
JP2013063474A (en) | Robot system and imaging method | |
JP7057841B2 (en) | Robot control system and robot control method | |
JP6900290B2 (en) | Robot system | |
JP7281910B2 (en) | robot control system | |
JP2007122705A (en) | Welding teaching point correction system and calibration method | |
US20110118876A1 (en) | Teaching line correcting apparatus, teaching line correcting method, and program thereof | |
JP6912529B2 (en) | How to correct the visual guidance robot arm | |
JP2009125839A (en) | Weld teaching position correction system | |
JP2021013983A (en) | Apparatus and method for acquiring deviation of moving locus of moving machine | |
WO2018173192A1 (en) | Articulated robot parallelism determination method and articulated robot inclination adjustment device | |
CN114571199B (en) | Screw locking machine and screw positioning method | |
JPH06218684A (en) | Instruction device for operation type manipulator/and automatic work by operation type manipulator | |
JP6965422B2 (en) | Camera parallelism judgment method | |
CN113905859B (en) | Robot control system and robot control method | |
JP2000117466A (en) | Teaching method of yag laser beam machine, and its device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18942686 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2020558830 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18942686 Country of ref document: EP Kind code of ref document: A1 |