WO2022180674A1 - カメラの位置ずれ測定装置および位置ずれ測定方法 - Google Patents

カメラの位置ずれ測定装置および位置ずれ測定方法 Download PDF

Info

Publication number
WO2022180674A1
WO2022180674A1 PCT/JP2021/006829 JP2021006829W WO2022180674A1 WO 2022180674 A1 WO2022180674 A1 WO 2022180674A1 JP 2021006829 W JP2021006829 W JP 2021006829W WO 2022180674 A1 WO2022180674 A1 WO 2022180674A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
hand
measurement
robot
measurement hole
Prior art date
Application number
PCT/JP2021/006829
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
智紀 川▲崎▼
Original Assignee
株式会社Fuji
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Fuji filed Critical 株式会社Fuji
Priority to JP2023501705A priority Critical patent/JP7562827B2/ja
Priority to PCT/JP2021/006829 priority patent/WO2022180674A1/ja
Priority to CN202180091564.9A priority patent/CN116806186A/zh
Publication of WO2022180674A1 publication Critical patent/WO2022180674A1/ja

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices

Definitions

  • This specification discloses a camera positional deviation measuring device and a positional deviation measuring method.
  • a robot having a robot arm and a robot camera and an external camera installed separately from the robot are provided, and the calibration pattern is captured by the pre-calibrated robot camera and the calibration pattern by the external camera.
  • a robot system has been proposed that obtains calibration data for an external camera by using a digital camera (see, for example, Patent Document 1).
  • calibration patterns are captured by the robot camera and an external camera to acquire pattern images, respectively, and calibration data for the external camera is obtained based on the acquired pattern images and known calibration data for the robot camera. .
  • the main purpose of the present disclosure is to appropriately measure the deviation of the camera attached to the arm of the robot.
  • the camera misalignment measuring device of the present disclosure includes: A robot system comprising a robot having a robot arm and a hand camera attached to the robot arm, and an external camera installed outside the robot, and measuring the positional deviation of the camera for measuring the positional deviation of the hand camera.
  • a device a jig member installed above the external camera and having a first measurement hole penetrating vertically; a first position acquisition unit that captures an image of the jig member with the external camera and acquires the position of the first measurement hole from the obtained captured image of the first measurement hole;
  • the hand camera is moved to the position acquired by the first position acquiring unit, the jig member is imaged by the hand camera, and the first measurement hole is obtained from the acquired image of the first measurement hole.
  • a second position acquisition unit that acquires the position of The hand of the robot is moved to the position of the first measurement hole acquired by the second position acquisition unit, the hand of the robot is imaged by the external camera, and the position of the camera is shifted from the captured image obtained.
  • a positional deviation measuring unit for measuring The gist is to provide
  • the camera positional deviation measuring device of the present disclosure measures the positional deviation of a hand camera attached to a robot arm using an external camera. includes a jig member having a hole for use. Further, the positional deviation measuring device acquires the position of the first measurement hole by imaging the jig member with the external camera, moves the hand camera to the acquired position, and images the jig member with the hand camera. Obtain the position of the first measurement hole. Then, the positional deviation measuring device moves the hand of the robot to the acquired position of the first measurement hole, images the hand of the robot with an external camera, and measures the positional deviation of the camera. Accordingly, it is possible to appropriately measure the positional deviation of the camera attached to the arm of the robot.
  • FIG. 1 is an external perspective view of a robot system
  • FIG. FIG. 3 is a block diagram showing an electrical connection relationship between a robot and a control device
  • FIG. 4 is an explanatory diagram for explaining a world coordinate system
  • FIG. 4 is an explanatory diagram illustrating a base coordinate system, a camera coordinate system, and a mechanical interface coordinate system
  • It is explanatory drawing which shows a hand camera position measurement procedure.
  • FIG. 4 is an external view of a marked end effector
  • FIG. 10 is a flowchart showing work camera position measurement processing for hand camera position measurement
  • FIG. FIG. 10 is an explanatory diagram showing how a work camera position is measured
  • FIG. 11 is an explanatory diagram for explaining a deviation amount of a work camera position; It is explanatory drawing which shows the mode of attachment of a hole jig. It is a flow chart which shows an example of hole position measurement processing for hand camera position measurement.
  • FIG. 4 is an explanatory diagram showing how three holes of a hole jig are imaged by a work camera; 9 is a flowchart showing an example of hand camera angle measurement processing; It is explanatory drawing which shows a mode that a hole jig is imaged with a hand camera.
  • FIG. 4 is an explanatory diagram showing phases of three-point holes measured by a work camera and a hand camera; 9 is a flowchart showing an example of hand camera position measurement processing; FIG.
  • FIG. 10 is an explanatory diagram showing a state in which an image of an end effector with a mark is captured by a work camera; It is an explanatory view explaining the distance from the hand camera center to the center hole.
  • FIG. 10 is an explanatory diagram showing the relationship between the mark pin position (average value), the center position of the hole for hand camera position measurement, and the amount of positional deviation;
  • FIG. 1 is an external perspective view of the robot.
  • FIG. 2 is a block diagram showing the electrical connection relationship between the robot and the controller.
  • the robot system 1 is a work robot system, and for example, picks up a work W (component) supplied by a work supply unit P and mounts it on a board S.
  • this robot system 1 includes a workbench 2, a robot 10, a hand camera 60 attached to a robot arm 20 of the robot 10, a work camera 70 as an external camera, and the robot 10. and a control device 90 that controls the The workbench 2 is provided with a vertically extending main pole 3a and a vertically extending sub pole 3b separated from the main pole 3a.
  • the robot 10 is a SCARA robot in this embodiment and includes a base 11 and a robot arm 20 .
  • the base 11 is fixed to the workbench 2 and supports the base end side of the robot arm 20 .
  • the robot arm 20 includes a first arm 21 , a first arm drive section 30 , a second arm 22 , a second arm drive section 40 , a shaft 23 and a shaft drive section 50 .
  • the base end of the first arm 21 is connected to the base 11 via the first joint shaft J1, and is configured to be rotatable (horizontal rotation) in a horizontal plane with respect to the base 11 by the rotation of the first joint shaft J1. be done.
  • the base end of the second arm 22 is connected to the distal end of the first arm 21 via the second joint shaft J2, and rotates in the horizontal plane with respect to the first arm 21 as the second joint shaft J2 rotates. (horizontal turning).
  • the shaft 23 is connected to the distal end of the second arm 22 via a third joint axis J3, is rotatable about the third joint axis J3 with respect to the second arm 22, and is rotatable in the axial direction of the third joint axis J3. configured to be able to ascend and descend along the In the robot system 1 of the present embodiment, a work holding portion 24 that collects and holds a work W is provided as an end effector at the tip of the shaft 23 .
  • the work holding unit 24 include a suction nozzle that sucks the work W with negative pressure, a mechanical chuck that holds the work W with a pair of claws, and an electromagnetic chuck that sucks the work W with an electromagnet. .
  • the first arm driving section 30 includes a motor 32 and an encoder 34.
  • the rotating shaft of the motor 32 is connected to the first joint shaft J1 via a speed reducer (not shown).
  • the first arm drive unit 30 drives the motor 32 to rotate the first arm 21 about the first joint axis J1 by torque transmitted to the first joint axis J1 through the speed reducer.
  • the encoder 34 is attached to the rotary shaft of the motor 32 and configured as a rotary encoder that detects the amount of rotational displacement of the motor 32 .
  • the second arm driving section 40 includes a motor 42 and an encoder 44, similar to the first arm driving section 30.
  • the rotating shaft of the motor 42 is connected to the second joint shaft J2 via a speed reducer (not shown).
  • the second arm drive unit 40 drives the motor 42 to rotate the second arm 22 about the second joint axis J2 by torque transmitted to the second joint axis J2 via the speed reducer.
  • the encoder 44 is attached to the rotating shaft of the motor 42 and configured as a rotary encoder that detects the amount of rotational displacement of the motor 42 .
  • the shaft drive unit 50 includes motors 52a, 52b and encoders 54a, 54b.
  • a rotating shaft of the motor 52a is connected to the shaft 23 via a belt (not shown).
  • the shaft drive unit 50 rotates the shaft 23 around its axis by driving the motor 52a.
  • a rotating shaft of the motor 52b is connected to the shaft 23 via a ball screw mechanism (not shown).
  • By driving the motor 52b the shaft drive unit 50 vertically moves the shaft 23 by converting the rotary motion of the motor 52b into linear motion by the ball screw mechanism.
  • the encoder 54 a is configured as a rotary encoder that detects the amount of rotational displacement of the shaft 23 .
  • the encoder 54b is configured as a linear encoder that detects the vertical position of the shaft 23. As shown in FIG.
  • the control device 90 includes a CPU 91, a ROM 92 for storing processing programs, a RAM 93 as a work memory, a storage device 94 such as an HDD or SSD, and an input/output interface (not shown).
  • a CPU 91 a central processing unit
  • ROM 92 for storing processing programs
  • RAM 93 as a work memory
  • storage device 94 such as an HDD or SSD
  • input/output interface (not shown).
  • Position signals from the encoders 34, 44, 54a, 54b, image signals from the hand camera 60, and the like are input to the control device 90 via an input/output interface.
  • Driving signals to the motors 32, 42, 52a, 52b and the like are output from the control device 90 via the input/output interface.
  • the hand camera 60 is attached to the tip of the second arm 22 .
  • the hand camera 60 captures an image of the workpiece W in the workpiece supply unit P from above and outputs the captured image to the control device 90 .
  • the control device 90 recognizes the position of the workpiece W by processing the captured image. Since the hand camera 60 is attached to the second arm 22, the robot arm 20 cannot move the hand camera 60 vertically in this embodiment.
  • the work camera 70 is installed between the work supply section P and the board S on the workbench 2 .
  • the work camera 70 is housed in a housing box 71 having a rectangular opening 71a at the top.
  • the work camera 70 captures an image of the work W held by the work holding section 24 of the robot 10 from below and outputs the captured image to the control device 90 .
  • the control device 90 determines whether or not the work W is normally held by the work holding section 24 by processing the captured image.
  • the operation of the robot system 1 includes a workpiece position recognition operation, a collection operation, a collection confirmation operation, and a mounting operation.
  • the work position recognition operation the work W is imaged by the hand camera 60 and the position of the work W is recognized.
  • the picking operation picks up the work W by moving the hand (work holding unit 24) of the robot system 1 to the recognized work W position.
  • the picking confirmation operation the work W picked up by hand is moved above the work camera 70, and the work W is imaged by the work camera 70 to check whether the picking state of the work W is good or not.
  • the mounting operation the collected workpiece W is mounted on the board S. As shown in FIG.
  • the robot system 1 has a world coordinate system ⁇ w , a base coordinate system ⁇ b , a camera coordinate system ⁇ c , and a mechanical interface coordinate system ⁇ m as coordinate systems.
  • the world coordinate system ⁇ w has its origin set at the tip of the main pole 3a, the X-axis set in the direction passing through the tip of the main pole 3a and the tip of the sub-pole 3b, the Z-axis set in the vertical direction, and the Y-axis is set in a direction perpendicular to the X and Z axes.
  • the origin of the base coordinate system ⁇ b is set at the bottom surface of the base 11 of the robot 10, and the X-, Y-, and Z-axes are set so as to match the corresponding axes of the world coordinate system ⁇ w .
  • the origin of the mechanical interface coordinate system ⁇ m is set at the tip of the robot arm 20, and the X-, Y-, and Z-axes are set to match the corresponding axes of the world coordinate system ⁇ w .
  • the camera coordinate system ⁇ c has its origin set at the bottom surface of the second arm 22 and at the center of the third joint axis J3, and the X-, Y-, and Z-axes respectively correspond to the axes and directions of the world coordinate system ⁇ w . are set to match.
  • the world coordinate system ⁇ w , the base coordinate system ⁇ b , the camera coordinate system ⁇ c , and the mechanical interface coordinate system ⁇ m are mutually transformed using a transformation matrix.
  • the CPU 91 of the control device 90 first detects the position of the hand camera 60 as viewed from the base 11 (the base coordinate system ⁇ b shown in FIG. 1) for capturing an image of the work W supplied from the work supply section P. Set target positions X btag , Y btag , Z btag . Subsequently, the CPU 91 solves inverse kinematics for the target positions X btag , Y btag , and Z btag of the hand camera 60 to match the position of the hand camera 60 with the target positions X btag , Y btag , and Z btag .
  • the angle ⁇ J1 of the first joint axis J1 and the angle ⁇ J2 of the second joint axis J2 are calculated for this purpose.
  • the CPU 91 considers error factors such as the twist and deflection of the first arm 21 and the deflection of the second arm 22 to determine the angle ⁇ J1 of the first joint axis J1 and the angle ⁇ J2 of the second joint axis J2.
  • Estimated positions X best , Y best , and Z best of the hand camera 60 are calculated by solving the forward kinematics based on and. That is, the CPU 91 calculates angles ⁇ J1 , ⁇ J2 of the first and second joint axes J1, J2 calculated by inverse kinematics so that the position of the hand camera 60 matches the target positions X btag , Y btag , Z btag .
  • the CPU 91 After calculating the estimated positions X best , Y best , Z best , the CPU 91 obtains the difference between the calculated estimated positions X best , Y best , Z best and the target positions X btag , Y btag , Z btag to determine the positional deviation. Calculate the quantities ⁇ X b , ⁇ Y b , ⁇ Z b . Subsequently, the CPU 91 offsets the target positions X btag , Y btag , and Z btag by the amounts of positional deviations ⁇ X b , ⁇ Y b , and ⁇ Z b , and solves inverse kinematics to set the hand camera 60 to the target position X btag .
  • the CPU 91 determines that the angle of the first joint axis J1 detected by the encoder 34 coincides with the calculated angle command value ⁇ J1 * and that the angle of the second joint axis J2 detected by the encoder 44 is the calculated angle command value.
  • the corresponding motors 31 and 32 are controlled by feedback control so as to match ⁇ J2 *.
  • the CPU 91 captures an image of the work W supplied from the work supply section P with the hand camera 60, and recognizes the position of the work W by performing image processing on the captured image. Then, the CPU 91 sets the target positions X btag and Y btag of the hand (work holding unit 24) for picking up the work W to the recognized position of the work W, and sets the hand to the target positions X btag , Y btag and Z btag. , and move to the collection operation.
  • the target positions X btag and Y btag are set based on the position of the workpiece W recognized.
  • the setting of the target position Z btag is performed based on the height information of the workpiece W input in advance.
  • the CPU 91 first determines the angle ⁇ J1 of the first joint axis J1 and the angle ⁇ J2 of the second joint axis J2 for moving the hand to the set target positions (X btag , Y btag , Z btag ).
  • the angle (shaft angle) ⁇ J4 of the shaft 23 and the elevation position Zs of the shaft 23 are calculated by inverse kinematics.
  • the CPU 91 estimates the hand based on ⁇ J1 , ⁇ J2 , ⁇ J4 , and Z s calculated in consideration of error factors such as the twist and deflection of the first arm 21 and the deflection of the second arm 22.
  • Positions X best , Y best , Z best are calculated by forward kinematics.
  • the CPU 91 calculates the positional deviation amounts ⁇ X b , ⁇ Y b , ⁇ Z b by taking differences between the estimated positions X best , Y best , Z best of the hand and the target positions X btag , Y btag , Z btag . .
  • the CPU 91 offsets the target positions X btag , Y btag , and Z btag by the positional deviation amounts ⁇ X b , ⁇ Y b , and ⁇ Z b , and solves the inverse kinematics to shift the hand to the target positions X btag , Y
  • a command value Z s * is calculated.
  • the CPU 91 controls the corresponding motors 31, 32, 33a, 33b by feedback control based on each command value.
  • the robot system 1 captures an image of the workpiece W with the hand camera 60 attached to the robot arm 20, recognizes the position of the workpiece W, moves the tip of the robot arm 20 to the recognized position, and detects the workpiece W. Collect. Therefore, unless the positional relationship between the robot system 1 and the hand camera 60 is properly grasped, the position of the workpiece W cannot be accurately recognized by the hand camera 60 . In other words, the robot system 1 cannot pick up the workpiece W appropriately. Therefore, in the present embodiment, the position of the hand camera 60 is measured in advance with respect to the design position, and the measured positional deviation is reflected in the control of the workpiece position recognition operation so that the position of the workpiece W can be accurately recognized. made it It is assumed that the robot system 1 has already been calibrated.
  • FIG. 5 is an explanatory diagram showing the hand camera position measurement procedure.
  • a step S10 attaches the marked end effector EE (mark pin) to the tip of the shaft 23 .
  • FIG. 6 is an external view of the marked end effector EE.
  • the marked end effector EE is a pin-shaped member that extends vertically while attached to the shaft 23 .
  • a mark M is provided at the tip of the pin of the marked end effector EE.
  • step S20 the robot system 1 is provided with a work camera position measurement for hand camera position measurement for measuring the position of the work camera 70 in the world coordinate system ⁇ w (work camera positions X wwc , Y wwc , ⁇ wwc for hand camera position measurement).
  • FIG. 7 is a flowchart showing an example of work camera position measurement processing for hand camera position measurement executed by the CPU 91 of the control device 90 .
  • the CPU 91 first receives an input of the control point Zm (eigenvalue) of the marked end effector EE from the operator (step S100).
  • the control point Zm is the distance between the tip of the shaft 23 and the tip of the marked end effector EE (mark M).
  • the coordinates of the tip of the marked end effector EE are set at a position Zm away from the origin of the mechanical interface coordinate system ⁇ m in the Z-axis direction.
  • the CPU 91 waits until the operator presses the work camera position measurement button (step S110).
  • the CPU 91 moves the end effector EE with the mark above the work camera 70 (step S120).
  • the position of the work camera 70 (work camera positions X wwc , Y wwc , ⁇ wwc ) is measured by imaging the mark M (step S130).
  • the CPU 91 moves the tip of the marked end effector EE in the X-axis direction and the Y-axis direction so as to pass through the design center position of the work camera 70, while moving the end effector EE with the mark.
  • the CPU 91 captures an image of the mark M.
  • the CPU 91 obtains the movement trajectory of the mark M in the obtained captured image in the X-axis direction and the Y-axis direction, respectively.
  • the CPU 91 obtains the positional deviation amounts ⁇ x and ⁇ y of the work camera 70 by calculating the difference between the image center of the captured image and the intersection point of each movement trajectory obtained.
  • the CPU 91 calculates the rotation deviation amount ⁇ of the work camera 70 from the difference in the rotation direction between the X-axis direction of the captured image and the X-axis direction of the movement locus.
  • the work camera positions X wwc and Y wwc are obtained by offsetting the upper center position by the positional deviation amounts ⁇ x and ⁇ y, and the work camera position ⁇ wwc is obtained by offsetting the work camera 70 by the rotational deviation amount ⁇ in the rotational direction.
  • the CPU 91 After measuring the work camera positions X wwc , Y wwc , ⁇ wwc in this way, the CPU 91 registers the measured work camera positions X wwc , Y wwc , ⁇ wwc in the storage device 94 (step S140), and executes this process. finish.
  • a hole jig 80 for measuring the position of the hand camera is attached to the opening 71a provided at the top of the storage box 71.
  • FIG. 10 is an explanatory diagram showing how the hole jig is attached.
  • the hole jig 80 is fixed by a fixture 82 while being fitted in the opening 71 a of the storage box 71 .
  • the hole jig 80 has a center hole 81a which is located right above the work camera 70 and penetrates vertically, and three-point holes 81b which are located around the center hole 81a at intervals in the circumferential direction and penetrate vertically. and are formed.
  • the number of holes of the three-point hole 81b is not limited to three, and may be two or may be four or more.
  • step S40 the robot system 1 is provided with a position measuring hand camera for measuring the position (center hole positions Xwwv , Ywwv , Zwwv ) of the center hole 81a of the hole jig 80 in the world coordinate system ⁇ w by the work camera 70.
  • Execute hole position measurement processing FIG. 11 is a flow chart showing an example of hand camera position measurement hole position measurement processing executed by the CPU 91 of the control device 90 .
  • the CPU 91 first waits until the operator presses the hand camera position measurement button (step S200).
  • the CPU 91 captures an image of the hole jig 80 with the work camera 70, and measures the distance a m of the three-point hole 81b of the hole jig 80 from the obtained captured image (step S210).
  • 12A and 12B are explanatory diagrams showing how the three-point holes of the hole jig are imaged by the work camera. The measurement of the distance am of the three-point holes 81b is performed by measuring the distance between the three-point holes 81b reflected in the captured image.
  • the CPU 91 calculates the center hole position Zwwv for hand camera position measurement by the following equation (1) based on the measured distance am of the three-point holes 81b (step S220), and stores the calculated center hole position Zwwv .
  • Register in device 94 step S230.
  • a 0 indicates the designed distance of the three-point hole 81 b
  • WD indicates the working distance of the work camera 70
  • Z indicates the imaging height of the work camera 70 .
  • the CPU 91 measures the center hole positions X wwv and Y wwv of the hole jig 80 in the world coordinate system ⁇ w from the captured image obtained in step S210 (step S240). Then, the CPU 91 registers the measured center hole positions X wwv , Y wwv in the storage device 94 (step S250), and terminates this process.
  • a step S50 causes the robot system 1 to execute hand camera angle measurement processing for measuring the angle (position ⁇ ci ) of the hand camera 60 .
  • FIG. 13 is a flow chart showing an example of hand camera angle measurement processing executed by the CPU 91 of the control device 90 .
  • the CPU 91 first uses the work camera 70 to image the hole jig 80 for hand camera position measurement, and measures the phase ⁇ wwv of the three-point hole 81b of the hole jig 80 in the world coordinate system ⁇ w . (step S300), and the measured three-point hole phase ⁇ wwv is registered in the storage device 94 (step S310).
  • the measurement of the three-point hole phase ⁇ wwv is performed by obtaining the angle of the triangle connecting the holes of the three-point hole 81b reflected in the obtained captured image with respect to the X axis. Subsequently, as shown in FIG. 14, the CPU 91 sets the control points of the hand camera 60 to the work camera positions X wwc and Y wwc for hand camera position measurement registered in step S20 and the work camera positions X wwc and Y wwc for hand camera position measurement registered in step S40. It is moved to the center hole position Z wwv (step S320).
  • the CPU 91 captures an image of the hole jig 80 with the hand camera 60, and calculates the phase ⁇ whv of the three-point hole 81b of the hole jig 80 in the world coordinate system ⁇ w based on the obtained captured image in the same manner as in step S300. Measure (step S330). Then, the CPU 91 determines the hand camera position ⁇ ci based on the three-point hole phase ⁇ wwv measured using the work camera 70 in steps S300 and S310 and the three-point hole phase ⁇ whv measured using the hand camera 60 in step S330. is calculated (step S340), the calculated hand camera position ⁇ ci is registered in the storage device 94 (step S350), and the present process ends.
  • FIG. 15 is an explanatory diagram showing the three-point hole phase ⁇ measured with the work camera and the hand camera.
  • a step S60 causes the robot system 1 to execute hand camera position measurement processing for measuring the position of the hand camera 60 (hand camera positions X ci , Y ci ).
  • FIG. 16 is a flowchart showing an example of hand camera position measurement processing executed by the control device.
  • the CPU 91 first waits until the operator presses the hand camera position measurement button (step S400).
  • the CPU 91 sets the control points of the hand camera 60 to the hand camera position measurement work camera positions X wwc and Y wwc registered in step S20 and the hand camera positions registered in step S40. It is moved to the center hole position for measurement Z wwv (step S410).
  • the CPU 91 takes an image of the hole jig 80 with the hand camera 60, and measures the center hole positions X whv and Y whv from the obtained captured image (step S420).
  • the CPU 91 calculates the distance L between the image center of the captured image and the measured center hole positions X whv and Y whv (step S430).
  • the distance L can be calculated by the following equation (2), where the center of the image is the origin, the center hole position X whv is a, and the center hole position Y whv is b.
  • the CPU 91 determines whether or not the calculated distance L is equal to or less than a predetermined allowable value Lref (step S440).
  • the CPU 91 determines that the distance L is not equal to or less than the allowable value Lref, the CPU 91 returns to step S410 and repeats the process until the distance L becomes equal to or less than the allowable value Lref. Note that the process may proceed to step S450 regardless of whether the distance L is equal to or less than the allowable value Lref.
  • the control point of the marked end effector EE (mark pin) is set to the work camera positions X wwc and Y wwc for hand camera position measurement registered in step S20 and step It is moved to the hand camera position measuring hole position Z wwv registered in S40 (step S450). Subsequently, the CPU 91 rotates the marked end effector EE to angles of 0°, 90°, 180°, and 270°, respectively, and images the mark M with the work camera 70 at each angle to capture the mark M in the world coordinate system ⁇ w .
  • Pin positions (X0, y0), (X90, Y90), (X180, Y180), (X270, Y270) are measured (step S460).
  • the CPU 91 calculates the average values X mc and Y mc of the mark pin positions (X0, Y0), (X90, Y90), (X180, Y180), (X270, Y270) by the following equation (3).
  • the CPU 91 calculates the difference between the center hole positions Xwwc , Ywwc for hand camera position measurement measured by the work camera 70 in step S40 and the mark pin positions (average values) Xmc , Ymc according to the following equation (4): By calculating, positional deviation amounts X me and Y me from the center positions X wwv and Y wwv of the hole for hand camera position measurement of the center of the end effector are calculated (step S190).
  • the CPU 91 transforms the coordinate system of the positional deviation amounts X me and Y me from the world coordinate system ⁇ w to the camera coordinate system ⁇ c , and the post-transformed positional deviation amounts X rme and Y rme correspond to the designed hand position.
  • New hand camera positions X ci and Y ci are calculated by offsetting the camera positions X mo and Y mo (eigenvalues) (step S470).
  • the positional deviation amounts X me and Y me are converted into the world coordinate system ⁇ w and the camera coordinate system ⁇ c is rotated by the phase difference .theta.rm.
  • ⁇ rw be the phase of the base origin of the robot 10 in the world coordinate system ⁇ w
  • each rotation angle of the first joint axis J1 and the second joint axis J2 when the center hole 81 a of the hole jig 81 is imaged by the hand camera 60 .
  • the phase difference .theta.rm is calculated by the following equation (5). Therefore, the positional deviation amounts Xrme and Yrme after conversion can be calculated using the rotation matrices of the following equations (6) and (7).
  • the new hand camera positions X mm , Y mm can be calculated by the following equation (8) using the hand camera positions X mo , Y mo which are eigenvalues, which are the hand camera positions X ci , Y ci . After calculating the new hand camera positions X ci and Y ci in this way, the CPU 91 registers the calculated hand camera positions X ci and Y ci in the storage device 94 (step S480).
  • the CPU 91 calculates the hand camera position change amount E using the following equation (9) (step S490), and determines whether or not the hand camera position change amount E is equal to or less than a predetermined allowable value Eref (step S490). S500).
  • the CPU 91 determines that the hand camera position change amount E is not equal to or less than the allowable value Eref, the CPU 91 returns to step S400.
  • the storage device 94 stores the correct hand camera positions X ci , Y ci , ⁇ ci .
  • the CPU 91 controls the position of the hand camera 60 using the hand camera positions X ci , Y ci , and ⁇ ci stored in the storage device 94 , so that the work W (object) imaged by the hand camera 60 is ) can be accurately grasped, work on the work W can be performed more accurately.
  • a step S70 removes the marked end effector EE and the hole jig 80 for hand camera position measurement. This completes the hand camera position measurement procedure.
  • the robot 10 corresponds to the robot
  • the hand camera 60 corresponds to the hand camera
  • the work camera 70 corresponds to the external camera
  • the hole jig 80 corresponds to the jig member
  • the center hole 81a corresponds to the jig member
  • the CPU 91 of the control device 90 that executes the processes of steps S240 and S250 of the hand camera position measurement hole position measurement process corresponds to the first position acquisition section
  • the hand camera position measurement process The CPU 91 of the control device 90 that executes the processes of steps S410 and S420 corresponds to the second position acquisition section, and the CPU 91 of the control device 90 that executes the processes of steps S450 to S480 of the hand camera position measurement process functions as the position deviation measurement section. Equivalent to. Further, the CPU 91 of the control device 90 that executes the work camera position measurement process for hand camera position measurement corresponds to the external camera position acquisition unit, and the CPU 91 of the control device 90 that executes the processes of steps S300 and S310 of the hand camera angle measurement process.
  • the CPU 91 of the control device 90 that executes the processing of steps S320 and S330 of the hand camera angle measurement process corresponds to the second phase acquisition unit
  • the processing of step S340 of the hand camera angle measurement process corresponds to the phase shift measuring unit.
  • the CPU 91 of the control device 90 that executes the processes of steps S210 and S220 of the hand camera position measurement hole position measurement process corresponds to the height acquisition section.
  • the robot 10 is configured as a horizontal multi-joint robot (SCARA robot), but is not limited to this, and may be configured in any other configuration such as a vertical multi-joint robot. good too.
  • SCARA robot horizontal multi-joint robot
  • the camera positional deviation measuring apparatus of the present disclosure includes a robot having a robot arm and a hand camera attached to the robot arm, and an external camera installed outside the robot.
  • a camera positional deviation measuring device for measuring the positional deviation of the hand camera comprising: a jig member installed above the external camera and having a first measurement hole penetrating vertically; and the external camera.
  • a first position acquisition unit that acquires the position of the first measurement hole from the captured image of the first measurement hole obtained by capturing an image of the jig member
  • a second position acquisition unit configured to move the hand camera to a position, capture an image of the jig member with the hand camera, and acquire the position of the first measurement hole from the captured image of the first measurement hole obtained; Then, the robot's hand is moved to the position of the first measurement hole acquired by the second position acquisition unit, and the external camera captures an image of the robot's hand.
  • a positional deviation measuring unit for measuring the positional deviation.
  • the camera misalignment measuring device of the present disclosure includes a jig member that is installed above the external camera and has a first measurement hole that penetrates vertically. Further, the positional deviation measuring device acquires the position of the first measurement hole by imaging the jig member with the external camera, moves the hand camera to the acquired position, and images the jig member with the hand camera. Obtain the position of the first measurement hole. Then, the positional deviation measuring device moves the hand of the robot to the acquired position of the first measurement hole, images the hand of the robot with an external camera, and measures the positional deviation of the camera. Accordingly, it is possible to appropriately measure the positional deviation of the camera attached to the arm of the robot.
  • a jig member installed above the external camera and having a plurality of second measurement holes penetrating vertically separately from the first measurement holes; an external camera position acquisition unit configured to move the hand of the robot above a camera, capture an image of the hand with the external camera, and acquire the position of the external camera from the captured image of the hand obtained; a first phase acquisition unit that captures an image of the jig member and acquires the phase of the second measurement hole from the obtained captured image of the second measurement hole; A second step of moving the hand camera to the position of the external camera, capturing an image of the jig member with the hand camera, and acquiring the phase of the second measurement hole from the obtained captured image of the second measurement hole.
  • phase acquisition unit a phase acquisition unit
  • phase shift measurement unit that measures the phase shift of the hand camera based on the phases of the second measurement holes respectively acquired by the first phase acquisition unit and the second phase acquisition unit.
  • a jig member installed above the external camera and having a plurality of second measurement holes penetrating vertically apart from the first measurement holes; a height acquisition unit that captures an image of the jig member with an external camera and acquires the height of the first measurement hole from the obtained captured image of the second measurement hole, wherein the second position acquisition The unit may move the hand camera to the position obtained by the first position obtaining unit and the height obtained by the height obtaining unit. In this way, even if there is an error in the installation height of the jig member, it is possible to satisfactorily measure the displacement of the hand camera.
  • a jig member installed above the external camera and having a plurality of second measurement holes penetrating vertically separately from the first measurement holes; a height acquisition unit that captures an image of the jig member with an external camera and acquires the height of the first measurement hole from the obtained captured image of the second measurement hole; may move the hand camera to the position obtained by the second position obtaining unit and the height obtained by the height obtaining unit. In this way, even if there is an error in the installation height of the jig member, it is possible to satisfactorily measure the displacement of the hand camera.
  • the present disclosure is in the form of a camera positional deviation measuring device, it may be in the form of a camera positional deviation measuring method.
  • the present disclosure can be used in the robot system manufacturing industry and the like.
  • 1 robot system 2 workbench, 3a main pole, 3b sub pole, 10 robot, 11 base, 20 robot arm, 21 first arm, 22 second arm, 23 shaft, 24 work holder, 30 first arm drive, 31 motor, 32 motor, 33a motor, 33b motor, 34 encoder, 40 second arm drive unit, 42 motor, 44 encoder, 50 shaft drive unit, 52a motor, 52b motor, 54a encoder, 54b encoder, 60 hand camera, 70 Work camera, 71 housing box, 71a opening, 80 hole jig, 81a center hole, 81b 3-point hole, 82 fixture, 90 control device, 91 CPU, 92 ROM, 93 RAM, 94 storage device, EE marked end effector , J1 1st joint axis, J2 2nd joint axis, J3 3rd joint axis, M mark, P workpiece supply unit, S board.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)
PCT/JP2021/006829 2021-02-24 2021-02-24 カメラの位置ずれ測定装置および位置ずれ測定方法 WO2022180674A1 (ja)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2023501705A JP7562827B2 (ja) 2021-02-24 2021-02-24 カメラの位置ずれ測定装置および位置ずれ測定方法
PCT/JP2021/006829 WO2022180674A1 (ja) 2021-02-24 2021-02-24 カメラの位置ずれ測定装置および位置ずれ測定方法
CN202180091564.9A CN116806186A (zh) 2021-02-24 2021-02-24 相机的位置偏移测定装置及位置偏移测定方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/006829 WO2022180674A1 (ja) 2021-02-24 2021-02-24 カメラの位置ずれ測定装置および位置ずれ測定方法

Publications (1)

Publication Number Publication Date
WO2022180674A1 true WO2022180674A1 (ja) 2022-09-01

Family

ID=83047831

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/006829 WO2022180674A1 (ja) 2021-02-24 2021-02-24 カメラの位置ずれ測定装置および位置ずれ測定方法

Country Status (3)

Country Link
JP (1) JP7562827B2 (enrdf_load_stackoverflow)
CN (1) CN116806186A (enrdf_load_stackoverflow)
WO (1) WO2022180674A1 (enrdf_load_stackoverflow)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60159905A (ja) * 1984-01-30 1985-08-21 Hitachi Ltd 視覚を備えたロボツトの制御装置
JPS6427885A (en) * 1987-07-21 1989-01-30 Nippon Avionics Co Ltd Part fixture
JP2016052695A (ja) * 2014-09-03 2016-04-14 キヤノン株式会社 ロボット装置、およびロボット装置の制御方法
JP2017100240A (ja) * 2015-12-01 2017-06-08 セイコーエプソン株式会社 制御装置、ロボットおよびロボットシステム

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018012184A (ja) * 2016-07-22 2018-01-25 セイコーエプソン株式会社 制御装置、ロボットおよびロボットシステム
CN111791226B (zh) * 2019-12-31 2021-12-03 深圳市豪恩声学股份有限公司 通过机器人实现组装的方法、装置及机器人

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60159905A (ja) * 1984-01-30 1985-08-21 Hitachi Ltd 視覚を備えたロボツトの制御装置
JPS6427885A (en) * 1987-07-21 1989-01-30 Nippon Avionics Co Ltd Part fixture
JP2016052695A (ja) * 2014-09-03 2016-04-14 キヤノン株式会社 ロボット装置、およびロボット装置の制御方法
JP2017100240A (ja) * 2015-12-01 2017-06-08 セイコーエプソン株式会社 制御装置、ロボットおよびロボットシステム

Also Published As

Publication number Publication date
CN116806186A (zh) 2023-09-26
JPWO2022180674A1 (enrdf_load_stackoverflow) 2022-09-01
JP7562827B2 (ja) 2024-10-07

Similar Documents

Publication Publication Date Title
JP6661028B2 (ja) 作業位置補正方法
JP6661027B2 (ja) 作業ロボット
JP2012055999A (ja) 物体把持システム、物体把持方法、プログラム、およびロボットシステム
CN105313127A (zh) 机器人、机器人的控制方法以及机器人的控制装置
JP4289619B2 (ja) 多関節ロボットのツール位置補正方法
JP2019069493A (ja) ロボットシステム
JP2007122705A (ja) 溶接教示位置補正システム及びキャリブレーション方法
WO2007138756A1 (ja) 回転中心点算出方法、回転軸線算出方法、プログラムの作成方法、動作方法およびロボット装置
KR20070122271A (ko) 로봇 자세 제어 시스템 및 로봇 자세 제어 방법
WO2022180674A1 (ja) カメラの位置ずれ測定装置および位置ずれ測定方法
CN114939865B (zh) 校准方法
US11230015B2 (en) Robot system
JP2016203282A (ja) エンドエフェクタの姿勢変更機構を備えたロボット
US20240157567A1 (en) Picking system
WO2023032400A1 (ja) 自動搬送装置、及びシステム
JP2024113790A (ja) キャリブレーション方法、キャリブレーション装置およびロボットシステム
JP2024098591A (ja) ロボットの制御方法およびロボットシステム
JP6578671B2 (ja) ロボット、ロボットの制御方法、及びロボットの制御装置
WO2022254613A1 (ja) カメラの位置ずれ補正方法およびロボット装置
JP7510514B2 (ja) オフセット値設定方法およびロボット制御装置
US12330317B2 (en) Calibration method and robot system
CN113905859A (zh) 机器人控制系统及机器人控制方法
EP4385684A1 (en) Robot inspection system
JP2023137158A (ja) ロボットハンドのキャリブレーション方法およびロボットシステム
WO2022118374A1 (ja) スカラロボットの制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21927778

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023501705

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 202180091564.9

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21927778

Country of ref document: EP

Kind code of ref document: A1