WO2022091328A1 - ロボットシステム、ロボットアーム、エンドエフェクタ、およびアダプタ - Google Patents
ロボットシステム、ロボットアーム、エンドエフェクタ、およびアダプタ Download PDFInfo
- Publication number
- WO2022091328A1 WO2022091328A1 PCT/JP2020/040779 JP2020040779W WO2022091328A1 WO 2022091328 A1 WO2022091328 A1 WO 2022091328A1 JP 2020040779 W JP2020040779 W JP 2020040779W WO 2022091328 A1 WO2022091328 A1 WO 2022091328A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image pickup
- pickup device
- image
- robot system
- distance
- Prior art date
Links
- 239000012636 effector Substances 0.000 title claims description 262
- 238000003384 imaging method Methods 0.000 claims abstract description 87
- 230000003287 optical effect Effects 0.000 claims description 77
- 230000036544 posture Effects 0.000 description 45
- 238000012545 processing Methods 0.000 description 37
- NJPPVKZQTLUDBO-UHFFFAOYSA-N novaluron Chemical compound C1=C(Cl)C(OC(F)(F)C(OC(F)(F)F)F)=CC=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F NJPPVKZQTLUDBO-UHFFFAOYSA-N 0.000 description 32
- 238000010586 diagram Methods 0.000 description 11
- 238000000034 method Methods 0.000 description 11
- 238000006073 displacement reaction Methods 0.000 description 8
- 230000000694 effects Effects 0.000 description 5
- 238000013459 approach Methods 0.000 description 4
- 238000001514 detection method Methods 0.000 description 3
- 238000009434 installation Methods 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 230000003321 amplification Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000001678 irradiating effect Effects 0.000 description 2
- 238000003199 nucleic acid amplification method Methods 0.000 description 2
- 230000005484 gravity Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 239000011347 resin Substances 0.000 description 1
- 229920005989 resin Polymers 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0014—Image feed-back for automatic industrial control, e.g. robot with camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J15/00—Gripping heads and other end effectors
- B25J15/08—Gripping heads and other end effectors having finger members
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/023—Optical sensing devices including video camera means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1612—Programme controls characterised by the hand, wrist, grip control
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/37—Measurements
- G05B2219/37046—Use simultaneous several pairs of stereo cameras, synchronized
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/37—Measurements
- G05B2219/37425—Distance, range
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/37—Measurements
- G05B2219/37567—3-D vision, stereo vision, with two cameras
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
- G05B2219/39575—Wrist, flexible wrist
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40584—Camera, non-contact sensor mounted on wrist, indep from gripper
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40613—Camera, laser scanner on end effector, hand eye manipulator, local
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40617—Agile eye, control position of camera, active vision, pan-tilt camera, follow object
Definitions
- the present invention relates to a robot system, a robot arm, an end effector, and an adapter.
- Patent Document 1 describes a configuration in which an image pickup device is attached to a robot arm.
- One aspect of the robot system of the present invention is a robot system having a robot arm having a movable portion, the first image pickup device and the second image pickup device attached to the robot arm, and a control for controlling the robot system. It has a unit and a distance information acquisition unit that acquires information about the distance of an object, and the control unit can change the baseline length, which is the distance between the first imaging device and the second imaging device. , The distance information acquisition unit acquires information on the distance of the object based on the baseline length.
- One aspect of the robot system of the present invention is a robot system having a robot arm having a movable portion, which has a first image pickup device and a second image pickup device attached to the robot arm, and at least the first image pickup. Either the device or the second imaging device is movable with respect to the robot arm.
- One aspect of the robot system of the present invention is a robot system having a robot arm having a movable portion, that is, an end effector attached to the robot arm, a first image pickup device attached to the end effector, and a second image pickup device. It has an image pickup device, and at least one of the first image pickup device and the second image pickup device is movable with respect to the end effector.
- One aspect of the robot system of the present invention is a robot system having a robot arm having a movable portion, an adapter for attaching an end effector to the robot arm, a first image pickup device and a second image pickup device attached to the adapter. It has an image pickup device, and at least one of the first image pickup device and the second image pickup device is movable with respect to the adapter.
- One aspect of the robot system of the present invention is a robot system having a robot arm having a movable portion, which has a first image pickup device and a second image pickup device attached to the robot arm, and the first image pickup device.
- the relative position between the second image pickup device and the second image pickup apparatus is variable.
- One aspect of the robot system of the present invention is a robot system having a robot arm having a movable portion, that is, an end effector attached to the robot arm, a first image pickup device attached to the end effector, and a second image pickup device. It has an image pickup device, and the relative position between the first image pickup device and the second image pickup device is variable.
- One aspect of the robot system of the present invention is a robot system having a robot arm having a movable portion, wherein an adapter for attaching an end effector to the robot arm, a first image pickup device attached to the adapter, and a first image pickup device. It has two image pickup devices, and the relative positions of the first image pickup device and the second image pickup device are variable.
- One aspect of the robot system of the present invention is a robot system having a robot arm having a movable portion, the first image pickup device movable with respect to the robot arm, and an image captured by the first image pickup device. It has a distance information acquisition unit that acquires information about the distance of the object based on the above, and the first image pickup apparatus captures the first image of the object at the first position and is different from the first position. The second image of the object is imaged at the second position, and the distance information acquisition unit acquires information on the distance of the object based on the first image and the second image.
- One aspect of the robot system of the present invention is a robot system having a robot arm having a movable portion, the end effector attached to the robot arm, and a first image pickup device movable with respect to the end effector.
- the first image pickup device has a distance information acquisition unit that acquires information about the distance of the object based on the image captured by the first image pickup device, and the first image pickup device captures the first image of the object at the first position.
- the second image of the object is imaged at a second position different from the first position, and the distance information acquisition unit captures the distance of the object based on the first image and the second image. Get information about.
- One aspect of the robot system of the present invention is a robot system having a robot arm having a movable portion, which is an adapter for attaching an end effector to the robot arm and a first image pickup device movable with respect to the adapter. And has a distance information acquisition unit that acquires information about the distance of the object based on the image captured by the first image pickup device, and the first image pickup device has the first image of the object at the first position. And the second image of the object is imaged at a second position different from the first position, and the distance information acquisition unit receives the object based on the first image and the second image. Get information about the distance.
- One aspect of the robot system of the present invention is the robot arm, three or more image pickup devices for capturing an object, and the object acquired by two image pickup devices among the three or more image pickup devices. It has a control unit that acquires information about the distance of the object based on the information of the image.
- One aspect of the robot system of the present invention is information on an image acquired by a robot arm, three or more image pickup devices that image an object, and two image pickup devices among the three or more image pickup devices. Based on this, it has a control unit that controls at least one of the robot arm and the end effector connected to the robot arm.
- One aspect of the robot system of the present invention includes a robot arm and three or more image pickup devices, wherein the three or more image pickup devices include the robot arm, an end effector connected to the robot arm, and a robot arm. It is located around any of the adapters for mounting the end effector.
- One aspect of the robot arm of the present invention has a first holding portion in which the first image pickup device is held and a second holding portion in which the second image pickup device is held, and the first image pickup device is the first. It is movably held by the holding unit, or the second imaging device is movably held by the second holding unit.
- One aspect of the end effector of the present invention is an end effector attached to a robot arm, which has a first holding portion for holding a first image pickup device and a second holding section for holding a second image pickup device.
- the first image pickup apparatus is movably held by the first holding portion, or the second image pickup apparatus is movably held by the second holding portion.
- One aspect of the adapter of the present invention is an adapter for attaching an end effector to a robot arm, which is a first holding portion in which a first imaging device is held and a second holding portion in which a second imaging device is held.
- the first imaging device is movably held by the first holding unit, or the second imaging device is movably held by the second holding unit.
- One aspect of the robot arm of the present invention includes a first holding portion in which the first image pickup device is held and a second holding portion in which the second image pickup device is held, and the first image pickup device and the second image pickup device are held.
- the relative position of the image pickup device is variable.
- One aspect of the end effector of the present invention is an end effector attached to a robot arm, which has a first holding portion for holding a first image pickup device and a second holding section for holding a second image pickup device.
- the relative positions of the first image pickup device and the second image pickup device are variable.
- One aspect of the adapter of the present invention is an adapter for attaching an end effector to a robot arm, which is a first holding portion in which a first imaging device is held and a second holding portion in which a second imaging device is held.
- the relative positions of the first image pickup device and the second image pickup device are variable.
- One aspect of the robot arm of the present invention is a holding unit that holds three or more image pickup devices for imaging an object, and the object acquired by two image pickup devices among the three or more image pickup devices. It has a control unit which acquires information about the distance of the object based on the information of the image of.
- FIG. 1 is a perspective view showing the robot system of the first embodiment.
- FIG. 2 is a block diagram showing a part of the configuration of the robot system of the first embodiment.
- FIG. 3 is a perspective view showing a part of the robot arm, the end effector, the adapter, and the image pickup apparatus of the first embodiment.
- FIG. 4 is a plan view showing a part of the robot arm, the end effector, the adapter, and the image pickup apparatus of the first embodiment.
- FIG. 5 is a view of a part of the end effector, the first image pickup device, and the second image pickup device as viewed from the tip side in the central axis direction.
- FIG. 1 is a perspective view showing the robot system of the first embodiment.
- FIG. 2 is a block diagram showing a part of the configuration of the robot system of the first embodiment.
- FIG. 3 is a perspective view showing a part of the robot arm, the end effector, the adapter, and the image pickup apparatus of the first embodiment.
- FIG. 4 is
- FIG. 6 is a view of a part of the end effector, the first image pickup device, and the second image pickup device from the tip side in the central axis direction, and the first image pickup device and the second image pickup device are in predetermined initial positions. It is a figure which shows the case which is located.
- FIG. 7 is a perspective view showing a part of the robot system of the second embodiment.
- FIG. 8 is a view of a part of the robot system of the third embodiment as viewed from the tip side in the central axis direction.
- FIG. 9 is a view of a part of the robot system of the fourth embodiment as viewed from the tip side in the central axis direction.
- FIG. 10 is a view of a part of the robot system of the fifth embodiment as viewed from the tip side in the central axis direction.
- FIG. 11 is a diagram for explaining a part of the procedure when the robot system of the fifth embodiment acquires information regarding the distance of the object.
- FIG. 12A is a diagram showing an example of a case where the zoom magnification of the image pickup apparatus according to the fifth embodiment is relatively small and two images having a relatively large baseline length are selected.
- FIG. 12B is a diagram showing an example of a case where the zoom magnification of the image pickup apparatus according to the fifth embodiment is relatively large and two images having a relatively large baseline length are selected.
- FIG. 12A is a diagram showing an example of a case where the zoom magnification of the image pickup apparatus according to the fifth embodiment is relatively small and two images having a relatively large baseline length are selected.
- FIG. 12B is a diagram showing an example of a case where the zoom magnification of the image pickup apparatus according to the fifth embodiment is relatively large and two
- FIG. 12C is a diagram showing an example of a case where the zoom magnification of the image pickup apparatus according to the fifth embodiment is relatively large and two images having a relatively small baseline length are selected.
- FIG. 13 is a perspective view showing the robot system of the sixth embodiment.
- FIG. 14 is a perspective view showing the robot system of the seventh embodiment.
- FIG. 15 is a perspective view showing the robot system of the eighth embodiment.
- FIG. 16 is a perspective view showing the robot system of the ninth embodiment.
- FIG. 1 is a perspective view showing the robot system 10 of the present embodiment.
- FIG. 2 is a block diagram showing a part of the configuration of the robot system 10 of the present embodiment.
- the robot system 10 includes a robot 20, an image pickup device 30, a control unit 40, and a display unit 50.
- the robot 20 works on the object W on the workbench WB, for example.
- the robot 20 has a robot arm 21, an end effector 22, and an adapter 23.
- the robot arm 21 has an arm portion 24 as a movable portion. In this embodiment, a plurality of arm portions 24 are provided.
- the robot arm 21 is, for example, an articulated arm formed by connecting a plurality of arm portions 24.
- the arm portion 24 includes, for example, five arm portions of a first arm portion 24a, a second arm portion 24b, a third arm portion 24c, a fourth arm portion 24d, and a fifth arm portion 24e.
- the first arm portion 24a, the second arm portion 24b, the third arm portion 24c, the fourth arm portion 24d, and the fifth arm portion 24e are connected in this order from the installation surface of the robot arm 21.
- the robot arm 21 has an arm driving unit 25 and an arm position acquisition unit 26.
- the arm drive unit 25 is, for example, a servo motor.
- the arm drive unit 25 is provided for each arm unit 24, for example. That is, for example, five arm drive units 25 are provided.
- the arm drive unit 25 provided on the first arm unit 24a displaces the first arm unit 24a with reference to the installation surface of the robot 20.
- the arm drive portion 25 provided on the second arm portion 24b displaces the second arm portion 24b with reference to the first arm portion 24a.
- the arm drive portion 25 provided on the third arm portion 24c displaces the third arm portion 24c with reference to the second arm portion 24b.
- the arm drive portion 25 provided on the fourth arm portion 24d displaces the fourth arm portion 24d with reference to the third arm portion 24c.
- the arm drive portion 25 provided on the fifth arm portion 24e displaces the fifth arm portion 24e with reference to the fourth arm portion 24d.
- Each arm drive unit 25 rotates each arm unit 24, for example.
- the arm position acquisition unit 26 includes, for example, a rotary encoder (not shown).
- the arm position acquisition unit 26 is provided for each arm unit 24, for example. That is, for example, five arm position acquisition units 26 are provided.
- the arm position acquisition unit 26 provided on the first arm unit 24a can detect the displacement amount of the first arm unit 24a with respect to the installation surface of the robot 20.
- the arm position acquisition portion 26 provided on the second arm portion 24b can detect the displacement amount of the second arm portion 24b with respect to the first arm portion 24a.
- the arm position acquisition portion 26 provided in the third arm portion 24c can detect the displacement amount of the third arm portion 24c with respect to the second arm portion 24b.
- the arm position acquisition portion 26 provided on the fourth arm portion 24d can detect the displacement amount of the fourth arm portion 24d with respect to the third arm portion 24c.
- the arm position acquisition portion 26 provided in the fifth arm portion 24e can detect the displacement amount of the fifth arm portion 24e with respect to the fourth arm portion 24d.
- the displacement amount of each arm unit 24 that can be detected by each arm position acquisition unit 26 includes, for example, the rotation angle of each arm unit 24 detected by a rotary encoder (not shown).
- FIG. 3 is a perspective view showing a part of the robot arm 21 of the present embodiment, an end effector 22, an adapter 23, and an image pickup device 30.
- FIG. 4 is a plan view showing a part of the robot arm 21 of the present embodiment, an end effector 22, an adapter 23, and an image pickup device 30.
- the end effector 22 is attached to the robot arm 21.
- the end effector 22 is attached to the tip end portion of the fifth arm portion 24e via the adapter 23.
- the end effector 22 is detachably attached to, for example, the robot arm 21.
- the end effector 22 can be replaced with another end effector.
- an end effector having various shapes, structures, and functions can be appropriately adopted according to the work performed by the robot 20.
- Examples of the end effector 22 attached to the robot arm 21 include a robot hand capable of gripping an object W, a processing head for laser processing and ultrasonic processing, a camera, and particles for molten metal / resin or blast processing. These include injectors, manipulators, and air blows that eject.
- the end effector 22 is a multi-finger type robot hand capable of gripping an object W on the workbench WB.
- the end effector 22 has a base portion 22a, a plurality of finger portions 22b, an end effector driving unit 28, and an end effector position acquisition unit 29.
- the base portion 22a is connected to the fifth arm portion 24e via, for example, the adapter 23.
- the base portion 22a is, for example, a columnar shape centered on the central axis CL shown in each figure as appropriate.
- the central axis CL appropriately shown in each figure is the central axis of the end effector 22, the adapter 23, and the fifth arm portion 24e.
- the direction parallel to the central axis CL is referred to as the "central axis direction", and is appropriately indicated by the Z axis in each figure.
- the positive side (+ Z side) of the Z axis in the central axis direction is called the "tip side”
- the negative side (-Z side) in the central axis direction is called the "base end side”.
- the radial direction centered on the central axis CL is simply referred to as “diametrical direction”
- the circumferential direction around the central axis CL is simply referred to as "circumferential direction”.
- the base portion 22a is provided with a guide rail portion 22e.
- the guide rail portion 22e is, for example, an annular groove that surrounds the base portion 22a along the circumferential direction.
- the plurality of finger portions 22b project from the base portion 22a toward the tip end side (+ Z side) in the central axial direction.
- the end effector 22 can grip the object W by a plurality of finger portions 22b.
- the number of finger portions 22b is not particularly limited.
- the end effector drive unit 28 can drive the end effector 22.
- the end effector drive unit 28 has a rotation drive unit 22c.
- the rotation drive unit 22c is provided inside, for example, the base portion 22a.
- the rotation drive unit 22c is, for example, a servomotor capable of rotating the base unit 22a around the central axis CL.
- the end effector drive unit 28 has a finger drive unit that drives a plurality of finger units 22b.
- the finger drive unit is provided for each of the plurality of finger units 22b, for example.
- One finger drive unit may be provided for each finger portion 22b, or a plurality of finger drive portions may be provided for each finger portion 22b.
- the finger drive unit for example, displaces the angle of the finger portion 22b with respect to the base portion 22a.
- the finger drive unit is, for example, a servomotor.
- the end effector position acquisition unit 29 can acquire the relative position of the end effector 22 with respect to the adapter 23.
- the end effector position acquisition unit 29 has a rotation position acquisition unit 22d.
- the rotation position acquisition unit 22d is provided inside, for example, the base portion 22a.
- the rotation position acquisition unit 22d can detect the rotation position of the end effector 22.
- the rotation position acquisition unit 22d can detect, for example, the rotation angle of the base portion 22a around the central axis CL.
- the rotation position acquisition unit 22d is, for example, a rotary encoder.
- the end effector position acquisition unit 29 may include, for example, a sensor capable of detecting the position and angle of the finger portion 22b with respect to the base portion 22a.
- the camera unit 60 is attached to the end effector 22.
- the camera unit 60 is fixed to the base 22a.
- the camera unit 60 includes a support portion 61, a first camera 62, and a second camera 63.
- the support portion 61 projects from the base portion 22a toward the tip end side (+ Z side) in the central axial direction.
- the end portion of the support portion 61 on the base end side ( ⁇ Z side) is connected to the outer peripheral surface of the end portion on the tip end side of the base portion 22a.
- the first camera 62 and the second camera 63 are fixed to the end portion on the distal end side of the support portion 61.
- the first camera 62 and the second camera 63 can image the object W gripped by the plurality of finger portions 22b and the finger portions 22b.
- a stereo camera is configured by the first camera 62 and the second camera 63. In FIG. 4, the camera unit 60 is not shown.
- the camera unit 60 includes an image sensor 64, a memory 65, and a digital signal processing unit 66.
- the image sensor 64 is, for example, a CCD image sensor, a CMOS image sensor, or the like.
- the image pickup element 64 is provided in each of the first camera 62 and the second camera 63. Each image pickup element 64 converts an optical signal incident on a camera provided with the image pickup element 64 into an analog electric signal, and converts the converted analog electric signal into a digital image signal and outputs the signal.
- the digital signal processing unit 66 performs image processing such as digital amplification, color interpolation processing, and white balance processing on the digital image signal output from the image sensor 64.
- the digital image signal processed by the digital signal processing unit 66 may be temporarily stored in the memory 65, or may be output to the control unit 40 without being stored in the memory 65.
- the digital image signal output from the digital signal processing unit 66 to the control unit 40 is output to the distance information acquisition unit 44, which will be described later.
- the memory 65 can store a digital image signal output from the image sensor 64 and a digital image signal output from the digital signal processing unit 66.
- the memory 65 is, for example, a volatile memory.
- the memory 65 may be a non-volatile memory.
- the digital image signal output from the image pickup element 64 is, for example, stored in the memory 65, then sent from the memory 65 to the digital signal processing unit 66, and is subjected to image processing in the digital signal processing unit 66.
- the memory 65 and the digital signal processing unit 66 may be provided in the camera unit 60 one by one and used for both the image sensor 64 of the first camera 62 and the image sensor 64 of the second camera 63. However, it may be provided for each of the image pickup element 64 of the first camera 62 and the image pickup element 64 of the second camera 63. Further, a part or the whole of the memory 65 and the digital signal processing unit 66 may be provided outside the camera unit 60 such as the control unit 40. Further, the camera unit 60 may be configured to have only one camera.
- the adapter 23 is a member for attaching the end effector 22 to the robot arm 21. As shown in FIG. 4, the adapter 23 has a support portion 23a, a pedestal portion 23b, a connection portion 23c, and a pedestal drive portion 23d.
- the support portion 23a is a portion connected to the robot arm 21.
- the support portion 23a is detachably connected to, for example, the tip end portion of the fifth arm portion 24e.
- the support portion 23a has a recess 23g that is recessed from the tip end side (+ Z side) in the central axis direction to the base end side ( ⁇ Z side).
- the support portion 23a may be fixed to the robot arm 21 so as not to be detachable.
- the pedestal portion 23b is arranged on the tip end side (+ Z side) of the support portion 23a.
- the pedestal portion 23b is a portion to which the end effector 22 is connected.
- the base portion 22a of the end effector 22 is detachably connected to the end portion on the tip end side (+ Z side) of the pedestal portion 23b.
- the end effector 22 may be non-detachably fixed to the pedestal portion 23b.
- the portion of the pedestal portion 23b on the base end side ( ⁇ Z side) is inserted into, for example, the inside of the recess 23g.
- a gap is provided between the support portion 23a and the pedestal portion 23b, and the support portion 23a and the pedestal portion 23b are not in direct contact with each other.
- connection portion 23c is provided inside the recess 23g.
- the connecting portion 23c is provided between the support portion 23a and the pedestal portion 23b.
- the connecting portion 23c connects the support portion 23a and the pedestal portion 23b. That is, in the present embodiment, the support portion 23a and the pedestal portion 23b are indirectly connected to each other via the connection portion 23c without directly contacting each other.
- the connecting portion 23c supports the mass of the pedestal portion 23b and the mass of the end effector 22.
- the connection portion 23c has a damper element 23e and a spring element 23f.
- the spring element 23f may be, for example, an elastic member whose elastic force can be adjusted.
- the robot system 10 may have an adjusting portion capable of adjusting the elastic force of the spring element 23f.
- the spring element 23f may be, for example, an air spring.
- the elastic force of the spring element 23f may be adjusted by adjusting the air pressure of the air spring by the adjusting unit.
- a plurality of connecting portions 23c are provided, for example.
- the plurality of connecting portions 23c are, for example, a connecting portion 23c having a damper element 23e and a spring element 23f that are displaced by receiving a force in the central axial direction, and a damper element 23e that is displaced by receiving a force in a direction orthogonal to the central axial direction. And a connection portion 23c having a spring element 23f, and the like.
- the connection portion 23c can reduce, for example, the vibration of the end effector 22 and the vibration given from the outside.
- the connecting portion 23c suppresses the displacement of the end effector 22 and the displacement of the pedestal portion 23b due to the own weight of the end effector 22 and the own weight of the pedestal portion 23b.
- the connection portion 23c can support the end effector 22 in the direction of gravity regardless of the posture of the end effector 22.
- the connecting portion 23c may have, for example, in addition to the damper element 23e and the spring element 23f, another element capable of reducing the vibration of the end effector 22.
- the other element includes, for example, a piezoelectric element (piezo element).
- the pedestal drive unit 23d is provided between the support portion 23a and the pedestal portion 23b inside the recess 23g, for example.
- the pedestal drive unit 23d can displace the position of the pedestal portion 23b with respect to the support portion 23a.
- the pedestal drive unit 23d can move the end effector 22 connected to the pedestal unit 23b by moving the pedestal unit 23b.
- the pedestal drive unit 23d has, for example, a plurality of linear motors 27.
- the linear motor 27 is, for example, a voice coil motor.
- the linear motor 27 has a magnetic field generation unit 27a and a magnet unit 27b.
- the magnetic field generating portion 27a and the magnet portion 27b are attached to the support portion 23a and the other is attached to the pedestal portion 23b.
- the magnetic field generating portion 27a is attached to the support portion 23a
- the magnet portion 27b is attached to the pedestal portion 23b.
- the magnetic field generation portion 27a may be attached to the pedestal portion 23b, and the magnet portion 27b may be attached to the support portion 23a.
- the magnetic field generating unit 27a is, for example, a coil.
- a magnetic field is generated by supplying an electric current to the magnetic field generating portion 27a.
- a repulsive force or an attractive force is generated between the magnetic field generating portion 27a and the magnet portion 27b due to the magnetic field generated from the magnetic field generating portion 27a and the magnetic field generated from the magnet portion 27b. Due to this repulsive force or attractive force, the magnet portion 27b is displaced with respect to the magnetic field generating portion 27a.
- the linear motor 27 displaces the pedestal portion 23b to which the magnet portion 27b is attached with respect to the support portion 23a to which the magnetic field generation portion 27a is attached.
- the pedestal drive unit 23d can drive the pedestal portion 23b in a non-contact state without directly contacting the support portion 23a and the pedestal portion 23b with each other.
- the plurality of linear motors 27 are, for example, a linear motor 27 in which the pedestal portion 23b can be displaced in the central axis direction with respect to the support portion 23a, and the pedestal portion 23b is displaced in a direction orthogonal to the central axis direction with respect to the support portion 23a. Includes a possible linear motor 27.
- the adapter 23 may have any configuration as long as the end effector 22 can be attached to the robot arm 21.
- the configuration of the adapter 23 for example, the configuration of the adapter described in the international application PCT / JP2019 / 016043 may be adopted.
- a plurality of image pickup devices 30 are provided in this embodiment.
- Two image pickup devices 30 are provided, for example, a first image pickup device 31 and a second image pickup device 32.
- the first image pickup device 31 and the second image pickup device 32 may be, for example, an RGB camera or an infrared camera.
- a stereo camera is configured by the first image pickup device 31 and the second image pickup device 32.
- the first image pickup device 31 and the second image pickup device 32 are attached to the end effector 22.
- the first image pickup device 31 and the second image pickup device 32 are arranged around the end effector 22.
- the first image pickup device 31 and the second image pickup device 32 are located, for example, radially outside the base portion 22a and are arranged along the circumferential direction.
- the optical axis AX1 of the first image pickup device 31 and the optical axis AX2 of the second image pickup device 32 are parallel to each other.
- the optical axes AX1 and AX2 are, for example, parallel to the central axis CL.
- "the optical axes of the plurality of image pickup devices are parallel to each other” means that the optical axes of the plurality of image pickup devices are strictly parallel to each other and the optical axes of the plurality of image pickup devices are parallel to each other. It also includes cases where they are substantially parallel to each other.
- the case where the optical axes of the plurality of image pickup devices are substantially parallel to each other includes, for example, the case where the optical axes of the plurality of image pickup devices are tilted with each other within 5 °.
- FIG. 5 is a view of a part of the end effector 22, the first image pickup device 31, and the second image pickup device 32 as viewed from the tip side (+ Z side) in the central axis direction.
- FIG. 6 is a view of a part of the end effector 22, the first image pickup device 31, and the second image pickup device 32 from the tip end side (+ Z side) in the central axis direction, and is a view of the first image pickup device 31 and the second image pickup device 32. It is a figure which shows the case where the image pickup apparatus 32 is located at a predetermined initial position. In FIGS. 5 and 6, the finger portion 22b of the end effector 22 and the camera unit 60 are not shown.
- At least one of the first image pickup device 31 and the second image pickup device 32 is movable with respect to the end effector 22.
- both the first image pickup device 31 and the second image pickup device 32 are movable with respect to the end effector 22.
- the relative positions of the first image pickup device 31 and the second image pickup device 32 are variable.
- at least one of the first image pickup device 31 and the second image pickup device 32 is movable in a predetermined circumferential direction around the end effector 22.
- the "predetermined circumferential direction" is the circumferential direction around the central axis CL around the base 22a.
- both the first image pickup device 31 and the second image pickup device 32 are movable in a predetermined circumferential direction around the end effector 22. That is, in the present embodiment, one of the first image pickup device 31 and the second image pickup device 32 is movable in a predetermined circumferential direction around the end effector 22, and the other is movable in a predetermined circumferential direction around the end effector 22. be. As shown in FIG. 6, in the present embodiment, the first image pickup device 31 and the second image pickup device 32 are in contact with each other in the circumferential direction. In the present embodiment, the first image pickup apparatus 31 and the second image pickup apparatus 32 are in contact with each other in the circumferential direction when they are located at the initial positions shown in FIG.
- the first image pickup device 31 includes a housing 31a, a first drive unit 31b, a first position acquisition unit 31c, a lens 31e, and an image pickup element 31f.
- the housing 31a has, for example, an opening on the tip side (+ Z side) and has a cylindrical shape extending in the central axis direction.
- the central axis of the housing 31a coincides with, for example, the optical axis AX1 of the first image pickup apparatus 31.
- the housing 31a is attached to the base 22a of the end effector 22 via the slider 31d.
- the slider 31d is fixed to, for example, a radial inner portion in a portion on the base end side ( ⁇ Z side) of the housing 31a.
- the slider 31d connects the housing 31a and the base portion 22a of the end effector 22. That is, in the present embodiment, the first image pickup apparatus 31 is connected to the end effector 22 via the slider 31d.
- the slider 31d is connected to the guide rail portion 22e of the end effector 22.
- the slider 31d can move in the circumferential direction along the guide rail portion 22e. As a result, the first image pickup apparatus 31 can move in the circumferential direction along the guide rail portion 22e.
- the lens 31e is fitted in the opening on the tip end side (+ Z side) of the housing 31a.
- the lens 31e is, for example, a lens having a circular shape when viewed in the central axis direction.
- the optical axis AX1 of the first image pickup apparatus 31 passes through the center of the lens 31e.
- the image sensor 31f is arranged inside the housing 31a.
- the image sensor 31f is, for example, a CCD image sensor, a CMOS image sensor, or the like.
- Light incident on the housing 31a is incident on the image pickup device 31f via the lens 31e.
- the image pickup element 31f converts the incident optical signal into an analog electric signal, and converts the converted analog electric signal into a digital image signal for output.
- the image sensor 31f has a rectangular shape when viewed in the central axis direction.
- the long side of the image pickup element 31f is orthogonal to the radial direction of the first image pickup apparatus 31 passing through the optical axis AX1.
- the first image pickup apparatus 31 is arranged in the circumferential direction so as to maintain a state in which the long side of the image pickup element 31f is orthogonal to the radial direction passing through the optical axis AX1 of the first image pickup apparatus 31 when viewed in the central axis direction. It is movable.
- the first drive unit 31b is arranged inside, for example, the housing 31a.
- the first drive unit 31b is, for example, a servomotor.
- the first drive unit 31b moves the first image pickup device 31 in the circumferential direction around the end effector 22.
- the entire first image pickup apparatus 31, including the first drive unit 31b moves in the circumferential direction together with the slider 31d.
- the first position acquisition unit 31c is arranged inside, for example, the housing 31a.
- the first position acquisition unit 31c is, for example, a rotary encoder.
- the first position acquisition unit 31c can acquire position information in the circumferential direction of the first image pickup apparatus 31 by detecting the rotation of the first drive unit 31b.
- the first position acquisition unit 31c detects the rotation speed of the first drive unit 31b, for example, assuming that the rotation speed of the first drive unit 31b is zero when the first image pickup device 31 is located at the initial position shown in FIG. As a result, the circumferential position of the first image pickup apparatus 31 is detected.
- the second image pickup device 32 includes a housing 32a, a second drive unit 32b, a second position acquisition unit 32c, a lens 32e, and an image pickup element 32f.
- the housing 32a has, for example, an opening on the tip side (+ Z side) and has a cylindrical shape extending in the central axis direction.
- the central axis of the housing 32a coincides with, for example, the optical axis AX2 of the second image pickup apparatus 32.
- the housing 32a is attached to the base 22a of the end effector 22 via the slider 32d.
- the slider 32d is fixed to, for example, a radial inner portion in a portion on the base end side ( ⁇ Z side) of the housing 32a.
- the slider 32d connects the housing 32a and the base 22a of the end effector 22. That is, in the present embodiment, the second image pickup apparatus 32 is connected to the end effector 22 via the slider 32d.
- the slider 32d is connected to the guide rail portion 22e of the end effector 22.
- the slider 32d can move in the circumferential direction along the guide rail portion 22e. As a result, the second image pickup apparatus 32 can move in the circumferential direction along the guide rail portion 22e.
- the guide rail portion 22e corresponds to the first holding portion that holds the first image pickup device 31 and corresponds to the second holding portion that holds the second image pickup device 32. That is, in the present embodiment, the end effector 22 has a guide rail portion 22e as a first holding portion in which the first image pickup device 31 is held and a second holding portion in which the second image pickup device 32 is held.
- the first image pickup apparatus 31 is movably held by the guide rail portion 22e as the first holding portion
- the second image pickup device 32 is movably held by the guide rail portion 22e as the second holding portion. ing.
- the lens 32e is fitted in the opening on the tip end side (+ Z side) of the housing 32a.
- the lens 32e is, for example, a lens having a circular shape when viewed in the central axis direction.
- the optical axis AX2 of the second image pickup apparatus 32 passes through the center of the lens 32e.
- the image sensor 32f is arranged inside the housing 32a.
- the image sensor 32f is, for example, a CCD image sensor, a CMOS image sensor, or the like.
- Light incident on the housing 32a is incident on the image pickup device 32f via the lens 32e.
- the image pickup element 32f converts the incident optical signal into an analog electric signal, and converts the converted analog electric signal into a digital image signal for output.
- the image sensor 32f has a rectangular shape when viewed in the central axis direction.
- the long side of the image pickup element 32f is orthogonal to the radial direction of the second image pickup apparatus 32 passing through the optical axis AX2.
- the second image pickup apparatus 32 is arranged in the circumferential direction so as to maintain a state in which the long side of the image pickup element 32f is orthogonal to the radial direction passing through the optical axis AX2 of the second image pickup apparatus 32 when viewed in the central axis direction. It is movable.
- the image pickup device 32f has, for example, the same shape and size as the image pickup device 31f of the first image pickup device 31.
- the "long side of the image pickup device” is the long side of the image pickup device in the rectangular region where light is incident.
- the image pickup devices 31f and 32f shown in each figure show only the main body portion having a rectangular region in which light is incident.
- the image pickup devices 31f and 32f may have a portion other than the main body portion, such as a frame portion that holds the main body portion into which light is incident. In this case, even if the outer shapes of the image pickup elements 31f and 32f are not rectangular when viewed in the direction of the optical axes AX1 and AX2, the long sides of the image pickup elements 31f and 32f are among the image pickup elements 31f and 32f. It is the long side of a rectangular area where light is incident.
- the second drive unit 32b is arranged inside, for example, the housing 32a.
- the second drive unit 32b is, for example, a servo motor.
- the second drive unit 32b moves the second image pickup device 32 in the circumferential direction around the end effector 22.
- the entire second image pickup apparatus 32, including the second drive unit 32b moves in the circumferential direction together with the slider 32d.
- the first drive unit 31b and the second drive unit 32b constitute a drive unit 33 that drives the image pickup apparatus 30.
- the drive unit 33 can move at least one of the first image pickup device 31 and the second image pickup device 32 with respect to the end effector 22.
- the drive unit 33 can move both the first image pickup device 31 and the second image pickup device 32 with respect to the end effector 22 by each drive unit provided in each image pickup device 30.
- the second position acquisition unit 32c is arranged inside, for example, the housing 32a.
- the second position acquisition unit 32c is, for example, a rotary encoder.
- the second position acquisition unit 32c can acquire position information in the circumferential direction of the second image pickup apparatus 32 by detecting the rotation of the second drive unit 32b.
- the second position acquisition unit 32c detects the rotation speed of the second drive unit 32b, for example, assuming that the rotation speed of the second drive unit 32b is zero when the second image pickup device 32 is located at the initial position shown in FIG. This detects the circumferential position of the second image pickup apparatus 32.
- the first position acquisition unit 31c and the second position acquisition unit 32c constitute a position acquisition unit 34 that acquires at least the position information of the first image pickup apparatus 31.
- the position acquisition unit 34 can acquire both the position information of the first image pickup device 31 and the position information of the second image pickup device 32 by each position acquisition unit provided in each image pickup device 30.
- the position acquisition unit 34 can acquire, for example, the circumferential position of the first image pickup device 31 and the circumferential position of the second image pickup device 32.
- each image pickup apparatus 30 has a memory 35 and a digital signal processing unit 36, respectively.
- the digital signal processing unit 36 performs image processing such as digital amplification, color interpolation processing, and white balance processing on the digital image signal output from the image sensor of each image pickup device 30.
- the digital image signal processed by the digital signal processing unit 36 may be temporarily stored in the memory 35, or may be output to the control unit 40 without being stored in the memory 35.
- the digital image signal output from the digital signal processing unit 36 to the control unit 40 is output to the distance information acquisition unit 44, which will be described later.
- the memory 35 can store a digital image signal output from the image pickup element of each image pickup device 30 and a digital image signal output from the digital signal processing unit 36.
- the memory 35 is, for example, a volatile memory.
- the memory 35 may be a non-volatile memory.
- the digital image signal output from the image sensor of each image pickup device 30 is, for example, stored in the memory 35 and then sent from the memory 35 to the digital signal processing unit 36, where the digital signal processing unit 36 performs image processing. Will be done.
- the memory 35 and the digital signal processing unit 36 are configured to be provided one in each of the image pickup devices 30, but the present invention is not limited to this.
- the memory 35 and the digital signal processing unit 36 are provided one by one for each of the two image pickup devices 30, and are used for both the image pickup element 31f of the first image pickup device 31 and the image pickup element 32f of the second image pickup device 32. You may. Further, a part or the whole of the memory 35 and the digital signal processing unit 36 may be provided outside the image pickup device 30 such as the control unit 40.
- the control unit 40 controls the robot system 10. As shown in FIG. 2, in the present embodiment, the control unit 40 includes an arm control unit 41, an end effector control unit 42, an image pickup device control unit 43, and a distance information acquisition unit 44.
- Each of the arm control unit 41, the end effector control unit 42, the image pickup device control unit 43, and the distance information acquisition unit 44 may be realized by dedicated hardware, or may be realized by a memory and a microprocessor. It may be one.
- the arm control unit 41 controls the arm drive unit 25.
- the arm control unit 41 is input with information regarding the position and posture of the arm unit 24 from the arm position acquisition unit 26, and is input with information regarding the distance of the object W from the distance information acquisition unit 44.
- the arm control unit 41 controls the arm drive unit 25 based on the information regarding the position and posture of the arm unit 24 and the information regarding the distance of the object W. More specifically, the arm control unit 41 calculates, for example, a target value of the position and posture of the arm unit 24 based on the information regarding the distance of the object W, and feedback control using the information from the arm position acquisition unit 26. Controls the arm drive unit 25 so that the position and posture of the arm unit 24 reach the target value.
- control unit 40 controls at least one of the position and the posture of the robot arm 21 by controlling the arm drive unit 25 by the arm control unit 41.
- the target value of the position and the posture of the arm unit 24 may be input to the arm control unit 41 from the outside.
- the end effector control unit 42 controls the end effector drive unit 28.
- the end effector control unit 42 is input with information regarding the position and posture of the end effector 22 from the end effector position acquisition unit 29, and is input with information regarding the distance of the object W from the distance information acquisition unit 44.
- the end effector control unit 42 controls the end effector drive unit 28 based on the information regarding the position and orientation of the end effector 22 and the information regarding the distance of the object W. More specifically, the end effector control unit 42 calculates the target values of the position and posture of the end effector 22 based on the information regarding the distance of the object W, and uses the information from the end effector position acquisition unit 29.
- the feedback control controls the end effector drive unit 28 so that the position and orientation of the end effector 22 reach the target value.
- the control unit 40 controls at least one of the position and the posture of the end effector 22 by controlling the end effector drive unit 28 by the end effector control unit 42.
- the end effector control unit 42 may input the target values of the position and posture of the end effector 22 from the outside.
- the image pickup device control unit 43 controls the drive unit 33 of the image pickup device 30.
- information regarding the position of the image pickup device 30 is input to the image pickup device control unit 43 from the position acquisition unit 34 of the image pickup device 30. More specifically, the image pickup device control unit 43 is input with the position information in the circumferential direction of the first image pickup device 31 from the first position acquisition unit 31c, and the second position acquisition unit 32c to the second image pickup device 32. Position information in the circumferential direction is input. Further, information regarding the distance of the object W is input from the distance information acquisition unit 44 to the image pickup device control unit 43.
- the image pickup device control unit 43 is, for example, a first drive unit based on the position information of the image pickup device 30 input from the position acquisition unit 34 and the information regarding the distance of the object W input from the distance information acquisition unit 44. It controls 31b and the second drive unit 32b. As a result, the control unit 40 controls the position of the image pickup device 30 by controlling the drive unit 33 by the image pickup device control unit 43.
- the image pickup device control unit 43 can change the baseline length L, which is the distance between the first image pickup device 31 and the second image pickup device 32.
- the baseline length L is the distance between the optical axis AX1 of the first image pickup device 31 and the optical axis AX2 of the second image pickup device 32.
- the length L1 can be changed to a baseline length L2 that is larger than the baseline length L1.
- the baseline length L between the first image pickup device 31 and the second image pickup device 32 can be increased.
- the control unit 40 is the distance between the first image pickup device 31 and the second image pickup device 32 by controlling the drive unit 33 of the image pickup device 30 by the image pickup device control unit 43.
- the baseline length L can be changed.
- the image pickup device control unit 43 calculates, for example, the target value of the baseline length L to be changed based on the information regarding the distance of the object W.
- the image pickup device control unit 43 relatively increases the baseline length L.
- the image pickup device control unit 43 makes the baseline length L relatively small.
- the target value of the baseline length L may be input to the image pickup apparatus control unit 43 from the outside.
- the target value of the baseline length L may be input to the image pickup apparatus control unit 43 from the distance information acquisition unit 44.
- the control unit 40 changes the baseline length L by the image pickup device control unit 43 according to the work content of the robot system 10. For example, when searching for an object W on the workbench WB, the control unit 40 relatively increases the baseline length L. On the other hand, the control unit 40 makes the baseline length L relatively small, for example, when the end effector 22 is brought closer to the object W after finding the object W to be operated. At this time, the control unit 40 may reduce the baseline length L as the end effector 22 approaches the object W.
- the image pickup device control unit 43 controls the drive unit 33 after the power of the robot system 10 is turned on to move the image pickup device 30 to a predetermined initial position. For example, after the power of the robot system 10 is turned on, the image pickup device control unit 43 moves the first image pickup device 31 to a predetermined initial position shown in FIG. 6 and moves the second image pickup device 32 to a predetermined position shown in FIG. Move to the initial position.
- the movement of the first image pickup device 31 and the second image pickup device 32 to each initial position is, for example, more than the first image pickup device 31 and the second image pickup device 32 are used after the power of the robot system 10 is turned on. It is done before.
- the movement of the first image pickup apparatus 31 and the second image pickup apparatus 32 to the initial positions may be performed, for example, immediately after the power of the robot system 10 is turned on.
- the image pickup device control unit 43 makes each of the first image pickup device 31 and the second image pickup device 32 suitable and easily by, for example, bringing the first image pickup device 31 and the second image pickup device 32 into contact with each other in the circumferential direction. Move to the initial position.
- the robot system 10 may have a sensor capable of detecting that the first image pickup device 31 and the second image pickup device 32 are in contact with each other in the circumferential direction.
- the first image pickup apparatus 31 moves to a predetermined initial position after the power of the robot system 10 is turned on. More specifically, both the first image pickup device 31 and the second image pickup device 32 move to a predetermined initial position after the power of the robot system 10 is turned on. Of the first image pickup device 31 and the second image pickup device 32, only the first image pickup device 31 may move to a predetermined initial position after the power of the robot system 10 is turned on, or the first image pickup device 31 may be moved to a predetermined initial position. Of the 31 and the second image pickup device 32, only the second image pickup device 32 may move to a predetermined initial position after the power of the robot system 10 is turned on.
- the control unit 40 moves the first image pickup device 31 and the second image pickup device 32 by the image pickup device control unit 43, the member to which the image pickup device 30 is attached, that is, the end effector 22 in the present embodiment is stationary. Let it be in the state of being made. That is, in the present embodiment, the movement of the first image pickup device 31 and the movement of the second image pickup device 32 are performed in a state where the member to which the first image pickup device 31 and the second image pickup device 32 are attached is stationary.
- the movement of the first image pickup device 31 to a predetermined initial position and the movement of the second image pickup device 32 to a predetermined initial position are also the members to which the first image pickup device 31 and the second image pickup device 32 are attached ( This is performed in a state where the end effector 22) is stationary.
- the control unit 40 moves at least one of the first image pickup device 31 and the second image pickup device 32 to perform the first image pickup.
- Both the device 31 and the second image pickup device 32 may be positioned so that the object W can be imaged.
- the case where the object W cannot be imaged by the image pickup device 30 is, for example, a case where an obstacle is arranged between the image pickup device 30 and the object W, and the object W is not reflected on the image pickup device 30.
- the control unit 40 may move the first image pickup device 31 and the second image pickup device 32 to a position that does not interfere with the work of the end effector 22 according to the work of the end effector 22 on the object W.
- the distance information acquisition unit 44 acquires information regarding the distance of the object W.
- Information on the distance of the object W includes, for example, the distance from the image pickup device 30 to the object W, the distance from the end effector 22 to the object W, the distance from the robot arm 21 to the object W, and a plurality of objects W. Includes distances between, 3D point group data for the object W, and the like.
- Information of the image captured by the image pickup devices 31f and 32f is input to the distance information acquisition unit 44.
- the distance information acquisition unit 44 acquires the baseline length L based on the position information of the first image pickup apparatus 31 acquired by the position acquisition unit 34.
- the distance information acquisition unit 44 contains the position information of the first image pickup device 31 acquired by the first position acquisition unit 31c and the position information of the second image pickup device 32 acquired by the second position acquisition unit 32c. Based on and, the baseline length L is acquired.
- the distance information acquisition unit 44 determines the optical axis AX1 of the first image pickup device 31 and the optical axis of the second image pickup device 32 from the circumferential position of the first image pickup device 31 and the circumferential position of the second image pickup device 32.
- the distance to AX2 is calculated and the baseline length L is acquired.
- the distance information acquisition unit 44 may acquire the baseline length L from another portion such as the image pickup device control unit 43.
- the distance information acquisition unit 44 of the object W is based on the acquired baseline length L, the first image acquired by the first image pickup device 31, and the second image acquired by the second image pickup device 32. Get information about the distance.
- the posture of the image pickup device 31f of the first image pickup device 31 and the posture of the image pickup element 32f of the second image pickup device 32 are different from each other when viewed in the central axis direction. Therefore, the distance information acquisition unit 44 rotates at least one of the first image acquired by the first image pickup device 31 and the second image acquired by the second image pickup device 32, and the direction (direction) of the first image. And the direction (orientation) of the second image are matched.
- the distance information acquisition unit 44 rotates at least one of the first image acquired by the first image pickup device 31 and the second image acquired by the second image pickup device 32. Adjust the orientation of the acquired image.
- the distance information acquisition unit 44 may rotate only the first image acquired by the first image pickup device 31 to align the direction of the first image with the direction of the second image, or the distance information acquisition unit 44 may be acquired by the second image pickup device 32. You may rotate only the second image to align the direction of the second image with the direction of the first image, or the first image acquired by the first image pickup device 31 and the second image acquired by the second image pickup device 32.
- the direction of the first image and the direction of the second image may be aligned by rotating both of the two images.
- the distance information acquisition unit 44 measures the distance from the image pickup apparatus 30 to the object W by using the baseline length L and the first image and the second image whose directions are aligned by rotating.
- the control unit 40 controls at least one of the robot arm 21 and the end effector 22 based on the information regarding the distance of the object W thus acquired.
- the display unit 50 displays information based on the information regarding the distance.
- the information regarding the distance includes, for example, information regarding the distance of the object W, information regarding the baseline length L which is the distance between the first image pickup device 31 and the second image pickup device 32, and the like.
- the information based on the information about the distance may be the information about the distance itself or the information obtained from the information about the distance.
- the display unit 50 for example, the distance to the current object W and the current baseline length L are displayed.
- the display unit 50 may display the changes in the distance to the object W and the baseline length L in a graph.
- the display unit 50 may have any structure as long as it can display information based on the information regarding the distance.
- the display unit 50 may be provided separately from the robot 20, for example, or may be provided on the robot arm 21.
- the display unit 50 is controlled by the control unit 40.
- At least one of the first image pickup device 31 and the second image pickup device 32 attached to the end effector 22 is movable with respect to the end effector 22. Therefore, by moving at least one of the first image pickup device 31 and the second image pickup device 32, the distance between the first image pickup device 31 and the second image pickup device 32 can be changed. Thereby, the baseline length L, which is the distance between the first image pickup device 31 and the second image pickup device 32, can be changed.
- the baseline length L is relatively large, the resolution for the object W relatively far from the image pickup device 30 can be relatively large, and the distance from the image pickup device 30 to the object W relatively far away can be accurately determined. It becomes detectable. However, in this case, since the object W relatively close to the image pickup device 30 is not reflected on the image pickup device 30, the distance to the object W relatively close to the image pickup device 30 cannot be detected.
- the baseline length L is relatively small, it is possible to take an image of an object W relatively close to the image pickup device 30, but it is difficult to focus on the object W relatively far away from the image pickup device 30 and take an image. It is difficult to accurately detect the distance from the device 30 to the object W, which is relatively far away.
- the position of the object W for which the distance can be suitably detected differs depending on the size of the baseline length L. Therefore, for example, in an image pickup device in which the size of the baseline length L is fixed, the position of the object W with respect to the image pickup device, which can suitably acquire information on the distance, is limited. As a result, when only the image pickup device is used, the work contents of the robot system may be limited.
- the baseline length L between the first image pickup device 31 and the second image pickup device 32 attached to the end effector 22 can be changed as described above. Therefore, when the distance between the end effector 22 and the object W is relatively large, the baseline length L is relatively large, and when the distance between the end effector 22 and the object W is relatively small, the baseline length L is compared. By making the object smaller, even if the distance between the end effector 22 and the object W changes to some extent, the distance to the object W can be accurately measured only by the image pickup device 30 attached to the end effector 22.
- the baseline length L can be made relatively large, so that the distance to the object W can be detected more accurately by stereo matching.
- the baseline length L can be made relatively small, so that the image captured by the first image pickup device 31 and the image captured by the second image pickup device 32 can be obtained.
- the degree of overlap (overlapping portion) can be increased, and the distance to the object W can be measured by performing stereo matching within a relatively wide range in the image captured by each image pickup device 30.
- the robot system 10 can perform work on the object W regardless of the distance between the end effector 22 and the object W. Therefore, it is possible to prevent the work content of the robot system 10 from being restricted. Thereby, the workability for the object W can be improved.
- the work of searching the workbench WB from a relatively long distance to search for the object W the work of bringing the end effector 22 closer to the searched object W, and the work of using the end effector 22 to search for the object.
- the work of grasping W the work of moving the grasped object W to another place, etc., the distance to the object W is determined by using only the first image pickup device 31 and the second image pickup device 32 attached to the end effector 22. It can be carried out while being preferably acquired.
- the baseline length L in the facility is attached to the end effector.
- the robot system can perform the work while grasping the distance to the object.
- a cost for installing the equipment is required.
- the robot system can be used only in the place where the equipment is provided.
- the robot system 10 can perform the work on the object W without providing the equipment provided on the ceiling or the like as described above. This eliminates the cost of installing the equipment. Further, the robot system 10 can be used even in a place where the above equipment is not provided. Therefore, the degree of freedom of the place where the robot system 10 can be used can be improved.
- the first image pickup device 31 or the second image pickup device 32 when either the first image pickup device 31 or the second image pickup device 32 is in a position where the object W cannot be imaged, the first image pickup device 31 or the second image pickup device 32 is moved with respect to the end effector 22. Therefore, it is easy to enable the object W to be imaged by both the first image pickup device 31 and the second image pickup device 32 without moving the end effector 22. As a result, information regarding the distance of the object W can be suitably acquired regardless of the position and posture of the end effector 22.
- the position of at least one of the first image pickup device 31 and the second image pickup device 32 is set according to the moving path of the robot arm 21 and the end effector 22 and the surrounding environment in which the robot arm 21 and the end effector 22 are arranged. It can also be moved to a suitable position. For example, when the robot arm 21 and the end effector 22 are moved with respect to the object W, the first image pickup device 31 and the first image pickup device 31 and the second image pickup device 32 are prevented from coming into contact with other objects or the like. 2 The image pickup device 32 can be moved. Therefore, the degree of freedom of movement of the robot arm 21 and the end effector 22 can be improved.
- the inertia when the robot 20 moves can be optimized.
- the first image pickup device 31 and the second image pickup device 32 are arranged on opposite sides of the central axis CL so that the end is end. It is easy to minimize the total inertia of the effector 22, the first image pickup device 31, and the second image pickup device 32. This makes it possible to suitably move the end effector 22 to which the first image pickup device 31 and the second image pickup device 32 are attached.
- the first image pickup device 31 and the second image pickup device 32 are at a position where the first image pickup device 31 and the second image pickup device 32 function as counterweights with respect to the gripped object W. At least one of the image pickup device 31 and the second image pickup device 32 can be moved. As a result, the total inertia of the end effector 22, the first image pickup device 31, the second image pickup device 32, and the object W can be minimized. Therefore, the end effector 22 can be suitably moved easily in a state where the end effector 22 grips the object W.
- the distance between the object W and the first image pickup device 31 and the second image pickup device 32 changes within the range in which the object W can be imaged by the first image pickup device 31 and the second image pickup device 32
- the baseline length L may be increased as the first image pickup device 31 and the second image pickup device 32 get closer to the object W.
- the detection accuracy of the distance to the object W can be improved. Therefore, if the object W is within the range in which the image can be taken, the distance to the object W can be acquired more accurately by increasing the baseline length L as the object W approaches, and the object W can be obtained. On the other hand, it is possible to easily perform precise work.
- the optical axis AX1 of the first image pickup device 31 and the optical axis AX2 of the second image pickup device 32 are parallel to each other. Therefore, it is easy to suitably acquire the distance to the object W based on the first image acquired by the first image pickup apparatus 31 and the second image acquired by the second image pickup apparatus 32.
- the control unit 40 that controls the robot system 10 can change the baseline length L, which is the distance between the first image pickup device 31 and the second image pickup device 32, and the distance information acquisition unit.
- the baseline length L can be easily changed according to the work content of the robot system 10.
- the distance information acquisition unit 44 can easily acquire information regarding the distance of the object W.
- a position acquisition unit 34 for acquiring at least the position information of the first image pickup device 31 is provided, and the distance information acquisition unit 44 is the first image pickup device acquired by the position acquisition unit 34.
- the baseline length L is acquired based on the position information of 31. Therefore, the distance information acquisition unit 44 can suitably acquire information on the distance of the object W based on the acquired baseline length L while appropriately acquiring the baseline length L.
- the control unit 40 changes the baseline length L according to the work content of the robot system 10. Therefore, the baseline length L between the first image pickup device 31 and the second image pickup device 32 can be suitably changed according to the work content of the robot system 10. As a result, the work can be suitably performed by the robot system 10. Specifically, for example, when searching for an object W from a relatively long distance, the distance to a relatively distant object W can be accurately obtained by making the baseline length L relatively large. It is possible and it is easy to search the object W. Further, for example, when the work of gripping the object W by the end effector 22 is performed, the distance to the relatively small object W can be obtained by making the baseline length L relatively small. The end effector 22 makes it easy to preferably grip the object W.
- the distance information acquisition unit 44 rotates at least one of the first image acquired by the first image pickup device 31 and the second image acquired by the second image pickup device 32. Adjust the orientation of the acquired image. Therefore, even if the image pickup element 31f of the first image pickup device 31 and the image pickup element 32f of the second image pickup device 32 are arranged in different postures, the direction of the first image acquired by the first image pickup device 31 and the second image pickup are performed. It can be aligned with the direction of the second image acquired by the device 32. As a result, the distance of the object W based on the image acquired by each image pickup device 30 by the distance information acquisition unit 44 regardless of the relative position, relative posture, etc. between the first image pickup device 31 and the second image pickup device 32.
- the object W Information on the distance can be suitably obtained. Therefore, even when at least one of the first image pickup device 31 and the second image pickup device 32 is moved so that the first image pickup device 31 and the second image pickup device 32 are in an arbitrary position and posture, the object W Information on the distance can be suitably obtained.
- the first image pickup device 31 and the second image pickup device 32 are arranged around the end effector 22. Therefore, it is easy to measure the distance between the end effector 22 and the object W from the images captured by the first image pickup device 31 and the second image pickup device 32. Further, for example, by arranging the first image pickup device 31 and the second image pickup device 32 on the radial side of the base 22a as in the present embodiment, the first image pickup device 31 and the second image pickup device 32 can be the end effector 22. This makes it difficult to hinder the work of gripping the object W.
- At least one of the first image pickup device 31 and the second image pickup device 32 is movable in a predetermined circumferential direction around the end effector 22. Therefore, by moving at least one of the first image pickup device 31 and the second image pickup device 32, the circumferential distance between the first image pickup device 31 and the second image pickup device 32 can be changed, and the first The baseline length L between the image pickup device 31 and the second image pickup device 32 can be easily changed.
- the first image pickup apparatus 31 moves to a predetermined initial position after the power of the robot system 10 is turned on. Therefore, before using the first image pickup device 31, the first position acquisition unit 31c in the first image pickup device 31 can be calibrated. As a result, even if the first image pickup device 31 is moved, the position of the first image pickup device 31 can be accurately detected with reference to a predetermined initial position.
- both the first image pickup device 31 and the second image pickup device 32 move to a predetermined initial position after the power of the robot system 10 is turned on. Therefore, each position detection unit of each image pickup device 30 can be calibrated, and the position of each image pickup device 30 can be detected with high accuracy.
- the baseline length L between the first imaging device 31 and the second imaging device 32 can be changed with high accuracy, and information on the distance of the object W can be more preferably acquired based on the baseline length L. Further, in the present embodiment, by bringing the first image pickup device 31 and the second image pickup device 32 into contact with each other in the circumferential direction, the first image pickup device 31 and the second image pickup device 32 can be easily moved to the initial positions. be able to.
- the movement of the first image pickup device 31 to a predetermined initial position is performed when the member to which the first image pickup device 31 is attached, that is, the end effector 22 is stationary in the present embodiment. Will be done. Therefore, it is easier to move the first image pickup device 31 as compared with the case where the first image pickup device 31 is moved to the initial position while the end effector 22 is moving. Further, since it is possible to suppress the movement of the first image pickup apparatus 31 while the end effector 22 is moving, it is possible to suppress the complexity of the movement calculation of the end effector 22 and the movement calculation of the robot arm 21. In the present embodiment, the movement of the second image pickup apparatus 32 to a predetermined initial position is also performed in a state where the end effector 22 is stationary.
- the second image pickup apparatus 32 can be easily moved to the initial position. Further, since it is possible to suppress the movement of the second image pickup apparatus 32 while the end effector 22 is moving, it is possible to further suppress the complexity of the movement calculation of the end effector 22 and the movement calculation of the robot arm 21.
- the movement of the first image pickup device 31 with respect to the end effector 22 and the movement of the second image pickup device 32 with respect to the end effector 22 are all the members to which the first image pickup device 31 and the second image pickup device 32 are attached, that is, In this embodiment, the end effector 22 is stationary. Therefore, while the end effector 22 is moving, the first image pickup device 31 and the second image pickup device 32 do not move relative to the end effector 22. As a result, it is possible to further suppress the complexity of the movement calculation of the end effector 22 and the movement calculation of the robot arm 21.
- a display unit 50 for displaying information based on information on distance is provided. Therefore, an operator or the like of the robot system 10 can easily acquire information on the distance by looking at the display unit 50.
- the direction of the acquired image is adjusted by rotating at least one of the first image acquired by the first imaging device 31 and the second image acquired by the second imaging device 32.
- the control unit 40 may adjust the direction of the acquired image by rotating at least one of the image pickup element 31f of the first image pickup device 31 and the image pickup element 32f of the second image pickup device 32.
- the control unit 40 for example, has the image pickup element 31f of the first image pickup device 31 and the image pickup element 32f of the second image pickup device 32 so that the long side of the image pickup element 31f and the long side of the image pickup element 32f are parallel to each other. Rotate at least one of them.
- the direction of the image acquired by each image pickup device 30 can be adjusted without performing processing such as rotation on the image after being captured. Therefore, the load of image processing by the control unit 40 can be reduced as compared with the case where the acquired image is subjected to processing such as rotation.
- the control unit 40 may rotate the entire first image pickup device 31 together with the image pickup element 31f, or rotate only the image pickup element 31f of the first image pickup device 31. You may let me.
- the control unit 40 rotates the image pickup element 31f around the optical axis AX1.
- the first image pickup device 31 is rotatably attached to the end effector 22 around the optical axis AX1.
- the control unit 40 may rotate the entire second image pickup device 32 together with the image pickup element 32f, or rotate only the image pickup element 32f of the second image pickup device 32. You may let me.
- the control unit 40 rotates the image pickup element 32f around the optical axis AX2.
- the second image pickup device 32 is rotatably attached to the end effector 22 around the optical axis AX2.
- the end effector 22 may have a first holding portion that holds the first imaging device 31 immovably.
- the end effector 22 may have a guide rail portion 22e as a second holding portion that movably holds the second imaging device 32.
- the end effector 22 may have a second holding portion that holds the second imaging device 32 immovably.
- the end effector 22 may have a guide rail portion 22e as a first holding portion that movably holds the first imaging device 31.
- At least the first image pickup apparatus 31 may move to a predetermined end position before the power of the robot system 10 is turned off.
- the control unit 40 brings the first image pickup device 31 and the second image pickup device 32 into contact with each other in the circumferential direction, so that the first image pickup device 31 and the first image pickup device 31 and the second image pickup device 32 come into contact with each other. 2
- Both the image pickup device 32 and the image pickup device 32 may be moved to a predetermined end position.
- the predetermined end position may be the same as the predetermined initial position or may be different from the predetermined initial position.
- the first image pickup device 31 and the second image pickup device 32 are located at the predetermined initial position when the power of the robot system 10 is turned on again. Therefore, it is not necessary to provide a step of moving the first image pickup device 31 and the second image pickup device 32 to a predetermined initial position after the power of the robot system 10 is turned on. As a result, even if the first image pickup device 31 and the second image pickup device 32 are moved immediately after the power of the robot system 10 is turned on, the position acquisition unit 34 can suitably acquire the position of each image pickup device 30.
- the movement of the first image pickup device 31 to a predetermined end position is performed, for example, in a state where the member to which the first image pickup device 31 is attached is stationary. Therefore, the first image pickup apparatus 31 can be easily moved to the end position. Further, since it is possible to suppress the movement of the first image pickup apparatus 31 while the end effector 22 is moving, it is possible to further suppress the complexity of the movement calculation of the end effector 22 and the movement calculation of the robot arm 21.
- the movement of the second image pickup device 32 to a predetermined end position is performed, for example, in a state where the member to which the second image pickup device 32 is attached is stationary. Therefore, the second image pickup apparatus 32 can be easily moved to the end position. Further, since it is possible to suppress the movement of the second image pickup apparatus 32 while the end effector 22 is moving, it is possible to further suppress the complexity of the movement calculation of the end effector 22 and the movement calculation of the robot arm 21.
- the overlapping portion (the portion commonly reflected in the two images) between the image captured by the first imaging device 31 and the image captured by the second imaging device 32 covers the entire overlapping portion. It is possible to measure the distance to the featured portion such as the object W reflected in the overlapping portion. Therefore, the control unit 40 measures the distance over the entire overlapped portion between the image captured by the first image pickup device 31 and the image captured by the second image pickup device 32, and the depth map for the overlapped portion. May be created.
- the depth map is an image showing distance information by, for example, a difference in color, a shade of color, and the like.
- control unit 40 may measure the distance to the object W by using the image captured by the first image pickup device 31 and the second image pickup device 32 and the image captured by the camera unit 60.
- the control unit 40 powers the camera unit 60, for example, when the distance from the image captured by the first image pickup device 31 and the second image pickup device 32 to the object W to be measured becomes a predetermined distance or less.
- the distance to the object W may be measured by using each image captured by the first image pickup device 31 and the second image pickup device 32 and the image captured by the camera unit 60.
- control unit 40 sets the first imaging mode for measuring the distance to the object W using the images captured by the first imaging device 31 and the second imaging device 32, and the image captured by the camera unit 60.
- the second imaging mode for measuring the distance to the object W may be switched according to the distance to the object W. In this case, for example, when the distance to the object W is larger than the predetermined distance, the control unit 40 measures the distance to the object W in the first imaging mode described above, and the distance to the object W is determined. If the distance is less than or equal to a predetermined distance, the distance to the object W may be measured in the second imaging mode described above.
- FIG. 7 is a perspective view showing a part of the robot system 110 of the present embodiment.
- the finger portion 22b of the end effector 122 and the camera unit 60 are not shown.
- the same configuration as that of the above-described embodiment may be omitted from the description by appropriately assigning the same reference numerals.
- the second image pickup device 132 is fixed to the end effector 122. That is, in the present embodiment, the second image pickup apparatus 132 does not move relative to the end effector 122. Therefore, in the present embodiment, only the first image pickup apparatus 31 is movable in a predetermined circumferential direction around the end effector 122. As described above, in the present embodiment, one of the first image pickup device 31 and the second image pickup device 132 is movable in a predetermined circumferential direction around the end effector 122, and the other is fixed to a predetermined portion of the end effector 122. Has been done.
- the second image pickup device 132 is fixed to, for example, the base 122a.
- the base portion 122a has a hole portion 122f that is recessed from the surface of the base portion 122a on the tip end side (+ Z side) to the base end side ( ⁇ Z side).
- the hole portion 122f is, for example, a circular hole centered on the central axis CL.
- a portion on the base end side (-Z side) of the second image pickup apparatus 132 is fitted and held in the hole portion 122f.
- the hole portion 122f corresponds to the second holding portion that holds the second image pickup apparatus 132. That is, in the present embodiment, the end effector 122 has a hole portion 122f as a second holding portion.
- the second image pickup apparatus 132 is held immovably by the hole portion 122f as the second holding portion.
- the portion on the tip end side (+ Z side) of the second image pickup apparatus 132 projects from the center of the surface on the tip end side of the base portion 122a toward the tip end side.
- the base portion 122a has the same configuration as the base portion 22a of the first embodiment described above, except that the hole portion 122f is provided.
- the optical axis AX2a of the second image pickup device 132 is, for example, parallel to the optical axis AX1 of the first image pickup device 31 and coincides with the central axis CL.
- the second image pickup device 132 includes a cylindrical housing 132a, a lens 132e fitted in an opening on the tip end side (+ Z side) of the housing 132a, and an image pickup element 132f arranged in the housing 132a. Have.
- Each part of the second image pickup apparatus 132 can be formed in the same manner as each part of the second image pickup apparatus 32 of the first embodiment described above. Unlike the second image pickup device 32 of the first embodiment, the second image pickup device 132 does not have the second drive unit 32b and the second position acquisition unit 32c.
- the baseline length L between the first image pickup device 31 and the second image pickup device 132 is fixed.
- the distance information acquisition unit 44 is an object based on the baseline length L and the two images acquired from the first image pickup device 31 and the second image pickup device 132, as in the first embodiment described above. Acquire information about the distance of W.
- the distance information acquisition unit 44 can also acquire information regarding the distance of the object W based on the two images captured by the first image pickup device 31. Specifically, for example, when the first image pickup device 31 is located at the first position P1 shown by the solid line in FIG. 7, the first image pickup device 31 acquires the first image, and the first image pickup device 31 is shown in FIG. 7. The second image is acquired by the first image pickup apparatus 31 when the position is located at the second position P2 indicated by the alternate long and short dash line. In the present embodiment, the distance information acquisition unit 44 can also acquire information regarding the distance of the object W based on the first image and the second image thus obtained.
- the distance information acquisition unit 44 includes the first image and the second image acquired by the first image pickup device 31, and the first image pickup device 31 at the first position P1 and the first image pickup device at the second position P2.
- Information on the distance of the object W can also be obtained based on the baseline length La, which is the distance between the object W and 31.
- the baseline length La is the distance between the optical axis AX1a of the first image pickup device 31 at the first position P1 and the optical axis AX1b of the first image pickup device 31 at the second position P2.
- the baseline length La is determined by the first position P1 and the second position P2.
- the control unit 40 can change the baseline length La by changing the first position P1 and the second position P2 from which the first image pickup apparatus 31 acquires an image.
- the control unit 40 changes the baseline length La according to, for example, the work content of the robot system 110.
- the position acquisition unit 134 acquires position information regarding the first position P1 and the second position P2.
- the position information regarding the first position P1 and the second position P2 includes, for example, information on the first position P1, information on the second position P2, information indicating the relative positional relationship between the first position P1 and the second position P2, and the like. include.
- the position acquisition unit 134 includes only the first position acquisition unit 31c of the first image pickup apparatus 31.
- the position acquisition unit 134 acquires the position information of the first position P1 and the second position P2 based on the rotation position of the drive unit 133, for example.
- the drive unit 133 includes only the first drive unit 31b of the first image pickup device 31.
- the distance information acquisition unit 44 acquires the baseline length La based on the position information regarding the first position P1 and the second position P2 acquired by the position acquisition unit 134.
- the distance information acquisition unit 44 acquires information on the distance of the object W by using the images acquired by each of the first image pickup device 31 and the second image pickup device 132, for example, according to the work content of the robot system 110. It is possible to select whether to acquire information on the distance of the object W by using the two images acquired at the first position P1 and the second position P2 which are different from each other by the first image pickup apparatus 31.
- the distance information acquisition unit 44 When acquiring information on the distance of the object W using the images acquired by each of the first image pickup device 31 and the second image pickup device 132, the distance information acquisition unit 44 is the same as in the first embodiment.
- the direction of the acquired image is adjusted by rotating at least one of the image acquired by the image pickup device 31 and the image acquired by the second image pickup device 132.
- the distance information acquisition unit 44 is the first.
- the direction of the acquired image is adjusted by rotating at least one of the first image acquired at the first position P1 by the image pickup device 31 and the second image acquired at the second position P2 by the first image pickup device 31. do.
- the distance information acquisition unit 44 may rotate only the first image acquired at the first position P1 to align the direction of the first image with the direction of the second image, or the second image acquired at the second position P2.
- the first image pickup device 31 is movable in a predetermined circumferential direction around the end effector 122, and the second image pickup device 132 is fixed to a predetermined portion of the end effector 122. Therefore, it is easier to simplify the structure of the robot system 110 as compared with the case where both the first image pickup device 31 and the second image pickup device 132 are movably provided.
- the first image pickup apparatus 31 captures the first image of the object W at the first position P1 and the second position of the object W at the second position P2 different from the first position P1. 2 Take an image.
- the distance information acquisition unit 44 acquires information regarding the distance of the object W based on the first image and the second image. Therefore, it is possible to acquire information on the distance of the object W using only the image acquired by one first image pickup apparatus 31. As a result, even if the second image pickup device 132 is not provided, information regarding the distance of the object W can be acquired only by one first image pickup device 31. Further, the baseline length La can be changed by changing the relative position between the first position P1 and the second position P2.
- the distance information acquisition unit 44 has the first image acquired at the first position P1 by the first image pickup device 31 and the second image acquired at the second position P2 by the first image pickup device 31. Adjust the orientation of the acquired image by rotating at least one of the images. Therefore, even if the image pickup element 31f of the first image pickup device 31 at the first position P1 and the image pickup element 132f of the second image pickup device 132 at the second position P2 are arranged in different postures, they are acquired at the first position P1. The direction of the first image and the direction of the second image acquired at the second position P2 can be matched. As a result, regardless of the position of the first position P1 and the second position P2, the information regarding the distance of the object W based on the image acquired by the first image pickup apparatus 31 by the distance information acquisition unit 44. Can be suitably obtained.
- control unit 40 can change the baseline length La, which is the distance between the first image pickup device 31 at the first position P1 and the first image pickup device 31 at the second position P2.
- the distance information acquisition unit 44 acquires information on the distance of the object W based on the baseline length La. Therefore, it is possible to acquire information on the distance of the object W at different baseline lengths La using only the first image pickup apparatus 31.
- the position acquisition unit 134 acquires the position information regarding the first position P1 and the second position P2, and the distance information acquisition unit 44 acquires the first position P1 acquired by the position acquisition unit 134.
- the baseline length La is acquired based on the position information regarding the second position P2. Therefore, the distance information acquisition unit 44 can suitably acquire information on the distance of the object W based on the acquired baseline length La while appropriately acquiring the baseline length La.
- control unit 40 changes the baseline length La according to the work content of the robot system 110. Therefore, the baseline length La, which is the distance between the first image pickup device 31 at the first position P1 and the first image pickup device 31 at the second position P2, can be suitably changed according to the work content of the robot system 110. Thereby, each work can be suitably performed by the robot system 110.
- the distance information acquisition unit 44 rotates at least one of the first image acquired by the first image pickup device 31 and the second image acquired by the second image pickup device 32. Adjust the orientation of the acquired image. Therefore, even if the image pickup element 31f of the first image pickup device 31 and the image pickup element 32f of the second image pickup device 32 are arranged in different postures, the direction of the first image acquired by the first image pickup device 31 and the second image pickup are performed. It can be aligned with the direction of the second image acquired by the device 32. As a result, the distance of the object W based on the image acquired by each image pickup device 30 by the distance information acquisition unit 44 regardless of the relative position, relative posture, etc. between the first image pickup device 31 and the second image pickup device 32.
- the object W Information on the distance can be suitably obtained. Therefore, even when at least one of the first image pickup device 31 and the second image pickup device 32 is moved so that the first image pickup device 31 and the second image pickup device 32 are in an arbitrary position and posture, the object W Information on the distance can be suitably obtained.
- the first image pickup device 31 may be movable in the radial direction. In this case, by changing the radial position of the first image pickup device 31, the baseline length, which is the distance between the first image pickup device 31 and the second image pickup device 132, can be changed. Further, in the present embodiment, the second image pickup apparatus 132 may not be provided. Even in this case, as described above, information regarding the distance of the object W can be acquired using only the first image pickup apparatus 31.
- the long sides of the image pickup device 31f of the first image pickup device 31 at the first position P1 and the image pickup element 31f of the first image pickup device 31 at the second position P2 are parallel to each other. It may be movable as follows. This makes it possible to align the directions of the images captured at the first position P1 and the second position P2 by the first imaging device 31 without performing processing such as rotation on the image after imaging. Therefore, the load of image processing by the control unit 40 can be reduced as compared with the case where the acquired image is subjected to processing such as rotation.
- the first image pickup device 31 is rotatably attached around the optical axis AX1, for example.
- control unit 40 rotates the first image pickup device 31 around the optical axis AX1 according to the circumferential position of the first image pickup device 31, and adjusts so that the long sides of the image pickup device 31f are always in the same direction. ..
- control unit 40 may adjust the direction of the image acquired by the first image pickup device 31 by rotating the image pickup element 31f of the first image pickup device 31.
- the first image pickup device 31 may be movable with respect to the adapter 23.
- the first image pickup device 31 may be movable in a predetermined circumferential direction around the robot arm 21.
- the first image pickup device 31 may be movable in a predetermined circumferential direction around the adapter 23.
- the distance information acquisition unit 44 is based on the first image and the second image captured at the first position P1 and the second position P2, which differ depending on the first image pickup device 31, and the object. Information about the distance of W can be obtained.
- FIG. 8 is a view of a part of the robot system 210 of the present embodiment as viewed from the tip side (+ Z side) in the central axis direction.
- the finger portion 22b of the end effector 22 and the camera unit 60 are not shown.
- the same configuration as that of the above-described embodiment may be omitted from the description by appropriately assigning the same reference numerals.
- the robot system 210 of the present embodiment has three or more image pickup devices 230 that image an object W.
- the image pickup device 230 is provided with, for example, an image pickup device 230a, an image pickup device 230b, and an image pickup device 230c.
- the three image pickup devices 230a, 230b, 230c are arranged side by side on a predetermined axis VA.
- the axis VA is, for example, a virtual axis extending in a direction orthogonal to both the central axis direction and the radial direction (left-right direction in FIG. 8).
- the image pickup device 230a, the image pickup device 230b, and the image pickup device 230c are arranged at equal intervals along the axial direction of the axis VA, for example.
- the image pickup device 230b is located between the image pickup device 230a and the image pickup device 230c.
- the optical axis AX3a of the image pickup device 230a, the optical axis AX3b of the image pickup device 230b, and the optical axis AX3c of the image pickup device 230c extend, for example, in the central axis direction and are parallel to each other.
- the image pickup element 235a of the image pickup device 230a, the image pickup element 235b of the image pickup device 230b, and the image pickup element 235c of the image pickup device 230c are rectangular when viewed in the central axis direction.
- the image pickup element 235a, the image pickup element 235b, and the image pickup element 235c are arranged in the same posture.
- the three image pickup devices 230a, 230b, 230c are arranged so that the long sides of the image pickup devices 235a, 235b, 235c in the three image pickup devices 230a, 230b, 230c are parallel to each other.
- the robot system 210 has a holding member 230d that holds the three image pickup devices 230a, 230b, and 230c, and a slider 230e that connects the holding member 230d to the base 22a of the end effector 22.
- the three image pickup devices 230a, 230b, and 230c are fixed to the holding member 230d so as not to be relatively movable with each other.
- the slider 230e is connected to a guide rail portion 22e provided on the base portion 22a of the end effector 22, similarly to the sliders 31d and 32d of the first embodiment.
- the slider 230e can move around the base portion 22a in the circumferential direction along the guide rail portion 22e.
- the robot system 210 has a drive unit that moves the slider 230e in the circumferential direction.
- the slider 230e is moved in the circumferential direction along the guide rail portion 22e by the drive unit, so that the holding member 230d and the three image pickup devices 230a, 230b, 230c held by the holding member 230d move in the circumferential direction.
- the three image pickup devices 230a, 230b, 230c move in the circumferential direction while the long sides of the image pickup elements 235a, 235b, 235c remain parallel to each other.
- the control unit 40 acquires information on the distance of the object W based on the image information of the object W acquired by the two image pickup devices 230 among the three image pickup devices 230a, 230b, 230c. do. For example, the control unit 40 selects two image pickup devices 230 from the three image pickup devices 230a, 230b, and 230c, and the object is based on the image information acquired by the two selected image pickup devices 230. Acquire information about the distance of W.
- the two image pickup devices 230 to be selected include three patterns of the image pickup device 230a and the image pickup device 230b, the image pickup device 230b and the image pickup device 230c, and the image pickup device 230a and the image pickup device 230c.
- the baseline length L3, which is the distance between the image pickup device 230a and the image pickup device 230b, and the baseline length L4, which is the distance between the image pickup device 230a and the image pickup device 230c, are different from each other. That is, the baseline length is different between the case where the image pickup device 230a and the image pickup device 230b are selected as the two image pickup devices 230 and the case where the image pickup device 230a and the image pickup device 230c are selected.
- the control unit 40 can change the baseline length by changing the two image pickup devices 230 to be selected when acquiring the information regarding the distance of the object W.
- the baseline length L3 is smaller than, for example, the baseline length L4.
- the baseline length L3 is the distance between the optical axis AX3a of the image pickup apparatus 230a and the optical axis AX3b of the image pickup apparatus 230b.
- the baseline length L4 is the distance between the optical axis AX3a of the image pickup apparatus 230a and the optical axis AX3c of the image pickup apparatus 230c.
- the baseline length which is the distance between the image pickup device 230b and the image pickup device 230c, is the same as, for example, the baseline length L3, which is the distance between the image pickup device 230a and the image pickup device 230b.
- the baseline length which is the distance between the image pickup device 230b and the image pickup device 230c, is the distance between the optical axis AX3b of the image pickup device 230b and the optical axis AX3c of the image pickup device 230c.
- the control unit 40 changes the two image pickup devices 230 to be selected according to, for example, the work content of the robot system 210, and changes the baseline length.
- the control unit 40 acquires information on the distance of the object W based on the image acquired by the two selected image pickup devices 230 and the baseline length between the two image pickup devices 230 by the distance information acquisition unit 44. ..
- the control unit 40 controls at least one of the robot arm 21 and the end effector 22 based on the information regarding the distance of the object W thus acquired.
- control unit 40 is an end effector connected to the robot arm 21 and the robot arm 21 based on the image information acquired by the two image pickup devices 230 among the three image pickup devices 230a, 230b, 230c. Control at least one of 22.
- control unit 40 selects two image pickup devices 230 from the three image pickup devices 230a, 230b, and 230c, and based on the image information acquired by the two selected image pickup devices 230. , Controls at least one of the robot arm 21 and the end effector 22.
- control unit 40 may use both of the two selected image pickup devices 230, for example, when the object W is not reflected in at least one of the images captured by the two selected image pickup devices 230. At least one of the robot arm 21 and the end effector 22 is moved so that W can be imaged.
- the control unit 40 captures the object W by both of the two selected image pickup devices 230.
- the three image pickup devices 230 may be moved in the circumferential direction so as to be possible.
- control unit 40 controls so as to synchronize the imaging by at least two of the three imaging devices 230a, 230b, 230c.
- the control unit 40 controls, for example, to synchronize the imaging by the two selected imaging devices 230 described above.
- the control unit 40 relates to the distance of the object W based on the information of the image of the object W acquired by the two image pickup devices 230 among the three image pickup devices 230a, 230b, 230c. Get information. Therefore, information on the distance of the object W can be suitably acquired by changing which of the two image pickup devices 230 uses the image according to the work content of the robot system 210 and the object W and the like. ..
- the baseline length can be changed depending on whether the images acquired by the two image pickup devices 230a and 230b are used or the images acquired by the two image pickup devices 230a and 230c are used. Therefore, by changing the two image pickup devices 230 that are appropriately selected according to the distance to the object W and the like, information on the distance of the object W can be suitably acquired.
- the control unit 40 selects two image pickup devices 230 from the three image pickup devices 230a, 230b, 230c, and the image acquired by the two selected image pickup devices 230. Based on the information, the information regarding the distance of the object W is acquired. Therefore, when acquiring information regarding the distance of the object W, the image may be taken by two image pickup devices 230 out of the three image pickup devices 230a, 230b, 230c, and the image pickup is performed by the remaining one image pickup device 230. It does not have to be. Therefore, it is possible to reduce the load on the control unit 40 when acquiring information on the distance of the object W.
- the control unit 40 attaches to the robot arm 21 and the robot arm 21 based on the information of the images acquired by the two image pickup devices 230 among the three image pickup devices 230a, 230b, 230c. Controls at least one of the connected end effectors 22. Therefore, information such as the position of the object W and the environment in which the robot system 210 is arranged can be acquired from the images acquired by the two image pickup devices 230, and the robot arm 21 and the end can be obtained according to the work contents of the robot system 210 and the like.
- the effector 22 can be moved suitably.
- the control unit 40 selects two image pickup devices 230 from the three image pickup devices 230a, 230b, 230c, and the image acquired by the two selected image pickup devices 230. Based on the information, it controls at least one of the robot arm 21 and the end effector 22 connected to the robot arm 21. Therefore, when controlling at least one of the robot arm 21 and the end effector 22 based on the image information acquired by the two image pickup devices 230, it is not necessary to perform image pickup by the remaining one image pickup device 230. As a result, the load on the control unit 40 when controlling the robot arm 21 and the end effector 22 can be reduced.
- control unit 40 controls to synchronize the imaging by at least two of the three imaging devices 230a, 230b, 230c. Therefore, the object W can be suitably imaged at the same timing by at least two image pickup devices 230. Thereby, information regarding the distance of the object W can be suitably acquired based on the images acquired by at least two image pickup devices 230.
- the three image pickup devices 230a, 230b, 230c are arranged side by side on a predetermined axis VA. Therefore, as in the present embodiment, the image pickup devices 230a, 230b, 230c can be arranged side by side with the postures of the image pickup devices 235a, 235b, 235c aligned. Thereby, even when the information regarding the distance of the object W is acquired by using the image acquired by any two of the three image pickup devices 230a, 230b, 230c, the image pickup device 230 It is easy to acquire information on the distance of the object W without rotating the acquired image. Therefore, it is possible to reduce the load on the control unit 40 when acquiring information on the distance of the object W.
- the optical axes AX3a, AX3b, and AX3c of the three image pickup devices 230a, 230b, and 230c are parallel to each other. Therefore, even when the image acquired by any of the two image pickup devices 230 among the three image pickup devices 230a, 230b, and 230c is used, the information regarding the distance of the object W can be suitably acquired from the two images. ..
- the three image pickup devices 230a, 230b, 230c are arranged so that the long sides of the image pickup devices 235a, 235b, 235c in the three image pickup devices 230a, 230b, 230c are parallel to each other. ing. Therefore, it is easy to acquire information on the distance of the object W from the acquired image without rotating the image acquired by the image pickup apparatus 230. This makes it possible to reduce the load on the control unit 40 when acquiring information on the distance of the object W.
- four or more image pickup devices 230 may be arranged side by side on a predetermined axis VA. In three or more image pickup devices 230 arranged side by side on the axis VA, the distances between two adjacent image pickup devices 230 may be different from each other.
- FIG. 9 is a view of a part of the robot system 310 of the present embodiment as viewed from the tip side (+ Z side) in the central axis direction.
- the finger portion 22b of the end effector 22 and the camera unit 60 are not shown.
- the same configuration as that of the above-described embodiment may be omitted from the description by appropriately assigning the same reference numerals.
- the robot system 310 of the present embodiment has a connecting member 336 that connects the first image pickup device 331 and the second image pickup device 332.
- the connecting member 336 is, for example, a guide rail.
- the connecting member 336 has a groove 336a extending linearly.
- the groove 336a is, for example, a groove recessed from the tip end side (+ Z side) to the base end side ( ⁇ Z side).
- the groove 336a is opened, for example, at both ends in the direction in which the groove 336a extends.
- the robot system 310 has a first slider 331 g for attaching the first image pickup device 331 to the connecting member 336, and a second slider 332 g for attaching the second image pickup device 332 to the connecting member 336.
- the first slider 331g is fixed to the first image pickup apparatus 331.
- the second slider 332g is fixed to the second image pickup apparatus 332.
- the first slider 331g and the second slider 332g are fitted in the groove 336a so as to be movable in the direction in which the groove 336a extends, for example.
- the first image pickup apparatus 331 and the second image pickup apparatus 332 are connected by the connecting member 336 via the first slider 331 g and the second slider 332 g.
- the first slider 331g and the second slider 332g are restricted from moving and rotating relative to the connecting member 336 in directions other than the direction in which the groove 336a extends.
- the first slider 331g is movable in the circumferential direction and is rotatably attached around the optical axis AX1c of the first image pickup apparatus 331.
- the second slider 332g is movable in the circumferential direction and is rotatably attached around the optical axis AX2c of the second image pickup apparatus 332.
- the first image pickup device 331 and the second image pickup device 332 can move in the circumferential direction around the base portion 22a of the end effector 22.
- the first image pickup device 331 can rotate around the optical axis AX1c of the first image pickup device 331 together with the first slider 331 g.
- the second image pickup device 332 can rotate around the optical axis AX2c of the second image pickup device 332 together with the second slider 332 g.
- the long side of the image pickup device 331f of the first image pickup device 331 and the long side of the image pickup element 332f of the second image pickup device 332 are arranged in parallel with each other, for example, and are arranged in parallel with the direction in which the groove 336a extends.
- the first image pickup device 331 and the second image pickup device 332 move in the circumferential direction from the position shown by the alternate long and short dash line in FIG. 9, the first image pickup device 331 and the second image pickup device 332 are connected.
- the member 336 also moves according to the position of the first image pickup device 331 and the position of the second image pickup device 332.
- the connecting member 336 is moving upward while rotating about an axis in the central axis direction.
- the first image pickup device 331 and the second image pickup device 332 move relative to the connecting member 336 in the direction in which the groove 336a extends, depending on the position in the circumferential direction.
- the relative change between the position of the first image pickup device 331 and the position of the second image pickup device 332 in the direction in which the groove 336a extends increases the baseline length, which is the distance between the first image pickup device 331 and the second image pickup device 332. Change.
- the baseline length between the first image pickup device 331 and the second image pickup device 332 is the distance between the optical axis AX1c of the first image pickup device 331 and the optical axis AX2c of the second image pickup device 332.
- the first image pickup device 331 and the second image pickup device 332 move in the circumferential direction from the position shown by the alternate long and short dash line in FIG. 9, the first image pickup device 331 and the second image pickup device 332 approach each other. It moves in the direction and the baseline length becomes smaller.
- the first slider 331g and the second slider 332g are restricted from moving and rotating relative to the connecting member 336 in a direction other than the direction in which the groove 336a extends. Therefore, the first slider 331g and the second slider 332g are with the connecting member 336 in response to a change in at least one of the positions and postures of the connecting member 336 accompanying the movement of the first imaging device 331 and the second imaging device 332. It rotates around the optical axes AX1c and AX2c of each image pickup device so as to maintain the relative posture.
- the first image pickup device 331 to which the first slider 331 g is fixed and the second image pickup device 332 to which the second slider 332 g is fixed also maintain their relative postures with the connecting member 336, respectively. Rotate around. Therefore, even if the first image pickup device 331 and the second image pickup device 332 move in the circumferential direction, the relative posture between the image pickup element 331f of the first image pickup device 331 and the image pickup element 332f of the second image pickup device 332 is maintained. can.
- the connecting member 336 maintains the relative postures of the first image pickup device 331 and the second image pickup device 332 in the predetermined postures of the first image pickup device 331 and the second image pickup device 332. Can be held at.
- Other configurations of the robot system 310 can be the same as the other configurations of the robot system of each embodiment described above.
- At least one of the image pickup devices 332 is movable. Therefore, without rotating the image acquired by each image pickup device, information on the distance of the object W is obtained based on the image acquired by the first image pickup device 331 and the image acquired by the second image pickup device 332. Can be suitably obtained.
- a connecting member 336 for connecting the first image pickup device 331 and the second image pickup device 332 is provided, and the connection member 336 connects the first image pickup device 331 and the second image pickup device 332. It is possible to maintain the relative posture of the first image pickup apparatus 331 and the second image pickup apparatus 332 in a predetermined posture. Therefore, for example, even if a drive unit for rotating the first image pickup device 331 and the second image pickup device 332 around each optical axis AX1c and AX2c is not provided, the connecting member 336 can be used to connect the first image pickup device 331 to the image pickup element 331f.
- the relative posture of the image pickup device 332 with the image pickup element 332f can be easily maintained in a posture in which the long sides are parallel to each other. As a result, even if at least one of the first image pickup device 331 and the second image pickup device 332 is moved to change the baseline length, it is easy based on the images acquired by the first image pickup device 331 and the second image pickup device 332. Moreover, it is possible to preferably obtain information on the distance of the object W.
- only the image pickup element 331f of the first image pickup device 331 may be rotatable around the optical axis AX1c, and only the image pickup element 332f of the second image pickup device 332 may have an optical axis. It may be rotatable around AX2c.
- the image pickup device 331f and the second image pickup device 332 of the first image pickup device 331 are imaged so that the long sides of the image pickup element 331f of the first image pickup device 331 and the image pickup element 332f of the second image pickup device 332 are parallel to each other. At least one of the elements 332f may be movable.
- FIG. 10 is a view of a part of the robot system 410 of the present embodiment as viewed from the tip side (+ Z side) in the central axis direction.
- the finger portion 22b of the end effector 22 and the camera unit 60 are not shown.
- FIG. 11 is a diagram for explaining a part of the procedure when the robot system 410 of the present embodiment acquires information regarding the distance of the object W.
- the same configuration as that of the above-described embodiment may be omitted from the description by appropriately assigning the same reference numerals.
- three or more image pickup devices 430 are provided.
- 24 image pickup devices 430 are provided.
- the plurality of image pickup devices 430 are arranged side by side along the circumferential direction around the end effector 22. That is, in the present embodiment, the three or more image pickup devices 430 are arranged side by side on a predetermined circumference.
- the predetermined circumference is a circumference centered on the central axis CL.
- the plurality of image pickup devices 430 are arranged at equal intervals, for example, along the circumferential direction. That is, in the present embodiment, the three or more image pickup devices 430 are arranged at equal intervals on a predetermined circumference.
- the optical axes AX4 of three or more image pickup devices 430 are parallel to each other.
- the long side of the image pickup device 435 of each image pickup device 430 is orthogonal to the radial direction passing through the optical axis AX4 of each image pickup device 430 when viewed in the central axis direction.
- the N image pickup devices 430 are arranged symmetrically N times around the central axis CL. That is, in the present embodiment, the 24 image pickup devices 430 are symmetrically arranged 24 times around the central axis CL.
- the image pickup device 430 is fixed to, for example, the end effector 22. More specifically, the image pickup apparatus 430 is fixed to, for example, the outer peripheral surface of the base 22a. That is, the end effector 22 has, for example, a holding portion for holding three or more image pickup devices 430 on the outer peripheral surface of the base portion 22a.
- the control unit 40 selects two images from the three or more images acquired by the three or more image pickup devices 430, and the object is based on the information of the selected two images. Acquire information about the distance of W.
- the control unit 40 selects two images from, for example, 24 images acquired by each of the 24 image pickup devices 430 based on the information regarding the occlusion of the object W.
- the information regarding the occlusion of the object W is, for example, information on whether or not the object W is reflected in the image, information on the shielding state of the object W, and the ratio of the portion of the object W that is reflected in the image. Includes information, etc.
- the control unit 40 selects, for example, two images in which the object W is most preferably reflected from the 24 acquired images.
- the control unit 40 acquires information regarding the distance of the object W based on the two selected images.
- the control unit 40 selects the image F1 acquired by the image pickup device 431 and the image F2 acquired by the image pickup device 432 among the plurality of image pickup devices 430 .
- the image pickup device 435a of the image pickup device 431 and the image pickup element 435b of the image pickup device 432 are arranged in different postures from each other.
- the control unit 40 cuts out a part of the acquired two images F1 and F2 along the rectangular frame Fs.
- the long side of the rectangular frame Fs is parallel to the virtual line IL connecting the optical axis of the image pickup apparatus 431 and the optical axis of the image pickup apparatus 432.
- the control unit 40 acquires information regarding the distance of the object W based on a part of the two images F1 and F2 cut out.
- control unit 40 selects two images from the three or more images acquired by the three or more image pickup devices 430, and the robot arm is based on the information of the selected two images. It controls at least one of the end effector 22 connected to the 21 and the robot arm 21. In the present embodiment, the control unit 40 controls so as to synchronize the imaging by three or more imaging devices 430.
- Other configurations of the robot system 410 can be the same as the other configurations of the robot system of each embodiment described above.
- the control unit 40 selects two images from the three or more images acquired by the three or more image pickup devices 430, and based on the information of the selected two images. , Acquire information about the distance of the object W. Therefore, from the three or more images acquired by the three or more image pickup devices 430, the most suitable two images for acquiring the information regarding the distance of the object W are selected, and the distance of the object W is related. Information can be obtained. Thereby, information regarding the distance of the object W can be suitably acquired according to the work content of the robot system 410, the environment in which the robot system 410 is arranged, the position and posture of the robot arm 21, and the like.
- the control unit 40 selects two images from the three or more images acquired by the plurality of image pickup devices 430 based on the information regarding the occlusion of the object W. Therefore, even if at least a part of the object W cannot be imaged by a part of the image pickup apparatus 430 due to a shield or the like, it is possible to preferably select two images in which the object W is imaged. As a result, even when a part of the object W is shielded by a shield or the like, it is easy to suitably acquire information on the distance of the object W.
- the control unit 40 selects two images from the three or more images acquired by the three or more image pickup devices 430, and uses the information of the selected two images as the information. Based on this, it controls at least one of the robot arm 21 and the end effector 22 connected to the robot arm 21. Therefore, by selecting two images containing the optimum information for moving the robot arm 21 and the end effector 22 from the plurality of images acquired by each image pickup device 430, the robot arm 21 and the end effector 22 are selected. Can be moved favorably.
- three or more image pickup devices 430 are arranged side by side on a predetermined circumference. Therefore, a relatively large number of image pickup devices 430 can be arranged side by side around the base portion 22a of the end effector 22, for example, as in the present embodiment. As a result, the image pickup device 430 can be made less likely to protrude from the robot 20 as compared with the case where the same number of image pickup devices 430 are arranged in a straight line. Therefore, even if a relatively large number of image pickup devices 430 are attached to the robot 20, the robot 20 can be easily moved.
- the plurality of image pickup devices 430 are arranged side by side along the circumference, it is possible to easily image the object W from various angles by the plurality of image pickup devices 430. Therefore, it is easier to more preferably acquire information on the distance of the object W by using a plurality of image pickup devices 430.
- three or more image pickup devices 430 are arranged at equal intervals on a predetermined circumference. Therefore, compared to the case where the image pickup devices 430 are arranged at non-equal intervals, the number of image pickup devices 430 that can image the object W depending on the position and posture of the member (for example, the end effector 22) to which the image pickup device 430 is attached is increased. Bias is unlikely to occur. This makes it easy to acquire information on the distance of the object W by using the image pickup device 430 regardless of the position and posture of the member to which the image pickup device 430 is attached.
- control unit 40 has previously obtained distance information related to the object W, information regarding occlusion of the object W, focal length information of the image pickup device 430, and three or more image pickup devices. Two images may be selected from three or more images acquired by the plurality of image pickup devices 430 based on at least one piece of information regarding the shape change of the image obtained by the 430.
- the distance information related to the object W obtained in advance is, for example, the distance from the robot 20 to the object W when the robot 20 and the object W are arranged in the initial position in the initial posture, and the object W. It includes the distance to the shield arranged in the vicinity, the distance between the plurality of objects W when the plurality of objects W are arranged in the initial position in the initial posture, and the like.
- the distance from the robot 20 to the object W is, for example, the distance from a certain part of the robot arm 21 to the object W, the distance from a certain part of the end effector 22 to the object W, and the adapter 23. Includes the distance from the part to the object W and the like.
- the control unit 40 can select two images having a suitable baseline length according to the position of the object W by selecting two images based on the distance information related to the object W obtained in advance. , Information on the distance of the object W can be suitably acquired.
- the control unit 40 can select two images having a suitable baseline length according to the focal length of the image pickup device 430 by selecting two images based on the focal length information of the image pickup device 430.
- the overlapping portion of the two images (the range of the image of the characteristic portion reflected in the overlapping) is selected. Becomes smaller. Therefore, for example, when the zoom magnification of the image pickup apparatus 430 is relatively large, the overlapping portion of the two images can be increased by selecting two images having a relatively small baseline length. Thereby, the distance to the object W can be more preferably obtained based on the two images.
- FIGS. 12A to 12C are diagrams for explaining that the overlapping portion of the two images changes depending on the baseline length and the zoom magnification.
- FIG. 12A is a diagram showing an example of a case where two images F1a and F2a having a relatively large baseline length are selected when the zoom magnification of the image pickup apparatus 430 is relatively small.
- FIG. 12B is a diagram showing an example of a case where two images F1b and F2b having a relatively large baseline length are selected when the zoom magnification of the image pickup apparatus 430 is relatively large.
- FIG. 12C is a diagram showing an example of a case where two images F1c and F2c having a relatively small baseline length are selected when the zoom magnification of the image pickup apparatus 430 is relatively large.
- 12A to 12C illustrate the case where the object W is a tree T and a vehicle V.
- the two images F1a and F2a are represented by the tree T. It is assumed that the whole and the whole of the car V are reflected. In this case, the overlapping portion (range of images of the feature portion reflected in the overlapping) OPa of the two images F1a and F2a includes the entire tree T and the entire vehicle V. Therefore, the distance to each object W can be obtained for the entire tree T and the entire vehicle V.
- FIG. 12B when two images F1b and F2b are selected so that the zoom magnification is larger than that shown in FIG. 12A and the baseline length is the same as that shown in FIG. 12A, respectively. While the range reflected in the images F1b and F2b is narrower than the range reflected in the images F1a and F2a of FIG. 12A, the baseline length remains relatively large, so that the position where the object W in each image F1b and F2b is reflected is The deviation remains relatively large. Therefore, as shown in FIG. 12B, in each of the images F1b and F2b, a part of the object W may be cut off and the overlapping portion of the images F1b and F2b may become small.
- the overlapping portion OPb includes only a part of the tree T and a part of the car V. Therefore, even if the entire tree T or the entire vehicle V is reflected in one of the images, the distance to the object W cannot be acquired for the portion not included in the overlapping portion OPb.
- the information regarding the shape change of the image obtained by the three or more image pickup devices 430 includes, for example, the information about the difference in the appearance of the object W reflected in any two images of the three or more image pickup devices 430.
- the difference in the appearance of the object W in the images captured by different image pickup devices 430 differs depending on the shape of the object W, the shadow generated by the illumination applied to the object W, and the like.
- the information regarding the shape change of the image obtained by the three or more image pickup devices 430 is, for example, the degree of matching of the appearance of the object W in the two images, the shape information of the object W, and the object W being irradiated with the information. Includes information such as lighting.
- control unit 40 performs matching for all combinations that select two images from a plurality of images acquired by each image pickup device 430, and sets the degree of matching of the appearance of the object W reflected in each image as a parameter. May be obtained as. Further, information regarding the shape change of the image may be input to the control unit 40 in advance.
- control unit 40 is suitable based on the difference in the appearance of the object W for each image by selecting two images based on the information regarding the shape change of the images obtained by the three or more image pickup devices 430.
- the control unit 40 obtains the distance information related to the object W obtained in advance, the information regarding the occlusion of the object W, the focal length information of the image pickup device 430, and the three image pickup devices 430.
- Two image pickup devices 430 may be selected from the three or more image pickup devices 430 based on at least one piece of information regarding the shape change of the image. In this case, it is possible to perform imaging only with the two selected imaging devices 430 and select information regarding the distance of the object W. Therefore, the load on the control unit 40 can be reduced as compared with the case where two images are selected from the images acquired by taking images with all the image pickup devices 430.
- the control unit 40 selects based on the information.
- the image pickup device 430 to be used may be determined.
- the shape of the object W, the difference in the appearance of the brightness of the object W, and the like may be acquired from the distance information related to the object W obtained in advance.
- the three or more image pickup devices 430 may be arranged at non-equal intervals on a predetermined circumference. Further, the three or more image pickup devices 430 may be arranged side by side on a predetermined axis in the same manner as the image pickup devices 230a, 230b, 230c of the third embodiment described above. In this case, the three or more image pickup devices 430 may be arranged so that the long sides of the image pickup devices 435 in the three or more image pickup devices 430 are parallel to each other. The three or more image pickup devices 430 may be arranged in a matrix. Further, the three or more image pickup devices 430 may be a TOF camera (Time Of Flight Camera).
- TOF camera Time Of Flight Camera
- the three or more image pickup devices 430 are arranged around the end effector 22, but the present invention is not limited to this.
- the three or more image pickup devices 430 may be arranged around any one of the robot arm 21, the end effector 22 connected to the robot arm 21, and the adapter 23 for attaching the end effector 22 to the robot arm 21. .. Even when three or more image pickup devices 430 are arranged around the robot arm 21 or around the adapter 23, it is obtained when three or more image pickup devices 430 are arranged around the end effector 22 described above. The same effect as that of the effect is obtained.
- the robot arm 21 may have a holding unit for holding three or more image pickup devices 430 that image an object W. Further, when three or more image pickup devices 430 are arranged around the adapter 23, the adapter 23 may have a holding portion for holding three or more image pickup devices 430 that image an object W. Even in these cases, the control unit 40 is based on the information of the image of the object W acquired by the two image pickup devices 430 among the three or more image pickup devices 430 held by the robot arm 21 or the adapter 23. Information on the distance of the object W may be acquired.
- control unit 40 performs the work of acquiring information on the distance of the object W based on the images acquired by the two image pickup devices 430 out of the three or more image pickup devices 430, and the two image pickup devices 430 having different combinations. It may be used multiple times. In this case, the control unit 40 can more accurately acquire the information regarding the distance of the object W by comparing the information regarding the distance of the object W acquired by the plurality of operations with each other. In particular, when at least a part of the object W is shielded by a shield when viewed from at least a part of the image pickup devices 430, a plurality of information obtained by using the two image pickup devices 430 in a plurality of sets is used. By using, it is possible to acquire information on the distance of the object W while minimizing the influence of the shielding by the shielding object.
- control unit 40 moves the end effector 22 by moving the robot arm 21 or the like after the object W is imaged by all the image pickup devices 430, and the object W is re-imaged from another place by all the image pickup devices.
- the image may be taken with 430.
- the control unit 40 can acquire an image obtained by capturing the object W from various angles.
- the number of image pickup devices 430 is relatively large, many images obtained by capturing the object W from different angles can be acquired even if the number of times of imaging by moving the end effector 22 is small.
- information such as the position and posture of the end effector 22 when the image is captured, the position and posture of the image pickup device 430 captured, and the like may be associated with each of the acquired images.
- the control unit 40 may control the robot system 410 by a visual servo using the acquired image.
- the control unit 40 controls, for example, to move the image pickup device 430 to a position where the image pickup device 430 can take an image of the target object W.
- the image of the object W imaged by the image pickup apparatus 430 at the initial position is significantly different from the image of the target object W, it is difficult to make a correspondence between the images, and the target object W It may be difficult to move the image pickup device 430 to a position where an image can be captured.
- the image taken from the different angles is used as an intermediate image and is used as an image pickup device. It is easy to bring the image captured by the 430 closer to the target image. That is, it is easy to suitably move the image pickup apparatus 430 to a position where the image of the target object W can be captured. Further, when the distance information from the object W when the intermediate image is captured is associated with the intermediate image, the control unit 40 captures the intermediate image at a position far from the object W.
- the images captured by the image pickup apparatus 430 may be brought closer to the target image by arranging them in order and passing through a plurality of intermediate images in that order.
- the robot system 410 may have, for example, a plurality of general-purpose cameras that can simply capture an image of the object W, in addition to the plurality of image pickup devices 430 described above.
- the control unit 40 may construct a 3D model of the object W by using a plurality of images captured by a plurality of general-purpose cameras.
- the 3D model for example, it is possible to further improve the acquisition accuracy of information regarding the distance of the object W using a plurality of image pickup devices 430.
- twelve image pickup devices 430 may be provided side by side at equal intervals along the circumferential direction, and two general-purpose cameras may be provided side by side in the circumferential direction between adjacent image pickup devices 430. ..
- 24 general-purpose cameras are provided.
- the general-purpose camera for example, a relatively inexpensive camera such as that used for a smartphone can be used.
- control unit 40 may perform Simultaneous Localization and Mapping (SLMA). That is, the control unit 40 may simultaneously estimate the self-position of the robot system 410 and create a map of the environment in which the robot system 410 is arranged.
- SLMA Simultaneous Localization and Mapping
- FIG. 13 is a perspective view showing the robot system 510 of the present embodiment.
- the same configuration as that of the above-described embodiment may be omitted from the description by appropriately assigning the same reference numerals.
- the image pickup device 530 includes a first image pickup device 531 attached to the robot arm 521 and a second image pickup device 32 attached to the end effector 22.
- the first image pickup apparatus 531 is arranged, for example, around the fifth arm portion 524e.
- the first image pickup apparatus 531 is connected to the guide rail portion 521a provided on the fifth arm portion 524e, for example, by the same structure as that of the first image pickup apparatus 31 connected to the guide rail portion 22e in the first embodiment. Has been done.
- the guide rail portion 521a is an annular shape surrounding the fifth arm portion 524e.
- the first image pickup apparatus 531 is movable in a predetermined circumferential direction around the robot arm 521 along the guide rail portion 521a.
- the robot arm 521 has the same configuration as the robot arm 21 of the first embodiment except that the guide rail portion 521a is provided.
- Other configurations of the robot system 510 of the present embodiment can be made in the same manner as the other configurations of the robot system of each of the above-described embodiments.
- the first image pickup device 531 and the second image pickup device 32 are attached to different members and can move relative to each attached member. Therefore, as compared with the case where the two image pickup devices are attached to the same member, it is possible to prevent the movement of the first image pickup device 531 and the movement of the second image pickup device 32 from being hindered by the other image pickup device. .. As a result, each of the first image pickup device 531 and the second image pickup device 32 can be suitably moved relative to each attached member.
- FIG. 14 is a perspective view showing the robot system 610 of the present embodiment.
- the same configuration as that of the above-described embodiment may be omitted from the description by appropriately assigning the same reference numerals.
- the robot system 610 of the present embodiment has a projection device 670 that projects an optical SL.
- the projection device 670 is arranged around the robot arm 621.
- the projection device 670 is arranged, for example, around the fifth arm portion 624e.
- the projection device 670 is connected to the guide rail portion 621a of the fifth arm portion 624e.
- the guide rail portion 621a has the same configuration as the guide rail portion 521a of the sixth embodiment except that the projection device 670 is connected instead of the image pickup device.
- the projection device 670 can move along the guide rail portion 621a in a predetermined circumferential direction around the robot arm 621.
- the projection device 670 projects, for example, the light SL having a grid pattern on the object W.
- the structure of the projection device 670 is not particularly limited as long as it can project the optical SL.
- the first image pickup device 31 and the second image pickup device 32 perform imaging in a state where the optical SL is projected by the projection device 670 and acquire an image.
- the control unit 40 has a projection device 670 and a first image pickup device 31 so that the light SL projected by the projection device 670 onto the object W can be imaged by the first image pickup device 31 and the second image pickup device 32. , And the second image pickup device 32 is moved.
- the robot arm 621 has the same configuration as the robot arm 21 of the first embodiment except that the guide rail portion 621a is provided. Other configurations of the robot system 610 can be made in the same manner as the other configurations of the robot system of each embodiment described above.
- the first image pickup device 31 and the second image pickup device 32 perform imaging in a state where the optical SL is projected by the projection device 670 and acquire an image. Therefore, the object W on which the optical SL is projected can be imaged by the first image pickup device 31 and the second image pickup device 32. As a result, the image of the object W can be more preferably acquired by the first image pickup device 31 and the second image pickup device 32. Further, by making the light SL projected from the projection device 670 into light having a pattern such as a grid pattern, the object W is based on the pattern reflected in the images acquired by the first image pickup device 31 and the second image pickup device 32. It is also possible to measure the three-dimensional shape of the light.
- the first image pickup device 31 of the first image pickup device 31 and the second image pickup device 32 executes image pickup in a state where the optical SL is projected by the projection device 670 and acquires an image. You may. Further, only the second image pickup device 32 of the first image pickup device 31 and the second image pickup device 32 may perform imaging and acquire an image in a state where the optical SL is projected by the projection device 670. Further, in the present embodiment, the robot system 610 may have only the first image pickup device 31 among the first image pickup device 31 and the second image pickup device 32. In this case, the first image pickup device 31 executes imaging in a state where the optical SL is projected by the projection device 670 to acquire an image, and the first image pickup device 31 described in the second embodiment captures the object. The information regarding the distance of the object W may be acquired by the same method as the method for acquiring the information regarding the distance of W.
- the projection device 670 may be arranged around the end effector 22 or may be arranged around the adapter 23. Which part of the robot arm 621, the end effector 22, and the adapter 23 is to be arranged around the projection device 670 is the position where the first image pickup device 31 and the second image pickup device 32 are attached, the work contents of the robot system 610, and the like. It is decided as appropriate according to. By arranging the projection device 670 around these parts, it is easy for the light SL projected from the projection device 670 to be less likely to be blocked by the parts of the robot system 610, and the light SL projected from the projection device 670 is an object. It can be easily projected to W.
- the projection device 670 may be fixed so as not to be relatively movable with respect to the attached member. Further, a plurality of projection devices 670 may be provided. In this case, the plurality of projection devices 670 may be attached to different members.
- FIG. 15 is a perspective view showing the robot system 710 of the present embodiment.
- the same configuration as that of the above-described embodiment may be omitted from the description by appropriately assigning the same reference numerals.
- the adapter 723 has an annular guide rail portion 723h along the circumferential direction.
- the adapter 723 has the same configuration as the adapter 23 of the first embodiment except that it has a guide rail portion 723h.
- the end effector 722 is the same as the end effector 22 of the first embodiment except that the guide rail portion 22e is not provided and the first image pickup device 731 and the second image pickup device 732 are not attached. It is a composition.
- the first image pickup device 731 and the second image pickup device 732 are attached to the adapter 723.
- the first image pickup device 731 and the second image pickup device 732 are arranged around the adapter 723.
- the first image pickup apparatus 731 is connected to the guide rail portion 723h via the slider 731d.
- the second image pickup apparatus 732 is connected to the guide rail portion 723h via the slider 732d. That is, in the present embodiment, the adapter 723 has a guide rail portion 723h as a first holding portion in which the first image pickup device 731 is held and a second holding portion in which the second image pickup device 732 is held.
- the sliders 731d and 732d extend radially outward from the guide rail portion 723h and project radially outward from the end effector 722.
- the first image pickup device 731 and the second image pickup device 732 provided at the radial outer ends of the sliders 731d and 732d are located radially outside the end effector 722.
- At least one of the first image pickup device 731 and the second image pickup device 732 is movable with respect to the adapter 723. At least one of the first image pickup device 731 and the second image pickup device 732 is movable in a predetermined circumferential direction around the adapter 723. In this embodiment, both the first imaging device 731 and the second imaging device 732 are movable with respect to the adapter 723 and are movable in a predetermined circumferential direction around the adapter 723. That is, in the present embodiment, the first image pickup device 731 and the second image pickup device 732 are movably held by the guide rail portion 723h as the first holding portion and the second holding portion. The relative positions of the first image pickup device 731 and the second image pickup device 732 are variable. Other configurations of the robot system 710 can be made in the same manner as the other configurations of the robot system of each embodiment described above.
- An effect similar to the obtained effect can be obtained.
- the adapter 723 may have a first holding portion that holds the first imaging device 731 immovably.
- the adapter 723 may have a guide rail portion 723h as a second holding portion that movably holds the second imaging device 732.
- the adapter 723 may have a second holding portion for holding the second imaging device 732 immovably.
- the adapter 723 may have a guide rail portion 723h as a first holding portion that movably holds the first imaging device 731.
- one of the first image pickup device 731 and the second image pickup device 732 is movable in a predetermined circumferential direction around the adapter 723, and the other is fixed to a predetermined portion of the adapter 723. May be.
- FIG. 16 is a perspective view showing the robot system 810 of the present embodiment.
- the same configuration as that of the above-described embodiment may be omitted from the description by appropriately assigning the same reference numerals.
- the first image pickup device 831 and the second image pickup device 832 are attached to the robot arm 521.
- the first image pickup device 831 and the second image pickup device 832 are arranged around the robot arm 521. More specifically, the first image pickup device 831 and the second image pickup device 832 are arranged around the fifth arm portion 524e.
- the first image pickup device 831 and the second image pickup device 832 are connected to the guide rail portion 521a by, for example, the same structure as the structure in which the first image pickup device 31 and the second image pickup device 32 are connected in the first embodiment. .. That is, in the present embodiment, the robot arm 521 has a guide rail portion 521a as a first holding portion for holding the first image pickup device 831 and a second holding portion for holding the second image pickup device 832.
- At least one of the first image pickup device 831 and the second image pickup device 832 is movable with respect to the robot arm 521. At least one of the first image pickup device 831 and the second image pickup device 832 is movable in a predetermined circumferential direction around the robot arm 521. In the present embodiment, both the first image pickup device 831 and the second image pickup device 832 are movable with respect to the robot arm 521 and are movable in a predetermined circumferential direction around the robot arm 521. That is, the first image pickup apparatus 831 and the second image pickup apparatus 832 are movably held by the guide rail portion 521a as the first holding portion and the second holding portion. The relative positions of the first image pickup device 831 and the second image pickup device 832 are variable. Other configurations of the robot system 810 can be the same as the other configurations of the robot system of each embodiment described above.
- the first image pickup device 31 and the second image pickup device 32 attached to the end effector 22 in the first embodiment by the first image pickup device 831 and the second image pickup device 832 attached to the robot arm 521. It is possible to obtain the same effect as that obtained by.
- the robot arm 521 may have a first holding portion that holds the first imaging device 831 immovably.
- the robot arm 521 may have a guide rail portion 521a as a second holding portion that movably holds the second imaging device 832.
- the robot arm 521 may have a second holding portion that holds the second imaging device 832 immovably.
- the robot arm 521 may have a guide rail portion 521a as a first holding portion that movably holds the first imaging device 831.
- one of the first image pickup device 831 and the second image pickup device 832 is movable in a predetermined circumferential direction around the robot arm 521, and the other is relative to a predetermined portion of the robot arm 521. It may be fixed.
- the position of the image pickup device may be calibrated by any method.
- the position of the image pickup device may be calibrated by arranging a panel or the like on which a specific mark is written at a specific distance with respect to the image pickup device and taking an image of the mark.
- the image pickup device may be movable only between a plurality of predetermined locations. In this case, the position where the image pickup device can be moved may be structurally determined. In this case, the position of the image pickup device can be structurally grasped.
- the image pickup device may move relative to the member.
- the image pickup apparatus may move linearly with respect to the member, or may move in a curved shape other than an arc.
- each image pickup device may move along a movement path different from each other.
- the structure of the drive unit that moves the image pickup device relative to the attached member is not particularly limited.
- the plurality of image pickup devices may include different types of image pickup devices.
- the plurality of image pickup devices may include, for example, an infrared camera and an RGB camera. In this case, an object may be imaged from the same position by each of the infrared camera and the RGB camera to acquire an image.
- the robot system may have an external sensor capable of detecting at least one of the position, posture, shape, etc. of the robot.
- the external sensor may be placed on the ceiling of the place where the robot is placed, or may be placed on the floor of the place where the robot is placed.
- the external sensor may be, for example, a sensor capable of detecting the position and posture of the robot arm, a sensor capable of detecting the position and posture of the end effector, and a sensor capable of detecting the position and posture of the adapter. Sensor may be used.
- the external sensor may be, for example, a laser tracker.
- the external sensor is, for example, based on the distance measurement result by the Time of Flight (TOF) method based on the difference between the irradiation timing for irradiating the light and the light receiving timing for receiving the reflected light.
- TOF Time of Flight
- the position information of may be detected.
- the external sensor detects the position information of each part by finding the geometrical positional relationship in the manner of triangulation based on the measurement result of the reflected position of the reflected light generated by irradiating the light of a plurality of optical paths. You may.
- the external sensor may have a variable focus lens (for example, a zoom lens) in the optical system of the light receiving portion that receives the reflected light. Further, for the position detection of each part by the external sensor, a distance measuring method using an optical comb using an ultrashort optical pulse may be used.
- the external sensor may be able to detect the position and orientation of the image pickup device.
- the control unit may move the image pickup device based on the information of the image pickup device obtained by the external sensor.
- the image pickup apparatus may be provided with a marker that can be detected by an external sensor.
- the external sensor may be an image pickup device whose baseline length can be changed.
- the control unit may change the baseline length of the external sensor, for example, according to the distance between the object on which the work is performed by the end effector and the external sensor.
- the control unit may reduce the baseline length of the external sensor when the object gripped by the end effector is brought close to the external sensor by moving the robot arm.
- the baseline length of the external sensor can be changed, for example, by the same method as the baseline length changing method appropriately described in each of the above-described embodiments.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Manipulator (AREA)
Abstract
Description
なお、本発明の範囲は、以下の実施の形態に限定されず、本発明の技術的思想の範囲内で任意に変更可能である。また、以下の図面においては、各構成をわかりやすくするために、各構造における縮尺および数等を、実際の構造における縮尺および数等と異ならせる場合がある。
図1は、本実施形態のロボットシステム10を示す斜視図である。図2は、本実施形態のロボットシステム10の構成の一部を示すブロック図である。
図1に示すように、ロボットシステム10は、ロボット20と、撮像装置30と、制御部40と、表示部50と、を有する。ロボット20は、例えば、作業台WB上の対象物Wに対して作業を行う。
図3および図4に示すように、エンドエフェクタ22は、ロボットアーム21に取り付けられている。本実施形態においてエンドエフェクタ22は、アダプタ23を介して、第5アーム部24eの先端部に取り付けられている。エンドエフェクタ22は、例えば、ロボットアーム21に対して着脱可能に取り付けられている。エンドエフェクタ22は、他のエンドエフェクタに交換可能である。
図7は、本実施形態のロボットシステム110の一部を示す斜視図である。図7では、エンドエフェクタ122の指部22bおよびカメラユニット60の図示を省略している。なお、上述した実施形態と同様の構成については、適宜同一の符号を付す等により説明を省略する場合がある。
図8は、本実施形態のロボットシステム210の一部を中心軸方向の先端側(+Z側)から見た図である。図8では、エンドエフェクタ22の指部22bおよびカメラユニット60の図示を省略している。なお、上述した実施形態と同様の構成については、適宜同一の符号を付す等により説明を省略する場合がある。
図9は、本実施形態のロボットシステム310の一部を中心軸方向の先端側(+Z側)から見た図である。図9では、エンドエフェクタ22の指部22bおよびカメラユニット60の図示を省略している。なお、上述した実施形態と同様の構成については、適宜同一の符号を付す等により説明を省略する場合がある。
図10は、本実施形態のロボットシステム410の一部を中心軸方向の先端側(+Z側)から見た図である。図10では、エンドエフェクタ22の指部22bおよびカメラユニット60の図示を省略している。図11は、本実施形態のロボットシステム410が対象物Wの距離に関する情報を取得する際の手順の一部を説明するための図である。なお、上述した実施形態と同様の構成については、適宜同一の符号を付す等により説明を省略する場合がある。
図13は、本実施形態のロボットシステム510を示す斜視図である。なお、上述した実施形態と同様の構成については、適宜同一の符号を付す等により説明を省略する場合がある。
図14は、本実施形態のロボットシステム610を示す斜視図である。なお、上述した実施形態と同様の構成については、適宜同一の符号を付す等により説明を省略する場合がある。
図15は、本実施形態のロボットシステム710を示す斜視図である。なお、上述した実施形態と同様の構成については、適宜同一の符号を付す等により説明を省略する場合がある。
図16は、本実施形態のロボットシステム810を示す斜視図である。なお、上述した実施形態と同様の構成については、適宜同一の符号を付す等により説明を省略する場合がある。
20…ロボット、21,521,621…ロボットアーム
22,122,722…エンドエフェクタ
22e,521a,723h…ガイドレール部(第1保持部、第2保持部)
23,723…アダプタ
24…アーム部(可動部)
30,230,230a,230b,230c,430,431,432,530…撮像装置
31,331,531,731,831…第1撮像装置
31f,32f,132f,235a,235b,235c,331f,332f,435…撮像素子
32,132,332,732,832…第2撮像装置
34,134…位置取得部
40…制御部
44…距離情報取得部
50…表示部
122f…穴部(第2保持部)
336…連結部材
670…投射装置、AX1,AX1a,AX1b,AX1c,AX2,AX2a,AX2c,AX3a,AX3b,AX3c,AX4…光軸
L,L1,L2,L3,L4,La…基線長
P1…第1位置
P2…第2位置
SL…光
VA…軸
W…対象物
Claims (71)
- 可動部を有するロボットアームを有するロボットシステムであって、
前記ロボットアームに取り付けられた第1撮像装置および第2撮像装置と、
前記ロボットシステムを制御する制御部と、
対象物の距離に関する情報を取得する距離情報取得部を有し、
前記制御部は、前記第1撮像装置と前記第2撮像装置との間の距離である基線長を変更可能であり、
前記距離情報取得部は、前記基線長に基づいて前記対象物の距離に関する情報を取得する、ロボットシステム。 - 前記制御部は、前記ロボットシステムの作業内容に応じて、前記基線長を変更する、請求項1に記載のロボットシステム。
- さらに、前記第1撮像装置と前記第2撮像装置とを連結する連結部材を有し、
前記連結部材は、前記第1撮像装置と前記第2撮像装置とを、前記第1撮像装置と前記第2撮像装置との相対姿勢を所定の姿勢に維持した状態で保持可能である、請求項1または2に記載のロボットシステム。 - 前記距離情報取得部は、前記第1撮像装置により取得された第1画像および前記第2撮像装置により取得された第2画像の少なくとも一方を回転させることにより、取得された画像の方向を調整する、請求項1~3のいずれか一項に記載のロボットシステム。
- 前記制御部は、前記第1撮像装置の撮像素子および前記第2撮像装置の撮像素子の少なくとも一方を回転させることにより、取得される画像の方向を調整する、請求項1~4のいずれか一項に記載のロボットシステム。
- 前記第1撮像装置および前記第2撮像装置は、前記ロボットアームの周囲に配置される、請求項1~5のいずれか一項に記載のロボットシステム。
- 前記第1撮像装置および前記第2撮像装置の少なくとも一方は、前記ロボットアームの周囲の所定の周方向に可動である、請求項6に記載のロボットシステム。
- 可動部を有するロボットアームを有するロボットシステムであって、
前記ロボットアームに取り付けられた第1撮像装置および第2撮像装置を有し、
少なくとも前記第1撮像装置および前記第2撮像装置のいずれか一方は、前記ロボットアームに対して移動可能である、ロボットシステム。 - 可動部を有するロボットアームを有するロボットシステムであって、
前記ロボットアームに取り付けられたエンドエフェクタと、
前記エンドエフェクタに取り付けられた第1撮像装置および第2撮像装置を有し、
少なくとも前記第1撮像装置および前記第2撮像装置のいずれか一方は、前記エンドエフェクタに対して移動可能である、ロボットシステム。 - 可動部を有するロボットアームを有するロボットシステムであって、
エンドエフェクタをロボットアームに取り付けるためのアダプタと、
前記アダプタに取り付けられた第1撮像装置および第2撮像装置を有し、
少なくとも前記第1撮像装置および前記第2撮像装置のいずれか一方は、前記アダプタに対して移動可能である、ロボットシステム。 - 可動部を有するロボットアームを有するロボットシステムであって、
前記ロボットアームに取り付けられた第1撮像装置および第2撮像装置を有し、
前記第1撮像装置と前記第2撮像装置との相対位置は可変である、ロボットシステム。 - 可動部を有するロボットアームを有するロボットシステムであって、
前記ロボットアームに取り付けられたエンドエフェクタと、
前記エンドエフェクタに取り付けられた第1撮像装置および第2撮像装置を有し、
前記第1撮像装置と前記第2撮像装置との相対位置は可変である、ロボットシステム。 - 可動部を有するロボットアームを有するロボットシステムであって、
エンドエフェクタを前記ロボットアームに取り付けるためのアダプタと、
前記アダプタに取り付けられた第1撮像装置および第2撮像装置を有し、
前記第1撮像装置と前記第2撮像装置との相対位置は可変である、ロボットシステム。 - 前記第1撮像装置の光軸と前記第2撮像装置の光軸とは、互いに平行である、請求項8~13のいずれか一項に記載のロボットシステム。
- 前記第1撮像装置の撮像素子および前記第2撮像装置の撮像素子の長辺同士が平行になるように、前記第1撮像装置および前記第2撮像装置の少なくとも一方が可動する、請求項8~14のいずれか一項に記載のロボットシステム。
- さらに、前記第1撮像装置と前記第2撮像装置とを連結する連結部材を有し、
前記連結部材は、前記第1撮像装置と前記第2撮像装置とを、前記第1撮像装置と前記第2撮像装置との相対姿勢を所定の姿勢に維持した状態で保持可能である、請求項8~15のいずれか一項に記載のロボットシステム。 - さらに、前記ロボットシステムを制御する制御部と、
対象物の距離に関する情報を取得する距離情報取得部を有し、
前記制御部は、前記第1撮像装置と前記第2撮像装置との間の距離である基線長を変更可能であり、
前記距離情報取得部は、前記基線長に基づいて前記対象物の距離に関する情報を取得する、請求項8~16のいずれか一項に記載のロボットシステム。 - さらに、少なくとも前記第1撮像装置の位置情報を取得する位置取得部を有し、
前記距離情報取得部は、前記位置取得部によって取得された前記第1撮像装置の位置情報に基づいて、前記基線長を取得する、請求項17に記載のロボットシステム。 - 前記制御部は、前記ロボットシステムの作業内容に応じて、前記基線長を変更する、請求項17または18に記載のロボットシステム。
- 前記距離情報取得部は、前記第1撮像装置により取得された第1画像および前記第2撮像装置により取得された第2画像の少なくとも一方を回転させることにより、取得された画像の方向を調整する、請求項17~19のいずれか一項に記載のロボットシステム。
- 前記制御部は、前記第1撮像装置の撮像素子および前記第2撮像装置の撮像素子の少なくとも一方を回転させることにより、取得される画像の方向を調整する、請求項17~19のいずれか一項に記載のロボットシステム。
- 前記第1撮像装置および前記第2撮像装置は、前記ロボットアームの周囲に配置される、請求項8または11に記載のロボットシステム。
- 前記第1撮像装置および前記第2撮像装置の少なくとも一方は、前記ロボットアームの周囲の所定の周方向に可動である、請求項22に記載のロボットシステム。
- 前記第1撮像装置および前記第2撮像装置の一方は、前記ロボットアームの周囲の所定の周方向に可動であり、他方も前記所定の周方向に可動である、または、他方は前記ロボットアームの所定部位に対して固定されている、請求項22に記載のロボットシステム。
- 前記第1撮像装置および前記第2撮像装置は、前記エンドエフェクタの周囲に配置される、請求項9または12に記載のロボットシステム。
- 前記第1撮像装置および前記第2撮像装置の少なくとも一方は、前記エンドエフェクタの周囲の所定の周方向に可動である、請求項25に記載のロボットシステム。
- 前記第1撮像装置および前記第2撮像装置の一方は前記エンドエフェクタの周囲の所定の周方向に可動であり、他方も前記所定の周方向に可動である、または、他方は前記エンドエフェクタの所定部位に対して固定されている、請求項25に記載のロボットシステム。
- 前記第1撮像装置および前記第2撮像装置は、前記アダプタの周囲に配置される、請求項10または13に記載のロボットシステム。
- 前記第1撮像装置および前記第2撮像装置の少なくとも一方は、前記アダプタの周囲の所定の周方向に可動である、請求項28に記載のロボットシステム。
- 前記第1撮像装置および前記第2撮像装置の一方は、前記アダプタの周囲の所定の周方向に可動であり、他方も前記所定の周方向に可動である、または、他方は前記アダプタの所定部位に対して固定されている、請求項28に記載のロボットシステム。
- 可動部を有するロボットアームを有するロボットシステムであって、
前記ロボットアームに対して移動可能な第1撮像装置と、
前記第1撮像装置によって撮像された画像に基づいて対象物の距離に関する情報を取得する距離情報取得部を有し、
前記第1撮像装置は、第1位置において前記対象物の第1画像を撮像し、かつ、前記第1位置と異なる第2位置において前記対象物の第2画像を撮像し、
前記距離情報取得部は、前記第1画像および前記第2画像に基づいて前記対象物の距離に関する情報を取得する、ロボットシステム。 - 可動部を有するロボットアームを有するロボットシステムであって、
前記ロボットアームに取り付けられたエンドエフェクタと、
前記エンドエフェクタに対して移動可能な第1撮像装置と、
前記第1撮像装置によって撮像された画像に基づいて対象物の距離に関する情報を取得する距離情報取得部を有し、
前記第1撮像装置は、第1位置において前記対象物の第1画像を撮像し、かつ、前記第1位置と異なる第2位置において前記対象物の第2画像を撮像し、
前記距離情報取得部は、前記第1画像および前記第2画像に基づいて前記対象物の距離に関する情報を取得する、ロボットシステム。 - 可動部を有するロボットアームを有するロボットシステムであって、
エンドエフェクタを前記ロボットアームに取り付けるためのアダプタと、
前記アダプタに対して移動可能な第1撮像装置と、
前記第1撮像装置によって撮像された画像に基づいて対象物の距離に関する情報を取得する距離情報取得部を有し、
前記第1撮像装置は、第1位置において前記対象物の第1画像を撮像し、かつ、前記第1位置と異なる第2位置において前記対象物の第2画像を撮像し、
前記距離情報取得部は、前記第1画像および前記第2画像に基づいて前記対象物の距離に関する情報を取得する、ロボットシステム。 - 前記第1撮像装置は、前記第1位置における前記第1撮像装置の撮像素子および前記第2位置における前記第1撮像装置の撮像素子の長辺同士が平行になるように可動する、請求項31~33のいずれか一項に記載のロボットシステム。
- さらに、前記ロボットシステムを制御する制御部を有し、
前記制御部は、前記第1位置における前記第1撮像装置と前記第2位置における前記第1撮像装置との間の距離である基線長を変更可能であり、
前記距離情報取得部は、前記基線長に基づいて前記対象物の距離に関する情報を取得する、請求項31~34のいずれか一項に記載のロボットシステム。 - さらに、前記第1位置および前記第2位置に関する位置情報を取得する位置取得部を有し、
前記距離情報取得部は、前記位置取得部によって取得された前記第1位置および前記第2位置に関する位置情報に基づいて、前記基線長を取得する、請求項35に記載のロボットシステム。 - 前記制御部は、前記ロボットシステムの作業内容に応じて、前記基線長を変更する、請求項35または36に記載のロボットシステム。
- 前記距離情報取得部は、前記第1撮像装置により前記第1位置において取得された第1画像および前記第1撮像装置により前記第2位置において取得された第2画像の少なくとも一方を回転させることにより、取得された画像の方向を調整する、請求項35~37のいずれか一項に記載のロボットシステム。
- 前記制御部は、前記第1撮像装置の撮像素子を回転させることにより、前記第1撮像装置により取得される画像の方向を調整する、請求項35~37のいずれか一項に記載のロボットシステム。
- 前記第1撮像装置は、前記ロボットアームの周囲の所定の周方向に可動である、請求項31に記載のロボットシステム。
- 前記第1撮像装置は、前記エンドエフェクタの周囲の所定の周方向に可動である、請求項32に記載のロボットシステム。
- 前記第1撮像装置は、前記アダプタの周囲の所定の周方向に可動である、請求項33に記載のロボットシステム。
- 少なくとも前記第1撮像装置は、前記ロボットシステムの電源がオンされた後に、所定の初期位置に移動する、請求項8~42のいずれか一項に記載のロボットシステム。
- 前記第1撮像装置の前記所定の初期位置への移動は、前記第1撮像装置が取り付けられた部材が静止している状態において行われる、請求項43に記載のロボットシステム。
- 少なくとも前記第1撮像装置は、前記ロボットシステムの電源がオフされる前に、所定の終了位置に移動する、請求項8~44のいずれか一項に記載のロボットシステム。
- 前記第1撮像装置の前記所定の終了位置への移動は、前記第1撮像装置が取り付けられた部材が静止している状態において行われる、請求項45に記載のロボットシステム。
- さらに、光を投射する投射装置を有し、
少なくとも前記第1撮像装置は、前記投射装置によって光が投射された状態で撮像を実行し画像を取得する、請求項8~46のいずれか一項に記載のロボットシステム。 - ロボットアームと、
対象物を撮像する3つ以上の撮像装置と、
前記3つ以上の撮像装置の中の2つの撮像装置により取得された前記対象物の画像の情報に基づいて、前記対象物の距離に関する情報を取得する制御部を有する、ロボットシステム。 - 前記制御部は、前記3つ以上の撮像装置により取得された3つ以上の画像の中から2つの画像を選択し、当該選択された2つの画像の情報に基づいて、前記対象物の距離に関する情報を取得する、請求項48に記載のロボットシステム。
- 前記制御部は、前記3つ以上の撮像装置の中から前記2つの撮像装置を選択し、当該選択された2つの撮像装置により取得された画像の情報に基づいて、前記対象物の距離に関する情報を取得する、請求項48に記載のロボットシステム。
- ロボットアームと、
対象物を撮像する3つ以上の撮像装置と、
前記3つ以上の撮像装置の中の2つの撮像装置により取得された画像の情報に基づいて、前記ロボットアームおよび前記ロボットアームに接続されるエンドエフェクタの少なくとも一方を制御する制御部を有する、ロボットシステム。 - 前記制御部は、前記3つ以上の撮像装置により取得された3つ以上の画像の中から2つの画像を選択し、当該選択された2つの画像の情報に基づいて、前記ロボットアームおよび前記ロボットアームに接続されるエンドエフェクタの少なくとも一方を制御する、請求項51に記載のロボットシステム。
- 前記制御部は、前記3つ以上の撮像装置の中から前記2つの撮像装置を選択し、当該選択された2つの撮像装置により取得された画像の情報に基づいて、前記ロボットアームおよび前記ロボットアームに接続されるエンドエフェクタの少なくとも一方を制御する、請求項51に記載のロボットシステム。
- 前記制御部は、事前に求められた前記対象物に関連する距離情報、前記対象物のオクルージョンに関する情報、前記撮像装置の焦点距離情報、および、前記3つ以上の撮像装置で得られる画像の形状変化に関する情報のうちの少なくとも1つの情報に基づいて、前記2つの画像を選択する、請求項49または52に記載のロボットシステム。
- 前記制御部は、事前に求められた前記対象物に関連する距離情報、前記対象物のオクルージョンに関する情報、前記撮像装置の焦点距離情報、および、前記3つ以上の撮像装置で得られる画像の形状変化に関する情報のうちの少なくとも1つの情報に基づいて、前記2つの撮像装置を選択する、請求項50または53に記載のロボットシステム。
- 前記制御部は、前記3つ以上の撮像装置のうちの少なくとも2つの撮像装置による撮像を同期させるよう制御する、請求項48~55のいずれか一項に記載のロボットシステム。
- さらに、前記距離に関する情報に基づいた情報を表示する表示部を有する、請求項17~21、請求項31~33、および、請求項48~50のいずれか一項に記載のロボットシステム。
- ロボットアームと、
3つ以上の撮像装置を有し、
前記3つ以上の撮像装置は、前記ロボットアーム、前記ロボットアームに接続されるエンドエフェクタ、および、前記エンドエフェクタを前記ロボットアームに取り付けるためのアダプタのいずれかの周囲に配置される、ロボットシステム。 - 前記3つ以上の撮像装置は、所定の軸上に並べて配置される、請求項58に記載のロボットシステム。
- 前記3つ以上の撮像装置は、所定の円周上に並べて配置される、請求項58に記載のロボットシステム。
- 前記3つ以上の撮像装置は、前記円周上に等間隔に配置される、請求項60に記載のロボットシステム。
- 前記3つ以上の撮像装置の光軸は、互いに平行である、請求項58~61のいずれか一項に記載のロボットシステム。
- 前記3つ以上の撮像装置は、前記3つ以上の撮像装置における撮像素子のそれぞれの長辺が平行になるように配置される、請求項58~62のいずれか一項に記載のロボットシステム。
- さらに、光を投射する投射装置を有し、
前記投射装置は、前記ロボットアーム、前記エンドエフェクタ、および、前記アダプタのいずれかの周囲に配置される、請求項58~63のいずれか一項に記載のロボットシステム。 - 第1撮像装置が保持される第1保持部と、
第2撮像装置が保持される第2保持部を有し、
前記第1撮像装置は前記第1保持部によって移動可能に保持され、または、前記第2撮像装置は前記第2保持部によって移動可能に保持される、ロボットアーム。 - 第1撮像装置が保持される第1保持部と、
第2撮像装置が保持される第2保持部を有し、
前記第1撮像装置は前記第1保持部によって移動可能に保持され、または、前記第2撮像装置は前記第2保持部によって移動可能に保持される、ロボットアームに取り付けられるエンドエフェクタ。 - 第1撮像装置が保持される第1保持部と、
第2撮像装置が保持される第2保持部を有し、
前記第1撮像装置は前記第1保持部によって移動可能に保持され、または、前記第2撮像装置は前記第2保持部によって移動可能に保持される、エンドエフェクタをロボットアームに取り付けるためのアダプタ。 - 第1撮像装置が保持される第1保持部と、
第2撮像装置が保持される第2保持部を有し、
前記第1撮像装置と前記第2撮像装置の相対位置は可変である、ロボットアーム。 - 第1撮像装置が保持される第1保持部と、
第2撮像装置が保持される第2保持部を有し、
前記第1撮像装置と前記第2撮像装置の相対位置は可変である、ロボットアームに取り付けられるエンドエフェクタ。 - 第1撮像装置が保持される第1保持部と、
第2撮像装置が保持される第2保持部を有し、
前記第1撮像装置と前記第2撮像装置の相対位置は可変である、エンドエフェクタをロボットアームに取り付けるためのアダプタ。 - 対象物を撮像する3つ以上の撮像装置を保持する保持部と、
前記3つ以上の撮像装置の中の2つの撮像装置により取得された前記対象物の画像の情報に基づいて、前記対象物の距離に関する情報を取得する制御部を有する、ロボットアーム。
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020237015508A KR20230104161A (ko) | 2020-10-30 | 2020-10-30 | 로봇 시스템, 로봇 아암, 엔드 이펙터, 및 어댑터 |
US18/034,452 US20240046401A1 (en) | 2020-10-30 | 2020-10-30 | Robot system, robot arm, end effector, and adapter |
CN202080106828.9A CN116438042A (zh) | 2020-10-30 | 2020-10-30 | 机器人系统、机器人手臂、端点效果器以及配接器 |
JP2022558737A JPWO2022091328A5 (ja) | 2020-10-30 | ロボットシステム | |
EP20959846.5A EP4238719A1 (en) | 2020-10-30 | 2020-10-30 | Robot system, robot arm, end effector, and adapter |
PCT/JP2020/040779 WO2022091328A1 (ja) | 2020-10-30 | 2020-10-30 | ロボットシステム、ロボットアーム、エンドエフェクタ、およびアダプタ |
TW110140421A TW202222519A (zh) | 2020-10-30 | 2021-10-29 | 機器人系統、機器人手臂、端點效果器以及配接器 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/040779 WO2022091328A1 (ja) | 2020-10-30 | 2020-10-30 | ロボットシステム、ロボットアーム、エンドエフェクタ、およびアダプタ |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022091328A1 true WO2022091328A1 (ja) | 2022-05-05 |
Family
ID=81382127
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/040779 WO2022091328A1 (ja) | 2020-10-30 | 2020-10-30 | ロボットシステム、ロボットアーム、エンドエフェクタ、およびアダプタ |
Country Status (6)
Country | Link |
---|---|
US (1) | US20240046401A1 (ja) |
EP (1) | EP4238719A1 (ja) |
KR (1) | KR20230104161A (ja) |
CN (1) | CN116438042A (ja) |
TW (1) | TW202222519A (ja) |
WO (1) | WO2022091328A1 (ja) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010131685A (ja) | 2008-12-03 | 2010-06-17 | Seiko Epson Corp | ロボット装置および撮像方法 |
JP2013013950A (ja) * | 2011-07-01 | 2013-01-24 | Seiko Epson Corp | ロボット、ロボット制御装置、ロボット制御方法、およびロボット制御プログラム |
WO2018043525A1 (ja) * | 2016-09-02 | 2018-03-08 | 倉敷紡績株式会社 | ロボットシステム、ロボットシステム制御装置、およびロボットシステム制御方法 |
JP2019013802A (ja) * | 2013-05-30 | 2019-01-31 | 国立研究開発法人産業技術総合研究所 | 撮像システム及び手術支援システム |
-
2020
- 2020-10-30 EP EP20959846.5A patent/EP4238719A1/en active Pending
- 2020-10-30 CN CN202080106828.9A patent/CN116438042A/zh active Pending
- 2020-10-30 WO PCT/JP2020/040779 patent/WO2022091328A1/ja unknown
- 2020-10-30 US US18/034,452 patent/US20240046401A1/en active Pending
- 2020-10-30 KR KR1020237015508A patent/KR20230104161A/ko active Search and Examination
-
2021
- 2021-10-29 TW TW110140421A patent/TW202222519A/zh unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010131685A (ja) | 2008-12-03 | 2010-06-17 | Seiko Epson Corp | ロボット装置および撮像方法 |
JP2013013950A (ja) * | 2011-07-01 | 2013-01-24 | Seiko Epson Corp | ロボット、ロボット制御装置、ロボット制御方法、およびロボット制御プログラム |
JP2019013802A (ja) * | 2013-05-30 | 2019-01-31 | 国立研究開発法人産業技術総合研究所 | 撮像システム及び手術支援システム |
WO2018043525A1 (ja) * | 2016-09-02 | 2018-03-08 | 倉敷紡績株式会社 | ロボットシステム、ロボットシステム制御装置、およびロボットシステム制御方法 |
Also Published As
Publication number | Publication date |
---|---|
US20240046401A1 (en) | 2024-02-08 |
CN116438042A (zh) | 2023-07-14 |
EP4238719A1 (en) | 2023-09-06 |
KR20230104161A (ko) | 2023-07-07 |
JPWO2022091328A1 (ja) | 2022-05-05 |
TW202222519A (zh) | 2022-06-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10837756B2 (en) | Multi-dimensional measuring system with measuring instrument having 360° angular working range | |
JP5290324B2 (ja) | 空間内において少なくとも1つのオブジェクトを最終姿勢に高精度で位置決めするための方法およびシステム | |
US10107618B2 (en) | Coordinate measuring machine | |
US20050131582A1 (en) | Process and device for determining the position and the orientation of an image reception means | |
US11230011B2 (en) | Robot system calibration | |
EP1215017B1 (en) | Robot teaching apparatus | |
JP2020116734A (ja) | ロボットモーション用のビジョンシステムの自動ハンドアイ校正のためのシステム及び方法 | |
EP3272473B1 (en) | Teaching device and method for generating control information | |
WO2018043525A1 (ja) | ロボットシステム、ロボットシステム制御装置、およびロボットシステム制御方法 | |
JP2016536150A (ja) | 向上した正確度を伴うロボット配置および操作 | |
JP2009269110A (ja) | 組立装置 | |
JP2009241247A (ja) | ステレオ画像型検出移動装置 | |
JP2010152664A (ja) | 画像を利用したセンサレスモータ駆動ロボット | |
JP6869159B2 (ja) | ロボットシステム | |
JP6897396B2 (ja) | 制御装置、ロボットシステムおよび制御方法 | |
JP2003094367A (ja) | 手先視覚付ロボットハンド | |
WO2021241187A1 (ja) | 形状測定装置および形状測定方法 | |
WO2018043524A1 (ja) | ロボットシステム、ロボットシステム制御装置、およびロボットシステム制御方法 | |
JP2018194542A (ja) | 画像処理システム、画像処理装置および画像処理プログラム | |
JP2006297559A (ja) | キャリブレーションシステムおよびロボットのキャリブレーション方法 | |
WO2022091328A1 (ja) | ロボットシステム、ロボットアーム、エンドエフェクタ、およびアダプタ | |
JP3239277B2 (ja) | 知能ロボット | |
JP2004015965A (ja) | 球面モータ | |
JP7343349B2 (ja) | ロボット、測定用治具、およびツール先端位置の決定方法 | |
US11813740B2 (en) | Camera and robot system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20959846 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022558737 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20237015508 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2020959846 Country of ref document: EP Effective date: 20230530 |