WO2021070922A1 - Correction system, correction method, robot system, and control device - Google Patents

Correction system, correction method, robot system, and control device Download PDF

Info

Publication number
WO2021070922A1
WO2021070922A1 PCT/JP2020/038254 JP2020038254W WO2021070922A1 WO 2021070922 A1 WO2021070922 A1 WO 2021070922A1 JP 2020038254 W JP2020038254 W JP 2020038254W WO 2021070922 A1 WO2021070922 A1 WO 2021070922A1
Authority
WO
WIPO (PCT)
Prior art keywords
mounting portion
camera
robot
teaching
attached
Prior art date
Application number
PCT/JP2020/038254
Other languages
French (fr)
Japanese (ja)
Inventor
剛彦 村田
匡志 庄司
佳典 毛笠
Original Assignee
川崎重工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 川崎重工業株式会社 filed Critical 川崎重工業株式会社
Priority to CN202080069459.0A priority Critical patent/CN114555271B/en
Publication of WO2021070922A1 publication Critical patent/WO2021070922A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K11/00Resistance welding; Severing by resistance heating
    • B23K11/10Spot welding; Stitch welding
    • B23K11/11Spot welding
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/42Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine

Definitions

  • This disclosure relates to a correction system, a correction method, a robot system and a control device.
  • Patent Document 1 discloses a teaching position correction system for a welding robot.
  • the teaching position correction system comprises an imaging device attached to or interchanged with one of the two opposed electrodes of the welding gun.
  • the optical axis of the image pickup apparatus is coaxial with the axis of one of the electrodes.
  • the imaging device welds based on the position information of the welding point of the work in the image captured by the imaging device, the distance from the imaging device to the welding point, and the clearance between the work and the other electrode. Correct the teaching position of the welding gun so that the points can be imaged.
  • the imaging device is attached to or replaced with one electrode of the welding gun.
  • the length of the image pickup device is generally larger than the length of the electrodes.
  • the imaging device may be crushed. Therefore, for example, imaging with an imaging device is performed with a gap between the electrodes. In this case, the distance between the imaging device and the welding point becomes large, the accuracy of the welding point position information obtained by processing the image of the welding point becomes low, and the correction accuracy of the teaching position may become low.
  • an object of the present disclosure is to provide a correction system, a correction method, a robot system, and a control device that improve the correction accuracy of the teaching data.
  • the correction system is a correction system that corrects the teaching data of the robot, and is a correction system of the first mounting portion and the second mounting portion of the robot gun of the robot facing each other.
  • a camera attached to the first mounting portion that can operate in the first direction, and a mounting that attaches the camera to the first mounting portion so that the direction of the optical axis of the camera is offset from the first mounting portion.
  • the control device includes a tool and a control device, and the control device is a teaching for pressing the first mounting portion to a predetermined hitting point position of a work between the first mounting portion and the second mounting portion according to the teaching data.
  • the camera is made to image the dot mark attached to the predetermined dot position, and the position of the dot mark is detected by using the image captured by the camera.
  • the corresponding position which is the position of the robot gun for pressing the first mounting portion against the hitting point mark, is detected, and the teaching data is corrected based on the difference between the corresponding position and the teaching position.
  • the robot system includes a correction system according to one aspect of the present disclosure and the robot, and the control device controls the operation of the robot.
  • the correction method is a correction method for correcting the teaching data of the robot, in which the work between the first mounting portion and the second mounting portion of the robot gun of the robot facing each other according to the teaching data.
  • To image the dot mark attached to the predetermined dot position detect the position of the dot mark using the image captured by the camera, and attach the first mounting portion to the dot mark.
  • the camera includes the detection of a corresponding position, which is the position of the robot gun for pressing, and the correction of the teaching data based on the difference between the corresponding position and the teaching position.
  • the shaft is attached to the first mounting portion so as to be offset from the first mounting portion that can operate in the first direction.
  • the control device is a control device that executes the correction method according to one aspect of the present disclosure.
  • FIG. 1 is a schematic view showing an example of a robot system according to an embodiment.
  • FIG. 2 is a side view showing an example of the configuration of the welding gun according to the embodiment.
  • FIG. 3 is a side view showing an example of a configuration in which an imaging device is mounted instead of an electrode tip in the welding gun of FIG.
  • FIG. 4 is a block diagram showing an example of the hardware configuration of the robot system according to the embodiment.
  • FIG. 5 is a block diagram showing an example of the functional configuration of the robot system according to the embodiment.
  • FIG. 6 is a diagram showing an example of marking of a dot mark according to an embodiment.
  • FIG. 7 is a side view showing an example of a state at the time of imaging of the dot mark by the camera in the correction mode.
  • FIG. 1 is a schematic view showing an example of a robot system according to an embodiment.
  • FIG. 2 is a side view showing an example of the configuration of the welding gun according to the embodiment.
  • FIG. 3 is a side view showing an
  • FIG. 8 is a diagram showing an example of an image captured by the camera in the state of FIG. 7.
  • FIG. 9 is a flowchart showing an example of the operation of the robot system according to the embodiment in the correction mode.
  • FIG. 10 is a side view showing an example of the configuration of the image pickup device for the welding gun according to the modified example.
  • the correction system is a correction system that corrects the teaching data of the robot, and can operate in the first direction of the facing first mounting portion and the second mounting portion of the robot gun of the robot.
  • the camera is attached to the first mounting portion, the mounting tool for mounting the camera to the first mounting portion so that the direction of the optical axis of the camera is offset from the first mounting portion, and a control device.
  • the robot gun is positioned at a teaching position for pressing the first mounting portion to a predetermined hitting point position of the work between the first mounting portion and the second mounting portion according to the teaching data.
  • the camera is made to image the dot mark attached to the predetermined dot position
  • the position of the dot mark is detected by using the image captured by the camera
  • the first mounting portion is attached to the dot mark.
  • the corresponding position which is the position of the robot gun for pressing the camera, is detected, and the teaching data is corrected based on the difference between the corresponding position and the teaching position.
  • the camera is attached to the first mounting portion so that the direction of the optical axis of the camera is offset, the amount of protrusion of the camera from the first mounting portion can be suppressed in the first direction.
  • the camera can take an image of the hitting point mark in a state where the first mounting portion is operated toward the hitting point position, and can take an image showing the relative position of the hitting point mark with high accuracy and high image quality. Therefore, it is possible to improve the detection accuracy of the corresponding position and the correction accuracy of the teaching data.
  • the control device when the robot gun is located at the teaching position, the control device operates the first mounting portion in the first direction toward the work according to the teaching data.
  • the dot mark may be imaged by the camera in a state of being close to the work.
  • electrodes for welding can be attached to and detached from the first mounting portion and the second mounting portion, and the mounting tool is the first mounting instead of the electrodes. It may be attached to the part.
  • the camera since the camera is attached to the first mounting portion instead of the electrodes, the camera can image the portion of the work in which the electrodes come into contact. Such a camera can capture an image showing the relative position of the dot mark corresponding to the portion where the electrodes are in contact with high accuracy and high image quality.
  • electrodes for welding can be attached to and detached from the first mounting portion and the second mounting portion, and in the mounting tool, the electrodes are attached to the first mounting portion. It may be configured so that it can be attached to the first attachment portion in the attached state.
  • the camera can capture an image in a state where the first mounting portion is moved in the first direction and the electrodes are in contact with the work. Since such an image shows the actual hitting point position of the electrode and the hitting point mark together, it enables highly accurate and simple detection of the relative position between the actual hitting point position and the hitting point mark.
  • the length of the attachment attached to the first mounting portion and the camera protruding from the first mounting portion in the first direction is determined by the first mounting portion.
  • the attached electrode may have a length or less that protrudes from the first mounting portion in the first direction.
  • the camera and the fixture approach the work to the same extent as the electrodes or less, but the camera and the fixture are pressed against the work and damaged. Is suppressed.
  • the fixture may offset the direction of the optical axis of the camera so that the direction of the optical axis of the camera intersects the first direction. ..
  • the length occupied by the camera in the first direction can be kept small. Therefore, it is possible to save space for mounting the camera on the first mounting portion.
  • the fixture may offset the direction of the optical axis of the camera so that the direction of the optical axis of the camera is parallel to the first direction.
  • the distortion of the dot mark captured by the camera is suppressed. Therefore, it is possible to simplify the image processing for detecting the dot mark.
  • the teaching position includes the three-dimensional position and orientation of the robot gun at the teaching position
  • the corresponding position is the three-dimensional position of the robot gun at the corresponding position.
  • the control device corrects the teaching data based on the difference between the three-dimensional position and posture of the robot gun at the corresponding position and the three-dimensional position and posture of the robot gun at the teaching position. You may.
  • control device may move the robot gun at the teaching position so as to press the second mounting portion against the work before imaging with the camera. ..
  • the position of the work between the first mounting portion and the second mounting portion is maintained constant. Therefore, the process of detecting the positions of the workpiece and the dot mark in the direction from the first mounting portion to the second mounting portion can be simplified.
  • the dot mark may be a marking including a center display unit indicating the center and a directional display unit indicating the direction in rotation around the center.
  • the robot system according to one aspect of the present disclosure includes a correction system according to one aspect of the present disclosure and the robot, and the control device controls the operation of the robot. According to the above aspect, the same effect as that of the correction system according to one aspect of the present disclosure can be obtained.
  • the correction method is a correction method for correcting the teaching data of the robot, in which the work between the first mounting portion and the second mounting portion of the robot gun of the robot facing each other according to the teaching data.
  • To image the dot mark attached to the predetermined dot position detect the position of the dot mark using the image captured by the camera, and attach the first mounting portion to the dot mark.
  • the camera includes the detection of a corresponding position, which is the position of the robot gun for pressing, and the correction of the teaching data based on the difference between the corresponding position and the teaching position.
  • the shaft is attached to the first mounting portion so as to be offset from the first mounting portion that can operate in the first direction. According to the above aspect, the same effect as that of the correction system according to one aspect of the present disclosure can be obtained.
  • the correction method further includes operating the first mounting portion in the first direction toward the work according to the teaching data when the robot gun is located at the teaching position.
  • the image of the dot mark by the camera may be performed in a state where the camera is close to the work.
  • electrodes for welding can be attached to and detached from the first mounting portion and the second mounting portion, and the camera has the first mounting portion instead of the electrodes. It may be attached to.
  • electrodes for welding can be attached to and detached from the first mounting portion and the second mounting portion, and in the camera, the electrodes are attached to the first mounting portion. It may be configured so that it can be attached to the first mounting portion in the state of being welded.
  • the length of the camera mounted on the first mounting portion protruding from the first mounting portion in the first direction is the length of the camera mounted on the first mounting portion.
  • the length of the electrode may be less than or equal to the length of protrusion from the first mounting portion in the first direction.
  • the direction of the optical axis of the camera may be offset so that the direction of the optical axis of the camera intersects the first direction.
  • the direction of the optical axis of the camera may be offset so that the direction of the optical axis of the camera is parallel to the first direction.
  • the teaching data is obtained based on the difference between the three-dimensional position and orientation of the robot gun at the corresponding position and the three-dimensional position and orientation of the robot gun at the teaching position.
  • the teaching position may include the three-dimensional position and orientation of the robot gun at the teaching position
  • the corresponding position may include the three-dimensional position and orientation of the robot gun at the corresponding position.
  • the correction method according to one aspect of the present disclosure may further include moving the robot gun at the teaching position so as to press the second mounting portion against the work before imaging with the camera.
  • the dot mark may be a marking including a center display unit indicating the center and a directional display unit indicating the direction in rotation around the center.
  • the control device is a control device that executes the correction method according to one aspect of the present disclosure. According to the above aspect, the same effect as the correction method according to one aspect of the present disclosure can be obtained.
  • FIG. 1 is a schematic view showing an example of the robot system 1 according to the embodiment.
  • the robot system 1 according to the embodiment includes a robot 10, a robot control device 20, an image processing device 30, an input device 40, and an image pickup device 50.
  • the robot control device 20, the image processing device 30, and the image pickup device 50 constitute a teaching data correction system 2.
  • the robot control device 20 and the image processing device 30 constitute the control device 3.
  • the robot system 1 can make the robot 10 automatically operate according to the instructed operation procedure to execute a predetermined work.
  • the robot system 1 can cause the robot 10 to manually operate according to the operation information input via the input device 40 to execute the work.
  • the robot system 1 can execute a process of automatically correcting the teaching data with the data of the taught operation procedure.
  • the robot system 1 operates by selecting one of an automatic operation mode in which the robot 10 is automatically operated, a manual operation mode in which the robot 10 is manually operated, and a correction mode in which the teaching data is automatically corrected.
  • the work executed by the robot 10 is a welding work, for example, a spot welding work.
  • the work executed by the robot 10 may be a welding work other than spot welding, or may be a work other than the welding work.
  • Such work may include positioning a movable portion of the robot 10 with respect to an object, such as drilling, screwing, sealing, and the like.
  • the robot 10 as described above is an industrial robot.
  • the robot 10 includes an end effector 11 that actually welds a welded portion of a welding object W, which is an example of a work, and a robot arm 12 that moves the end effector 11 to the welded portion.
  • the end effector 11 is a welding gun which is an example of a robot gun.
  • the "end effector 11" is also referred to as a "welding gun 11".
  • the object W to be welded is, for example, two stacked thin plate-like objects.
  • the configuration of the robot arm 12 is not particularly limited as long as the position and posture of the welding gun 11 at the tip can be changed, but in the present embodiment, the robot arm 12 is a vertical articulated robot arm.
  • the robot arm 12 may be configured as, for example, a horizontal articulated type, a polar coordinate type, a cylindrical coordinate type, a rectangular coordinate type, or another type of robot arm.
  • the robot arm 12 is fixedly arranged on an installation surface such as a floor surface, but may be arranged and movable on a transport vehicle or the like.
  • the robot arm 12 includes joints JT1 to JT6 that sequentially connect links 12a to 12f, links 12a to 12f, which are sequentially arranged from the base to the tip, and arm drive devices M1 to rotationally drive each of the joints JT1 to JT6. It is equipped with M6.
  • the operations of the arm drive devices M1 to M6 are controlled by the robot control device 20.
  • Each of the arm drive devices M1 to M6 uses electric power as a power source and has a servomotor as an electric motor for driving the electric power, but the arm drive devices M1 to M6 are not limited thereto.
  • the number of joints of the robot arm 12 is not limited to 6, but may be 7 or more, or 1 or more and 5 or less.
  • the joint JT1 rotatably connects the installation surface of the robot arm 12 and the base end portion of the link 12a around an axis in the vertical direction perpendicular to the installation surface.
  • the joint JT2 rotatably connects the tip end of the link 12a and the base end of the link 12b around a horizontal axis parallel to the installation surface.
  • the joint JT3 rotatably connects the tip end of the link 12b and the base end of the link 12c around an axis in the horizontal direction.
  • the joint JT4 rotatably connects the tip end of the link 12c and the base end of the link 12d around the longitudinal axis of the link 12c.
  • the joint JT5 rotatably connects the tip end of the link 12d and the base end of the link 12e about an axis in a direction orthogonal to the longitudinal direction of the link 12d.
  • the joint JT6 connects the tip end of the link 12e and the base end of the link 12f in a twistable and rotatable manner with respect to the link 12e.
  • the tip of the link 12f constitutes a mechanical interface and is connected to the welding gun 11.
  • FIG. 2 is a side view showing an example of the configuration of the welding gun 11 according to the embodiment.
  • FIG. 3 is a side view showing an example of a configuration in which the imaging device 50 is mounted in place of the electrode tip 11d in the welding gun 11 of FIG.
  • the welding gun 11 is detachably attached to the tip of the link 12f.
  • the welding gun 11 includes a mounting portion 11a, a main body portion 11b, and a moving device 11c.
  • the mounting portion 11a is configured to be connected to the mechanical interface of the link 12f and supports the main body portion 11b.
  • the main body portion 11b is composed of a U-shaped member and is connected to the mounting portion 11a.
  • the main body portion 11b is made of the same material as the mounting portion 11a and is integrated with the mounting portion 11a.
  • the main body portion 11b is connected to the mounting portion 11a in the vicinity of the end portion 11ba of the U-shaped end portions 11ba and 11bb.
  • the main body portion 11b has a movable first mounting portion 11bc at the end portion 11ba, and a second mounting portion 11bd fixed to the main body portion 11b at the end portion 11bb.
  • the mounting portions 11bc and 11bd are arranged so as to face each other in the direction D1, and the first mounting portion 11bc can move in the direction D1 approaching the second mounting portion 11bd and in the direction D2 away from the second mounting portion 11bd.
  • the directions D1 and D2 are opposite to each other.
  • Direction D1 is an example of the first direction.
  • the moving device 11c is arranged at the end 11ba and moves the first mounting portion 11bc in the directions D1 and D2.
  • the mobile device 11c includes a mobile drive device 11ca and a mobile drive mechanism 11cc.
  • the mobile drive device 11ca drives the mobile drive mechanism 11 cab.
  • the mobile drive device 11ca uses electric power as a power source and has a servomotor as an electric motor, but the mobile drive device 11ca is not limited to this.
  • the operation of the mobile drive device 11ca is controlled by the robot control device 20.
  • the mobile drive mechanism 11 bc transmits the driving force of the mobile drive device 11ca to the first mounting portion 11 bc, and moves the first mounted portion 11 bc in the directions D1 and D2.
  • the mobile drive mechanism 11 bc converts the rotational drive force of the mobile drive device 11ca into a linear drive force and transmits it to the first mounting portion 11 bc.
  • the moving drive mechanism 11cc has, for example, a ball screw structure, and the nut is rotationally driven by the moving drive device 11ca to rotate the rod-shaped screw connected to the first mounting portion 11bc in the axial directions D1 and D2. Move to.
  • the mobile drive device 11ca is not limited to an electric motor, and may be, for example, a hydraulic or pneumatic piston, an electric linear actuator, or the like.
  • the mobile drive device 11ca and the mobile drive mechanism 11 bc may be configured to move the first mounting portion 11 bc in the directions D1 and D2.
  • the second mounting portion 11bd is configured so that the electrode tip 11d, which is an example of an electrode for welding, can be attached and detached.
  • the electrode tip 11d may be configured to be attached by being inserted into a hole included in the second mounting portion 11bd.
  • the shape of the electrode tip 11d is a cylindrical shape having a hemispherical tip, but the shape is not limited to this.
  • the welding gun 11 includes a contact sensor 11e that detects contact between the electrode tip 11d mounted on the second mounting portion 11bd and the welding object W. The contact sensor 11e transmits a detection signal indicating contact between the electrode tip 11d and the welding object W to the robot control device 20.
  • the configuration of the contact sensor 11e is not particularly limited as long as the contact can be detected, but in the present embodiment, a weak current is applied to the conductive electrode tip 11d of the second mounting portion 11bd to be a conductive welding target.
  • the configuration is such that a signal representing a change in current at the time of contact with the object W is transmitted as a detection signal.
  • the first mounting portion 11bc is configured so that the electrode tip 11d can be attached and detached. Further, the first mounting portion 11bc is configured so that the imaging device 50 can be attached and detached instead of the electrode chip 11d.
  • the electrode tip 11d and the image pickup apparatus 50 can be exchanged with each other and attached to the first mounting portion 11bc.
  • the electrode tip 11d and the image pickup apparatus 50 may be configured to be attached by being inserted into a hole included in the first mounting portion 11bc.
  • the image pickup device 50 has a camera 51 and a fixture 52.
  • the camera 51 is a small camera that captures a digital image.
  • An example of the camera 51 is an image sensor such as a CMOS (Complementary Metal-Oxide Semiconductor) image sensor and a CCD (Charge Coupled Device) image sensor.
  • the camera 51 is controlled in operation by the robot control device 20, and transmits a signal of the captured image to the robot control device 20 and / or the image processing device 30.
  • the camera 51 is a monocular camera, but is not limited thereto.
  • the camera 51 detects the position of a subject such as a compound eye camera, a TOF camera (Time-of-Flight-Camera), a pattern light projection camera such as fringe projection, or a camera using a light cutting method. It may have a configuration for capturing an image of.
  • a subject such as a compound eye camera, a TOF camera (Time-of-Flight-Camera), a pattern light projection camera such as fringe projection, or a camera using a light cutting method. It may have a configuration for capturing an image of.
  • the attachment 52 attaches the camera 51 to the first attachment portion 11bc.
  • the attachment 52 holds the camera 51 and is configured to be detachably attached to the first attachment portion 11bc.
  • the mounting tool 52 is mounted on the first mounting portion 11bc
  • the camera 51 is mounted on the first mounting portion 11bc so that the direction of the optical axis 51a of the camera 51 is offset from the first mounting portion 11bc.
  • the fixture 52 specifically offsets the direction of the optical axis 51a from the first mounting portion 11bc so that the direction of the optical axis 51a of the camera 51 intersects the direction D1.
  • the direction of the optical axis 51a of the camera 51 and the direction D1 intersect diagonally.
  • the axis 11da of the electrode chip 11d mounted on the first mounting portion 11bc and the optical axis 51a of the camera 51 mounted on the first mounting portion 11bc via the mounting tool 52 intersect. However, they may have a twisting relationship without intersecting.
  • the axis 11da of the electrode tip 11d mounted on the first mounting portion 11bc is coaxial with the axis of the electrode tip 11d mounted on the second mounting portion 11bd.
  • the axis 11da is also the axis of the first mounting portion 11bc.
  • the fixture 52 integrally includes a cylindrical connecting portion 52a connected to the first mounting portion 11bc and a cylindrical accommodating portion 52b extending from the connecting portion 52a.
  • the accommodating portion 52b accommodates and holds the camera 51 and its harness and the like, and exposes the lens of the camera 51 at the end portion connected to the connecting portion 52a.
  • the axis of the connecting portion 52a connected to the first mounting portion 11bc is coaxial with, but is not limited to, the axis center 11da of the electrode tip 11d mounted on the first mounting portion 11bc.
  • the axis of the accommodating portion 52b is coaxial with the optical axis 51a of the camera 51, but is not limited thereto.
  • the accommodating portion 52b mounted on the first mounting portion 11bc extends from the connecting portion 52a in the direction D2 along a direction diagonally intersecting the direction of the axis of the connecting portion 52a.
  • the accommodating portion 52b is arranged so as to prevent interference with the main body portion 11b of the welding gun 11 and the moving device 11c, and to suppress the amount of protrusion from the connecting portion 52a in the direction D1 to a low level or to prevent the housing portion 52b from protruding from the connecting portion 52a. Can be done.
  • Such a mounting tool 52 can suppress the length protruding from the first mounting portion 11bc in the direction D1 to a small size when mounted on the first mounting portion 11bc.
  • the length of the fixture 52 and the camera 51 protruding from the first mounting portion 11bc in the direction D1 in the state of being mounted on the first mounting portion 11bc is such that the electrode tip 11d protrudes from the first mounting portion 11bc. It is less than or equal to the length protruding in the direction D1.
  • the fixture 52 and the camera 51 mounted on the first mounting portion 11bc have the first mounting portion 11bc and the second mounting portion even when the first mounting portion 11bc is moved in the direction D1 due to the welding operation. It is suppressed from being strongly pressed against the welding object W between the 11b and the welding object W and being damaged.
  • the input device 40 receives input of commands, information, data, etc. by the user of the robot system 1 and outputs the commands, information, data, etc. to the robot control device 20.
  • the input device 40 is connected to the robot control device 20 via wired communication or wireless communication.
  • the format of wired communication and wireless communication may be any format.
  • the input device 40 receives an input of a command for executing any of the automatic operation mode, the manual operation mode, and the correction mode, and outputs the command to the robot control device 20.
  • the input device 40 may include a teaching device such as a teaching pendant for teaching the robot 10 the operation procedure of a predetermined welding operation.
  • the robot control device 20 controls the entire robot system 1.
  • the robot control device 20 may include a computer device.
  • the image processing device 30 generates image data from the signal of the image received from the camera 51, and performs image processing on the image data.
  • the image processing device 30 may include a computer device.
  • the image processing device 30 detects the three-dimensional position and orientation of the subject projected on the image data by performing image processing.
  • the "three-dimensional position" may be a three-dimensional position in the three-dimensional space
  • the "posture” may be the three-dimensional posture in the three-dimensional space. It may be a two-dimensional posture in a two-dimensional plane such as a plane of an image or a plane along a plane intersecting the plane.
  • the camera 51 takes an image of a dot mark attached to a predetermined dot position on the surface of the welding object W, and the image processing device 30 detects the three-dimensional position and orientation of the dot mark projected on the image data.
  • the predetermined striking point position is a position to be welded when the robot 10 performs a welding operation according to the teaching data in the automatic operation mode, and is a position to be pressed against the electrode tip 11d of the welding gun 11.
  • Examples of dot marks are spot weld marks made on the surface of the object W to be welded, markings including figures and the like. When the dot mark has no directionality, only the three-dimensional position of the dot mark may be detected.
  • the robot 10 when the robot 10 performs the same operation as the welding work by automatic operation according to the teaching data, the actual hitting point position where the robot 10 actually performs welding and the robot 10 should originally perform welding.
  • the teaching data is corrected so as to reduce the difference that occurs with the predetermined spot position.
  • a dot mark is attached to such a predetermined dot position, and the dot mark may be attached to the surface of the welding object W by marking the position positioned by calculation or the like, and the instructor manually operates the spot mark.
  • welding marks may be formed on the surface of the object to be welded W.
  • FIG. 4 is a block diagram showing an example of the hardware configuration of the robot system 1 according to the embodiment.
  • the robot control device 20 includes a CPU (Central Processing Unit) 201, a ROM (Read Only Memory) 202, a RAM (Random Access Memory) 203, a memory 204, and an input / output I / F ( Interface) 205 to 207, arm drive circuit 208, and gun drive circuit 209 are included as components.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • I / F Interface
  • Each of the above components is connected via bus, wired or wireless communication. Not all of the above components are essential.
  • the CPU 201 is a processor and controls the entire operation of the robot control device 20.
  • the ROM 202 is composed of a non-volatile semiconductor memory or the like, and stores a program, data, or the like for causing the CPU 201 to control the operation.
  • the RAM 203 is composed of a volatile semiconductor memory or the like, and temporarily stores a program executed by the CPU 201 and data in the middle of processing or processed.
  • the memory 204 is composed of a semiconductor memory such as a volatile memory and a non-volatile memory, and a storage device such as a hard disk (HDD: Hard Disc Drive) and an SSD (Solid State Drive), and stores various information.
  • the memory 204 may be a device external to the robot control device 20.
  • the program for operating the CPU 201 is stored in the ROM 202 or the memory 204 in advance.
  • the CPU 201 reads a program from the ROM 202 or the memory 204 into the RAM 203 and develops the program.
  • the CPU 201 executes each coded instruction in the program expanded in the RAM 203.
  • Each function of the robot control device 20 may be realized by a computer system including a CPU 201, a ROM 202, a RAM 203, or the like, or may be realized by a dedicated hardware circuit such as an electronic circuit or an integrated circuit, and the computer system and hardware may be realized. It may be realized by a combination of hardware circuits.
  • the first input / output I / F 205 is connected to the input device 40 and inputs / outputs information, data, commands, etc. to the input device 40.
  • the first input / output I / F 205 may include a circuit for converting a signal or the like.
  • the second input / output I / F 206 is connected to the image processing device 30 and inputs / outputs information, data, commands, and the like to the image processing device 30.
  • the second input / output I / F 206 may include a circuit for converting a signal or the like.
  • the third input / output I / F 207 is connected to the camera 51 and inputs / outputs information, data, commands, etc. to the camera 51.
  • the third input / output I / F 207 may include a circuit or the like for driving the camera 51.
  • the arm drive circuit 208 supplies electric power to the servomotors of the arm drive devices M1 to M6 of the robot 10 and controls the drive of the servomotors in accordance with the command of the CPU 201.
  • the gun drive circuit 209 supplies electric power to the servomotor of the mobile drive device 11ca of the welding gun 11 to control the drive of the servomotor in accordance with the command of the CPU 201.
  • the image processing device 30 includes a CPU 301, a ROM 302, a RAM 303, a memory 304, and input / output I / F 305 to 306 as components. Each of the above components is connected via bus, wired or wireless communication. Not all of the above components are essential.
  • the configurations of the CPU 301, ROM 302, RAM 303, and memory 304 are the same as those of the robot control device 20.
  • the first input / output I / F 305 is connected to the robot control device 20 and inputs / outputs information, data, commands, etc. to the robot control device 20.
  • the second input / output I / F 306 is connected to the camera 51 and inputs / outputs information, data, commands, etc. to the camera 51.
  • the second input / output I / F 306 receives the signal of the image captured by the camera 51.
  • the input / output I / F 305 to 306 may include a circuit for converting a signal or the like.
  • the robot control device 20 and the image processing device 30 as described above include, for example, a microcontroller, an MPU (Micro Processing Unit), an LSI (Large Scale Integration), a system LSI, a PLC (Programmable Logic Controller), and logic. It may be composed of a circuit or the like.
  • the plurality of functions of the robot control device 20 may be realized by being individually integrated into one chip, or may be realized by being integrated into one chip so as to include a part or all of them. Further, each circuit may be a general-purpose circuit or a dedicated circuit.
  • an FPGA Field Programmable Gate Array
  • a reconfigurable processor that can reconfigure the connection and / or setting of circuit cells inside the LSI, or multiple functions for a specific application.
  • An ASIC Application Specific Integrated Circuit
  • ASIC Application Specific Integrated Circuit
  • FIG. 5 is a block diagram showing an example of the functional configuration of the robot system 1 according to the embodiment.
  • the robot control device 20 includes an image pickup control unit 20a, a mode determination unit 20b, a manual command generation unit 20c, an automatic command generation unit 20d, an operation control unit 20e, and a correction unit 20f.
  • a storage unit of 20 g is included as a functional component.
  • the motion control unit 20e includes an arm control unit 20ea and a gun control unit 20eb. Not all of the above functional components are required.
  • the functions of the functional components other than the storage unit 20g are realized by the CPU 201 and the like, and the functions of the storage unit 20g are realized by the memory 204, the ROM 202 and / or the RAM 203.
  • the storage unit 20g stores various information and enables reading of the stored information.
  • the storage unit 20g stores a program for operating the robot control device 20.
  • the storage unit 20g stores the teaching data 20ga stored by the teaching for causing the robot 10 to perform a predetermined welding operation.
  • the teaching method of the robot 10 is teaching by programming, and the teaching data 20ga is offline teaching data.
  • the teaching method of the robot 10 may be, for example, direct teaching by the instructor moving the robot 10 in direct contact, teaching by remote control using a teaching pendant, teaching by a master / slave, or the like.
  • the actually welded position may not match the position to be originally welded due to factors such as individual differences in the operation of the robot 10.
  • the actually welded position may not match the position to be originally welded due to factors such as a difference in the skill level of the instructor. Therefore, it is necessary to correct the teaching data of 20 ga.
  • the teaching data 20ga includes the gun teaching position set as the position of the welding gun 11 for welding at each welding position included in the welding work, and the electrode teaching set as the position of the first mounting portion 11bc during welding. Including position etc.
  • the welding position is the position of the hitting point where the electrode tip 11d is pressed onto the object to be welded.
  • the gun teaching position may include the three-dimensional position and orientation of the welding gun 11.
  • the electrode teaching position is a position relative to the welding gun 11 of the first mounting portion 11bc, and may be the amount of movement of the first mounting portion 11bc.
  • the teaching data 20ga may include the time at each gun teaching position and the time at each electrode teaching position. Further, the teaching data 20ga may include a force applied by the welding gun 11 to the welding object at each gun teaching position, or may include a force applied by the first mounting portion 11bc to the welding object via the electrode tip 11d.
  • the storage unit 20g may store the relationship between the three-dimensional position and posture of the welding spot and the three-dimensional position and posture of the welding gun 11 for welding to the spot.
  • the three-dimensional position of the hitting point may be the three-dimensional position of the center of the hitting point.
  • the posture of the hitting point is not particularly limited, but for example, the amount and direction of inclination of the surface formed by the hitting point with respect to the vertical axis, the orientation of the specific point on the hitting point in the horizontal direction with respect to the center of the hitting point, and the direction of the hitting point. It may be the direction of a specific point on the hitting point with respect to the center in the three-dimensional direction.
  • the storage unit 20g stores information on the welding gun 11, the electrode tip 11d, the object to be welded, and the imaging device 50.
  • the information of the welding gun 11 includes the distance between the first mounting portion 11bc and the second mounting portion 11bd retracted to the end portion 11ba of the welding gun 11, the movable amount of the first mounting portion 11bc, and the like.
  • the information on the electrode tip 11d includes dimensions such as the length of the electrode tip 11d mounted on the first mounting portion 11bc and the second mounting portion 11bd.
  • Information on the object to be welded includes dimensions such as the type, material and thickness of the object to be welded.
  • the information of the image pickup apparatus 50 includes the information of the camera 51 and the fixture 52.
  • the information of the camera 51 includes the camera parameters of the camera 51, and the camera parameters include internal parameters related to the camera 51 itself and external parameters related to the surrounding environment of the camera 51.
  • the information on the fixture 52 may include information on the positional relationship between the connecting portion 52a and the accommodating portion 52b, such as the angle and separation distance between the axial center of the connecting portion 52a and the axial center of the accommodating portion 52b.
  • the above information may be stored in the storage unit 20g by input via the input device 40, respectively.
  • the image pickup control unit 20a controls the image pickup operation of the camera 51.
  • the image pickup control unit 20a operates in the correction mode, and a welding object having a spot mark at the spot position is designated on the camera 51 of the image pickup device 50 mounted on the first mounting portion 11bc of the welding gun 11.
  • the image is taken at the timing.
  • the robot 10 performs the same operation as the welding operation according to the teaching data 20ga.
  • the object to be welded is located between the first mounting portion 11bc and the second mounting portion 11bd of the welding gun 11.
  • the imaging control unit 20a When the welding gun 11 is positioned at the gun teaching position for pressing the first mounting portion 11bc to the predetermined spot position of the object to be welded according to the teaching data 20ga, the imaging control unit 20a has the spot mark attached to the predetermined spot position. Is imaged by the camera 51. Specifically, when the welding gun 11 is located at the gun teaching position, the first mounting portion 11bc is operated in the direction D1 so as to press the electrode tip 11d against the welding target according to the teaching data 20ga, and the timing when the welding gun 11 approaches the welding target. In, the image pickup control unit 20a causes the camera 51 to image the dot mark.
  • the timing may be the timing when the camera 51 is closest to the welding target, and the first mounting portion 11bc is moving in the direction D1 toward the welding target or the first in the direction D2 away from the welding target. It may be a predetermined timing during the movement of the mounting portion 11bc.
  • the image processing device 30 may include an image pickup control unit 20a.
  • the mode determination unit 20b determines the mode to be executed by the robot system 1 from the automatic operation mode, the manual operation mode, and the correction mode according to a command for designating the mode via the input device 40, and sets it as another functional component. Operate according to the determined mode.
  • the manual command generation unit 20c generates an operation command for causing the robot 10 to perform an operation corresponding to the operation information output from the input device 40 in the manual operation mode or when the instructor executes the instruction, and the operation control unit 20e Output to.
  • the automatic command generation unit 20d generates an operation command for automatically causing the robot 10 to perform a predetermined welding operation according to the teaching data 20ga in the automatic operation mode and the correction mode, and outputs the operation command to the operation control unit 20e.
  • the automatic command generation unit 20d acquires welding work information via the input device 40, and reads and uses the teaching data 20ga corresponding to the welding work from the storage unit 20g.
  • the automatic command generation unit 20d generates an operation command for pressurizing the welding object by pressing the electrode tip 11d of the first mounting unit 11bc, but in the correction mode, the welding object is pressed.
  • An operation command is generated in which the image pickup device 50 of the first mounting portion 11bc is brought into close contact with each other but not pressurized or not brought into contact with each other.
  • the operation command includes commands such as the three-dimensional position and posture of the welding gun 11, the position of the first mounting portion 11bc with respect to the welding gun 11, and the time at each position.
  • the operation command may include commands such as a force applied by the welding gun 11 to the welding object at each position and a force applied by the first mounting portion 11bc to the welding object via the electrode tip 11d.
  • the motion control unit 20e controls the motion of the robot 10 according to the motion command.
  • the arm control unit 20ea of the operation control unit 20e generates a command for operating the servomotors of the arm drive devices M1 to M6 of the robot arm 12 so that the three-dimensional position and posture of the welding gun 11 obey the operation command. Then, the output is output to the arm drive devices M1 to M6.
  • the arm control unit 20ea acquires the rotation amount and drive current of each of the servomotors of the arm drive devices M1 to M6 as feedback information and uses them to generate the above command.
  • the gun control unit 20eb generates a command for operating the servomotor of the movement drive device 11ca of the welding gun 11 and outputs the command to the movement drive device 11ca so that the position of the first mounting unit 11bc follows the operation command. ..
  • the gun control unit 20eb acquires the rotation amount and the drive current of the servomotor of the mobile drive device 11ca as feedback information and uses them for generating the above command.
  • the correction unit 20f operates in the correction mode and corrects the teaching data 20ga.
  • the correction unit 20f receives the three-dimensional position and orientation of the dot mark on the surface of the welding object detected from the image captured by the camera 51 from the image processing device 30. Further, the correction unit 20f is based on the three-dimensional position and orientation of the dot mark, and the position of the welding gun 11 for actually pressing the electrode tip 11d of the first mounting portion 11bc of the welding gun 11 against the center of the dot mark, that is, The corresponding position, which is the position of the welding gun 11 for actually executing welding, is detected at the center of the dot mark.
  • the corresponding position includes the three-dimensional position and posture of the welding gun 11, but may include only the three-dimensional position of the welding gun 11, for example, when the posture of the welding gun 11 is constant.
  • the correction unit 20f detects, for example, a corresponding position based on the relationship between the three-dimensional position and posture of the hitting point stored in the storage unit 20g and the three-dimensional position and posture of the welding gun 11 for welding to the hitting point. May be good.
  • the correction unit 20f has a corresponding position of the welding gun 11 for actually executing welding at the center of the dot mark and a gun teaching position of the welding gun 11 set for executing welding at the center of the dot mark.
  • the teaching data 20ga is corrected based on the difference from the above.
  • the correction unit 20f may correct the gun teaching position so as to reduce the difference, or may correct the gun teaching position by replacing it with a corresponding position, for example.
  • the image processing device 30 includes an extraction unit 30a, a mark position detection unit 30b, and a storage unit 30c as functional components. Not all of the above functional components are required.
  • the functions of the extraction unit 30a and the mark position detection unit 30b are realized by the CPU 301 and the like, and the functions of the storage unit 30c are realized by the memory 304, the ROM 302 and / or the RAM 303.
  • the storage unit 30c stores various information and enables reading of the stored information.
  • the storage unit 30c stores a program for operating the image processing device 30.
  • the storage unit 30c stores the information of the dot mark.
  • the dot mark information may include information on the shape, size, color and texture of the spot weld mark, image data of the spot weld mark, and the like.
  • the dot mark information may include image data of the marking and information on the shape, dimensions, and arrangement of the marking and the figures constituting the marking.
  • the marking of the dot mark has directionality.
  • the marking is composed of a plurality of figures, and the plurality of figures represent the center of the marking and the directionality which is the direction of the marking.
  • FIG. 6 is a diagram showing an example of marking of a dot mark according to an embodiment.
  • the marking M in FIG. 6 is an outer circle Ma, an equilateral triangle Mb inscribed in the outer circle Ma, an inner circle Mc inside the equilateral triangle Mb, and a point Md near the inside of one corner of the equilateral triangle Mb. It is composed.
  • the inner circumference Mc represents the center of the marking M and is an example of the center display unit.
  • the equilateral triangle Mb and the point Md represent the direction of the marking M and are an example of the directional display unit.
  • the image data of the dot mark not only the front image data which is the image data captured from the front of the dot mark but also the oblique image data which is the image data captured from various angles with respect to the dot mark is also available. It may be stored in the storage unit 30c. In the oblique image data, the dot mark is distorted.
  • the storage unit 30c may store information on the welding gun 11, the electrode tip 11d, the welding object W, and the imaging device 50. These pieces of information may be stored in the storage unit 30c by being transmitted from the robot control device 20 to the image processing device 30.
  • the extraction unit 30a detects a dot mark from the subject projected on the image data captured by the camera 51.
  • the extraction unit 30a executes binary conversion or the like on the image data to detect the edge.
  • the extraction unit 30a compares the image data before conversion and the image data after conversion with the image data of spot welding marks, and executes shape pattern matching, color pattern matching, and / or texture pattern matching. , Detects the image of spot welding marks.
  • the extraction unit 30a is based on information such as the angle between the axis of the connecting portion 52a of the attachment 52 and the axis of the accommodating portion 52b, and the axis 11da of the electrode tip 11d and the camera 51 in the first mounting portion 11bc.
  • the distortion of the subject of the image data captured from an oblique direction may be corrected based on the detection result by detecting the angle with the optical axis 51a of the above.
  • the extraction unit 30a may detect an image of a spot weld mark using the image data after distortion correction.
  • the extraction unit 30a executes a binary conversion or the like on the image data to detect an edge.
  • the extraction unit 30a detects the marking image by comparing the converted image data with the marking image data and performing shape pattern matching.
  • the extraction unit 30a detects the line segment and the arc by executing a Hough transform or the like on the converted image data.
  • the extraction unit 30a detects a figure included in the marking and a figure similar to the figure in the image data after the Hough transform, and from the combinations of the detected figures, the combination forming the marking is a marking image. Detect as.
  • the combination of figures includes the shape and arrangement of the figures.
  • the extraction unit 30a detects a similar figure so as to consider the distortion of the subject of the image data captured from the oblique direction.
  • the extraction unit 30a includes the outermost circle or ellipse, the triangle inside the circle, the circle or ellipse inside the triangle, and the point inside one corner of the triangle. Detect combinations.
  • the extraction unit 30a may correct the distortion of the subject in the image data captured from an oblique direction, and detect the marking using the image data after the distortion correction.
  • the mark position detection unit 30b detects the three-dimensional position and orientation of the dot mark by using the image data captured by the camera 51 and the dot mark information detected in the image data.
  • the mark position detection unit 30b also uses information on the welding gun 11, the electrode tip 11d, the welding object W, and the fixture 52 of the imaging device 50 for the above detection.
  • the operation of the mark position detection unit 30b will be described by taking the case of marking M as an example.
  • FIG. 7 is a side view showing an example of a state at the time of imaging the dot mark by the camera 51 in the correction mode.
  • FIG. 8 is a diagram showing an example of an image captured by the camera 51 in the state of FIG. 7.
  • the mark position detection unit 30b has the pixel coordinates of the pixel pMc at the center of the marking M, that is, the center of the inner circumference circle Mc, and the center of the circular point Md in the image Ia on which the marking M is copied.
  • the pixel coordinates of the pixel pMd of the above are detected.
  • the pixel coordinates are coordinates in the image coordinate system of the image Ia in units of pixels.
  • the position of the optical axis 51a of the camera 51 is the pixel pI at the center of the image Ia.
  • the mark position detection unit 30b refers to the angle ⁇ of the line-of-sight LMc and the optical axis 51a from the camera 51 to the center of the inner peripheral circle Mc shown in FIG. 7 and the optical axis 51a based on the positional relationship between the pixel pMc and the pixel pI. Calculate the direction of the line of sight LMc.
  • the intersection Pd between the axis 11da of the first mounting portion 11bc and the surface of the object W to be welded is the center of the hitting point position of the electrode tip 11d of the first mounting portion 11bc when welding is performed according to the teaching data 20ga. ..
  • the mark position detection unit 30b calculates the distance da between the tip of the first mounting unit 11bc and the surface of the welding object W at the time of imaging by the camera 51. Specifically, the mark position detection unit 30b acquires feedback information at the time of imaging of the camera 51 from the gun control unit 20eb of the robot control device 20, and detects the position of the first mounting unit 11bc based on the feedback information. The mark position detection unit 30b calculates the distance da based on the position of the first mounting portion 11bc, the distance between the first mounting portion 11bc and the second mounting portion 11bd, and the thickness of the welding object W.
  • the mark position detection unit 30b has an inner circumference Mc based on the protrusion length of the fixture 52 from the first mounting portion 11bc in the direction D1, the direction and the angle ⁇ of the line of sight LMc with respect to the optical axis 51a, and the distance da. Calculate the three-dimensional position of the center.
  • the mark position detection unit 30b may acquire the protrusion length from the storage unit 20g of the robot control device 20, or may acquire the protrusion length stored in advance in the storage unit 30c.
  • the mark position detection unit 30b calculates the three-dimensional position of the center of the point Md in the same manner as the calculation of the three-dimensional position of the center of the inner circumference circle Mc.
  • the mark position detection unit 30b calculates the three-dimensional position of the center of the marking M and the posture of the marking M in the three-dimensional space based on the three-dimensional positions of the center of the inner circumference circle Mc and the center of the point Md.
  • the posture of the marking M may be any posture, and may be, for example, an orientation in the horizontal direction of the center of the point Md with respect to the center of the marking M, and the three dimensions of the point Md with respect to the center of the marking M.
  • the mark position detection unit 30b transmits the three-dimensional position and posture of the center of the marking M to the correction unit 20f of the robot control device 20.
  • the mark position detection unit 30b detects the three-dimensional position and orientation of the spot weld mark in the same manner as the marking even when the spot mark is a spot weld mark.
  • the mark position detection unit 30b can detect the posture of the spot weld mark by detecting the three-dimensional position of the center of the spot weld mark and the three-dimensional position of at least a part of the outer periphery of the spot weld mark. ..
  • FIG. 9 is a flowchart showing an example of the operation of the robot system 1 according to the embodiment in the correction mode.
  • step S1 the electrode tip 11d of the first mounting portion 11bc of the welding gun 11 is replaced with the imaging device 50 by the user. That is, the attachment 52 of the image pickup apparatus 50 is attached to the first attachment portion 11bc.
  • step S2 a command to execute the correction mode is input to the input device 40 by the user and accepted by the robot control device 20.
  • step S3 the robot control device 20 causes the robot 10 to automatically operate according to the teaching data 20ga of the storage unit 20g.
  • step S4 the robot control device 20 moves the welding gun 11 to the robot 10 to the spot position where the welding gun 11 should be placed next among the plurality of spot positions included in the teaching data 20ga.
  • step S5 the robot control device 20 causes the robot 10 to adjust the position and posture of the welding gun 11 with respect to the striking point position. Specifically, the robot control device 20 adjusts the posture of the welding gun 11 so that the axis 11da of the first mounting portion 11bc is perpendicular to the surface of the welding object W at the striking point position. Further, the robot control device 20 moves the welding gun 11 in the direction D2 with respect to the welding object W, so that the electrode tip 11d of the second mounting portion 11bd is brought into contact with the welding object W. The robot control device 20 detects the contact based on the detection signal of the contact sensor 11e.
  • step S6 the robot control device 20 causes the moving device 11c of the welding gun 11 to perform the same operation as during welding, that is, the welding operation. Specifically, the robot control device 20 moves the first mounting portion 11bc in the direction D1 to the moving device 11c, moves it in the direction D2 after making it closest to the welding object W, and separates it from the welding object W. Let me. At this time, the robot control device 20 does not press the image pickup device 50 against the welded object W to pressurize it as in the case of the automatic operation mode, but brings the image pickup device 50 into contact with the welded object W. Do not pressurize or contact.
  • step S7 the robot control device 20 takes the camera 51 at a predetermined timing in the process in which the first mounting portion 11bc approaches the welding target W or the process in which the first mounting portion 11bc leaves the welding target W.
  • the robot control device 20 may or may not temporarily stop the first mounting portion 11bc at the time of imaging.
  • the camera 51 transmits the signal of the captured image to the image processing device 30 in association with the information of the hitting point position and stores it as image data in the storage unit 30c, but it may be transmitted to the robot control device 20 and stored in the storage unit 20g. Good.
  • step S8 the robot control device 20 determines whether or not the welding operation to all the hitting point positions included in the teaching data 20ga is completed.
  • the robot control device 20 proceeds to step S9 when it is completed (Yes in step S8), and proceeds to step S4 when it is not completed (N Meeting in step S8).
  • step S9 the image processing device 30 detects the dot mark projected on the image data by processing the image data captured at each dot position.
  • step S10 the image processing device 30 detects the three-dimensional position and orientation of the dot mark using the image data at each dot position and the dot mark information detected in the image data.
  • the image processing device 30 associates the information of each hitting point position with the three-dimensional position and posture of the hitting point mark at the hitting point position and transmits the information to the robot control device 20.
  • step S11 the robot control device 20 detects the corresponding position of the welding gun 11 for actually executing welding at the center of the dot mark based on the three-dimensional position and posture of the dot mark for each dot position. To do.
  • step S12 the robot control device 20 corresponds to the gun teaching position of the welding gun 11 set to execute welding at the hitting point position and the welding gun 11 at the hitting point position.
  • the teaching data 20ga is corrected based on the difference from the position.
  • the robot system 1 can automatically image the dot marks corresponding to each dot position in order and correct the teaching data 20ga.
  • the configuration of the image pickup device 50A that can be attached to and detached from the welding gun 11 is different from that of the embodiment.
  • the fixture 52A of the image pickup apparatus 50A according to the modified example specifically offsets the direction of the optical axis 51a from the first mounting portion 11bc so that the direction of the optical axis 51a of the camera 51 is parallel to the direction D1. Attaches the camera 51 to the first mounting portion 11bc by offsetting the optical axis 51a from the operation path of the first mounting portion 11bc in the directions D1 and D2.
  • the present modification will be described with a focus on different points in the embodiment, and the same points as in the embodiment will be omitted as appropriate.
  • FIG. 10 is a side view showing an example of the configuration of the imaging device 50A of the welding gun 11 according to the modified example.
  • the image pickup apparatus 50A includes a camera 51 and a fixture 52A.
  • the fixture 52A integrally includes a cylindrical connecting portion 52Aa and a cylindrical accommodating portion 52Ab.
  • the axis of the accommodating portion 52Ab is parallel to the axis of the connecting portion 52Aa. Therefore, the optical axis 51a of the camera 51 mounted on the first mounting portion 11bc via the mounting tool 52A is parallel to the axis 11da of the electrode tip 11d mounted on the first mounting portion 11bc.
  • the axis of the accommodating portion 52Ab is located away from the axis of the connecting portion 52Aa in a direction perpendicular to the axis of the connecting portion 52Aa. That is, the optical axis 51a and the axis 11da are separated from each other.
  • the accommodating portion 52Ab extends from the connecting portion 52Aa in the direction D2.
  • the accommodating portion 52Ab is arranged so as to prevent interference with the main body portion 11b of the welding gun 11 and the moving device 11c, and to suppress the amount of protrusion from the connecting portion 52Aa in the direction D1 to a low level, or to prevent the housing portion 52Ab from protruding from the connecting portion 52Aa. Can be done.
  • the camera 51 mounted on the first mounting portion 11bc via the mounting tool 52A as described above generates an image of the spot mark with small distortion when the spot mark of the welding object W is imaged in the correction mode. can do. This makes it possible to simplify the image processing in the image processing apparatus 30.
  • the axis of the accommodating portion 52Ab offset from the axis of the connection portion 52Aa is located away from the axis of the connection portion 52Aa, but the present invention is not limited to this, and the connection portion 52Aa It may be coaxial with the axis.
  • the robot control device 20 moves the welding gun 11 to the robot 10 according to the teaching data 20ga to each hitting point position, and further moves the first mounting portion 11bc to the welding object W.
  • the camera is moved toward the camera to image the image on the 51, but the method is not limited to this.
  • the robot control device 20 causes the robot 10 to move the welding gun 11 to each hitting point position according to the teaching data 20ga, but the camera 51 may image the welding gun 11 without moving the first mounting portion 11bc.
  • the robot control device 20 moves the first mounting portion 11bc in the direction D1 toward the welding object W, but the position before the electrode tip 11d comes into contact with the welding object W, that is, the contact position.
  • the first mounting portion 11bc may be stopped at a position closer to the direction D2 and then pulled back to the direction D2.
  • the robot control device 20 may cause the camera 51 to take an image at the stop position, or may make the camera 51 take an image in the process of movement.
  • the robot control device 20 brings the electrode tip 11d of the second mounting portion 11bd of the welding gun 11 into contact with the welding object before performing the imaging with the camera 51 in the correction mode.
  • the present invention is not limited to this, and it is not necessary to make contact with the contact.
  • the image processing device 30 may detect the distance between the camera 51 and the object to be welded by processing the image data captured by the camera 51.
  • the attachments 52 and 52A of the image pickup apparatus 50 and 50A are configured to be attached to the first attachment portion 11bc of the welding gun 11 instead of the electrode tip 11d.
  • the attachments 52 and 52A may be configured to be attached to the first attachment portion 11bc at a position different from that of the electrode tip 11d.
  • the attachments 52 and 52A may be configured so that the electrode tip 11d can be attached to the first attachment portion 11bc even when the electrode tip 11d is attached to the first attachment portion 11bc. That is, the attachments 52 and 52A may be configured to be attached to the first attachment portion 11bc together with the electrode tip 11d.
  • the attachments 52 and 52A may be attached to the first attachment portion 11bc laterally to the direction D1.
  • the robot control device 20 and the image processing device 30 are separate devices, but the device is not limited to this, and may be included in one device. Further, the robot control device 20 and the image processing device 30 may both be composed of two or more devices.
  • the technique of the present disclosure may be a correction method or a control device that executes the above correction method.
  • the correction method according to one aspect of the present disclosure is a correction method for correcting the teaching data of the robot, and is between the first mounting portion and the second mounting portion of the robot gun of the robot facing each other according to the teaching data.
  • the robot gun is moved to a teaching position for pressing the first mounting portion to a predetermined hitting point position of the work, and when the robot gun is located at the teaching position, the robot gun is mounted on the first mounting portion.
  • the camera is made to image the dot mark attached to the predetermined dot position, the position of the dot mark is detected by using the image captured by the camera, and the first attachment is made to the dot mark.
  • the camera includes detecting a corresponding position, which is a position of the robot gun for pressing a unit, and correcting the teaching data based on a difference between the corresponding position and the teaching position. Is attached to the first mounting portion so that the direction of the optical axis of the above is offset from the first mounting portion that can operate in the first direction.
  • a correction method may be realized by a circuit such as a CPU or an LSI, an IC card, a single module, or the like.
  • the technique of the present disclosure may be a program that causes a computer to execute the correction method, or may be a non-temporary computer-readable recording medium in which the program is recorded. Needless to say, the above program can be distributed via a transmission medium such as the Internet.
  • the numbers such as the ordinal number and the quantity used above are all examples for concretely explaining the technology of the present disclosure, and the present disclosure is not limited to the illustrated numbers.
  • the connection relationship between the components is illustrated for the purpose of specifically explaining the technique of the present disclosure, and the connection relationship for realizing the function of the present disclosure is not limited thereto.
  • the division of blocks in the functional block diagram is an example, and even if a plurality of blocks are realized as one block, one block is divided into a plurality of blocks, and / or some functions are transferred to another block. Good.
  • a single piece of hardware or software may process the functions of a plurality of blocks having similar functions in parallel or in a time division manner.

Abstract

This correction system (2) is provided with: a camera (51) which is mounted on a first mounting unit (11bc) that is operable in a first direction of a robot gun (11) of a robot (10); a fitting tool (52) by which the camera is fit to the first mounting unit so that an optical axis direction of the camera is to be offset from the first mounting unit; and a control device (3), wherein, when the robot gun is positioned at a teaching position for pushing the first mounting position to a prescribed hitting position of a workpiece according to teaching data, the control device causes the camera to capture an image of a hitting mark at the prescribed hitting position, detects the position of the hitting mark by using the captured image, detects a corresponding position of the robot gun for pushing the first mounting unit to the hitting mark, and corrects the teaching data on the basis of the difference between the corresponding position and the teaching position.

Description

補正システム、補正方法、ロボットシステム及び制御装置Correction system, correction method, robot system and control device 関連出願への相互参照Cross-reference to related applications
 本件出願は、2019年10月9日に日本特許庁に出願された特願2019-186010号の優先権を主張するものであり、その全体を参照することにより本件出願の一部となすものとして引用する。 This application claims the priority of Japanese Patent Application No. 2019-186010 filed with the Japan Patent Office on October 9, 2019, and is a part of this application by referring to the whole. To quote.
 本開示は、補正システム、補正方法、ロボットシステム及び制御装置に関する。 This disclosure relates to a correction system, a correction method, a robot system and a control device.
 従来、ロボットの教示データを自動で補正する技術が知られている。例えば、特許文献1は、溶接ロボットの教示位置補正システムを開示している。教示位置補正システムは、溶接ガンの対向配置された2つの電極の一方の電極に取り付けられる又は交換される撮像装置を備える。撮像装置の光軸は、上記一方の電極の軸と同軸である。教示位置補正システムは、撮像装置によって撮像された画像中におけるワークの溶接点の位置情報と、撮像装置から溶接点までの距離と、ワークと他方の電極とのクリアランスとに基づき、撮像装置が溶接点を撮像できるように溶接ガンの教示位置を補正する。 Conventionally, a technique for automatically correcting the teaching data of a robot is known. For example, Patent Document 1 discloses a teaching position correction system for a welding robot. The teaching position correction system comprises an imaging device attached to or interchanged with one of the two opposed electrodes of the welding gun. The optical axis of the image pickup apparatus is coaxial with the axis of one of the electrodes. In the teaching position correction system, the imaging device welds based on the position information of the welding point of the work in the image captured by the imaging device, the distance from the imaging device to the welding point, and the clearance between the work and the other electrode. Correct the teaching position of the welding gun so that the points can be imaged.
特開2008-132525号公報Japanese Unexamined Patent Publication No. 2008-132525
 特許文献1のシステムでは、撮像装置は、溶接ガンの一方の電極に取り付けられる又は当該電極と交換される。撮像装置の長さは、一般的に電極の長さよりも大きい。例えば、教示データに従って溶接ガンの電極を電極間の間隙を閉じるように動作させると、撮像装置が押し潰される可能性がある。このため、例えば、撮像装置での撮像は、電極間の間隙があけられた状態で行われる。この場合、撮像装置と溶接点との距離が大きくなり、溶接点の画像を処理することで得られる溶接点の位置情報の精度が低くなり、教示位置の補正精度が低くなるおそれがある。 In the system of Patent Document 1, the imaging device is attached to or replaced with one electrode of the welding gun. The length of the image pickup device is generally larger than the length of the electrodes. For example, if the electrodes of the welding gun are operated so as to close the gap between the electrodes according to the teaching data, the imaging device may be crushed. Therefore, for example, imaging with an imaging device is performed with a gap between the electrodes. In this case, the distance between the imaging device and the welding point becomes large, the accuracy of the welding point position information obtained by processing the image of the welding point becomes low, and the correction accuracy of the teaching position may become low.
 そこで、本開示は、教示データの補正精度を向上する補正システム、補正方法、ロボットシステム及び制御装置を提供することを目的とする。 Therefore, an object of the present disclosure is to provide a correction system, a correction method, a robot system, and a control device that improve the correction accuracy of the teaching data.
 上記目的を達成するために、本開示の一態様に係る補正システムは、ロボットの教示データを補正する補正システムであって、前記ロボットのロボットガンの対向する第1装着部及び第2装着部のうちの第1方向に動作可能である前記第1装着部に取り付けられるカメラと、前記カメラの光軸の方向が前記第1装着部からオフセットするように前記カメラを前記第1装着部に取り付ける取付具と、制御装置とを備え、前記制御装置は、前記教示データに従って前記第1装着部を前記第1装着部と前記第2装着部との間のワークの所定の打点位置に押し付けるための教示位置に、前記ロボットガンが位置するとき、前記所定の打点位置に付けられた打点マークを前記カメラに撮像させ、前記カメラによって撮像された画像を用いて、前記打点マークの位置を検出し、前記打点マークに前記第1装着部を押し付けるための前記ロボットガンの位置である対応位置を検出し、前記対応位置と前記教示位置との差異に基づき、前記教示データを補正する。 In order to achieve the above object, the correction system according to one aspect of the present disclosure is a correction system that corrects the teaching data of the robot, and is a correction system of the first mounting portion and the second mounting portion of the robot gun of the robot facing each other. A camera attached to the first mounting portion that can operate in the first direction, and a mounting that attaches the camera to the first mounting portion so that the direction of the optical axis of the camera is offset from the first mounting portion. The control device includes a tool and a control device, and the control device is a teaching for pressing the first mounting portion to a predetermined hitting point position of a work between the first mounting portion and the second mounting portion according to the teaching data. When the robot gun is positioned at the position, the camera is made to image the dot mark attached to the predetermined dot position, and the position of the dot mark is detected by using the image captured by the camera. The corresponding position, which is the position of the robot gun for pressing the first mounting portion against the hitting point mark, is detected, and the teaching data is corrected based on the difference between the corresponding position and the teaching position.
 本開示の一態様に係るロボットシステムは、本開示の一態様に係る補正システムと、前記ロボットとを備え、前記制御装置は、前記ロボットの動作を制御する。 The robot system according to one aspect of the present disclosure includes a correction system according to one aspect of the present disclosure and the robot, and the control device controls the operation of the robot.
 本開示の一態様に係る補正方法は、ロボットの教示データを補正する補正方法であって、前記教示データに従って前記ロボットのロボットガンの対向する第1装着部及び第2装着部の間のワークの所定の打点位置に前記第1装着部を押し付けるための教示位置に、前記ロボットガンを移動させることと、前記ロボットガンが前記教示位置に位置するとき、前記第1装着部に取り付けられたカメラに、前記所定の打点位置に付けられた打点マークを撮像させることと、前記カメラによって撮像された画像を用いて、前記打点マークの位置を検出することと、前記打点マークに前記第1装着部を押し付けるための前記ロボットガンの位置である対応位置を検出することと、前記対応位置と前記教示位置との差異に基づき、前記教示データを補正することとを含み、前記カメラは、前記カメラの光軸の方向が、第1方向に動作可能である前記第1装着部からオフセットするように前記第1装着部に取り付けられる。 The correction method according to one aspect of the present disclosure is a correction method for correcting the teaching data of the robot, in which the work between the first mounting portion and the second mounting portion of the robot gun of the robot facing each other according to the teaching data. Moving the robot gun to a teaching position for pressing the first mounting portion to a predetermined hitting point position, and when the robot gun is located at the teaching position, to a camera mounted on the first mounting portion. To image the dot mark attached to the predetermined dot position, detect the position of the dot mark using the image captured by the camera, and attach the first mounting portion to the dot mark. The camera includes the detection of a corresponding position, which is the position of the robot gun for pressing, and the correction of the teaching data based on the difference between the corresponding position and the teaching position. The shaft is attached to the first mounting portion so as to be offset from the first mounting portion that can operate in the first direction.
 本開示の一態様に係る制御装置は、本開示の一態様に係る補正方法を実行する制御装置である。 The control device according to one aspect of the present disclosure is a control device that executes the correction method according to one aspect of the present disclosure.
 本開示の技術によれば、教示データの補正精度を向上することが可能になる。 According to the technique of the present disclosure, it is possible to improve the correction accuracy of the teaching data.
図1は、実施の形態に係るロボットシステムの一例を示す概略図である。FIG. 1 is a schematic view showing an example of a robot system according to an embodiment. 図2は、実施の形態に係る溶接ガンの構成の一例を示す側面図である。FIG. 2 is a side view showing an example of the configuration of the welding gun according to the embodiment. 図3は、図2の溶接ガンにおいて電極チップの代わりに撮像装置が装着される構成の一例を示す側面図である。FIG. 3 is a side view showing an example of a configuration in which an imaging device is mounted instead of an electrode tip in the welding gun of FIG. 図4は、実施の形態に係るロボットシステムのハードウェア構成の一例を示すブロック図である。FIG. 4 is a block diagram showing an example of the hardware configuration of the robot system according to the embodiment. 図5は、実施の形態に係るロボットシステムの機能的構成の一例を示すブロック図である。FIG. 5 is a block diagram showing an example of the functional configuration of the robot system according to the embodiment. 図6は、実施の形態に係る打点マークのマーキングの一例を示す図である。FIG. 6 is a diagram showing an example of marking of a dot mark according to an embodiment. 図7は、補正モードにおけるカメラでの打点マークの撮像時の状態の一例を示す側面図である。FIG. 7 is a side view showing an example of a state at the time of imaging of the dot mark by the camera in the correction mode. 図8は、図7の状態でカメラによって撮像された画像の一例を示す図である。FIG. 8 is a diagram showing an example of an image captured by the camera in the state of FIG. 7. 図9は、実施の形態に係るロボットシステムの補正モードでの動作の一例を示すフローチャートである。FIG. 9 is a flowchart showing an example of the operation of the robot system according to the embodiment in the correction mode. 図10は、変形例に係る溶接ガンの撮像装置の構成の一例を示す側面図である。FIG. 10 is a side view showing an example of the configuration of the image pickup device for the welding gun according to the modified example.
 まず、本開示の態様例を説明する。本開示の一態様に係る補正システムは、ロボットの教示データを補正する補正システムであって、前記ロボットのロボットガンの対向する第1装着部及び第2装着部のうちの第1方向に動作可能である前記第1装着部に取り付けられるカメラと、前記カメラの光軸の方向が前記第1装着部からオフセットするように前記カメラを前記第1装着部に取り付ける取付具と、制御装置とを備え、前記制御装置は、前記教示データに従って前記第1装着部を前記第1装着部と前記第2装着部との間のワークの所定の打点位置に押し付けるための教示位置に、前記ロボットガンが位置するとき、前記所定の打点位置に付けられた打点マークを前記カメラに撮像させ、前記カメラによって撮像された画像を用いて、前記打点マークの位置を検出し、前記打点マークに前記第1装着部を押し付けるための前記ロボットガンの位置である対応位置を検出し、前記対応位置と前記教示位置との差異に基づき、前記教示データを補正する。 First, an example of the embodiment of the present disclosure will be described. The correction system according to one aspect of the present disclosure is a correction system that corrects the teaching data of the robot, and can operate in the first direction of the facing first mounting portion and the second mounting portion of the robot gun of the robot. The camera is attached to the first mounting portion, the mounting tool for mounting the camera to the first mounting portion so that the direction of the optical axis of the camera is offset from the first mounting portion, and a control device. In the control device, the robot gun is positioned at a teaching position for pressing the first mounting portion to a predetermined hitting point position of the work between the first mounting portion and the second mounting portion according to the teaching data. At that time, the camera is made to image the dot mark attached to the predetermined dot position, the position of the dot mark is detected by using the image captured by the camera, and the first mounting portion is attached to the dot mark. The corresponding position, which is the position of the robot gun for pressing the camera, is detected, and the teaching data is corrected based on the difference between the corresponding position and the teaching position.
 上記態様によると、カメラの光軸の方向がオフセットするようにカメラが第1装着部に取り付けられるため、第1方向において、第1装着部からのカメラの突出量を抑えることができる。これにより、カメラでの撮像の際、例えば、教示データに従って第1装着部を第1方向に動作させても、カメラを第1装着部とワークとの間で押し潰さずに打点マークに接近させることができる。よって、カメラは、第1装着部が打点位置に向かって動作した状態で打点マークを撮像することでき、打点マークの相対位置を高精度且つ高画質に表す画像の撮像が可能になる。従って、対応位置の検出精度及び教示データの補正の精度の向上が可能になる。 According to the above aspect, since the camera is attached to the first mounting portion so that the direction of the optical axis of the camera is offset, the amount of protrusion of the camera from the first mounting portion can be suppressed in the first direction. As a result, when imaging with the camera, for example, even if the first mounting portion is operated in the first direction according to the teaching data, the camera is brought close to the dot mark without being crushed between the first mounting portion and the work. be able to. Therefore, the camera can take an image of the hitting point mark in a state where the first mounting portion is operated toward the hitting point position, and can take an image showing the relative position of the hitting point mark with high accuracy and high image quality. Therefore, it is possible to improve the detection accuracy of the corresponding position and the correction accuracy of the teaching data.
 本開示の一態様に係る補正システムにおいて、前記制御装置は、前記ロボットガンが前記教示位置に位置するとき、前記教示データに従って前記ワークに向かって前記第1方向に前記第1装着部を動作させ、前記ワークに接近した状態の前記カメラに前記打点マークを撮像させてもよい。 In the correction system according to one aspect of the present disclosure, when the robot gun is located at the teaching position, the control device operates the first mounting portion in the first direction toward the work according to the teaching data. , The dot mark may be imaged by the camera in a state of being close to the work.
 上記態様によると、カメラによる打点マークの相対位置を高精度且つ高画質に表す画像の撮像が可能になり、教示データの補正の精度の向上が可能になる。 According to the above aspect, it is possible to capture an image in which the relative position of the dot mark is represented by the camera with high accuracy and high image quality, and it is possible to improve the accuracy of correction of teaching data.
 本開示の一態様に係る補正システムにおいて、前記第1装着部及び前記第2装着部には、溶接のための電極が着脱可能であり、前記取付具は、前記電極の代わりに前記第1装着部に取り付けられてもよい。 In the correction system according to one aspect of the present disclosure, electrodes for welding can be attached to and detached from the first mounting portion and the second mounting portion, and the mounting tool is the first mounting instead of the electrodes. It may be attached to the part.
 上記態様によると、カメラは電極の代わりに第1装着部に取り付けられるため、カメラは、ワークにおける電極が接触する部分を撮像することができる。このようなカメラは、電極が接触する部分に対応する打点マークの相対位置を高精度且つ高画質に表す画像を撮像することができる。 According to the above aspect, since the camera is attached to the first mounting portion instead of the electrodes, the camera can image the portion of the work in which the electrodes come into contact. Such a camera can capture an image showing the relative position of the dot mark corresponding to the portion where the electrodes are in contact with high accuracy and high image quality.
 本開示の一態様に係る補正システムにおいて、前記第1装着部及び前記第2装着部には、溶接のための電極が着脱可能であり、前記取付具は、前記電極が前記第1装着部に取り付けられている状態で前記第1装着部に取り付けられることができるように構成されてもよい。 In the correction system according to one aspect of the present disclosure, electrodes for welding can be attached to and detached from the first mounting portion and the second mounting portion, and in the mounting tool, the electrodes are attached to the first mounting portion. It may be configured so that it can be attached to the first attachment portion in the attached state.
 上記構成によると、例えば、第1装着部が第1方向へ移動され、電極がワークに接触した状態の画像を、カメラは撮像することができる。このような画像は、電極の実際の打点位置と打点マークとを一緒に写し出すため、実際の打点位置と打点マークとの相対位置の高精度且つ簡易な検出を可能にする。 According to the above configuration, for example, the camera can capture an image in a state where the first mounting portion is moved in the first direction and the electrodes are in contact with the work. Since such an image shows the actual hitting point position of the electrode and the hitting point mark together, it enables highly accurate and simple detection of the relative position between the actual hitting point position and the hitting point mark.
 本開示の一態様に係る補正システムにおいて、前記第1装着部に取り付けられた前記取付具及び前記カメラが前記第1装着部から前記第1方向に突出する長さは、前記第1装着部に取り付けられた前記電極が前記第1装着部から前記第1方向に突出する長さ以下であってもよい。 In the correction system according to one aspect of the present disclosure, the length of the attachment attached to the first mounting portion and the camera protruding from the first mounting portion in the first direction is determined by the first mounting portion. The attached electrode may have a length or less that protrudes from the first mounting portion in the first direction.
 上記態様によると、教示データに従って第1装着部を第1方向へ動作させたとき、カメラ及び取付具は、電極と同程度以下にワークに接近するが、カメラ及び取付具がワークに押し付けられ破損することが抑制される。 According to the above aspect, when the first mounting portion is operated in the first direction according to the teaching data, the camera and the fixture approach the work to the same extent as the electrodes or less, but the camera and the fixture are pressed against the work and damaged. Is suppressed.
 本開示の一態様に係る補正システムにおいて、前記取付具は、前記カメラの光軸の方向が前記第1方向と交差する方向であるように、前記カメラの光軸の方向をオフセットさせてもよい。 In the correction system according to one aspect of the present disclosure, the fixture may offset the direction of the optical axis of the camera so that the direction of the optical axis of the camera intersects the first direction. ..
 上記態様によると、第1方向でカメラが占める長さを小さく抑えることができる。よって、第1装着部にカメラを取り付けるための省スペース化が可能である。 According to the above aspect, the length occupied by the camera in the first direction can be kept small. Therefore, it is possible to save space for mounting the camera on the first mounting portion.
 本開示の一態様に係る補正システムにおいて、前記取付具は、前記カメラの光軸の方向が前記第1方向と平行であるように、前記カメラの光軸の方向をオフセットさせてもよい。 In the correction system according to one aspect of the present disclosure, the fixture may offset the direction of the optical axis of the camera so that the direction of the optical axis of the camera is parallel to the first direction.
 上記態様によると、カメラによって撮像された打点マークの歪みが抑えられる。よって、打点マークを検出するための画像処理の簡略化が可能になる。 According to the above aspect, the distortion of the dot mark captured by the camera is suppressed. Therefore, it is possible to simplify the image processing for detecting the dot mark.
 本開示の一態様に係る補正システムにおいて、前記教示位置は、前記教示位置での前記ロボットガンの3次元位置及び姿勢を含み、前記対応位置は、前記対応位置での前記ロボットガンの3次元位置及び姿勢を含み、前記制御装置は、前記対応位置での前記ロボットガンの3次元位置及び姿勢と前記教示位置での前記ロボットガンの3次元位置及び姿勢との差異に基づき、前記教示データを補正してもよい。 In the correction system according to one aspect of the present disclosure, the teaching position includes the three-dimensional position and orientation of the robot gun at the teaching position, and the corresponding position is the three-dimensional position of the robot gun at the corresponding position. And the posture, the control device corrects the teaching data based on the difference between the three-dimensional position and posture of the robot gun at the corresponding position and the three-dimensional position and posture of the robot gun at the teaching position. You may.
 上記態様によると、教示データの補正の精度が向上する。 According to the above aspect, the accuracy of correction of teaching data is improved.
 本開示の一態様に係る補正システムにおいて、前記制御装置は、前記教示位置において、前記カメラでの撮像の前に前記第2装着部を前記ワークに押し付けるように前記ロボットガンを移動させてもよい。 In the correction system according to one aspect of the present disclosure, the control device may move the robot gun at the teaching position so as to press the second mounting portion against the work before imaging with the camera. ..
 上記態様によると、第1装着部と第2装着部との間でのワークの位置が一定に維持される。よって、第1装着部から第2装着部に向かう方向でのワーク及び打点マークの位置の検出処理が簡略化され得る。 According to the above aspect, the position of the work between the first mounting portion and the second mounting portion is maintained constant. Therefore, the process of detecting the positions of the workpiece and the dot mark in the direction from the first mounting portion to the second mounting portion can be simplified.
 本開示の一態様に係る補正システムにおいて、前記打点マークは、中心を示す中心表示部と、前記中心の周りの回転での向きを示す指向表示部とを含むマーキングであってもよい。 In the correction system according to one aspect of the present disclosure, the dot mark may be a marking including a center display unit indicating the center and a directional display unit indicating the direction in rotation around the center.
 上記態様によると、打点マークに対するロボットガンの対応位置として、ロボットガンの3次元位置及び姿勢の検出が可能になる。このような対応位置を用いた教示データの補正は、高精度な補正を可能にする。 According to the above aspect, it is possible to detect the three-dimensional position and posture of the robot gun as the corresponding position of the robot gun with respect to the hitting point mark. The correction of the teaching data using such a corresponding position enables highly accurate correction.
 本開示の一態様に係るロボットシステムは、本開示の一態様に係る補正システムと、前記ロボットとを備え、前記制御装置は、前記ロボットの動作を制御する。上記態様によると、本開示の一態様に係る補正システムと同様の効果が得られる。 The robot system according to one aspect of the present disclosure includes a correction system according to one aspect of the present disclosure and the robot, and the control device controls the operation of the robot. According to the above aspect, the same effect as that of the correction system according to one aspect of the present disclosure can be obtained.
 本開示の一態様に係る補正方法は、ロボットの教示データを補正する補正方法であって、前記教示データに従って前記ロボットのロボットガンの対向する第1装着部及び第2装着部の間のワークの所定の打点位置に前記第1装着部を押し付けるための教示位置に、前記ロボットガンを移動させることと、前記ロボットガンが前記教示位置に位置するとき、前記第1装着部に取り付けられたカメラに、前記所定の打点位置に付けられた打点マークを撮像させることと、前記カメラによって撮像された画像を用いて、前記打点マークの位置を検出することと、前記打点マークに前記第1装着部を押し付けるための前記ロボットガンの位置である対応位置を検出することと、前記対応位置と前記教示位置との差異に基づき、前記教示データを補正することとを含み、前記カメラは、前記カメラの光軸の方向が、第1方向に動作可能である前記第1装着部からオフセットするように前記第1装着部に取り付けられる。上記態様によると、本開示の一態様に係る補正システムと同様の効果が得られる。 The correction method according to one aspect of the present disclosure is a correction method for correcting the teaching data of the robot, in which the work between the first mounting portion and the second mounting portion of the robot gun of the robot facing each other according to the teaching data. Moving the robot gun to a teaching position for pressing the first mounting portion to a predetermined hitting point position, and when the robot gun is located at the teaching position, to a camera mounted on the first mounting portion. To image the dot mark attached to the predetermined dot position, detect the position of the dot mark using the image captured by the camera, and attach the first mounting portion to the dot mark. The camera includes the detection of a corresponding position, which is the position of the robot gun for pressing, and the correction of the teaching data based on the difference between the corresponding position and the teaching position. The shaft is attached to the first mounting portion so as to be offset from the first mounting portion that can operate in the first direction. According to the above aspect, the same effect as that of the correction system according to one aspect of the present disclosure can be obtained.
 本開示の一態様に係る補正方法は、前記ロボットガンが前記教示位置に位置するとき、前記教示データに従って前記ワークに向かって前記第1方向に前記第1装着部を動作させることをさらに含み、前記カメラによる前記打点マークの撮像を、前記カメラが前記ワークに接近した状態で行ってもよい。 The correction method according to one aspect of the present disclosure further includes operating the first mounting portion in the first direction toward the work according to the teaching data when the robot gun is located at the teaching position. The image of the dot mark by the camera may be performed in a state where the camera is close to the work.
 本開示の一態様に係る補正方法において、前記第1装着部及び前記第2装着部には、溶接のための電極が着脱可能であり、前記カメラは、前記電極の代わりに前記第1装着部に取り付けられてもよい。 In the correction method according to one aspect of the present disclosure, electrodes for welding can be attached to and detached from the first mounting portion and the second mounting portion, and the camera has the first mounting portion instead of the electrodes. It may be attached to.
 本開示の一態様に係る補正方法において、前記第1装着部及び前記第2装着部には、溶接のための電極が着脱可能であり、前記カメラは、前記電極が前記第1装着部に取り付けられている状態で前記第1装着部に取り付けられることができるように構成されてもよい。 In the correction method according to one aspect of the present disclosure, electrodes for welding can be attached to and detached from the first mounting portion and the second mounting portion, and in the camera, the electrodes are attached to the first mounting portion. It may be configured so that it can be attached to the first mounting portion in the state of being welded.
 本開示の一態様に係る補正方法において、前記第1装着部に取り付けられた前記カメラが前記第1装着部から前記第1方向に突出する長さは、前記第1装着部に取り付けられた前記電極が前記第1装着部から前記第1方向に突出する長さ以下であってもよい。 In the correction method according to one aspect of the present disclosure, the length of the camera mounted on the first mounting portion protruding from the first mounting portion in the first direction is the length of the camera mounted on the first mounting portion. The length of the electrode may be less than or equal to the length of protrusion from the first mounting portion in the first direction.
 本開示の一態様に係る補正方法において、前記カメラの光軸の方向は、前記カメラの光軸の方向が前記第1方向と交差する方向であるようにオフセットされてもよい。 In the correction method according to one aspect of the present disclosure, the direction of the optical axis of the camera may be offset so that the direction of the optical axis of the camera intersects the first direction.
 本開示の一態様に係る補正方法において、前記カメラの光軸の方向は、前記カメラの光軸の方向が前記第1方向と平行であるようにオフセットされてもよい。 In the correction method according to one aspect of the present disclosure, the direction of the optical axis of the camera may be offset so that the direction of the optical axis of the camera is parallel to the first direction.
 本開示の一態様に係る補正方法において、前記対応位置での前記ロボットガンの3次元位置及び姿勢と前記教示位置での前記ロボットガンの3次元位置及び姿勢との差異に基づき、前記教示データを補正し、前記教示位置は、前記教示位置での前記ロボットガンの3次元位置及び姿勢を含み、前記対応位置は、前記対応位置での前記ロボットガンの3次元位置及び姿勢を含んでもよい。 In the correction method according to one aspect of the present disclosure, the teaching data is obtained based on the difference between the three-dimensional position and orientation of the robot gun at the corresponding position and the three-dimensional position and orientation of the robot gun at the teaching position. Corrected, the teaching position may include the three-dimensional position and orientation of the robot gun at the teaching position, and the corresponding position may include the three-dimensional position and orientation of the robot gun at the corresponding position.
 本開示の一態様に係る補正方法は、前記教示位置において、前記カメラでの撮像の前に前記第2装着部を前記ワークに押し付けるように前記ロボットガンを移動させることをさらに含んでもよい。 The correction method according to one aspect of the present disclosure may further include moving the robot gun at the teaching position so as to press the second mounting portion against the work before imaging with the camera.
 本開示の一態様に係る補正方法において、前記打点マークは、中心を示す中心表示部と、前記中心の周りの回転での向きを示す指向表示部とを含むマーキングであってもよい。 In the correction method according to one aspect of the present disclosure, the dot mark may be a marking including a center display unit indicating the center and a directional display unit indicating the direction in rotation around the center.
 本開示の一態様に係る制御装置は、本開示の一態様に係る補正方法を実行する制御装置である。上記態様によると、本開示の一態様に係る補正方法と同様の効果が得られる。 The control device according to one aspect of the present disclosure is a control device that executes the correction method according to one aspect of the present disclosure. According to the above aspect, the same effect as the correction method according to one aspect of the present disclosure can be obtained.
 (実施の形態)
 以下において、本開示の実施の形態を、図面を参照しつつ説明する。なお、以下で説明する実施の形態は、いずれも包括的又は具体的な例を示すものである。また、以下の実施の形態における構成要素のうち、最上位概念を示す独立請求項に記載されていない構成要素については、任意の構成要素として説明される。また、添付の図面における各図は、模式的な図であり、必ずしも厳密に図示されたものでない。さらに、各図において、実質的に同一の構成要素に対しては同一の符号を付しており、重複する説明は省略又は簡略化される場合がある。また、本明細書及び請求の範囲では、「装置」とは、1つの装置を意味し得るだけでなく、複数の装置からなるシステムも意味し得る。
(Embodiment)
Hereinafter, embodiments of the present disclosure will be described with reference to the drawings. It should be noted that all of the embodiments described below show comprehensive or specific examples. Further, among the components in the following embodiments, the components not described in the independent claims indicating the highest level concept are described as arbitrary components. Further, each figure in the attached drawings is a schematic view and is not necessarily exactly illustrated. Further, in each figure, substantially the same components are designated by the same reference numerals, and duplicate description may be omitted or simplified. Further, in the present specification and claims, the "device" may mean not only one device but also a system including a plurality of devices.
 [ロボットシステムの構成]
 実施の形態に係るロボットシステム1の構成を説明する。図1は、実施の形態に係るロボットシステム1の一例を示す概略図である。図1に示すように、実施の形態に係るロボットシステム1は、ロボット10と、ロボット制御装置20と、画像処理装置30と、入力装置40と、撮像装置50とを備える。ロボット制御装置20、画像処理装置30及び撮像装置50は、教示データの補正システム2を構成する。ロボット制御装置20及び画像処理装置30は、制御装置3を構成する。
[Robot system configuration]
The configuration of the robot system 1 according to the embodiment will be described. FIG. 1 is a schematic view showing an example of the robot system 1 according to the embodiment. As shown in FIG. 1, the robot system 1 according to the embodiment includes a robot 10, a robot control device 20, an image processing device 30, an input device 40, and an image pickup device 50. The robot control device 20, the image processing device 30, and the image pickup device 50 constitute a teaching data correction system 2. The robot control device 20 and the image processing device 30 constitute the control device 3.
 ロボットシステム1は、教示された動作手順に従ってロボット10に自動運転させ所定の作業を実行させることができる。ロボットシステム1は、入力装置40を介して入力される操作情報に従ってロボット10に手動運転させ作業を実行させることができる。ロボットシステム1は、教示された動作手順のデータある教示データを自動で補正する処理を実行することができる。ロボットシステム1は、ロボット10に自動運転させる自動運転モードと、ロボット10に手動運転させる手動運転モードと、教示データを自動で補正する補正モードとのいずれかを選択して動作する。本実施の形態では、ロボット10が実行する作業は、溶接作業であり、例えば、スポット溶接作業である。なお、ロボット10が実行する作業は、スポット溶接以外の溶接作業であってもよく、溶接作業以外の作業であってもよい。このような作業は、例えば、穿孔、ネジ締結、シーリング等のように、ロボット10の移動可能な部位を対象物に対して位置決めすることを含む作業であってもよい。 The robot system 1 can make the robot 10 automatically operate according to the instructed operation procedure to execute a predetermined work. The robot system 1 can cause the robot 10 to manually operate according to the operation information input via the input device 40 to execute the work. The robot system 1 can execute a process of automatically correcting the teaching data with the data of the taught operation procedure. The robot system 1 operates by selecting one of an automatic operation mode in which the robot 10 is automatically operated, a manual operation mode in which the robot 10 is manually operated, and a correction mode in which the teaching data is automatically corrected. In the present embodiment, the work executed by the robot 10 is a welding work, for example, a spot welding work. The work executed by the robot 10 may be a welding work other than spot welding, or may be a work other than the welding work. Such work may include positioning a movable portion of the robot 10 with respect to an object, such as drilling, screwing, sealing, and the like.
 上記のようなロボット10は産業用ロボットである。ロボット10は、ワークの一例である溶接対象物Wの溶接箇所を実際に溶接するエンドエフェクタ11と、溶接箇所へエンドエフェクタ11を動かすロボットアーム12とを備える。例えば、エンドエフェクタ11は、ロボットガンの一例である溶接ガンである。以下、「エンドエフェクタ11」を「溶接ガン11」とも表記する。溶接対象物Wは、例えば、重ねられた2つの薄い板状物である。 The robot 10 as described above is an industrial robot. The robot 10 includes an end effector 11 that actually welds a welded portion of a welding object W, which is an example of a work, and a robot arm 12 that moves the end effector 11 to the welded portion. For example, the end effector 11 is a welding gun which is an example of a robot gun. Hereinafter, the "end effector 11" is also referred to as a "welding gun 11". The object W to be welded is, for example, two stacked thin plate-like objects.
 ロボットアーム12の構成は、先端の溶接ガン11の位置及び姿勢を変更することができれば特に限定されないが、本実施の形態では、ロボットアーム12は垂直多関節型ロボットアームである。なお、ロボットアーム12は、例えば、水平多関節型、極座標型、円筒座標型、直角座標型、又はその他の型式のロボットアームとして構成されてもよい。 The configuration of the robot arm 12 is not particularly limited as long as the position and posture of the welding gun 11 at the tip can be changed, but in the present embodiment, the robot arm 12 is a vertical articulated robot arm. The robot arm 12 may be configured as, for example, a horizontal articulated type, a polar coordinate type, a cylindrical coordinate type, a rectangular coordinate type, or another type of robot arm.
 ロボットアーム12は、床面等の設置面に固定して配置されるが、搬送車等に配置され移動可能であってもよい。ロボットアーム12は、その基部から先端に向かって順に配置されたリンク12a~12fと、リンク12a~12fを順次接続する関節JT1~JT6と、関節JT1~JT6それぞれを回転駆動するアーム駆動装置M1~M6とを備える。アーム駆動装置M1~M6の動作はロボット制御装置20によって制御される。アーム駆動装置M1~M6はそれぞれ、電力を動力源とし、これらを駆動する電気モータとしてサーボモータを有するが、これに限定されない。なお、ロボットアーム12の関節の数量は、6つに限定されず、7つ以上であってもよく、1つ以上5つ以下であってもよい。 The robot arm 12 is fixedly arranged on an installation surface such as a floor surface, but may be arranged and movable on a transport vehicle or the like. The robot arm 12 includes joints JT1 to JT6 that sequentially connect links 12a to 12f, links 12a to 12f, which are sequentially arranged from the base to the tip, and arm drive devices M1 to rotationally drive each of the joints JT1 to JT6. It is equipped with M6. The operations of the arm drive devices M1 to M6 are controlled by the robot control device 20. Each of the arm drive devices M1 to M6 uses electric power as a power source and has a servomotor as an electric motor for driving the electric power, but the arm drive devices M1 to M6 are not limited thereto. The number of joints of the robot arm 12 is not limited to 6, but may be 7 or more, or 1 or more and 5 or less.
 関節JT1は、ロボットアーム12の設置面とリンク12aの基端部とを、設置面に対して垂直である鉛直方向の軸周りに回転可能に連結する。関節JT2は、リンク12aの先端部とリンク12bの基端部とを、設置面に平行である水平方向の軸周りに回転可能に連結する。関節JT3は、リンク12bの先端部とリンク12cの基端部とを、水平方向の軸周りに回転可能に連結する。関節JT4は、リンク12cの先端部とリンク12dの基端部とを、リンク12cの長手方向の軸周りに回転可能に連結する。関節JT5は、リンク12dの先端部とリンク12eの基端部とを、リンク12dの長手方向と直交する方向の軸周りに回転可能に連結する。関節JT6は、リンク12eの先端部とリンク12fの基端部とを、リンク12eに対する捻れ回転可能に連結する。リンク12fの先端部は、メカニカルインタフェースを構成し、溶接ガン11と接続される。 The joint JT1 rotatably connects the installation surface of the robot arm 12 and the base end portion of the link 12a around an axis in the vertical direction perpendicular to the installation surface. The joint JT2 rotatably connects the tip end of the link 12a and the base end of the link 12b around a horizontal axis parallel to the installation surface. The joint JT3 rotatably connects the tip end of the link 12b and the base end of the link 12c around an axis in the horizontal direction. The joint JT4 rotatably connects the tip end of the link 12c and the base end of the link 12d around the longitudinal axis of the link 12c. The joint JT5 rotatably connects the tip end of the link 12d and the base end of the link 12e about an axis in a direction orthogonal to the longitudinal direction of the link 12d. The joint JT6 connects the tip end of the link 12e and the base end of the link 12f in a twistable and rotatable manner with respect to the link 12e. The tip of the link 12f constitutes a mechanical interface and is connected to the welding gun 11.
 図2は、実施の形態に係る溶接ガン11の構成の一例を示す側面図である。図3は、図2の溶接ガン11において電極チップ11dの代わりに撮像装置50が装着される構成の一例を示す側面図である。図2及び図3に示すように、溶接ガン11は、リンク12fの先端部に着脱可能に取り付けられている。溶接ガン11は、取付部11aと、本体部11bと、移動装置11cとを備える。取付部11aは、リンク12fのメカニカルインタフェースと接続されるように構成され、本体部11bを支持する。本体部11bは、U字状の部材で構成され、取付部11aと接続されている。本実施の形態では、本体部11bは、取付部11aと同じ材料で構成され、取付部11aと一体化されている。本体部11bは、U字形状の端部11ba及び11bbのうちの端部11baの近傍において取付部11aと接続されている。本体部11bは、端部11baに、可動な第1装着部11bcを有し、端部11bbに、本体部11bに固定された第2装着部11bdを有する。装着部11bc及び11bdは方向D1で対向して配置され、第1装着部11bcは、第2装着部11bdに接近する方向D1及び離れる方向D2に移動可能である。方向D1及びD2は互いに反対方向である。方向D1は第1方向の一例である。 FIG. 2 is a side view showing an example of the configuration of the welding gun 11 according to the embodiment. FIG. 3 is a side view showing an example of a configuration in which the imaging device 50 is mounted in place of the electrode tip 11d in the welding gun 11 of FIG. As shown in FIGS. 2 and 3, the welding gun 11 is detachably attached to the tip of the link 12f. The welding gun 11 includes a mounting portion 11a, a main body portion 11b, and a moving device 11c. The mounting portion 11a is configured to be connected to the mechanical interface of the link 12f and supports the main body portion 11b. The main body portion 11b is composed of a U-shaped member and is connected to the mounting portion 11a. In the present embodiment, the main body portion 11b is made of the same material as the mounting portion 11a and is integrated with the mounting portion 11a. The main body portion 11b is connected to the mounting portion 11a in the vicinity of the end portion 11ba of the U-shaped end portions 11ba and 11bb. The main body portion 11b has a movable first mounting portion 11bc at the end portion 11ba, and a second mounting portion 11bd fixed to the main body portion 11b at the end portion 11bb. The mounting portions 11bc and 11bd are arranged so as to face each other in the direction D1, and the first mounting portion 11bc can move in the direction D1 approaching the second mounting portion 11bd and in the direction D2 away from the second mounting portion 11bd. The directions D1 and D2 are opposite to each other. Direction D1 is an example of the first direction.
 移動装置11cは、端部11baに配置され、第1装着部11bcを方向D1及びD2に移動させる。移動装置11cは、移動駆動装置11caと移動駆動機構11cbとを含む。移動駆動装置11caは、移動駆動機構11cbを駆動する。移動駆動装置11caは、電力を動力源とし、電気モータとしてサーボモータを有するが、これに限定されない。移動駆動装置11caの動作は、ロボット制御装置20によって制御される。 The moving device 11c is arranged at the end 11ba and moves the first mounting portion 11bc in the directions D1 and D2. The mobile device 11c includes a mobile drive device 11ca and a mobile drive mechanism 11cc. The mobile drive device 11ca drives the mobile drive mechanism 11 cab. The mobile drive device 11ca uses electric power as a power source and has a servomotor as an electric motor, but the mobile drive device 11ca is not limited to this. The operation of the mobile drive device 11ca is controlled by the robot control device 20.
 移動駆動機構11cbは、移動駆動装置11caの駆動力を第1装着部11bcに伝達し、第1装着部11bcを方向D1及びD2に移動させる。移動駆動機構11cbは、移動駆動装置11caの回転駆動力を直線駆動力に変換し第1装着部11bcに伝達する。移動駆動機構11cbは、例えばボールねじ構造を有し、移動駆動装置11caによってナットが回転駆動されることで、第1装着部11bcと接続された棒状のねじを、軸方向である方向D1及びD2に移動させる。移動駆動装置11caは、電気モータに限定されず、例えば、液圧式又は空圧式のピストン、電気式リニアアクチュエータ等であってもよい。移動駆動装置11ca及び移動駆動機構11cbは、第1装着部11bcを方向D1及びD2に移動するように構成されればよい。 The mobile drive mechanism 11 bc transmits the driving force of the mobile drive device 11ca to the first mounting portion 11 bc, and moves the first mounted portion 11 bc in the directions D1 and D2. The mobile drive mechanism 11 bc converts the rotational drive force of the mobile drive device 11ca into a linear drive force and transmits it to the first mounting portion 11 bc. The moving drive mechanism 11cc has, for example, a ball screw structure, and the nut is rotationally driven by the moving drive device 11ca to rotate the rod-shaped screw connected to the first mounting portion 11bc in the axial directions D1 and D2. Move to. The mobile drive device 11ca is not limited to an electric motor, and may be, for example, a hydraulic or pneumatic piston, an electric linear actuator, or the like. The mobile drive device 11ca and the mobile drive mechanism 11 bc may be configured to move the first mounting portion 11 bc in the directions D1 and D2.
 第2装着部11bdは、溶接のための電極の一例である電極チップ11dが着脱可能であるように構成される。例えば、電極チップ11dは、第2装着部11bdが有する孔に挿入されることで取り付けられるように構成されてもよい。本実施の形態では、電極チップ11dの形状は、半球面状の先端を有する円柱形状であるが、これに限定されない。溶接ガン11は、第2装着部11bdに装着された電極チップ11dと溶接対象物Wとの接触を検知する接触センサ11eを備える。接触センサ11eは、電極チップ11dと溶接対象物Wとの接触を示す検知信号をロボット制御装置20に送信する。接触センサ11eの構成は、上記接触を検知することができれば特に限定されないが、本実施の形態では、第2装着部11bdの導電性の電極チップ11dに微弱電流を印加し、導電性の溶接対象物Wとの接触時の電流の変化を表す信号を検知信号として送信する構成である。 The second mounting portion 11bd is configured so that the electrode tip 11d, which is an example of an electrode for welding, can be attached and detached. For example, the electrode tip 11d may be configured to be attached by being inserted into a hole included in the second mounting portion 11bd. In the present embodiment, the shape of the electrode tip 11d is a cylindrical shape having a hemispherical tip, but the shape is not limited to this. The welding gun 11 includes a contact sensor 11e that detects contact between the electrode tip 11d mounted on the second mounting portion 11bd and the welding object W. The contact sensor 11e transmits a detection signal indicating contact between the electrode tip 11d and the welding object W to the robot control device 20. The configuration of the contact sensor 11e is not particularly limited as long as the contact can be detected, but in the present embodiment, a weak current is applied to the conductive electrode tip 11d of the second mounting portion 11bd to be a conductive welding target. The configuration is such that a signal representing a change in current at the time of contact with the object W is transmitted as a detection signal.
 第1装着部11bcは、電極チップ11dが着脱可能であるように構成される。さらに、第1装着部11bcは、電極チップ11dの代わりに、撮像装置50が着脱可能であるように構成される。電極チップ11dと撮像装置50とは、互いから交換されて第1装着部11bcに取り付けられることができる。例えば、電極チップ11d及び撮像装置50は、第1装着部11bcが有する孔に挿入されることで取り付けられるように構成されてもよい。 The first mounting portion 11bc is configured so that the electrode tip 11d can be attached and detached. Further, the first mounting portion 11bc is configured so that the imaging device 50 can be attached and detached instead of the electrode chip 11d. The electrode tip 11d and the image pickup apparatus 50 can be exchanged with each other and attached to the first mounting portion 11bc. For example, the electrode tip 11d and the image pickup apparatus 50 may be configured to be attached by being inserted into a hole included in the first mounting portion 11bc.
 撮像装置50は、カメラ51と取付具52とを有する。カメラ51は、デジタル画像を撮像する小型カメラである。カメラ51の例は、CMOS(Complementary Metal-Oxide Semiconductor)イメージセンサ及びCCD(Charge Coupled Device)イメージセンサ等のイメージセンサである。カメラ51は、ロボット制御装置20によって動作の制御を受け、撮像した画像の信号をロボット制御装置20及び/又は画像処理装置30に送信する。本実施の形態では、カメラ51は、単眼カメラであるが、これに限定されない。例えば、カメラ51は、複眼カメラ、TOFカメラ(トフカメラ:Time-of-Flight-Camera)、縞投影等のパターン光投影カメラ、又は光切断法を用いたカメラ等の、被写体の位置を検出するための画像を撮像する構成を有してもよい。 The image pickup device 50 has a camera 51 and a fixture 52. The camera 51 is a small camera that captures a digital image. An example of the camera 51 is an image sensor such as a CMOS (Complementary Metal-Oxide Semiconductor) image sensor and a CCD (Charge Coupled Device) image sensor. The camera 51 is controlled in operation by the robot control device 20, and transmits a signal of the captured image to the robot control device 20 and / or the image processing device 30. In the present embodiment, the camera 51 is a monocular camera, but is not limited thereto. For example, the camera 51 detects the position of a subject such as a compound eye camera, a TOF camera (Time-of-Flight-Camera), a pattern light projection camera such as fringe projection, or a camera using a light cutting method. It may have a configuration for capturing an image of.
 取付具52は、カメラ51を第1装着部11bcに取り付ける。取付具52は、カメラ51を保持し、第1装着部11bcに着脱可能に装着されるように構成される。取付具52は、第1装着部11bcに装着されることで、カメラ51の光軸51aの方向が第1装着部11bcからオフセットするようにカメラ51を第1装着部11bcに取り付ける。本実施の形態では、取付具52は、カメラ51の光軸51aの方向が方向D1と交差する方向であるように、光軸51aの方向を第1装着部11bcからオフセットさせて、具体的には、方向D1及びD2の第1装着部11bcの動作経路から光軸51aをオフセットさせてカメラ51を第1装着部11bcに取り付ける。カメラ51の光軸51aの方向と方向D1とは、斜めに交差する。本実施の形態では、第1装着部11bcに装着された電極チップ11dの軸心11daと、第1装着部11bcに取付具52を介して装着されたカメラ51の光軸51aとは、交差するが、交差せずに捻じれの関係にあってもよい。なお、第1装着部11bcに装着された電極チップ11dの軸心11daは、第2装着部11bdに装着された電極チップ11dの軸心と同軸である。軸心11daは、第1装着部11bcの軸心でもある。 The attachment 52 attaches the camera 51 to the first attachment portion 11bc. The attachment 52 holds the camera 51 and is configured to be detachably attached to the first attachment portion 11bc. When the mounting tool 52 is mounted on the first mounting portion 11bc, the camera 51 is mounted on the first mounting portion 11bc so that the direction of the optical axis 51a of the camera 51 is offset from the first mounting portion 11bc. In the present embodiment, the fixture 52 specifically offsets the direction of the optical axis 51a from the first mounting portion 11bc so that the direction of the optical axis 51a of the camera 51 intersects the direction D1. Attaches the camera 51 to the first mounting portion 11bc by offsetting the optical axis 51a from the operation path of the first mounting portion 11bc in the directions D1 and D2. The direction of the optical axis 51a of the camera 51 and the direction D1 intersect diagonally. In the present embodiment, the axis 11da of the electrode chip 11d mounted on the first mounting portion 11bc and the optical axis 51a of the camera 51 mounted on the first mounting portion 11bc via the mounting tool 52 intersect. However, they may have a twisting relationship without intersecting. The axis 11da of the electrode tip 11d mounted on the first mounting portion 11bc is coaxial with the axis of the electrode tip 11d mounted on the second mounting portion 11bd. The axis 11da is also the axis of the first mounting portion 11bc.
 取付具52は、第1装着部11bcと接続される円筒状の接続部52aと、接続部52aから延びる円筒状の収容部52bとを一体的に含む。収容部52bは、カメラ51及びそのハーネス等を収容して保持し、接続部52aと接続された端部においてカメラ51のレンズを露出させる。第1装着部11bcと接続された接続部52aの軸心は、第1装着部11bcに装着された電極チップ11dの軸心11daと同軸であるが、これに限定されない。収容部52bの軸心は、カメラ51の光軸51aと同軸であるが、これに限定されない。第1装着部11bcに装着された収容部52bは、接続部52aの軸心の方向と斜めに交差する方向に沿って接続部52aから方向D2へ延びる。収容部52bは、溶接ガン11の本体部11b及び移動装置11cとの干渉を防ぎつつ、方向D1での接続部52aからの突出量を低く抑制する、又は接続部52aから突出しないように配置されることができる。 The fixture 52 integrally includes a cylindrical connecting portion 52a connected to the first mounting portion 11bc and a cylindrical accommodating portion 52b extending from the connecting portion 52a. The accommodating portion 52b accommodates and holds the camera 51 and its harness and the like, and exposes the lens of the camera 51 at the end portion connected to the connecting portion 52a. The axis of the connecting portion 52a connected to the first mounting portion 11bc is coaxial with, but is not limited to, the axis center 11da of the electrode tip 11d mounted on the first mounting portion 11bc. The axis of the accommodating portion 52b is coaxial with the optical axis 51a of the camera 51, but is not limited thereto. The accommodating portion 52b mounted on the first mounting portion 11bc extends from the connecting portion 52a in the direction D2 along a direction diagonally intersecting the direction of the axis of the connecting portion 52a. The accommodating portion 52b is arranged so as to prevent interference with the main body portion 11b of the welding gun 11 and the moving device 11c, and to suppress the amount of protrusion from the connecting portion 52a in the direction D1 to a low level or to prevent the housing portion 52b from protruding from the connecting portion 52a. Can be done.
 このような取付具52は、第1装着部11bcに取り付けられたときに第1装着部11bcから方向D1に突出する長さを小さく抑えることができる。本実施の形態では、第1装着部11bcに取り付けられた状態において、取付具52及びカメラ51が第1装着部11bcから方向D1に突出する長さは、電極チップ11dが第1装着部11bcから方向D1に突出する長さ以下である。これにより、第1装着部11bcに装着された取付具52及びカメラ51は、第1装着部11bcが溶接動作のために方向D1へ移動された場合でも、第1装着部11bcと第2装着部11bdとの間の溶接対象物Wに強く押し付けられ破損することが抑制される。 Such a mounting tool 52 can suppress the length protruding from the first mounting portion 11bc in the direction D1 to a small size when mounted on the first mounting portion 11bc. In the present embodiment, the length of the fixture 52 and the camera 51 protruding from the first mounting portion 11bc in the direction D1 in the state of being mounted on the first mounting portion 11bc is such that the electrode tip 11d protrudes from the first mounting portion 11bc. It is less than or equal to the length protruding in the direction D1. As a result, the fixture 52 and the camera 51 mounted on the first mounting portion 11bc have the first mounting portion 11bc and the second mounting portion even when the first mounting portion 11bc is moved in the direction D1 due to the welding operation. It is suppressed from being strongly pressed against the welding object W between the 11b and the welding object W and being damaged.
 図1に示すように、入力装置40は、ロボットシステム1のユーザによる指令、情報及びデータ等の入力を受け付け、当該指令、情報及びデータ等をロボット制御装置20に出力する。入力装置40は、ロボット制御装置20と有線通信又は無線通信を介して接続される。有線通信及び無線通信の形式はいかなる形式であってもよい。例えば、入力装置40は、自動運転モード、手動運転モード及び補正モードのいずれかを実行する指令の入力を受け付け、当該指令をロボット制御装置20に出力する。入力装置40は、ロボット10に所定の溶接作業の動作手順を教示するためのティーチングペンダント等の教示装置を含んでもよい。 As shown in FIG. 1, the input device 40 receives input of commands, information, data, etc. by the user of the robot system 1 and outputs the commands, information, data, etc. to the robot control device 20. The input device 40 is connected to the robot control device 20 via wired communication or wireless communication. The format of wired communication and wireless communication may be any format. For example, the input device 40 receives an input of a command for executing any of the automatic operation mode, the manual operation mode, and the correction mode, and outputs the command to the robot control device 20. The input device 40 may include a teaching device such as a teaching pendant for teaching the robot 10 the operation procedure of a predetermined welding operation.
 ロボット制御装置20は、ロボットシステム1の全体を制御する。例えば、ロボット制御装置20はコンピュータ装置を含んでもよい。 The robot control device 20 controls the entire robot system 1. For example, the robot control device 20 may include a computer device.
 画像処理装置30は、カメラ51から受信された画像の信号から画像データを生成し、当該画像データを画像処理する。例えば、画像処理装置30はコンピュータ装置を含んでもよい。画像処理装置30は、画像処理することで、画像データに写し出される被写体の3次元位置及び姿勢を検出する。本明細書及び請求の範囲において、「3次元位置」は、3次元空間内での3次元の位置であり、「姿勢」は、3次元空間内での3次元の姿勢であってもよく、画像の平面又は当該平面に交差する平面に沿う平面等の2次元平面内での2次元の姿勢であってもよい。例えば、カメラ51は、溶接対象物Wの表面の所定の打点位置に付けられた打点マークを撮像し、画像処理装置30は、画像データに写し出される打点マークの3次元位置及び姿勢を検出する。所定の打点位置は、自動運転モードで教示データに従ってロボット10が溶接作業を行う場合に溶接すべき位置であり、溶接ガン11の電極チップ11dが押し当てられるべき位置である。打点マークの例は、溶接対象物Wの表面に付けられたスポット溶接痕、及び、図形などを含むマーキング等である。打点マークが方向性を有さない場合等では、打点マークの3次元位置のみが検出されてもよい。 The image processing device 30 generates image data from the signal of the image received from the camera 51, and performs image processing on the image data. For example, the image processing device 30 may include a computer device. The image processing device 30 detects the three-dimensional position and orientation of the subject projected on the image data by performing image processing. In the present specification and the scope of the claim, the "three-dimensional position" may be a three-dimensional position in the three-dimensional space, and the "posture" may be the three-dimensional posture in the three-dimensional space. It may be a two-dimensional posture in a two-dimensional plane such as a plane of an image or a plane along a plane intersecting the plane. For example, the camera 51 takes an image of a dot mark attached to a predetermined dot position on the surface of the welding object W, and the image processing device 30 detects the three-dimensional position and orientation of the dot mark projected on the image data. The predetermined striking point position is a position to be welded when the robot 10 performs a welding operation according to the teaching data in the automatic operation mode, and is a position to be pressed against the electrode tip 11d of the welding gun 11. Examples of dot marks are spot weld marks made on the surface of the object W to be welded, markings including figures and the like. When the dot mark has no directionality, only the three-dimensional position of the dot mark may be detected.
 例えば、補正モードでは、ロボット10が教示データに従って自動運転で溶接作業と同様の動作を行う場合において、ロボット10が実際に溶接を行う位置である実打点位置と、ロボット10が本来溶接を行うべき所定の打点位置との間で生じる差異を低減するように、教示データが補正される。このような所定の打点位置には打点マークが付けられ、打点マークは、演算等により位置決めされた位置へのマーキングにより溶接対象物Wの表面上に付けられてもよく、教示者が手動運転によりロボット10に実際に溶接作業をさせることで溶接対象物Wの表面上に溶接痕として付けられてもよい。 For example, in the correction mode, when the robot 10 performs the same operation as the welding work by automatic operation according to the teaching data, the actual hitting point position where the robot 10 actually performs welding and the robot 10 should originally perform welding. The teaching data is corrected so as to reduce the difference that occurs with the predetermined spot position. A dot mark is attached to such a predetermined dot position, and the dot mark may be attached to the surface of the welding object W by marking the position positioned by calculation or the like, and the instructor manually operates the spot mark. By actually causing the robot 10 to perform welding work, welding marks may be formed on the surface of the object to be welded W.
 [ロボットシステムのハードウェア構成]
 ロボットシステム1のハードウェア構成を説明する。図4は、実施の形態に係るロボットシステム1のハードウェア構成の一例を示すブロック図である。図4に示すように、ロボット制御装置20は、CPU(Central Processing Unit)201と、ROM(Read Only Memory)202と、RAM(Random Access Memory)203と、メモリ204と、入出力I/F(インタフェース:Interface)205~207と、アーム駆動回路208と、ガン駆動回路209とを構成要素として含む。上記構成要素はそれぞれ、バス、有線通信又は無線通信を介して接続されている。なお、上記構成要素の全てが必須ではない。
[Hardware configuration of robot system]
The hardware configuration of the robot system 1 will be described. FIG. 4 is a block diagram showing an example of the hardware configuration of the robot system 1 according to the embodiment. As shown in FIG. 4, the robot control device 20 includes a CPU (Central Processing Unit) 201, a ROM (Read Only Memory) 202, a RAM (Random Access Memory) 203, a memory 204, and an input / output I / F ( Interface) 205 to 207, arm drive circuit 208, and gun drive circuit 209 are included as components. Each of the above components is connected via bus, wired or wireless communication. Not all of the above components are essential.
 例えば、CPU201はプロセッサであり、ロボット制御装置20の動作の全体を制御する。ROM202は不揮発性半導体メモリ等で構成され、CPU201に動作を制御させるためのプログラム及びデータ等を格納する。RAM203は揮発性半導体メモリ等で構成され、CPU201で実行するプログラム及び処理途中又は処理済みのデータ等を一時的に格納する。メモリ204は、揮発性メモリ及び不揮発性メモリなどの半導体メモリ、ハードディスク(HDD:Hard Disc Drive)及びSSD(Solid State Drive)等の記憶装置で構成され、種々の情報を記憶する。メモリ204は、ロボット制御装置20の外部の装置であってもよい。 For example, the CPU 201 is a processor and controls the entire operation of the robot control device 20. The ROM 202 is composed of a non-volatile semiconductor memory or the like, and stores a program, data, or the like for causing the CPU 201 to control the operation. The RAM 203 is composed of a volatile semiconductor memory or the like, and temporarily stores a program executed by the CPU 201 and data in the middle of processing or processed. The memory 204 is composed of a semiconductor memory such as a volatile memory and a non-volatile memory, and a storage device such as a hard disk (HDD: Hard Disc Drive) and an SSD (Solid State Drive), and stores various information. The memory 204 may be a device external to the robot control device 20.
 例えば、CPU201が動作するためのプログラムは、ROM202又はメモリ204に予め保持されている。CPU201は、ROM202又はメモリ204からプログラムをRAM203に読み出して展開する。CPU201は、RAM203に展開されたプログラム中のコード化された各命令を実行する。 For example, the program for operating the CPU 201 is stored in the ROM 202 or the memory 204 in advance. The CPU 201 reads a program from the ROM 202 or the memory 204 into the RAM 203 and develops the program. The CPU 201 executes each coded instruction in the program expanded in the RAM 203.
 ロボット制御装置20の各機能は、CPU201、ROM202及びRAM203等からなるコンピュータシステムにより実現されてもよく、電子回路又は集積回路等の専用のハードウェア回路により実現されてもよく、上記コンピュータシステム及びハードウェア回路の組み合わせにより実現されてもよい。 Each function of the robot control device 20 may be realized by a computer system including a CPU 201, a ROM 202, a RAM 203, or the like, or may be realized by a dedicated hardware circuit such as an electronic circuit or an integrated circuit, and the computer system and hardware may be realized. It may be realized by a combination of hardware circuits.
 第1入出力I/F205は、入力装置40と接続され、入力装置40に対して情報、データ及び指令等を入出力する。第1入出力I/F205は、信号を変換する回路等を含んでもよい。第2入出力I/F206は、画像処理装置30と接続され、画像処理装置30に対して情報、データ及び指令等を入出力する。第2入出力I/F206は、信号を変換する回路等を含んでもよい。第3入出力I/F207は、カメラ51と接続され、カメラ51に対して情報、データ及び指令等を入出力する。第3入出力I/F207は、カメラ51を駆動する回路等を含んでもよい。 The first input / output I / F 205 is connected to the input device 40 and inputs / outputs information, data, commands, etc. to the input device 40. The first input / output I / F 205 may include a circuit for converting a signal or the like. The second input / output I / F 206 is connected to the image processing device 30 and inputs / outputs information, data, commands, and the like to the image processing device 30. The second input / output I / F 206 may include a circuit for converting a signal or the like. The third input / output I / F 207 is connected to the camera 51 and inputs / outputs information, data, commands, etc. to the camera 51. The third input / output I / F 207 may include a circuit or the like for driving the camera 51.
 アーム駆動回路208は、CPU201の指令に従って、ロボット10のアーム駆動装置M1~M6のサーボモータに電力を供給し当該サーボモータの駆動を制御する。ガン駆動回路209は、CPU201の指令に従って、溶接ガン11の移動駆動装置11caのサーボモータに電力を供給し当該サーボモータの駆動を制御する。 The arm drive circuit 208 supplies electric power to the servomotors of the arm drive devices M1 to M6 of the robot 10 and controls the drive of the servomotors in accordance with the command of the CPU 201. The gun drive circuit 209 supplies electric power to the servomotor of the mobile drive device 11ca of the welding gun 11 to control the drive of the servomotor in accordance with the command of the CPU 201.
 画像処理装置30は、CPU301と、ROM302と、RAM303と、メモリ304と、入出力I/F305~306とを構成要素として含む。上記構成要素はそれぞれ、バス、有線通信又は無線通信を介して接続されている。なお、上記構成要素の全てが必須ではない。CPU301、ROM302、RAM303及びメモリ304の構成は、ロボット制御装置20と同様である。第1入出力I/F305は、ロボット制御装置20と接続され、ロボット制御装置20に対して情報、データ及び指令等を入出力する。第2入出力I/F306は、カメラ51と接続され、カメラ51に対して情報、データ及び指令等を入出力する。例えば、第2入出力I/F306は、カメラ51によって撮像された画像の信号を受け付ける。入出力I/F305~306は、信号を変換する回路等を含んでもよい。 The image processing device 30 includes a CPU 301, a ROM 302, a RAM 303, a memory 304, and input / output I / F 305 to 306 as components. Each of the above components is connected via bus, wired or wireless communication. Not all of the above components are essential. The configurations of the CPU 301, ROM 302, RAM 303, and memory 304 are the same as those of the robot control device 20. The first input / output I / F 305 is connected to the robot control device 20 and inputs / outputs information, data, commands, etc. to the robot control device 20. The second input / output I / F 306 is connected to the camera 51 and inputs / outputs information, data, commands, etc. to the camera 51. For example, the second input / output I / F 306 receives the signal of the image captured by the camera 51. The input / output I / F 305 to 306 may include a circuit for converting a signal or the like.
 上述のようなロボット制御装置20及び画像処理装置30は、例えば、マイクロコントローラ、MPU(Micro Processing Unit)、LSI(大規模集積回路:Large Scale Integration)、システムLSI、PLC(Programmable Logic Controller)、論理回路等で構成されてもよい。ロボット制御装置20の複数の機能は、個別に1チップ化されることで実現されてもよく、一部又は全てを含むように1チップ化されることで実現されてもよい。また、回路はそれぞれ、汎用的な回路でもよく、専用の回路でもよい。LSIとして、LSI製造後にプログラムすることが可能なFPGA(Field Programmable Gate Array)、LSI内部の回路セルの接続及び/又は設定を再構成可能なリコンフィギュラブルプロセッサ、又は、特定用途向けに複数の機能の回路が1つにまとめられたASIC(Application Specific Integrated Circuit)等が利用されてもよい。 The robot control device 20 and the image processing device 30 as described above include, for example, a microcontroller, an MPU (Micro Processing Unit), an LSI (Large Scale Integration), a system LSI, a PLC (Programmable Logic Controller), and logic. It may be composed of a circuit or the like. The plurality of functions of the robot control device 20 may be realized by being individually integrated into one chip, or may be realized by being integrated into one chip so as to include a part or all of them. Further, each circuit may be a general-purpose circuit or a dedicated circuit. As an LSI, an FPGA (Field Programmable Gate Array) that can be programmed after the LSI is manufactured, a reconfigurable processor that can reconfigure the connection and / or setting of circuit cells inside the LSI, or multiple functions for a specific application. An ASIC (Application Specific Integrated Circuit) or the like in which the above circuits are integrated may be used.
 [ロボットシステムの機能的構成]
 ロボットシステム1の機能的構成を説明する。図5は、実施の形態に係るロボットシステム1の機能的構成の一例を示すブロック図である。図5に示すように、ロボット制御装置20は、撮像制御部20aと、モード決定部20bと、手動指令生成部20cと、自動指令生成部20dと、動作制御部20eと、補正部20fと、記憶部20gとを機能的構成要素として含む。動作制御部20eは、アーム制御部20eaとガン制御部20ebとを含む。上記機能的構成要素の全てが必須ではない。記憶部20gを除く機能的構成要素の機能は、CPU201等によって実現され、記憶部20gの機能は、メモリ204、ROM202及び/又はRAM203によって実現される。
[Functional configuration of robot system]
The functional configuration of the robot system 1 will be described. FIG. 5 is a block diagram showing an example of the functional configuration of the robot system 1 according to the embodiment. As shown in FIG. 5, the robot control device 20 includes an image pickup control unit 20a, a mode determination unit 20b, a manual command generation unit 20c, an automatic command generation unit 20d, an operation control unit 20e, and a correction unit 20f. A storage unit of 20 g is included as a functional component. The motion control unit 20e includes an arm control unit 20ea and a gun control unit 20eb. Not all of the above functional components are required. The functions of the functional components other than the storage unit 20g are realized by the CPU 201 and the like, and the functions of the storage unit 20g are realized by the memory 204, the ROM 202 and / or the RAM 203.
 記憶部20gは、種々の情報を記憶し、記憶している情報の読み出しを可能にする。例えば、記憶部20gは、ロボット制御装置20を動作させるプログラムを記憶する。さらに、記憶部20gは、ロボット10に所定の溶接作業を行わせるための教示により記憶された教示データ20gaを記憶する。 The storage unit 20g stores various information and enables reading of the stored information. For example, the storage unit 20g stores a program for operating the robot control device 20. Further, the storage unit 20g stores the teaching data 20ga stored by the teaching for causing the robot 10 to perform a predetermined welding operation.
 本実施の形態では、ロボット10の教示方式は、プログラミングによる教示であり、教示データ20gaは、オフラインの教示データである。なお、ロボット10の教示方式は、例えば、教示者がロボット10を直接触って動かすことによるダイレクト教示、ティーチングペンダントを用いた遠隔操縦による教示、及びマスタースレーブによる教示等であってもよい。オフラインの教示データに従ったロボット10の溶接作業では、ロボット10の動作の個体差等の要因により、実際に溶接された位置が本来溶接すべき位置と一致しない場合がある。教示者によって教示された教示データに従ったロボット10の溶接作業では、教示者の熟練度の差等の要因により、実際に溶接された位置が本来溶接すべき位置と一致しない場合がある。このため、教示データ20gaの補正が必要になる。 In the present embodiment, the teaching method of the robot 10 is teaching by programming, and the teaching data 20ga is offline teaching data. The teaching method of the robot 10 may be, for example, direct teaching by the instructor moving the robot 10 in direct contact, teaching by remote control using a teaching pendant, teaching by a master / slave, or the like. In the welding work of the robot 10 according to the offline teaching data, the actually welded position may not match the position to be originally welded due to factors such as individual differences in the operation of the robot 10. In the welding work of the robot 10 according to the teaching data taught by the instructor, the actually welded position may not match the position to be originally welded due to factors such as a difference in the skill level of the instructor. Therefore, it is necessary to correct the teaching data of 20 ga.
 教示データ20gaは、溶接作業に含まれる各溶接位置に溶接するための溶接ガン11の位置として設定されたガン教示位置、及び、溶接実行中の第1装着部11bcの位置として設定された電極教示位置等を含む。例えば、溶接位置は、溶接対象物上で電極チップ11dを押し当てる打点の位置である。ガン教示位置は、溶接ガン11の3次元位置及び姿勢を含んでもよい。電極教示位置は、溶接ガン11に対する第1装着部11bcの相対的な位置であり、第1装着部11bcの移動量であってもよい。教示データ20gaは、各ガン教示位置での時刻、及び、各電極教示位置での時刻を含んでもよい。また、教示データ20gaは、各ガン教示位置において溶接ガン11が溶接対象物に加える力を含んでもよく、第1装着部11bcが電極チップ11dを介して溶接対象物に加える力を含んでもよい。 The teaching data 20ga includes the gun teaching position set as the position of the welding gun 11 for welding at each welding position included in the welding work, and the electrode teaching set as the position of the first mounting portion 11bc during welding. Including position etc. For example, the welding position is the position of the hitting point where the electrode tip 11d is pressed onto the object to be welded. The gun teaching position may include the three-dimensional position and orientation of the welding gun 11. The electrode teaching position is a position relative to the welding gun 11 of the first mounting portion 11bc, and may be the amount of movement of the first mounting portion 11bc. The teaching data 20ga may include the time at each gun teaching position and the time at each electrode teaching position. Further, the teaching data 20ga may include a force applied by the welding gun 11 to the welding object at each gun teaching position, or may include a force applied by the first mounting portion 11bc to the welding object via the electrode tip 11d.
 また記憶部20gは、溶接の打点の3次元位置及び姿勢と当該打点に溶接するための溶接ガン11の3次元位置及び姿勢との関係を記憶してもよい。打点の3次元位置は、打点の中心の3次元位置であってもよい。打点の姿勢は、特に限定されないが、例えば、打点が形成する面の鉛直軸に対する傾斜量及び傾斜方向、打点の中心に対する打点上の特定の点の水平方向の向きである方位、及び、打点の中心に対する打点上の特定の点の3次元方向の向き等であってもよい。 Further, the storage unit 20g may store the relationship between the three-dimensional position and posture of the welding spot and the three-dimensional position and posture of the welding gun 11 for welding to the spot. The three-dimensional position of the hitting point may be the three-dimensional position of the center of the hitting point. The posture of the hitting point is not particularly limited, but for example, the amount and direction of inclination of the surface formed by the hitting point with respect to the vertical axis, the orientation of the specific point on the hitting point in the horizontal direction with respect to the center of the hitting point, and the direction of the hitting point. It may be the direction of a specific point on the hitting point with respect to the center in the three-dimensional direction.
 また、記憶部20gは、溶接ガン11、電極チップ11d、溶接対象物及び撮像装置50の情報を記憶する。溶接ガン11の情報は、溶接ガン11の端部11baへ後退した第1装着部11bcと第2装着部11bdとの距離、及び、第1装着部11bcの移動可能量等を含む。電極チップ11dの情報は、第1装着部11bc及び第2装着部11bdに装着されている電極チップ11dの長さなどの寸法等を含む。溶接対象物の情報は、溶接対象物の種類、材質及び厚さなどの寸法等を含む。撮像装置50の情報は、カメラ51及び取付具52の情報を含む。カメラ51の情報は、カメラ51のカメラパラメタを含み、カメラパラメタは、カメラ51自体に関する内部パラメタと、カメラ51の周辺環境に関する外部パラメタとを含む。取付具52の情報は、接続部52aの軸心と収容部52bの軸心との角度及び離間距離などの接続部52a及び収容部52bの位置関係の情報等を含んでもよい。上記情報はそれぞれ入力装置40を介した入力により記憶部20gに記憶されてもよい。 Further, the storage unit 20g stores information on the welding gun 11, the electrode tip 11d, the object to be welded, and the imaging device 50. The information of the welding gun 11 includes the distance between the first mounting portion 11bc and the second mounting portion 11bd retracted to the end portion 11ba of the welding gun 11, the movable amount of the first mounting portion 11bc, and the like. The information on the electrode tip 11d includes dimensions such as the length of the electrode tip 11d mounted on the first mounting portion 11bc and the second mounting portion 11bd. Information on the object to be welded includes dimensions such as the type, material and thickness of the object to be welded. The information of the image pickup apparatus 50 includes the information of the camera 51 and the fixture 52. The information of the camera 51 includes the camera parameters of the camera 51, and the camera parameters include internal parameters related to the camera 51 itself and external parameters related to the surrounding environment of the camera 51. The information on the fixture 52 may include information on the positional relationship between the connecting portion 52a and the accommodating portion 52b, such as the angle and separation distance between the axial center of the connecting portion 52a and the axial center of the accommodating portion 52b. The above information may be stored in the storage unit 20g by input via the input device 40, respectively.
 撮像制御部20aは、カメラ51の撮像動作を制御する。例えば、撮像制御部20aは、補正モードにおいて動作し、溶接ガン11の第1装着部11bcに装着された撮像装置50のカメラ51に、打点位置に打点マークが付けられた溶接対象物を所定のタイミングで撮像させる。本実施の形態では、補正モードでは、ロボット10は教示データ20gaに従って溶接作業と同じ動作を行う。溶接対象物は、溶接ガン11の第1装着部11bcと第2装着部11bdとの間に位置する。教示データ20gaに従って第1装着部11bcを溶接対象物の所定の打点位置に押し付けるためのガン教示位置に、溶接ガン11が位置するとき、撮像制御部20aは所定の打点位置に付けられた打点マークをカメラ51に撮像させる。具体的には、溶接ガン11がガン教示位置に位置するとき、教示データ20gaに従って第1装着部11bcが電極チップ11dを溶接対象物に押し付けるように方向D1へ動作され溶接対象物に接近したタイミングにおいて、撮像制御部20aはカメラ51に打点マークを撮像させる。当該タイミングは、カメラ51が溶接対象物に最も接近するタイミングであってもよく、溶接対象物に向かう方向D1への第1装着部11bcの移動中又は溶接対象物から離れる方向D2への第1装着部11bcの移動中の所定のタイミングであってもよい。なお、画像処理装置30が撮像制御部20aを含んでもよい。 The image pickup control unit 20a controls the image pickup operation of the camera 51. For example, the image pickup control unit 20a operates in the correction mode, and a welding object having a spot mark at the spot position is designated on the camera 51 of the image pickup device 50 mounted on the first mounting portion 11bc of the welding gun 11. The image is taken at the timing. In the present embodiment, in the correction mode, the robot 10 performs the same operation as the welding operation according to the teaching data 20ga. The object to be welded is located between the first mounting portion 11bc and the second mounting portion 11bd of the welding gun 11. When the welding gun 11 is positioned at the gun teaching position for pressing the first mounting portion 11bc to the predetermined spot position of the object to be welded according to the teaching data 20ga, the imaging control unit 20a has the spot mark attached to the predetermined spot position. Is imaged by the camera 51. Specifically, when the welding gun 11 is located at the gun teaching position, the first mounting portion 11bc is operated in the direction D1 so as to press the electrode tip 11d against the welding target according to the teaching data 20ga, and the timing when the welding gun 11 approaches the welding target. In, the image pickup control unit 20a causes the camera 51 to image the dot mark. The timing may be the timing when the camera 51 is closest to the welding target, and the first mounting portion 11bc is moving in the direction D1 toward the welding target or the first in the direction D2 away from the welding target. It may be a predetermined timing during the movement of the mounting portion 11bc. The image processing device 30 may include an image pickup control unit 20a.
 モード決定部20bは、入力装置40を介したモードを指定する指令に従って、自動運転モード、手動運転モード及び補正モードの中からロボットシステム1が実行するモードを決定し、他の機能的構成要素に決定したモードに従って動作させる。 The mode determination unit 20b determines the mode to be executed by the robot system 1 from the automatic operation mode, the manual operation mode, and the correction mode according to a command for designating the mode via the input device 40, and sets it as another functional component. Operate according to the determined mode.
 手動指令生成部20cは、手動運転モード又は教示者による教示の実行時において、入力装置40から出力される操作情報に対応する動作をロボット10にさせるための動作指令を生成し、動作制御部20eに出力する。 The manual command generation unit 20c generates an operation command for causing the robot 10 to perform an operation corresponding to the operation information output from the input device 40 in the manual operation mode or when the instructor executes the instruction, and the operation control unit 20e Output to.
 自動指令生成部20dは、自動運転モード及び補正モードにおいて、教示データ20gaに従ってロボット10に自動で所定の溶接作業をさせるための動作指令を生成し、動作制御部20eに出力する。自動指令生成部20dは、入力装置40を介して溶接作業の情報を取得し、当該溶接作業に対応する教示データ20gaを記憶部20gから読み込み使用する。例えば、自動指令生成部20dは、自動運転モードでは溶接対象物に対して第1装着部11bcの電極チップ11dを押し付けることで加圧する動作指令を生成するが、補正モードでは溶接対象物に対して第1装着部11bcの撮像装置50を接近させて接触させるが加圧しない又は接触させない動作指令を生成する。 The automatic command generation unit 20d generates an operation command for automatically causing the robot 10 to perform a predetermined welding operation according to the teaching data 20ga in the automatic operation mode and the correction mode, and outputs the operation command to the operation control unit 20e. The automatic command generation unit 20d acquires welding work information via the input device 40, and reads and uses the teaching data 20ga corresponding to the welding work from the storage unit 20g. For example, in the automatic operation mode, the automatic command generation unit 20d generates an operation command for pressurizing the welding object by pressing the electrode tip 11d of the first mounting unit 11bc, but in the correction mode, the welding object is pressed. An operation command is generated in which the image pickup device 50 of the first mounting portion 11bc is brought into close contact with each other but not pressurized or not brought into contact with each other.
 動作指令は、溶接ガン11の3次元位置及び姿勢、並びに、溶接ガン11に対する第1装着部11bcの位置、各位置での時刻等の指令を含む。動作指令は、各位置で溶接ガン11が溶接対象物に加える力、及び、第1装着部11bcが電極チップ11dを介して溶接対象物に加える力等の指令を含んでもよい。 The operation command includes commands such as the three-dimensional position and posture of the welding gun 11, the position of the first mounting portion 11bc with respect to the welding gun 11, and the time at each position. The operation command may include commands such as a force applied by the welding gun 11 to the welding object at each position and a force applied by the first mounting portion 11bc to the welding object via the electrode tip 11d.
 動作制御部20eは、動作指令に従って、ロボット10の動作を制御する。動作制御部20eのアーム制御部20eaは、溶接ガン11の3次元位置及び姿勢を動作指令に従わせるように、ロボットアーム12のアーム駆動装置M1~M6のサーボモータを動作させるための指令を生成し、アーム駆動装置M1~M6に出力する。アーム制御部20eaは、アーム駆動装置M1~M6のサーボモータそれぞれの回転量及び駆動電流をフィードバック情報として取得し、上記指令の生成に用いる。ガン制御部20ebは、第1装着部11bcの位置を動作指令に従わせるように、溶接ガン11の移動駆動装置11caのサーボモータを動作させるための指令を生成し、移動駆動装置11caに出力する。ガン制御部20ebは、移動駆動装置11caのサーボモータの回転量及び駆動電流をフィードバック情報として取得し、上記指令の生成に用いる。 The motion control unit 20e controls the motion of the robot 10 according to the motion command. The arm control unit 20ea of the operation control unit 20e generates a command for operating the servomotors of the arm drive devices M1 to M6 of the robot arm 12 so that the three-dimensional position and posture of the welding gun 11 obey the operation command. Then, the output is output to the arm drive devices M1 to M6. The arm control unit 20ea acquires the rotation amount and drive current of each of the servomotors of the arm drive devices M1 to M6 as feedback information and uses them to generate the above command. The gun control unit 20eb generates a command for operating the servomotor of the movement drive device 11ca of the welding gun 11 and outputs the command to the movement drive device 11ca so that the position of the first mounting unit 11bc follows the operation command. .. The gun control unit 20eb acquires the rotation amount and the drive current of the servomotor of the mobile drive device 11ca as feedback information and uses them for generating the above command.
 補正部20fは、補正モードにおいて動作し、教示データ20gaを補正する。補正部20fは、カメラ51によって撮像された画像から検出された溶接対象物の表面の打点マークの3次元位置及び姿勢を、画像処理装置30から受け取る。さらに、補正部20fは、打点マークの3次元位置及び姿勢に基づき、打点マークの中心に溶接ガン11の第1装着部11bcの電極チップ11dを実際に押し付けるための溶接ガン11の位置、つまり、打点マークの中心に実際に溶接を実行するための溶接ガン11の位置である対応位置を検出する。対応位置は、溶接ガン11の3次元位置及び姿勢を含むが、例えば、溶接ガン11の姿勢が一定である場合等では、溶接ガン11の3次元位置のみを含んでもよい。補正部20fは、例えば、記憶部20gに記憶される打点の3次元位置及び姿勢と当該打点に溶接するための溶接ガン11の3次元位置及び姿勢との関係に基づき、対応位置を検出してもよい。 The correction unit 20f operates in the correction mode and corrects the teaching data 20ga. The correction unit 20f receives the three-dimensional position and orientation of the dot mark on the surface of the welding object detected from the image captured by the camera 51 from the image processing device 30. Further, the correction unit 20f is based on the three-dimensional position and orientation of the dot mark, and the position of the welding gun 11 for actually pressing the electrode tip 11d of the first mounting portion 11bc of the welding gun 11 against the center of the dot mark, that is, The corresponding position, which is the position of the welding gun 11 for actually executing welding, is detected at the center of the dot mark. The corresponding position includes the three-dimensional position and posture of the welding gun 11, but may include only the three-dimensional position of the welding gun 11, for example, when the posture of the welding gun 11 is constant. The correction unit 20f detects, for example, a corresponding position based on the relationship between the three-dimensional position and posture of the hitting point stored in the storage unit 20g and the three-dimensional position and posture of the welding gun 11 for welding to the hitting point. May be good.
 さらに、補正部20fは、打点マークの中心に実際に溶接を実行するための溶接ガン11の対応位置と当該打点マークの中心に溶接を実行するために設定されている溶接ガン11のガン教示位置との差異に基づき、教示データ20gaを補正する。補正部20fは、当該差異を小さくするようにガン教示位置を補正してもよく、例えば、対応位置で置き換えることでガン教示位置を補正してもよい。 Further, the correction unit 20f has a corresponding position of the welding gun 11 for actually executing welding at the center of the dot mark and a gun teaching position of the welding gun 11 set for executing welding at the center of the dot mark. The teaching data 20ga is corrected based on the difference from the above. The correction unit 20f may correct the gun teaching position so as to reduce the difference, or may correct the gun teaching position by replacing it with a corresponding position, for example.
 画像処理装置30は、抽出部30aと、マーク位置検出部30bと、記憶部30cとを機能的構成要素として含む。上記機能的構成要素の全てが必須ではない。抽出部30a及びマーク位置検出部30bの機能は、CPU301等によって実現され、記憶部30cの機能は、メモリ304、ROM302及び/又はRAM303によって実現される。 The image processing device 30 includes an extraction unit 30a, a mark position detection unit 30b, and a storage unit 30c as functional components. Not all of the above functional components are required. The functions of the extraction unit 30a and the mark position detection unit 30b are realized by the CPU 301 and the like, and the functions of the storage unit 30c are realized by the memory 304, the ROM 302 and / or the RAM 303.
 記憶部30cは、種々の情報を記憶し、記憶している情報の読み出しを可能にする。例えば、記憶部30cは、画像処理装置30を動作させるプログラムを記憶する。さらに、記憶部30cは、打点マークの情報を記憶する。打点マークがスポット溶接痕である場合、打点マークの情報は、スポット溶接痕の形状、寸法、色及びテクスチャの情報、並びに、スポット溶接痕の画像データ等を含んでもよい。打点マークがマーキングである場合、打点マークの情報は、マーキングの画像データ、並びに、マーキング及びマーキングを構成する図形などの形状、寸法及び配置の情報等を含んでもよい。 The storage unit 30c stores various information and enables reading of the stored information. For example, the storage unit 30c stores a program for operating the image processing device 30. Further, the storage unit 30c stores the information of the dot mark. When the dot mark is a spot weld mark, the dot mark information may include information on the shape, size, color and texture of the spot weld mark, image data of the spot weld mark, and the like. When the dot mark is a marking, the dot mark information may include image data of the marking and information on the shape, dimensions, and arrangement of the marking and the figures constituting the marking.
 本実施の形態では、打点マークのマーキングは方向性を有する。例えば、図6に示すように、マーキングは、複数の図形で構成され、複数の図形がマーキングの中心と、マーキングの向きである方向性とを表す。図6は、実施の形態に係る打点マークのマーキングの一例を示す図である。図6のマーキングMは、外周円Maと、外周円Maに内接する正三角形Mbと、正三角形Mbの内側の内周円Mcと、正三角形Mbの1つの角の内側近傍のポイントMdとで構成される。内周円Mcは、マーキングMの中心を表し、中心表示部の一例である。正三角形Mb及びポイントMdは、マーキングMの向きを表し、指向表示部の一例である。 In the present embodiment, the marking of the dot mark has directionality. For example, as shown in FIG. 6, the marking is composed of a plurality of figures, and the plurality of figures represent the center of the marking and the directionality which is the direction of the marking. FIG. 6 is a diagram showing an example of marking of a dot mark according to an embodiment. The marking M in FIG. 6 is an outer circle Ma, an equilateral triangle Mb inscribed in the outer circle Ma, an inner circle Mc inside the equilateral triangle Mb, and a point Md near the inside of one corner of the equilateral triangle Mb. It is composed. The inner circumference Mc represents the center of the marking M and is an example of the center display unit. The equilateral triangle Mb and the point Md represent the direction of the marking M and are an example of the directional display unit.
 また、打点マークの画像データとして、打点マークの正面から撮像された画像データである正面画像データだけでなく、打点マークに対して様々な角度から撮像された画像データである斜方向画像データも、記憶部30cに記憶されてもよい。斜方向画像データでは、打点マークが歪んで表わされる。 Further, as the image data of the dot mark, not only the front image data which is the image data captured from the front of the dot mark but also the oblique image data which is the image data captured from various angles with respect to the dot mark is also available. It may be stored in the storage unit 30c. In the oblique image data, the dot mark is distorted.
 また、記憶部30cは、溶接ガン11、電極チップ11d、溶接対象物W及び撮像装置50の情報を記憶してもよい。これらの情報は、ロボット制御装置20から画像処理装置30に送信されることで、記憶部30cに記憶されてもよい。 Further, the storage unit 30c may store information on the welding gun 11, the electrode tip 11d, the welding object W, and the imaging device 50. These pieces of information may be stored in the storage unit 30c by being transmitted from the robot control device 20 to the image processing device 30.
 抽出部30aは、補正モードにおいて、カメラ51によって撮像された画像データに写し出される被写体の中から、打点マークを検出する。 In the correction mode, the extraction unit 30a detects a dot mark from the subject projected on the image data captured by the camera 51.
 例えば、打点マークがスポット溶接痕である場合、抽出部30aは、画像データに対して二値変換等を実行し、エッジを検出する。抽出部30aは、変換前の画像データ及び変換後の画像データと、スポット溶接痕の画像データとを比較し、形状のパターンマッチング、色のパターンマッチング及び/又はテクスチャのパターンマッチングを実行することで、スポット溶接痕の像を検出する。なお、抽出部30aは、取付具52の接続部52aの軸心と収容部52bの軸心との角度等の情報に基づき、第1装着部11bcでの電極チップ11dの軸心11daとカメラ51の光軸51aとの角度等を検出し、検出結果に基づき、斜め方向から撮像された画像データの被写体の歪みを補正してもよい。抽出部30aは、歪み補正後の画像データを用いてスポット溶接痕の像を検出してもよい。 For example, when the spot welding mark is a spot welding mark, the extraction unit 30a executes binary conversion or the like on the image data to detect the edge. The extraction unit 30a compares the image data before conversion and the image data after conversion with the image data of spot welding marks, and executes shape pattern matching, color pattern matching, and / or texture pattern matching. , Detects the image of spot welding marks. The extraction unit 30a is based on information such as the angle between the axis of the connecting portion 52a of the attachment 52 and the axis of the accommodating portion 52b, and the axis 11da of the electrode tip 11d and the camera 51 in the first mounting portion 11bc. The distortion of the subject of the image data captured from an oblique direction may be corrected based on the detection result by detecting the angle with the optical axis 51a of the above. The extraction unit 30a may detect an image of a spot weld mark using the image data after distortion correction.
 例えば、打点マークがマーキングである場合、抽出部30aは、画像データに対して二値変換等を実行し、エッジを検出する。抽出部30aは、変換後の画像データと、マーキングの画像データとを比較し、形状のパターンマッチングを実行することで、マーキングの像を検出する。又は、抽出部30aは、変換後の画像データに対してハフ変換等を実行することで、線分及び弧を検出する。さらに、抽出部30aは、ハフ変換後の画像データにおいて、マーキングに含まれる図形及び当該図形に類似する図形を検出し、検出された図形の組み合わせの中から、マーキングを形成する組み合わせをマーキングの像として検出する。図形の組み合わせは、図形の形状及び配置を含む。抽出部30aは、斜め方向から撮像された画像データの被写体の歪みを考慮するように類似する図形を検出する。例えば、図6のマーキングMの場合、抽出部30aは、最外周の円又は楕円と、その内側の三角形と、三角形の内側の円又は楕円と、三角形の1つの角の内側のポイントとを含む組み合わせを検出する。なお、抽出部30aは、斜め方向から撮像された画像データの被写体の歪みを補正し、歪み補正後の画像データを用いてマーキングを検出してもよい。 For example, when the dot mark is a marking, the extraction unit 30a executes a binary conversion or the like on the image data to detect an edge. The extraction unit 30a detects the marking image by comparing the converted image data with the marking image data and performing shape pattern matching. Alternatively, the extraction unit 30a detects the line segment and the arc by executing a Hough transform or the like on the converted image data. Further, the extraction unit 30a detects a figure included in the marking and a figure similar to the figure in the image data after the Hough transform, and from the combinations of the detected figures, the combination forming the marking is a marking image. Detect as. The combination of figures includes the shape and arrangement of the figures. The extraction unit 30a detects a similar figure so as to consider the distortion of the subject of the image data captured from the oblique direction. For example, in the case of the marking M of FIG. 6, the extraction unit 30a includes the outermost circle or ellipse, the triangle inside the circle, the circle or ellipse inside the triangle, and the point inside one corner of the triangle. Detect combinations. The extraction unit 30a may correct the distortion of the subject in the image data captured from an oblique direction, and detect the marking using the image data after the distortion correction.
 マーク位置検出部30bは、カメラ51によって撮像された画像データと、当該画像データにおいて検出された打点マークの情報とを用いて、打点マークの3次元位置及び姿勢を検出する。マーク位置検出部30bは、上記検出に、溶接ガン11、電極チップ11d、溶接対象物W及び撮像装置50の取付具52の情報も用いる。マーキングMの場合を例にマーク位置検出部30bの動作を説明する。 The mark position detection unit 30b detects the three-dimensional position and orientation of the dot mark by using the image data captured by the camera 51 and the dot mark information detected in the image data. The mark position detection unit 30b also uses information on the welding gun 11, the electrode tip 11d, the welding object W, and the fixture 52 of the imaging device 50 for the above detection. The operation of the mark position detection unit 30b will be described by taking the case of marking M as an example.
 図7は、補正モードにおけるカメラ51での打点マークの撮像時の状態の一例を示す側面図である。図8は、図7の状態でカメラ51によって撮像された画像の一例を示す図である。図8に示すように、マーク位置検出部30bは、マーキングMが写された画像Iaにおいて、マーキングMの中心、つまり内周円Mcの中心の画素pMcの画素座標と、円形のポイントMdの中心の画素pMdの画素座標とを検出する。画素座標は、画像Iaの画像座標系における画素を単位とする座標である。 FIG. 7 is a side view showing an example of a state at the time of imaging the dot mark by the camera 51 in the correction mode. FIG. 8 is a diagram showing an example of an image captured by the camera 51 in the state of FIG. 7. As shown in FIG. 8, the mark position detection unit 30b has the pixel coordinates of the pixel pMc at the center of the marking M, that is, the center of the inner circumference circle Mc, and the center of the circular point Md in the image Ia on which the marking M is copied. The pixel coordinates of the pixel pMd of the above are detected. The pixel coordinates are coordinates in the image coordinate system of the image Ia in units of pixels.
 カメラ51の光軸51aの位置は、画像Iaの中心の画素pIである。マーク位置検出部30bは、画素pMcと画素pIとの位置関係に基づき、図7に示すカメラ51から内周円Mcの中心に至る視線LMcと光軸51aとの夾角αと、光軸51aに対する視線LMcの向きとを演算する。なお、第1装着部11bcの軸心11daと溶接対象物Wの表面との交点Pdは、教示データ20gaに従って溶接を行ったときの第1装着部11bcの電極チップ11dの打点位置の中心である。 The position of the optical axis 51a of the camera 51 is the pixel pI at the center of the image Ia. The mark position detection unit 30b refers to the angle α of the line-of-sight LMc and the optical axis 51a from the camera 51 to the center of the inner peripheral circle Mc shown in FIG. 7 and the optical axis 51a based on the positional relationship between the pixel pMc and the pixel pI. Calculate the direction of the line of sight LMc. The intersection Pd between the axis 11da of the first mounting portion 11bc and the surface of the object W to be welded is the center of the hitting point position of the electrode tip 11d of the first mounting portion 11bc when welding is performed according to the teaching data 20ga. ..
 さらに、マーク位置検出部30bは、カメラ51の撮像時の第1装着部11bcの先端と溶接対象物Wの表面との距離daを演算する。具体的には、マーク位置検出部30bは、ロボット制御装置20のガン制御部20ebからカメラ51の撮像時のフィードバック情報を取得し、当該フィードバック情報に基づき第1装着部11bcの位置を検出する。マーク位置検出部30bは、第1装着部11bcの位置と、第1装着部11bcと第2装着部11bdとの距離と、溶接対象物Wの厚さとに基づき、距離daを演算する。 Further, the mark position detection unit 30b calculates the distance da between the tip of the first mounting unit 11bc and the surface of the welding object W at the time of imaging by the camera 51. Specifically, the mark position detection unit 30b acquires feedback information at the time of imaging of the camera 51 from the gun control unit 20eb of the robot control device 20, and detects the position of the first mounting unit 11bc based on the feedback information. The mark position detection unit 30b calculates the distance da based on the position of the first mounting portion 11bc, the distance between the first mounting portion 11bc and the second mounting portion 11bd, and the thickness of the welding object W.
 マーク位置検出部30bは、方向D1での取付具52の第1装着部11bcからの突出長さと、光軸51aに対する視線LMcの向き及び夾角αと、距離daとに基づき、内周円Mcの中心の3次元位置を演算する。マーク位置検出部30bは、上記突出長さをロボット制御装置20の記憶部20gから取得してもよく、記憶部30cに予め記憶された上記突出長さを取得してもよい。 The mark position detection unit 30b has an inner circumference Mc based on the protrusion length of the fixture 52 from the first mounting portion 11bc in the direction D1, the direction and the angle α of the line of sight LMc with respect to the optical axis 51a, and the distance da. Calculate the three-dimensional position of the center. The mark position detection unit 30b may acquire the protrusion length from the storage unit 20g of the robot control device 20, or may acquire the protrusion length stored in advance in the storage unit 30c.
 さらに、マーク位置検出部30bは、内周円Mcの中心の3次元位置の演算と同様にして、ポイントMdの中心の3次元位置を演算する。マーク位置検出部30bは、内周円Mcの中心及びポイントMdの中心の3次元位置に基づき、3次元空間内でのマーキングMの中心の3次元位置とマーキングMの姿勢とを演算する。マーキングMの姿勢は、いかなる姿勢であってもよいが、例えば、マーキングMの中心に対するポイントMdの中心の水平方向の向きである方位であってもよく、マーキングMの中心に対するポイントMdの3次元方向の向きであってもよく、マーキングMが形成する面の鉛直軸に対する傾斜量及び傾斜方向であってもよい。マーク位置検出部30bは、マーキングMの中心の3次元位置と姿勢とをロボット制御装置20の補正部20fに送信する。 Further, the mark position detection unit 30b calculates the three-dimensional position of the center of the point Md in the same manner as the calculation of the three-dimensional position of the center of the inner circumference circle Mc. The mark position detection unit 30b calculates the three-dimensional position of the center of the marking M and the posture of the marking M in the three-dimensional space based on the three-dimensional positions of the center of the inner circumference circle Mc and the center of the point Md. The posture of the marking M may be any posture, and may be, for example, an orientation in the horizontal direction of the center of the point Md with respect to the center of the marking M, and the three dimensions of the point Md with respect to the center of the marking M. It may be in the direction of the direction, or may be the amount of inclination and the direction of inclination of the surface formed by the marking M with respect to the vertical axis. The mark position detection unit 30b transmits the three-dimensional position and posture of the center of the marking M to the correction unit 20f of the robot control device 20.
 なお、マーク位置検出部30bは、打点マークがスポット溶接痕である場合も、マーキングと同様に、スポット溶接痕の3次元位置及び姿勢を検出する。例えば、マーク位置検出部30bは、スポット溶接痕の中心の3次元位置とスポット溶接痕の外周の少なくとも一部の3次元位置とを検出することで、スポット溶接痕の姿勢を検出することができる。 Note that the mark position detection unit 30b detects the three-dimensional position and orientation of the spot weld mark in the same manner as the marking even when the spot mark is a spot weld mark. For example, the mark position detection unit 30b can detect the posture of the spot weld mark by detecting the three-dimensional position of the center of the spot weld mark and the three-dimensional position of at least a part of the outer periphery of the spot weld mark. ..
 [ロボットシステムの動作]
 実施の形態に係るロボットシステム1の動作のうちの補正モードでの動作を説明する。図9は、実施の形態に係るロボットシステム1の補正モードでの動作の一例を示すフローチャートである。
[Operation of robot system]
The operation in the correction mode among the operations of the robot system 1 according to the embodiment will be described. FIG. 9 is a flowchart showing an example of the operation of the robot system 1 according to the embodiment in the correction mode.
 図9に示すように、まず、ステップS1において、ユーザによって、溶接ガン11の第1装着部11bcの電極チップ11dが、撮像装置50と取り換えられる。つまり、撮像装置50の取付具52が第1装着部11bcに取り付けられる。次いで、ステップS2において、補正モードを実行する指令が、ユーザによって入力装置40に入力され、ロボット制御装置20によって受け付けられる。 As shown in FIG. 9, first, in step S1, the electrode tip 11d of the first mounting portion 11bc of the welding gun 11 is replaced with the imaging device 50 by the user. That is, the attachment 52 of the image pickup apparatus 50 is attached to the first attachment portion 11bc. Next, in step S2, a command to execute the correction mode is input to the input device 40 by the user and accepted by the robot control device 20.
 次いで、ステップS3において、ロボット制御装置20は、記憶部20gの教示データ20gaに従ってロボット10に自動運転させる。次いで、ステップS4において、ロボット制御装置20は、ロボット10に、教示データ20gaに含まれる複数の打点位置のうちの次に溶接ガン11を配置すべき打点位置に溶接ガン11を移動させる。 Next, in step S3, the robot control device 20 causes the robot 10 to automatically operate according to the teaching data 20ga of the storage unit 20g. Next, in step S4, the robot control device 20 moves the welding gun 11 to the robot 10 to the spot position where the welding gun 11 should be placed next among the plurality of spot positions included in the teaching data 20ga.
 次いで、ステップS5において、ロボット制御装置20は、ロボット10に、打点位置に対する溶接ガン11の位置及び姿勢を調節させる。具体的には、ロボット制御装置20は、打点位置での溶接対象物Wの表面に対して、第1装着部11bcの軸心11daが垂直となるように溶接ガン11の姿勢を調節する。さらに、ロボット制御装置20は、溶接対象物Wに対して溶接ガン11を方向D2に移動させることで、第2装着部11bdの電極チップ11dを溶接対象物Wに接触させる。ロボット制御装置20は、接触センサ11eの検知信号に基づき上記接触を検出する。 Next, in step S5, the robot control device 20 causes the robot 10 to adjust the position and posture of the welding gun 11 with respect to the striking point position. Specifically, the robot control device 20 adjusts the posture of the welding gun 11 so that the axis 11da of the first mounting portion 11bc is perpendicular to the surface of the welding object W at the striking point position. Further, the robot control device 20 moves the welding gun 11 in the direction D2 with respect to the welding object W, so that the electrode tip 11d of the second mounting portion 11bd is brought into contact with the welding object W. The robot control device 20 detects the contact based on the detection signal of the contact sensor 11e.
 次いで、ステップS6において、ロボット制御装置20は、溶接ガン11の移動装置11cに溶接時と同様の動作、つまり溶接の動作をさせる。具体的には、ロボット制御装置20は、移動装置11cに、第1装着部11bcを方向D1へ移動させ、溶接対象物Wに最も接近させた後に方向D2へ移動させ、溶接対象物Wから離させる。このとき、ロボット制御装置20は、自動運転モードの場合のように溶接対象物Wに対して撮像装置50を押し付けて加圧することは行わず、溶接対象物Wに対して撮像装置50を接触させるが加圧しない又は接触させない。 Next, in step S6, the robot control device 20 causes the moving device 11c of the welding gun 11 to perform the same operation as during welding, that is, the welding operation. Specifically, the robot control device 20 moves the first mounting portion 11bc in the direction D1 to the moving device 11c, moves it in the direction D2 after making it closest to the welding object W, and separates it from the welding object W. Let me. At this time, the robot control device 20 does not press the image pickup device 50 against the welded object W to pressurize it as in the case of the automatic operation mode, but brings the image pickup device 50 into contact with the welded object W. Do not pressurize or contact.
 次いで、ステップS7において、ロボット制御装置20は、第1装着部11bcが溶接対象物Wに接近する過程、又は、第1装着部11bcが溶接対象物Wから離れる過程の所定のタイミングで、カメラ51に溶接対象物Wの表面を撮像させる。なお、ロボット制御装置20は、撮像時に第1装着部11bcを一時的に停止させてもよく、停止させなくてもよい。カメラ51は、撮像した画像の信号を打点位置の情報と関連付けて画像処理装置30に送信し記憶部30cに画像データとして記憶させるが、ロボット制御装置20に送信し記憶部20gに記憶させてもよい。 Next, in step S7, the robot control device 20 takes the camera 51 at a predetermined timing in the process in which the first mounting portion 11bc approaches the welding target W or the process in which the first mounting portion 11bc leaves the welding target W. To image the surface of the object W to be welded. The robot control device 20 may or may not temporarily stop the first mounting portion 11bc at the time of imaging. The camera 51 transmits the signal of the captured image to the image processing device 30 in association with the information of the hitting point position and stores it as image data in the storage unit 30c, but it may be transmitted to the robot control device 20 and stored in the storage unit 20g. Good.
 次いで、ステップS8において、ロボット制御装置20は、教示データ20gaに含まれる全ての打点位置への溶接の動作が完了したか否かを判定する。ロボット制御装置20は、完了済みの場合(ステップS8でYes)にステップS9に進み、未完了の場合(ステップS8でNо)にステップS4に進む。 Next, in step S8, the robot control device 20 determines whether or not the welding operation to all the hitting point positions included in the teaching data 20ga is completed. The robot control device 20 proceeds to step S9 when it is completed (Yes in step S8), and proceeds to step S4 when it is not completed (Nо in step S8).
 ステップS9において、画像処理装置30は、各打点位置で撮像された画像データを処理することで、当該画像データに写し出される打点マークを検出する。次いで、ステップS10において、画像処理装置30は、各打点位置での画像データと、当該画像データにおいて検出された打点マークの情報とを用いて、当該打点マークの3次元位置及び姿勢を検出する。画像処理装置30は、各打点位置の情報と、当該打点位置での打点マークの3次元位置及び姿勢とを関連付けてロボット制御装置20に送信する。 In step S9, the image processing device 30 detects the dot mark projected on the image data by processing the image data captured at each dot position. Next, in step S10, the image processing device 30 detects the three-dimensional position and orientation of the dot mark using the image data at each dot position and the dot mark information detected in the image data. The image processing device 30 associates the information of each hitting point position with the three-dimensional position and posture of the hitting point mark at the hitting point position and transmits the information to the robot control device 20.
 次いで、ステップS11において、ロボット制御装置20は、各打点位置について、打点マークの3次元位置及び姿勢に基づき、当該打点マークの中心に実際に溶接を実行するための溶接ガン11の対応位置を検出する。次いで、ステップS12において、ロボット制御装置20は、各打点位置について、当該打点位置に溶接を実行するために設定されている溶接ガン11のガン教示位置と、当該打点位置での溶接ガン11の対応位置との差異に基づき、教示データ20gaを補正する。 Next, in step S11, the robot control device 20 detects the corresponding position of the welding gun 11 for actually executing welding at the center of the dot mark based on the three-dimensional position and posture of the dot mark for each dot position. To do. Next, in step S12, the robot control device 20 corresponds to the gun teaching position of the welding gun 11 set to execute welding at the hitting point position and the welding gun 11 at the hitting point position. The teaching data 20ga is corrected based on the difference from the position.
 ステップS1~S12の処理によって、ロボットシステム1は、自動で各打点位置に対応する打点マークを順に撮像し教示データ20gaを補正することができる。 By the processing of steps S1 to S12, the robot system 1 can automatically image the dot marks corresponding to each dot position in order and correct the teaching data 20ga.
 (変形例)
 変形例に係るロボットシステムでは、溶接ガン11に着脱可能である撮像装置50Aの構成が、実施の形態と異なる。変形例に係る撮像装置50Aの取付具52Aは、カメラ51の光軸51aの方向が方向D1と平行であるように、光軸51aの方向を第1装着部11bcからオフセットさせて、具体的には、方向D1及びD2の第1装着部11bcの動作経路から光軸51aをオフセットさせてカメラ51を第1装着部11bcに取り付ける。以下において、本変形例について、実施の形態を異なる点を中心に説明し、実施の形態と同様の点の説明を適宜省略する。
(Modification example)
In the robot system according to the modified example, the configuration of the image pickup device 50A that can be attached to and detached from the welding gun 11 is different from that of the embodiment. The fixture 52A of the image pickup apparatus 50A according to the modified example specifically offsets the direction of the optical axis 51a from the first mounting portion 11bc so that the direction of the optical axis 51a of the camera 51 is parallel to the direction D1. Attaches the camera 51 to the first mounting portion 11bc by offsetting the optical axis 51a from the operation path of the first mounting portion 11bc in the directions D1 and D2. Hereinafter, the present modification will be described with a focus on different points in the embodiment, and the same points as in the embodiment will be omitted as appropriate.
 図10は、変形例に係る溶接ガン11の撮像装置50Aの構成の一例を示す側面図である。撮像装置50Aは、カメラ51と取付具52Aとを備える。取付具52Aは、円筒状の接続部52Aaと、円筒状の収容部52Abとを一体的に含む。収容部52Abの軸心は、接続部52Aaの軸心と平行である。よって、第1装着部11bcに取付具52Aを介して装着されたカメラ51の光軸51aは、第1装着部11bcに装着された電極チップ11dの軸心11daと平行である。さらに、本変形例では、収容部52Abの軸心は、接続部52Aaの軸心と垂直な方向で接続部52Aaの軸心から離れて位置する。つまり、光軸51aと軸心11daとは離れている。収容部52Abは、接続部52Aaから方向D2へ延びる。収容部52Abは、溶接ガン11の本体部11b及び移動装置11cとの干渉を防ぎつつ、方向D1での接続部52Aaからの突出量を低く抑制する、又は接続部52Aaから突出しないように配置されることができる。 FIG. 10 is a side view showing an example of the configuration of the imaging device 50A of the welding gun 11 according to the modified example. The image pickup apparatus 50A includes a camera 51 and a fixture 52A. The fixture 52A integrally includes a cylindrical connecting portion 52Aa and a cylindrical accommodating portion 52Ab. The axis of the accommodating portion 52Ab is parallel to the axis of the connecting portion 52Aa. Therefore, the optical axis 51a of the camera 51 mounted on the first mounting portion 11bc via the mounting tool 52A is parallel to the axis 11da of the electrode tip 11d mounted on the first mounting portion 11bc. Further, in this modification, the axis of the accommodating portion 52Ab is located away from the axis of the connecting portion 52Aa in a direction perpendicular to the axis of the connecting portion 52Aa. That is, the optical axis 51a and the axis 11da are separated from each other. The accommodating portion 52Ab extends from the connecting portion 52Aa in the direction D2. The accommodating portion 52Ab is arranged so as to prevent interference with the main body portion 11b of the welding gun 11 and the moving device 11c, and to suppress the amount of protrusion from the connecting portion 52Aa in the direction D1 to a low level, or to prevent the housing portion 52Ab from protruding from the connecting portion 52Aa. Can be done.
 上述のような取付具52Aを介して第1装着部11bcに装着されたカメラ51は、補正モードにおいて溶接対象物Wの打点マークを撮像したとき、歪みが小さい打点マークの像を写し出す画像を生成することができる。これにより、画像処理装置30での画像処理の簡略化が可能である。 The camera 51 mounted on the first mounting portion 11bc via the mounting tool 52A as described above generates an image of the spot mark with small distortion when the spot mark of the welding object W is imaged in the correction mode. can do. This makes it possible to simplify the image processing in the image processing apparatus 30.
 また、本変形例では、接続部52Aaの軸心からオフセットされた収容部52Abの軸心は、接続部52Aaの軸心から離れて位置していたが、これに限定されず、接続部52Aaの軸心と同軸であってもよい。 Further, in this modification, the axis of the accommodating portion 52Ab offset from the axis of the connection portion 52Aa is located away from the axis of the connection portion 52Aa, but the present invention is not limited to this, and the connection portion 52Aa It may be coaxial with the axis.
 (その他の実施の形態)
 以上、本開示の実施の形態の例について説明したが、本開示は、上記実施の形態及び変形例に限定されない。すなわち、本開示の範囲内で種々の変形及び改良が可能である。例えば、各種変形を実施の形態及び変形例に施したもの、及び、異なる実施の形態及び変形例における構成要素を組み合わせて構築される形態も、本開示の範囲内に含まれる。
(Other embodiments)
Although the examples of the embodiments of the present disclosure have been described above, the present disclosure is not limited to the above-described embodiments and modifications. That is, various modifications and improvements are possible within the scope of the present disclosure. For example, a form in which various modifications are applied to the embodiments and modifications, and a form constructed by combining components in different embodiments and modifications are also included in the scope of the present disclosure.
 例えば、実施の形態及び変形例では、ロボット制御装置20は、補正モードにおいて、教示データ20gaに従ってロボット10に溶接ガン11を各打点位置に移動させ、さらに第1装着部11bcを溶接対象物Wに向かって移動させてカメラに51に撮像させるが、これに限定されない。例えば、ロボット制御装置20は、教示データ20gaに従ってロボット10に溶接ガン11を各打点位置に移動させるが、溶接ガン11に対して第1装着部11bcを移動させずにカメラ51に撮像させてもよい。又は、ロボット制御装置20は、第1装着部11bcを溶接対象物Wに向かう方向D1へ移動させるが、電極チップ11dが溶接対象物Wに接触する位置よりも手前の位置、つまり当該接触の位置よりも方向D2側の位置で第1装着部11bcを停止させた後に方向D2へ引き戻してもよい。この場合、ロボット制御装置20は、カメラ51に、当該停止位置で撮像させてもよく、移動の過程で撮像させてもよい。 For example, in the embodiment and the modified example, in the correction mode, the robot control device 20 moves the welding gun 11 to the robot 10 according to the teaching data 20ga to each hitting point position, and further moves the first mounting portion 11bc to the welding object W. The camera is moved toward the camera to image the image on the 51, but the method is not limited to this. For example, the robot control device 20 causes the robot 10 to move the welding gun 11 to each hitting point position according to the teaching data 20ga, but the camera 51 may image the welding gun 11 without moving the first mounting portion 11bc. Good. Alternatively, the robot control device 20 moves the first mounting portion 11bc in the direction D1 toward the welding object W, but the position before the electrode tip 11d comes into contact with the welding object W, that is, the contact position. The first mounting portion 11bc may be stopped at a position closer to the direction D2 and then pulled back to the direction D2. In this case, the robot control device 20 may cause the camera 51 to take an image at the stop position, or may make the camera 51 take an image in the process of movement.
 また、実施の形態及び変形例では、ロボット制御装置20は、補正モードにおいてカメラ51での撮像を実行する前に、溶接ガン11の第2装着部11bdの電極チップ11dを溶接対象物に接触させるが、これに限定されず、接触させなくてもよい。この場合、画像処理装置30は、カメラ51で撮像された画像データを処理することで、カメラ51と溶接対象物との距離を検出してもよい。 Further, in the embodiment and the modified example, the robot control device 20 brings the electrode tip 11d of the second mounting portion 11bd of the welding gun 11 into contact with the welding object before performing the imaging with the camera 51 in the correction mode. However, the present invention is not limited to this, and it is not necessary to make contact with the contact. In this case, the image processing device 30 may detect the distance between the camera 51 and the object to be welded by processing the image data captured by the camera 51.
 また、実施の形態及び変形例では、撮像装置50及び50Aの取付具52及び52Aは、電極チップ11dの代わりに溶接ガン11の第1装着部11bcに取り付けられるように構成されるが、これに限定されない。例えば、取付具52及び52Aは、電極チップ11dと異なる位置で第1装着部11bcに取り付けられるように構成されてもよい。この場合、取付具52及び52Aは、電極チップ11dが第1装着部11bcに装着されている状態でも第1装着部11bcに取り付けられることができるように構成されてもよい。つまり、取付具52及び52Aは、電極チップ11dと共に第1装着部11bcに取り付けられるように構成されてもよい。例えば、取付具52及び52Aは、方向D1に対する側方で第1装着部11bcに取り付けられてもよい。 Further, in the embodiment and the modified example, the attachments 52 and 52A of the image pickup apparatus 50 and 50A are configured to be attached to the first attachment portion 11bc of the welding gun 11 instead of the electrode tip 11d. Not limited. For example, the attachments 52 and 52A may be configured to be attached to the first attachment portion 11bc at a position different from that of the electrode tip 11d. In this case, the attachments 52 and 52A may be configured so that the electrode tip 11d can be attached to the first attachment portion 11bc even when the electrode tip 11d is attached to the first attachment portion 11bc. That is, the attachments 52 and 52A may be configured to be attached to the first attachment portion 11bc together with the electrode tip 11d. For example, the attachments 52 and 52A may be attached to the first attachment portion 11bc laterally to the direction D1.
 また、実施の形態及び変形例では、ロボット制御装置20と画像処理装置30とは別の装置であるが、これに限定されず、1つの装置に含まれていてもよい。また、ロボット制御装置20及び画像処理装置30はいずれも、2つ以上の装置で構成されてもよい。 Further, in the embodiment and the modified example, the robot control device 20 and the image processing device 30 are separate devices, but the device is not limited to this, and may be included in one device. Further, the robot control device 20 and the image processing device 30 may both be composed of two or more devices.
 また、本開示の技術は、補正方法であってもよく、上記補正方法を実行する制御装置であってもよい。例えば、本開示の一態様に係る補正方法は、ロボットの教示データを補正する補正方法であって、前記教示データに従って前記ロボットのロボットガンの対向する第1装着部及び第2装着部の間のワークの所定の打点位置に前記第1装着部を押し付けるための教示位置に、前記ロボットガンを移動させることと、前記ロボットガンが前記教示位置に位置するとき、前記第1装着部に取り付けられたカメラに、前記所定の打点位置に付けられた打点マークを撮像させることと、前記カメラによって撮像された画像を用いて、前記打点マークの位置を検出することと、前記打点マークに前記第1装着部を押し付けるための前記ロボットガンの位置である対応位置を検出することと、前記対応位置と前記教示位置との差異に基づき、前記教示データを補正することとを含み、前記カメラは、前記カメラの光軸の方向が、第1方向に動作可能である前記第1装着部からオフセットするように前記第1装着部に取り付けられる。このような補正方法は、CPU、LSIなどの回路、ICカード又は単体のモジュール等によって、実現されてもよい。 Further, the technique of the present disclosure may be a correction method or a control device that executes the above correction method. For example, the correction method according to one aspect of the present disclosure is a correction method for correcting the teaching data of the robot, and is between the first mounting portion and the second mounting portion of the robot gun of the robot facing each other according to the teaching data. The robot gun is moved to a teaching position for pressing the first mounting portion to a predetermined hitting point position of the work, and when the robot gun is located at the teaching position, the robot gun is mounted on the first mounting portion. The camera is made to image the dot mark attached to the predetermined dot position, the position of the dot mark is detected by using the image captured by the camera, and the first attachment is made to the dot mark. The camera includes detecting a corresponding position, which is a position of the robot gun for pressing a unit, and correcting the teaching data based on a difference between the corresponding position and the teaching position. Is attached to the first mounting portion so that the direction of the optical axis of the above is offset from the first mounting portion that can operate in the first direction. Such a correction method may be realized by a circuit such as a CPU or an LSI, an IC card, a single module, or the like.
 また、本開示の技術は、上記補正方法をコンピュータに実行させるプログラムであってもよく、上記プログラムが記録された非一時的なコンピュータ読み取り可能な記録媒体であってもよい。なお、上記プログラムは、インターネット等の伝送媒体を介して流通させることができるのは言うまでもない。 Further, the technique of the present disclosure may be a program that causes a computer to execute the correction method, or may be a non-temporary computer-readable recording medium in which the program is recorded. Needless to say, the above program can be distributed via a transmission medium such as the Internet.
 また、上記で用いた序数、数量等の数字は、全て本開示の技術を具体的に説明するために例示するものであり、本開示は例示された数字に制限されない。また、構成要素間の接続関係は、本開示の技術を具体的に説明するために例示するものであり、本開示の機能を実現する接続関係はこれに限定されない。 In addition, the numbers such as the ordinal number and the quantity used above are all examples for concretely explaining the technology of the present disclosure, and the present disclosure is not limited to the illustrated numbers. Further, the connection relationship between the components is illustrated for the purpose of specifically explaining the technique of the present disclosure, and the connection relationship for realizing the function of the present disclosure is not limited thereto.
 また、機能ブロック図におけるブロックの分割は一例であり、複数のブロックを一つのブロックとして実現する、一つのブロックを複数に分割する、及び/又は、一部の機能を他のブロックに移してもよい。また、類似する機能を有する複数のブロックの機能を単一のハードウェア又はソフトウェアが並列又は時分割に処理してもよい。 Further, the division of blocks in the functional block diagram is an example, and even if a plurality of blocks are realized as one block, one block is divided into a plurality of blocks, and / or some functions are transferred to another block. Good. In addition, a single piece of hardware or software may process the functions of a plurality of blocks having similar functions in parallel or in a time division manner.
1 ロボットシステム
2 補正システム
3 制御装置
10 ロボット
11 溶接ガン(ロボットガン)
11bc 第1装着部
11bd 第2装着部
11d 電極チップ(電極)
20 ロボット制御装置
30 画像処理装置
50,50A 撮像装置
51 カメラ
52,52A 取付具
1 Robot system 2 Correction system 3 Control device 10 Robot 11 Welding gun (robot gun)
11bc 1st mounting part 11bd 2nd mounting part 11d Electrode tip (electrode)
20 Robot control device 30 Image processing device 50, 50A Image pickup device 51 Camera 52, 52A Mounting tool

Claims (22)

  1.  ロボットの教示データを補正する補正システムであって、
     前記ロボットのロボットガンの対向する第1装着部及び第2装着部のうちの第1方向に動作可能である前記第1装着部に取り付けられるカメラと、
     前記カメラの光軸の方向が前記第1装着部からオフセットするように前記カメラを前記第1装着部に取り付ける取付具と、
     制御装置とを備え、
     前記制御装置は、
     前記教示データに従って前記第1装着部を前記第1装着部と前記第2装着部との間のワークの所定の打点位置に押し付けるための教示位置に、前記ロボットガンが位置するとき、前記所定の打点位置に付けられた打点マークを前記カメラに撮像させ、
     前記カメラによって撮像された画像を用いて、前記打点マークの位置を検出し、
     前記打点マークに前記第1装着部を押し付けるための前記ロボットガンの位置である対応位置を検出し、
     前記対応位置と前記教示位置との差異に基づき、前記教示データを補正する
     補正システム。
    It is a correction system that corrects the teaching data of the robot.
    A camera attached to the first mounting portion capable of operating in the first direction of the first mounting portion and the second mounting portion facing the robot gun of the robot, and a camera attached to the first mounting portion.
    A mounting tool for mounting the camera on the first mounting portion so that the direction of the optical axis of the camera is offset from the first mounting portion.
    Equipped with a control device
    The control device is
    When the robot gun is positioned at a teaching position for pressing the first mounting portion to a predetermined striking point position of the work between the first mounting portion and the second mounting portion according to the teaching data, the predetermined The camera is made to image the dot mark attached to the dot position.
    Using the image captured by the camera, the position of the dot mark is detected.
    The corresponding position, which is the position of the robot gun for pressing the first mounting portion against the hitting point mark, is detected.
    A correction system that corrects the teaching data based on the difference between the corresponding position and the teaching position.
  2.  前記制御装置は、
     前記ロボットガンが前記教示位置に位置するとき、前記教示データに従って前記ワークに向かって前記第1方向に前記第1装着部を動作させ、
     前記ワークに接近した状態の前記カメラに前記打点マークを撮像させる
     請求項1に記載の補正システム。
    The control device is
    When the robot gun is located at the teaching position, the first mounting portion is operated in the first direction toward the work according to the teaching data.
    The correction system according to claim 1, wherein the camera in a state of approaching the work captures the dot mark.
  3.  前記第1装着部及び前記第2装着部には、溶接のための電極が着脱可能であり、
     前記取付具は、前記電極の代わりに前記第1装着部に取り付けられる
     請求項1または2に記載の補正システム。
    Electrodes for welding can be attached to and detached from the first mounting portion and the second mounting portion.
    The correction system according to claim 1 or 2, wherein the fitting is attached to the first mounting portion instead of the electrode.
  4.  前記第1装着部及び前記第2装着部には、溶接のための電極が着脱可能であり、
     前記取付具は、前記電極が前記第1装着部に取り付けられている状態で前記第1装着部に取り付けられることができるように構成される
     請求項1または2に記載の補正システム。
    Electrodes for welding can be attached to and detached from the first mounting portion and the second mounting portion.
    The correction system according to claim 1 or 2, wherein the attachment is configured so that the electrode can be attached to the first attachment while the electrode is attached to the first attachment.
  5.  前記第1装着部に取り付けられた前記取付具及び前記カメラが前記第1装着部から前記第1方向に突出する長さは、前記第1装着部に取り付けられた前記電極が前記第1装着部から前記第1方向に突出する長さ以下である
     請求項3または4に記載の補正システム。
    The length of the attachment attached to the first mounting portion and the camera protruding from the first mounting portion in the first direction is such that the electrode attached to the first mounting portion is the first mounting portion. The correction system according to claim 3 or 4, wherein the length is less than or equal to the length protruding from the first direction.
  6.  前記取付具は、前記カメラの光軸の方向が前記第1方向と交差する方向であるように、前記カメラの光軸の方向をオフセットさせる
     請求項1~5のいずれか一項に記載の補正システム。
    The correction according to any one of claims 1 to 5, wherein the fixture offsets the direction of the optical axis of the camera so that the direction of the optical axis of the camera intersects the first direction. system.
  7.  前記取付具は、前記カメラの光軸の方向が前記第1方向と平行であるように、前記カメラの光軸の方向をオフセットさせる
     請求項1~5のいずれか一項に記載の補正システム。
    The correction system according to any one of claims 1 to 5, wherein the fixture offsets the direction of the optical axis of the camera so that the direction of the optical axis of the camera is parallel to the first direction.
  8.  前記教示位置は、前記教示位置での前記ロボットガンの3次元位置及び姿勢を含み、
     前記対応位置は、前記対応位置での前記ロボットガンの3次元位置及び姿勢を含み、
     前記制御装置は、前記対応位置での前記ロボットガンの3次元位置及び姿勢と前記教示位置での前記ロボットガンの3次元位置及び姿勢との差異に基づき、前記教示データを補正する
     請求項1~7のいずれか一項に記載の補正システム。
    The teaching position includes the three-dimensional position and posture of the robot gun at the teaching position.
    The corresponding position includes the three-dimensional position and posture of the robot gun at the corresponding position.
    The control device corrects the teaching data based on the difference between the three-dimensional position and posture of the robot gun at the corresponding position and the three-dimensional position and posture of the robot gun at the teaching position. The correction system according to any one of 7.
  9.  前記制御装置は、前記教示位置において、前記カメラでの撮像の前に前記第2装着部を前記ワークに押し付けるように前記ロボットガンを移動させる
     請求項1~8のいずれか一項に記載の補正システム。
    The correction according to any one of claims 1 to 8, wherein the control device moves the robot gun at the teaching position so as to press the second mounting portion against the work before imaging with the camera. system.
  10.  前記打点マークは、中心を示す中心表示部と、前記中心の周りの回転での向きを示す指向表示部とを含むマーキングである
     請求項1~9のいずれか一項に記載の補正システム。
    The correction system according to any one of claims 1 to 9, wherein the dot mark is a marking including a center display unit indicating the center and a directional display unit indicating the orientation in rotation around the center.
  11.  請求項1~10のいずれか一項に記載の補正システムと、
     前記ロボットとを備え、
     前記制御装置は、前記ロボットの動作を制御する
     ロボットシステム。
    The amendment system according to any one of claims 1 to 10.
    Equipped with the robot
    The control device is a robot system that controls the operation of the robot.
  12.  ロボットの教示データを補正する補正方法であって、
     前記教示データに従って前記ロボットのロボットガンの対向する第1装着部及び第2装着部の間のワークの所定の打点位置に前記第1装着部を押し付けるための教示位置に、前記ロボットガンを移動させることと、
     前記ロボットガンが前記教示位置に位置するとき、前記第1装着部に取り付けられたカメラに、前記所定の打点位置に付けられた打点マークを撮像させることと、
     前記カメラによって撮像された画像を用いて、前記打点マークの位置を検出することと、
     前記打点マークに前記第1装着部を押し付けるための前記ロボットガンの位置である対応位置を検出することと、
     前記対応位置と前記教示位置との差異に基づき、前記教示データを補正することとを含み、
     前記カメラは、前記カメラの光軸の方向が、第1方向に動作可能である前記第1装着部からオフセットするように前記第1装着部に取り付けられる
     補正方法。
    It is a correction method that corrects the teaching data of the robot.
    According to the teaching data, the robot gun is moved to a teaching position for pressing the first mounting portion to a predetermined striking point position of the work between the first mounting portion and the second mounting portion of the robot gun of the robot. That and
    When the robot gun is located at the teaching position, the camera attached to the first mounting portion is made to image the dot mark attached to the predetermined dot position.
    Using the image captured by the camera, the position of the dot mark is detected, and
    Detecting the corresponding position, which is the position of the robot gun for pressing the first mounting portion against the hitting point mark, and
    Including correcting the teaching data based on the difference between the corresponding position and the teaching position.
    A correction method in which the camera is attached to the first mounting portion so that the direction of the optical axis of the camera is offset from the first mounting portion that can operate in the first direction.
  13.  前記ロボットガンが前記教示位置に位置するとき、前記教示データに従って前記ワークに向かって前記第1方向に前記第1装着部を動作させることをさらに含み、
     前記カメラによる前記打点マークの撮像を、前記カメラが前記ワークに接近した状態で行う
     請求項12に記載の補正方法。
    Further including operating the first mounting portion in the first direction toward the work according to the teaching data when the robot gun is located at the teaching position.
    The correction method according to claim 12, wherein the camera captures the dot mark with the camera in close proximity to the work.
  14.  前記第1装着部及び前記第2装着部には、溶接のための電極が着脱可能であり、
     前記カメラは、前記電極の代わりに前記第1装着部に取り付けられる
     請求項12または13に記載の補正方法。
    Electrodes for welding can be attached to and detached from the first mounting portion and the second mounting portion.
    The correction method according to claim 12 or 13, wherein the camera is attached to the first mounting portion instead of the electrodes.
  15.  前記第1装着部及び前記第2装着部には、溶接のための電極が着脱可能であり、
     前記カメラは、前記電極が前記第1装着部に取り付けられている状態で前記第1装着部に取り付けられることができるように構成される
     請求項12または13に記載の補正方法。
    Electrodes for welding can be attached to and detached from the first mounting portion and the second mounting portion.
    The correction method according to claim 12 or 13, wherein the camera is configured so that the electrode can be attached to the first mounting portion in a state where the electrode is attached to the first mounting portion.
  16.  前記第1装着部に取り付けられた前記カメラが前記第1装着部から前記第1方向に突出する長さは、前記第1装着部に取り付けられた前記電極が前記第1装着部から前記第1方向に突出する長さ以下である
     請求項14または15に記載の補正方法。
    The length of the camera attached to the first mounting portion protruding from the first mounting portion in the first direction is such that the electrode attached to the first mounting portion has the first mounting portion from the first mounting portion. The amendment method according to claim 14 or 15, wherein the length is equal to or less than the length protruding in the direction.
  17.  前記カメラの光軸の方向は、前記カメラの光軸の方向が前記第1方向と交差する方向であるようにオフセットされる
     請求項12~16のいずれか一項に記載の補正方法。
    The correction method according to any one of claims 12 to 16, wherein the direction of the optical axis of the camera is offset so that the direction of the optical axis of the camera intersects the first direction.
  18.  前記カメラの光軸の方向は、前記カメラの光軸の方向が前記第1方向と平行であるようにオフセットされる
     請求項12~16のいずれか一項に記載の補正方法。
    The correction method according to any one of claims 12 to 16, wherein the direction of the optical axis of the camera is offset so that the direction of the optical axis of the camera is parallel to the first direction.
  19.  前記対応位置での前記ロボットガンの3次元位置及び姿勢と前記教示位置での前記ロボットガンの3次元位置及び姿勢との差異に基づき、前記教示データを補正し、
     前記教示位置は、前記教示位置での前記ロボットガンの3次元位置及び姿勢を含み、
     前記対応位置は、前記対応位置での前記ロボットガンの3次元位置及び姿勢を含む
     請求項12~18のいずれか一項に記載の補正方法。
    The teaching data is corrected based on the difference between the three-dimensional position and posture of the robot gun at the corresponding position and the three-dimensional position and posture of the robot gun at the teaching position.
    The teaching position includes the three-dimensional position and posture of the robot gun at the teaching position.
    The correction method according to any one of claims 12 to 18, wherein the corresponding position includes a three-dimensional position and a posture of the robot gun at the corresponding position.
  20.  前記教示位置において、前記カメラでの撮像の前に前記第2装着部を前記ワークに押し付けるように前記ロボットガンを移動させることをさらに含む
     請求項12~19のいずれか一項に記載の補正方法。
    The correction method according to any one of claims 12 to 19, further comprising moving the robot gun so as to press the second mounting portion against the work at the teaching position before imaging with the camera. ..
  21.  前記打点マークは、中心を示す中心表示部と、前記中心の周りの回転での向きを示す指向表示部とを含むマーキングである
     請求項12~20のいずれか一項に記載の補正方法。
    The correction method according to any one of claims 12 to 20, wherein the dot mark is a marking including a center display unit indicating the center and a directional display unit indicating the orientation in rotation around the center.
  22.  請求項12~21のいずれか一項に記載の補正方法を実行する制御装置。 A control device that executes the correction method according to any one of claims 12 to 21.
PCT/JP2020/038254 2019-10-09 2020-10-09 Correction system, correction method, robot system, and control device WO2021070922A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202080069459.0A CN114555271B (en) 2019-10-09 2020-10-09 Correction system, correction method, robot system, and control device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-186010 2019-10-09
JP2019186010A JP7290537B2 (en) 2019-10-09 2019-10-09 Correction system, correction method, robot system and controller

Publications (1)

Publication Number Publication Date
WO2021070922A1 true WO2021070922A1 (en) 2021-04-15

Family

ID=75381139

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/038254 WO2021070922A1 (en) 2019-10-09 2020-10-09 Correction system, correction method, robot system, and control device

Country Status (3)

Country Link
JP (1) JP7290537B2 (en)
CN (1) CN114555271B (en)
WO (1) WO2021070922A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024009484A1 (en) * 2022-07-07 2024-01-11 ファナック株式会社 Control device and control method

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07325611A (en) * 1994-05-31 1995-12-12 Toyota Motor Corp Automatic correcting method for off-line teaching data
JPH0924476A (en) * 1995-07-13 1997-01-28 Dengensha Mfg Co Ltd Method for teaching spotting position of robot welding gun
JP2005138223A (en) * 2003-11-06 2005-06-02 Fanuc Ltd Positional data correcting device for robot
JP2007122705A (en) * 2005-09-30 2007-05-17 Nachi Fujikoshi Corp Welding teaching point correction system and calibration method
JP2008132525A (en) * 2006-11-29 2008-06-12 Nachi Fujikoshi Corp Teaching-position correcting system of welding-robot and teaching-position correcting method of welding-robot
JP2008178887A (en) * 2007-01-23 2008-08-07 Nachi Fujikoshi Corp Image capturing apparatus and spot welding robot system
JP2009125839A (en) * 2007-11-21 2009-06-11 Nachi Fujikoshi Corp Weld teaching position correction system
JP2014184530A (en) * 2013-03-25 2014-10-02 Toyota Motor Corp Teaching system and teaching correction method
JP2018202559A (en) * 2017-06-06 2018-12-27 ファナック株式会社 Teaching position correction device and teaching position correction method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3805317B2 (en) * 2003-03-17 2006-08-02 ファナック株式会社 Teaching position correction method and teaching position correction apparatus
JP5850962B2 (en) * 2014-02-13 2016-02-03 ファナック株式会社 Robot system using visual feedback
JP2021003794A (en) * 2019-06-27 2021-01-14 ファナック株式会社 Device and method for acquiring deviation amount of work position of tool

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07325611A (en) * 1994-05-31 1995-12-12 Toyota Motor Corp Automatic correcting method for off-line teaching data
JPH0924476A (en) * 1995-07-13 1997-01-28 Dengensha Mfg Co Ltd Method for teaching spotting position of robot welding gun
JP2005138223A (en) * 2003-11-06 2005-06-02 Fanuc Ltd Positional data correcting device for robot
JP2007122705A (en) * 2005-09-30 2007-05-17 Nachi Fujikoshi Corp Welding teaching point correction system and calibration method
JP2008132525A (en) * 2006-11-29 2008-06-12 Nachi Fujikoshi Corp Teaching-position correcting system of welding-robot and teaching-position correcting method of welding-robot
JP2008178887A (en) * 2007-01-23 2008-08-07 Nachi Fujikoshi Corp Image capturing apparatus and spot welding robot system
JP2009125839A (en) * 2007-11-21 2009-06-11 Nachi Fujikoshi Corp Weld teaching position correction system
JP2014184530A (en) * 2013-03-25 2014-10-02 Toyota Motor Corp Teaching system and teaching correction method
JP2018202559A (en) * 2017-06-06 2018-12-27 ファナック株式会社 Teaching position correction device and teaching position correction method

Also Published As

Publication number Publication date
CN114555271B (en) 2023-10-03
CN114555271A (en) 2022-05-27
JP7290537B2 (en) 2023-06-13
JP2021058988A (en) 2021-04-15

Similar Documents

Publication Publication Date Title
US20210114221A1 (en) Method of teaching robot and robot system
CN107053167B (en) Control device, robot, and robot system
JP6429473B2 (en) Robot system, robot system calibration method, program, and computer-readable recording medium
JP6468741B2 (en) Robot system and robot system calibration method
US11267142B2 (en) Imaging device including vision sensor capturing image of workpiece
CN104175031B (en) A kind of welding robot system with autonomous centering capacity carries out the method for welding
JP5606241B2 (en) Visual cognitive system and method for humanoid robot
US11466974B2 (en) Image capturing apparatus and machine tool
KR101988937B1 (en) Method and apparatus for calibration of a robot hand and a camera
US10335895B2 (en) Friction stir welding device and friction stir welding method
JP2010112859A (en) Robot system, robot control device, and method for controlling robot
JP6869159B2 (en) Robot system
US11173608B2 (en) Work robot and work position correction method
WO2021070922A1 (en) Correction system, correction method, robot system, and control device
JP2018202542A (en) Measurement device, system, control method, and manufacturing method of article
EP3345729B1 (en) Robot system with camera
CN111283685A (en) Vision teaching method of robot based on vision system
JP4899099B2 (en) Work robot position measurement device
US20190037204A1 (en) Calibration method and calibration tool of camera
JP2020138294A (en) Robot system
WO2021117701A1 (en) Master/slave system and control method
JP7183372B1 (en) Marker detection device and robot teaching system
CN114599485B (en) Master-slave system and control method
CN210072704U (en) Camera calibration system
CN115319323B (en) Tube plate welding method, system, welding robot and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20873691

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20873691

Country of ref document: EP

Kind code of ref document: A1