WO2022270580A1 - オフライン教示装置およびオフライン教示方法 - Google Patents
オフライン教示装置およびオフライン教示方法 Download PDFInfo
- Publication number
- WO2022270580A1 WO2022270580A1 PCT/JP2022/025097 JP2022025097W WO2022270580A1 WO 2022270580 A1 WO2022270580 A1 WO 2022270580A1 JP 2022025097 W JP2022025097 W JP 2022025097W WO 2022270580 A1 WO2022270580 A1 WO 2022270580A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- welding
- scan
- teaching
- dimensional
- scanning
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 73
- 238000003466 welding Methods 0.000 claims abstract description 474
- 230000033001 locomotion Effects 0.000 claims description 157
- 239000000284 extract Substances 0.000 claims 1
- 238000007689 inspection Methods 0.000 description 153
- 239000011324 bead Substances 0.000 description 117
- 238000012545 processing Methods 0.000 description 97
- 238000011179 visual inspection Methods 0.000 description 70
- 230000008569 process Effects 0.000 description 48
- 238000004891 communication Methods 0.000 description 38
- 230000008439 repair process Effects 0.000 description 36
- 238000010586 diagram Methods 0.000 description 31
- 238000003860 storage Methods 0.000 description 28
- 238000013459 approach Methods 0.000 description 27
- 230000002950 deficient Effects 0.000 description 17
- 230000008859 change Effects 0.000 description 15
- 238000004364 calculation method Methods 0.000 description 12
- 230000007547 defect Effects 0.000 description 12
- 238000001514 detection method Methods 0.000 description 12
- 238000012937 correction Methods 0.000 description 10
- 238000004519 manufacturing process Methods 0.000 description 10
- 238000012986 modification Methods 0.000 description 8
- 230000004048 modification Effects 0.000 description 8
- 230000000694 effects Effects 0.000 description 6
- 230000009471 action Effects 0.000 description 5
- 238000012217 deletion Methods 0.000 description 5
- 230000037430 deletion Effects 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 230000003362 replicative effect Effects 0.000 description 3
- 238000005493 welding type Methods 0.000 description 3
- 101100522110 Oryza sativa subsp. japonica PHT1-10 gene Proteins 0.000 description 2
- 101100522109 Pinus taeda PT10 gene Proteins 0.000 description 2
- 238000005304 joining Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 206010024796 Logorrhoea Diseases 0.000 description 1
- 101100522111 Oryza sativa subsp. japonica PHT1-11 gene Proteins 0.000 description 1
- 101100522114 Oryza sativa subsp. japonica PHT1-12 gene Proteins 0.000 description 1
- 101150073618 ST13 gene Proteins 0.000 description 1
- 101150001619 St18 gene Proteins 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000011017 operating method Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000010076 replication Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B23—MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
- B23K—SOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
- B23K9/00—Arc welding or cutting
- B23K9/12—Automatic feeding or moving of electrodes or work for spot or seam welding or cutting
- B23K9/127—Means for tracking lines during arc welding or cutting
- B23K9/1272—Geometry oriented, e.g. beam optical trading
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/005—Manipulators for mechanical processing tasks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/022—Optical sensing devices using lasers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1671—Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/42—Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/37—Measurements
- G05B2219/37449—Inspection path planner
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/45—Nc applications
- G05B2219/45066—Inspection robot
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/45—Nc applications
- G05B2219/45104—Lasrobot, welding robot
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/45—Nc applications
- G05B2219/45135—Welding
Definitions
- the present disclosure relates to an offline teaching device and an offline teaching method.
- Patent Document 1 discloses an off-line teaching device that displays a motion trajectory of a robot when a teaching program is executed in a model diagram, and displays some of a plurality of position detection commands and a portion of a plurality of welding commands.
- the off-line teaching device includes a display unit that displays a teaching program and model diagrams, a storage unit that stores instructions constituting the teaching program and model data of the model diagrams, and a control unit that controls the display unit and the storage unit.
- the teaching program includes a position detection program composed of a plurality of position detection instructions and a welding program composed of a plurality of welding instructions.
- each of the instructions, position detection program, and welding program that constitute the teaching program is created by the operator.
- the present disclosure provides an offline teaching device and an offline teaching method for more efficiently creating a teaching program for a sensor scanning operation executed by a welding robot.
- the present disclosure includes an input unit capable of accepting operator operations, a scan range of a sensor that scans three-dimensional shape data of a work produced by welding, the operation trajectory of the welding, and the external shape of the work. Acquisition a generator that generates a three-dimensional area scanned by the sensor based on the acquired scan range and scan section; and a generator that generates the workpiece based on the operator operation input to the input unit arranging at least one of the three-dimensional regions on the three-dimensional shape data of the above, and based on the arranged three-dimensional region and the operation trajectory of the welding, a welding robot that performs the welding is provided with the three-dimensional region.
- an off-line teaching device comprising a control unit that creates and outputs a teaching program for scanning.
- an offline teaching method performed by an offline teaching device configured to include one or more computers communicably connected to an input device capable of accepting operator operations, wherein Acquiring the three-dimensional shape data of the work to be produced, the welding operation trajectory, and the scan range of a sensor that scans the external shape of the work, and based on the acquired scan range and scan section, the sensor generating a three-dimensional area to be scanned, arranging at least one of the three-dimensional areas on the three-dimensional shape data of the workpiece based on the operator's operation obtained from the input device, and Provided is an off-line teaching method for creating and outputting a teaching program for causing a welding robot that performs the welding to scan the three-dimensional region based on the dimensional region and the motion trajectory of the welding.
- an offline teaching method performed by an offline teaching device configured to include one or more computers communicably connected to an input device by a worker operating an input device.
- an offline teaching device configured to include one or more computers communicably connected to an input device by a worker operating an input device.
- Inputting three-dimensional shape data of a workpiece produced by welding into the computer inputting a scanning section for scanning the external shape of the workpiece into the computer, and corresponding to the scanning section in the three-dimensional shape data
- an off-line teaching method for creating a teaching program for causing a welding robot that performs welding to scan a three-dimensional area based on a scan point.
- FIG. 1 shows an internal configuration example of an inspection control device, a robot control device, a host device, and an offline teaching device according to Embodiment 1;
- Diagram explaining an example of the effective scanning range of the sensor A diagram explaining an example of the scan effective area of the sensor
- a diagram showing an example of a 3D model 4A and 4B are diagrams for explaining an example of copy processing of the scan effective area according to the first embodiment;
- FIG. 11 is a diagram for explaining example 1 of processing for deleting a scan valid area according to the first embodiment;
- FIG. 10 is a diagram for explaining example 2 of processing for deleting a scan valid area according to the first embodiment;
- FIG. 11 is a diagram for explaining an example of a scanning effective area dividing process according to the first embodiment;
- FIG. 5 is a diagram for explaining an example of various operations associated with scan effective areas according to Embodiment 1;
- FIG. 11 is a diagram for explaining an example of processing for changing the scan effective area and an example of processing for rotating the scan effective area according to the second embodiment;
- FIG. 11 is a diagram for explaining each of scan effective area change processing example 1, change processing example 2, and change processing example 3; Diagram showing an example of model work
- FIG. 11 is a diagram for explaining an example of copy processing of the scan effective area in the second embodiment;
- FIG. 11 is a diagram for explaining an example of rotation processing of the scan effective area according to the second embodiment;
- FIG. 11 is a diagram for explaining an example of various operations associated with scan effective areas according to the second embodiment;
- Patent Document 1 A device configuration capable of constructing a virtual production facility using an off-line teaching device is conventionally known, as in Patent Document 1.
- Such an off-line teaching device simultaneously displays some position detection instructions corresponding to the operation trajectory of the welding robot and some welding instructions, making it easy for the operator to identify the edit points when creating the teaching program. It can help improve the efficiency and accuracy of the created program.
- an offline teaching device that teaches scan locations in virtual space. By visualizing the scannable range at a predetermined position in the horizontal direction (on the XY plane), the offline teaching device visualizes the scanning location taught by the operator and the scannable range by the sensor, and the operator performs Support teaching work for executing visual inspection.
- the operator must use the created teaching program to perform the visual inspection, and correct the teaching points based on the sensor scan results (visual inspection results) in the visual inspection. No, it was a lot of work.
- the object to be welded (for example, metal) is defined as the "original work”, and the object produced (manufactured) by the main welding is defined as the “work”.
- the "work” is not limited to a work produced by one time of final welding, but may be a composite work produced by two or more times of final welding.
- the process of producing a work by joining an original work and another original work by a welding robot is defined as "final welding”.
- FIG. 1 is a schematic diagram showing a system configuration example of a welding system 100 according to Embodiment 1.
- the welding system 100 includes a host device 1 connected to an external storage ST, an input interface UI1, and a monitor MN1, a robot control device 2, an inspection control device 3, a sensor 4, an offline teaching device 5, and a monitor MN3. , an input device UI3, a welding robot MC1, and a monitor MN2.
- the sensor 4 is illustrated as a separate body from the welding robot MC1, but may be integrated with the welding robot MC1 (see FIG. 2).
- the monitor MN2 is not an essential component and may be omitted.
- the host device 1 comprehensively controls the start and completion of the main welding executed by the welding robot MC1 via the robot control device 2. For example, the host device 1 reads welding-related information previously input or set by a user (for example, a welding operator or a system administrator; the same shall apply hereinafter) from the external storage ST, and uses the welding-related information to read the welding-related information is generated and transmitted to the corresponding robot controller 2. When the main welding by the welding robot MC1 is completed, the host device 1 receives a main welding completion report indicating that the main welding by the welding robot MC1 has been completed from the robot control device 2, and reports the completion of the corresponding main welding. The status is updated and recorded in the external storage ST.
- a user for example, a welding operator or a system administrator; the same shall apply hereinafter
- the execution command for the main welding described above is not limited to being generated by the host device 1.
- the operation panel of the equipment in the factory where the main welding is performed for example, PLC: Programmable Logic Controller
- the robot control device 2 operation panel for example, a teach pendant
- the teach pendant is a device for operating the welding robot MC1 connected to the robot control device 2 .
- the host device 1 comprehensively controls the start and completion of the bead visual inspection using the robot control device 2, the inspection control device 3, and the sensor 4. For example, when the host device 1 receives a main welding completion report from the robot control device 2, it generates a bead visual inspection execution command for the workpiece produced by the welding robot MC1, and the robot control device 2 and the inspection control device 3 Send to When the bead visual inspection is completed, the host device 1 receives a visual inspection report to the effect that the bead visual inspection is completed from the inspection control device 3, updates the status to the effect that the corresponding bead visual inspection is completed, and sends it to the external device. Record in storage ST.
- the welding-related information is information indicating the details of the final welding performed by the welding robot MC1, and is created in advance for each process of the final welding and registered in the external storage ST.
- the welding-related information includes, for example, the number of original works used in the main welding, the ID of the original work used in the main welding, the lot information of the original work, the name and the welding location (for example, the information of the welding line, the welding line work information including position information, etc.), the scheduled execution date of the main welding, the number of original works to be produced, and various welding conditions at the time of the main welding.
- the welding-related information is not limited to the data of the items described above, and includes teaching programs for welding operations and scanning operations that have already been created (see below), welding operation setting information used to create these teaching programs, Information such as scan operation setting information may be further included.
- the welding conditions include, for example, the material and thickness of the original workpiece, the material and wire diameter of the welding wire 301, the type of shielding gas, the flow rate of the shielding gas, the set average value of the welding current, the set average value of the welding voltage, and the welding wire 301. These include the feed speed and feed amount, the number of welding times, the welding time, and the like. In addition to these, for example, information indicating the type of main welding (for example, TIG welding, MAG welding, pulse welding), moving speed and moving time of manipulator 200 may be included.
- the robot control device 2 Based on the execution command for final welding transmitted from the host device 1, the robot control device 2 causes the welding robot MC1 to start execution of final welding using the original workpiece specified by the execution command.
- the welding-related information described above is not limited to being managed by the host device 1 with reference to the external storage ST, and may be managed by the robot control device 2, for example. In this case, since the robot control device 2 can grasp the state that the main welding is completed, the actual execution date of the welding process may be managed instead of the scheduled execution date of the welding-related information.
- the type of main welding does not matter, in order to make the explanation easier to understand, a process of joining a plurality of original works to produce one work will be described as an example.
- the host device 1 is connected to the monitor MN1, the input interface UI1, and the external storage ST so as to be able to input and output data, and is also capable of data communication with the robot controller 2. connected so that The host device 1 may be a terminal device P1 that integrally includes the monitor MN1 and the input interface UI1, and may also integrally include the external storage ST.
- the terminal device P1 is a PC (Personal Computer) used by the user prior to execution of the main welding.
- the terminal device P1 is not limited to the PC described above, and may be a computer device having a communication function such as a smart phone or a tablet terminal.
- the monitor MN1 may be configured using a display device such as an LCD (Liquid Crystal Display) or an organic EL (Electroluminescence).
- the monitor MN1 may display a screen output from the host device 1, for example, showing a notification that the main welding is completed or a notification that the bead visual inspection is completed.
- a speaker (not shown) may be connected to the host device 1 in place of the monitor MN1 or together with the monitor MN1. may be output via a speaker.
- the input interface UI1 is a user interface that detects a user's input operation and outputs it to the host device 1, and may be configured using a mouse, keyboard, touch panel, or the like, for example.
- the input interface UI1 receives, for example, an input operation when the user creates welding-related information, or receives an input operation when transmitting a command to execute the main welding to the robot control device 2 .
- the external storage ST is configured using, for example, a hard disk drive or a solid state drive.
- the external storage ST stores, for example, welding-related information data created for each main welding, the status (production status) of the work Wk produced by the main welding, and work information of the work Wk (see above).
- the external storage ST may store the welding operation teaching program created by the offline teaching device 5 and the scanning operation teaching program for each welding line. Each of the teaching programs for the welding operation and the scanning operation will be described later.
- the robot control device 2 is connected to enable data communication with the host device 1, the inspection control device 3, and the offline teaching device 5, respectively, and is connected to enable data communication with the welding robot MC1. be done.
- the robot control device 2 receives the main welding execution command transmitted from the host device 1, the robot control device 2 creates a main welding program based on the welding operation teaching program corresponding to this execution command, and controls the welding robot MC1. to perform main welding.
- the robot control device 2 When detecting the completion of the main welding, the robot control device 2 generates a main welding completion report to the effect that the main welding is completed, and notifies it to the host device 1 . Thereby, the host device 1 can properly detect the completion of the main welding by the robot control device 2 .
- the robot control device 2 may detect the completion of main welding by, for example, a method of determining based on a signal indicating the completion of main welding from a sensor (not shown) provided in the wire feeding device 300, or a known method. It may be a method, and the content of the method for detecting the completion of the main welding need not be limited.
- the welding robot MC1 is connected to the robot control device 2 so that data communication is possible.
- Welding robot MC ⁇ b>1 performs main welding commanded from host device 1 under the control of corresponding robot control device 2 .
- the welding robot MC1 moves the sensor 4 based on the scanning operation teaching program, thereby executing the bead appearance inspection commanded by the host device 1 .
- the inspection control device 3 is connected to enable data communication with each of the host device 1, the robot control device 2, the sensor 4, and the offline teaching device 5.
- the inspection control device 3 receives a bead visual inspection execution command transmitted from the host device 1, the inspection control device 3 follows the teaching program for the scanning operation of the corresponding work Wk to weld the work Wk produced by the welding robot MC1 (i.e., A bead appearance inspection (for example, an inspection of whether or not a weld bead formed on a workpiece satisfies a predetermined welding standard) is performed with the sensor 4 .
- a bead appearance inspection for example, an inspection of whether or not a weld bead formed on a workpiece satisfies a predetermined welding standard
- the inspection control device 3 uses the input data (for example, point cloud data that can specify the three-dimensional shape of the weld bead) regarding the shape of the weld bead acquired by the sensor 4 as a result of the scanning operation, and predetermines for each work.
- a bead appearance inspection is performed based on comparison with the master data of non-defective works.
- the bead visual inspection performed by the welding robot MC1 in the first embodiment is not limited to the visual inspection of the weld bead, and the visual inspection of the weld bead and other visual inspections (for example, the inspection of parts mounting on the workpiece Wk). Presence or absence, etc.) may be included in the inspection.
- the operator can more efficiently utilize the scanning effective area of the sensor 4 and simultaneously perform appearance inspections having different purposes based on the appearance inspection results.
- the scan effective area referred to here indicates a three-dimensional area in which the sensor 4 can read the external shape by scanning.
- the inspection control device 3 performs a bead visual inspection, generates a visual inspection report including the inspection judgment result of this bead visual inspection and a notice that the bead visual inspection is completed, and transmits it to the host device 1, and monitors MN2. output to When the inspection control device 3 determines that a defect has been detected in the bead visual inspection of the workpiece, it generates a visual inspection report including visual inspection results including information on the defective section for repair welding of the defect. , to the host device 1 and the robot control device 2 . In addition, when the inspection control device 3 determines that a defect has been detected by the bead visual inspection of the workpiece, the repair welding for performing correction such as repairing the defective portion using the visual inspection result including information on the defective section. create a program The inspection control device 3 associates this repair welding program with the visual inspection result and transmits them to the host device 1 or the robot control device 2 .
- the sensor 4 is connected to enable data communication with the inspection control device 3 .
- the sensor 4 is attached to the welding robot MC1, and three-dimensionally scans the workpiece Wk or the stage STG (see FIG. 2) on which the workpiece Wk is placed according to the driving of the manipulator 200 under the control of the robot controller 2. Run.
- the sensor 4 detects the three-dimensional shape data of the workpiece Wk placed on the stage STG or the shape, size, and Three-dimensional shape data (for example, point cloud data) that can specify a position or the like is acquired and transmitted to the inspection control device 3 .
- the monitor MN2 may be configured using a display device such as LCD or organic EL.
- the monitor MN2 displays a screen output from the inspection control device 3, for example, a notification that the bead visual inspection has been completed, or a screen showing the notification and the result of the bead visual inspection.
- a speaker (not shown) may be connected to the inspection control device 3 instead of the monitor MN2 or together with the monitor MN2. You may output the audio
- the offline teaching device 5 is connected to the robot control device 2, the inspection control device 3, the monitor MN3, and the input device UI3 so that they can communicate with each other.
- the offline teaching device 5 stores, as setting information, welding line position information for each workpiece Wk for which a teaching program is to be created or which has already been created.
- the offline teaching device 5 constructs virtual production equipment (for example, a virtual welding robot, a virtual work, a virtual stage, etc.), and controls commands and various data transmitted from the input device UI 3, or the robot control device 2 or A teaching program for the welding operation of the work Wk based on various data output from the inspection control device 3 (for example, welding bead or input data related to the shape of the work Wk, 3D model data, position information of the welding line, etc.) and a scanning operation teaching program.
- the offline teaching device 5 transmits the created teaching program for the welding operation and the created teaching program for the scanning operation to the robot control device 2 .
- the created scanning operation teaching program may be sent to the inspection control device 3 as well as the robot control device 2 .
- the offline teaching device 5 also stores the created teaching program for the welding operation and the created teaching program for the scanning operation for each workpiece Wk.
- the weld line position information here is information indicating the position of the weld line formed on the workpiece Wk.
- the welding operation teaching program referred to here is a program created based on the welding line and for causing the welding robot MC1 to perform the main welding.
- the teaching program for welding operation includes the position, distance, angle ( posture) information and information such as welding conditions.
- the scanning operation teaching program referred to here is a program that is created based on the weld line and causes the welding robot MC1 to perform an appearance inspection of at least one weld bead created by the final welding or the workpiece Wk. .
- the scan operation teaching program uses the sensor 4 to perform various operations (for example, approach, retract, avoidance, scan, etc.) for performing visual inspection of the created weld bead, workpiece Wk, etc. 4 position, distance, and angle (orientation) information.
- the monitor MN3 may be configured using a display device such as LCD or organic EL.
- the monitor MN3 displays images of virtual production equipment (for example, a virtual welding robot, a virtual workpiece, a virtual stage, etc.) transmitted from the offline teaching device 5, and monitors the operation of the welding torch 400 based on the welding operation teaching program.
- the trajectory, the motion trajectory of the sensor 4 based on the scanning motion teaching program, and the like are displayed.
- the monitor MN3 also displays an image in which the motion trajectory of the sensor 4, the motion trajectory of the welding torch 400, or the like is superimposed on the image of the virtual production facility transmitted from the offline teaching device 5.
- the input device UI3 is a user interface that detects a user's input operation and outputs it to the host device 1, and may be configured using a mouse, keyboard, touch panel, or the like, for example.
- the input device UI 3 is used for inputting position information of the welding line of the workpiece Wk, welding setting information, scan setting information, 3D model, etc. used for creating teaching programs for scan motions and welding motions, or for inputting operations such as scan motions and welding motions that have already been created.
- Each input operation of the motion teaching program is received.
- the monitor MN3 and the input device UI3 may be a terminal device P3 (for example, a PC, a notebook PC, a tablet terminal, etc.) configured integrally.
- FIG. 2 is a diagram showing an internal configuration example of the inspection control device 3, the robot control device 2, the host device 1, and the offline teaching device 5 according to the first embodiment.
- the monitors MN1 and MN2 and the input interface UI1 are omitted from FIG.
- a work Wk shown in FIG. 2 is a work to be subjected to the bead appearance inspection. This work Wk may be a work produced by main welding, or a so-called repair work that has been repaired one or more times by repair welding.
- the welding robot MC1 shown in FIG. 2 is configured to include a sensor 4, but the sensor 4 may be used by other robots (for example, an inspection robot for performing visual inspection and a repair welding robot for performing repair welding). etc.).
- the welding robot MC1 Under the control of the robot control device 2, the welding robot MC1 performs a main welding process based on a teaching program for welding operations using the welding torch 400, a bead visual inspection process based on a teaching program for scanning operations using the sensor 4, and the like. to run.
- the welding robot MC1 also uses the sensor 4 to acquire the external shape of the workpiece Wk and the position information of the weld bead formed on the workpiece Wk, which are used to create teaching programs for the welding operation and the scanning operation. You may scan the external appearance of the workpiece
- the welding robot MC1 performs arc welding, for example, in the main welding process. However, the welding robot MC1 may perform welding other than arc welding (for example, laser welding and gas welding).
- Welding robot MC ⁇ b>1 includes at least a manipulator 200 , a wire feeder 300 , a welding wire 301 and a welding torch 400 .
- the manipulator 200 has articulated arms, and moves each arm based on control signals from the robot controller 24 of the robot controller 2 .
- manipulator 200 can change the positional relationship between work Wk and welding torch 400 (for example, the angle of welding torch 400 with respect to work Wk) and the positional relationship between work Wk and sensor 4 by driving the arm.
- the wire feeding device 300 controls the feeding speed of the welding wire 301 based on the control signal from the robot control device 2.
- Wire feeding device 300 may include a sensor (not shown) capable of detecting the remaining amount of welding wire 301 .
- the robot control device 2 can detect completion of the main welding process based on the output of this sensor.
- Welding wire 301 is held by welding torch 400 .
- Electric power is supplied to welding torch 400 from power supply device 500, whereby an arc is generated between the tip of welding wire 301 and work Wk, and arc welding is performed.
- illustration and explanation of the configuration for supplying the shielding gas to the welding torch 400 are omitted.
- the host device 1 uses the welding-related information input or set in advance by the user to generate execution commands for various processes such as final welding or bead visual inspection and transmit them to the robot control device 2 .
- execution commands for various processes such as final welding or bead visual inspection and transmit them to the robot control device 2 .
- the sensor 4 is integrally attached to the welding robot MC ⁇ b>1
- the bead visual inspection execution command is sent to both the robot control device 2 and the inspection control device 3 .
- the host device 1 has a configuration including at least a communication unit 10 , a processor 11 and a memory 12 .
- the communication unit 10 is connected to enable data communication with each of the robot control device 2 and the external storage ST.
- the communication unit 10 transmits to the robot control device 2 an execution command for various processes of final welding or bead visual inspection generated by the processor 11 .
- the communication unit 10 receives a main welding completion report and a visual inspection report transmitted from the robot control device 2 and outputs them to the processor 11 .
- the main welding execution command may include, for example, a control signal for controlling each of the manipulator 200, the wire feeding device 300, and the power supply device 500 provided in the welding robot MC1.
- the processor 11 is configured using, for example, a CPU (Central Processing Unit) or FPGA (Field Programmable Gate Array), and cooperates with the memory 12 to perform various types of processing and control. Specifically, the processor 11 functionally implements the cell control unit 13 by referring to the program held in the memory 12 and executing the program.
- a CPU Central Processing Unit
- FPGA Field Programmable Gate Array
- the memory 12 has, for example, a RAM (Random Access Memory) as a work memory that is used when executing the processing of the processor 11, and a ROM (Read Only Memory) that stores a program that defines the processing of the processor 11. Data generated or acquired by the processor 11 is temporarily stored in the RAM. A program that defines the processing of the processor 11 is written in the ROM.
- the memory 12 also stores welding-related information data read from the external storage ST, status of the work Wk, and work information (see later) data of the work Wk transmitted from the robot control device 2 .
- the cell control unit 13 Based on the welding-related information stored in the external storage ST, the cell control unit 13 generates an execution command for performing main welding, bead visual inspection of the work Wk, visual scan of the work Wk, or repair welding. .
- the cell control unit 13 is based on the welding-related information stored in the external storage ST and the teaching programs for the welding operation and the scanning operation that are created in the offline teaching device 5 and transmitted from the robot control device 2. Then, a main welding program for main welding, a visual inspection program for driving the welding robot MC1 for bead visual inspection of the workpiece Wk, or a visual scanning program for driving the welding robot MC1 for visual scanning are created. Further, the cell control unit 13 creates execution commands for these created programs.
- the appearance inspection program or the appearance scanning program may be created in advance for each workpiece Wk and stored in the external storage ST. Read and get the program.
- the cell control unit 13 may generate a different execution command for each of various processes of final welding performed by the welding robot MC1.
- the main welding appearance inspection and appearance scanning execution commands generated by the cell control unit 13 are transmitted to the corresponding robot control device 2 or to each of the robot control device 2 and the inspection control device 3 via the communication unit 10. .
- the robot control device 2 refers to the corresponding program based on the execution command for final welding, bead visual inspection, or visual scanning sent from the host device 1.
- the robot controller 2 controls the welding robot MC1 (eg, sensor 4, manipulator 200, wire feeder 300, power supply 500) based on the referenced program.
- the robot control device 2 has a configuration including at least a communication unit 20 , a processor 21 and a memory 22 .
- the communication unit 20 is connected to enable data communication with the host device 1, the inspection control device 3, the welding robot MC1, and the offline teaching device 5, respectively. Although the illustration is simplified in FIG. 2 , there are connections between the robot control unit 24 and the manipulator 200 , between the robot control unit 24 and the wire feeding device 300 , and between the power control unit 25 and the power supply device 500 . Data is transmitted and received between them via the communication unit 20 .
- the communication unit 20 receives an execution command for final welding or bead visual inspection transmitted from the host device 1 .
- the communication unit 20 receives the welding line position information, the welding operation teaching program, and the scanning operation teaching program transmitted from the offline teaching device 5 .
- the communication unit 20 transmits work information of the work Wk produced by the final welding to the host device 1 .
- the work information includes not only the ID of the work Wk, but also the ID, name, welding location, and welding conditions of the original work used for final welding.
- the processor 21 is configured using, for example, a CPU or FPGA, and cooperates with the memory 22 to perform various types of processing and control. Specifically, the processor 21 refers to the program held in the memory 22 and executes the program, thereby functionally realizing the main welding program creation unit 23, the robot control unit 24, and the power supply control unit 25. . Further, the processor 21 controls the welding robot MC1 (specifically, the manipulator 200, the wire feeder 300 and the power Calculation of parameters for controlling each of the devices 500 is performed.
- the welding robot MC1 specifically, the manipulator 200, the wire feeder 300 and the power Calculation of parameters for controlling each of the devices 500 is performed.
- the memory 22 has, for example, a RAM as a work memory that is used when executing the processing of the processor 21, and a ROM that stores a program that defines the processing of the processor 21. Data generated or acquired by the processor 21 is temporarily stored in the RAM. A program that defines the processing of the processor 21 is written in the ROM.
- the memory 22 stores welding-related data in which the main welding or bead visual inspection execution command data transmitted from the host device 1, the work information of the work Wk produced by the main welding, and the position information of the weld line are associated with each other. Each store information.
- the welding-related information including the work information of the work Wk to which the teaching programs for the welding operation and the scanning operation have been transmitted from the offline teaching device 5 includes the teaching programs for the welding operation and the scanning operation, the welding operation and the scanning operation. may include welding line position information, welding operation setting information, and scanning operation setting information used to create each teaching program.
- the main welding program creation unit 23 Based on the main welding execution command transmitted from the host device 1 via the communication unit 20, the main welding program creation unit 23 generates work information (for example, work ID, name, work coordinate system, original work information, welding line position information, etc.) and a welding operation teaching program associated with these work information, the main welding of the main welding performed by the welding robot MC1 create a program
- the main welding program includes welding current, welding voltage, offset amount, welding speed, welding torch 400 for controlling the power supply 500, manipulator 200, wire feeder 300, welding torch 400, etc. during execution of main welding. may include various parameters such as the attitude of the Note that this welding program may be stored in the processor 21 or may be stored in the RAM in the memory 22 .
- the robot control unit 24 controls the welding robot MC1 (specifically, each of the sensor 4, the manipulator 200, the wire feeder 300, and the power supply device 500) based on the main welding program generated by the main welding program creation unit 23. to generate a control signal for driving the The robot control unit 24 transmits this generated control signal to the welding robot MC1.
- the robot control unit 24 drives the manipulator 200 and the sensor 4 of the welding robot MC1 based on an appearance inspection program created using a scan operation teaching program).
- the sensor 4 attached to the welding robot MC1 moves along with the operation of the welding robot MC1, and scans the weld bead of the workpiece Wk to obtain input data (for example, three-dimensional shape of the weld bead) regarding the shape of the weld bead. can be specified), or by partially scanning the work Wk, input data related to the partial shape of the work Wk corresponding to another appearance inspection location (for example, Point cloud data that can specify the three-dimensional shape of the work Wk) can be acquired.
- input data for example, three-dimensional shape of the weld bead
- another appearance inspection location for example, Point cloud data that can specify the three-dimensional shape of the work Wk
- the power control unit 25 drives the power supply device 500 based on the calculation result of the main welding program generated by the main welding program creation unit 23 .
- the inspection control device 3 based on the execution command of the bead visual inspection transmitted from the host device 1, the work Wk produced by the main welding by the welding robot MC1 or the work Wk repaired by one or more repair welding. Controls the bead visual inspection process.
- the bead appearance inspection is, for example, an inspection of whether or not the weld bead formed on the workpiece Wk satisfies a predetermined welding standard (for example, a welding quality standard required by each user). Configured.
- the inspection control device 3 detects the appearance of the weld bead formed on the workpiece Wk based on input data (for example, point cloud data that can specify the three-dimensional shape of the weld bead) regarding the shape of the weld bead acquired by the sensor 4.
- the inspection control device 3 transmits input data relating to the weld bead acquired by the sensor 4 or the shape of the workpiece Wk to the offline teaching device 5 .
- the inspection control device 3 includes at least a communication unit 30 , a processor 31 , a memory 32 and an inspection result storage unit 33 .
- the communication unit 30 is connected to each of the host device 1, the robot control device 2, the sensor 4, and the offline teaching device 5 so that data communication is possible. Although the illustration is simplified in FIG. 2, data is transmitted and received between the shape detection control section 35 and the sensor 4 via the communication section 30, respectively.
- the communication unit 30 receives a bead visual inspection execution command transmitted from the host device 1 .
- the communication unit 30 transmits the inspection determination result of the bead appearance inspection using the sensor 4 to the host device 1, and transmits the three-dimensional shape data of the weld bead acquired by the sensor 4 to the offline teaching device 5. .
- the processor 31 is configured using, for example, a CPU or FPGA, and cooperates with the memory 32 to perform various types of processing and control. Specifically, the processor 31 refers to the program held in the memory 32 and executes the program to perform the determination threshold value storage unit 34, the shape detection control unit 35, the data processing unit 36, and the inspection result determination unit 37. , and a repair welding program creation unit 38 are functionally realized.
- the memory 32 has, for example, a RAM as a work memory that is used when executing the processing of the processor 31, and a ROM that stores a program that defines the processing of the processor 31. Data generated or acquired by the processor 31 is temporarily stored in the RAM. A program that defines the processing of the processor 31 is written in the ROM. Further, the memory 32 may store the scan operation teaching program transmitted from the offline teaching device 5 and the work information in association with each other.
- the inspection result storage unit 33 is configured using, for example, a hard disk or solid state drive.
- the inspection result storage unit 33 stores, as an example of the data generated or acquired by the processor 31, data indicating the result of the bead visual inspection of the welded portion of the work Wk (for example, work or repair work).
- the data indicating the result of this bead appearance inspection is, for example, the inspection result determination unit 37 (specifically, the first inspection determination unit 371, the second inspection determination unit 372 to the Nth inspection determination unit included in the inspection result determination unit 37). 37N).
- the determination threshold storage unit 34 is configured by, for example, a cache memory provided in the processor 31, is set in advance by a user operation, and stores the weld location and the first inspection determination units 371, . . . Information of each threshold value (for example, each threshold value set for each type of welding failure) corresponding to each bead appearance inspection process of the N inspection determination unit 37N is stored.
- the respective thresholds are, for example, the allowable range of positional deviation of the weld bead, the thresholds for the length, height, and width of the weld bead, and the thresholds for perforations, pits, undercuts, and spatter.
- the determination threshold value storage unit 34 stores, as each threshold value at the time of bead visual inspection after repair welding, an allowable range (for example, a minimum allowable value, a maximum allowable value, etc.) that satisfies the minimum welding standard (quality) required by a customer or the like. can be stored. Note that these threshold values are set so that the inspection results generated by the first inspection determination unit 371 and the second inspection determination unit 372 to the N-th inspection determination unit 37N included in the inspection result determination unit 37 pass the bead visual inspection. It is used for the process of determining whether or not there is. Furthermore, the determination threshold storage unit 34 may store an upper limit of the number of bead appearance inspections for each welding location.
- an allowable range for example, a minimum allowable value, a maximum allowable value, etc.
- the inspection control device 3 determines that it is difficult or impossible to repair the defective portion by automatic repair welding by the welding robot MC1 when the predetermined upper limit is exceeded when the defective portion is repaired by repair welding. , the decrease in the operating rate of the welding system 100 can be suppressed.
- the shape detection control unit 35 detects the shape of the weld bead acquired and transmitted by the sensor 4 based on a bead visual inspection execution command of the welded portion of the work Wk (for example, work or repair work) transmitted from the host device 1.
- Acquire input data for example, point cloud data that can specify the three-dimensional shape of the weld bead.
- the shape detection control unit 35 is acquired by the sensor 4 and transmitted input data related to the shape of the work Wk (for example, 3 point cloud data that can identify the dimensional shape).
- the shape detection control unit 35 enables the sensor 4 to image the welding bead or the workpiece Wk in accordance with the driving of the manipulator 200 by the robot control device 2 described above (in other words, the three-dimensional shape of the welding location or the workpiece Wk).
- a laser beam is emitted from the sensor 4 to acquire input data regarding the shape of the weld bead or workpiece Wk.
- the shape detection control section 35 passes the input data to the data processing section 36 .
- the data processing unit 36 When the data processing unit 36 acquires the input data (see above) regarding the shape of the weld bead from the shape detection control unit 35, the data processing unit 36 converts it into a data format suitable for the first inspection determination in the inspection result determination unit 37, . . , N-th inspection determination in the inspection result determination unit 37.
- the conversion of the data format may include, as a so-called preprocessing, correction processing in which unnecessary point cloud data (for example, noise) contained in the input data (that is, point cloud data) is removed, and may omit the pretreatment described above.
- the data processing unit 36 generates image data representing the three-dimensional shape of the weld bead by using a data format suitable for the first inspection determination, and performing statistical processing on the input shape data, for example.
- the data processing unit 36 may perform edge enhancement correction that emphasizes the peripheral portion of the weld bead in order to emphasize the position and shape of the weld bead as the data for the first inspection determination.
- the data processing unit 36 counts the number of times the bead appearance inspection is performed for each location of defective welding, and if the number of bead appearance inspections exceeds the number of times previously stored in the memory 32, the welding inspection result does not improve. , it may be determined that it is difficult or impossible to correct the defective welding portion by automatic repair welding.
- the inspection result determination unit 37 generates an alert screen including the position of the defective welding location and the type of the defective welding (for example, hole, pit, undercut, spatter, protrusion), and displays the generated alert screen. , to the host device 1 via the communication unit 30 .
- the alert screen sent to the host device 1 is displayed on the monitor MN1. This alert screen may be displayed on the monitor MN2.
- the data processing unit 36 uses the threshold value for the bead appearance inspection stored in the judgment threshold storage unit 34 to obtain the input data regarding the shape of the weld bead acquired by the sensor 4 and the master of non-defective workpieces predetermined for each workpiece. Perform bead visual inspection based on comparison with data.
- the data processing unit 36 creates and inspects a visual inspection report including defect determination results as inspection determination results (that is, information indicating the presence or absence of defects that require repair welding) and information on defect sections for each defect location.
- the results are stored in the result storage unit 33 and transmitted to the host device 1 or the robot control device 2 via the communication unit 30 .
- the data processing unit 36 determines that there is no defective portion requiring repair welding in the workpiece Wk to be inspected, the data processing unit 36 creates a visual inspection report including the inspection determination result indicating that the bead has passed the visual inspection. It is stored in the inspection result storage unit 33 and transmitted to the host device 1 via the communication unit 30 .
- the data processing unit 36 acquires the input data (see above) regarding the shape of the workpiece Wk from the shape detection control unit 35, it converts it into a data format suitable for the arithmetic processing executed by the offline teaching device 5.
- the conversion of the data format may include, as a so-called preprocessing, correction processing in which unnecessary point cloud data (for example, noise) included in the input data (that is, point cloud data) is removed. It may be a process of generating a model.
- the data processing unit 36 may perform edge enhancement correction that emphasizes the position and shape of the work Wk and emphasizes the peripheral portion of the work Wk.
- the data processing unit 36 transmits the input data regarding the shape of the workpiece Wk after conversion to the offline teaching device 5 via the communication unit 30 .
- the first inspection determination unit 371 performs a first inspection determination (that is, a bead appearance inspection based on comparison between input data regarding the shape of the weld bead acquired by the sensor 4 and master data of a non-defective work predetermined for each work). to inspect weld bead shape reliability (for example, whether it is along a straight or curved weld line), bead chipping, and bead misalignment.
- the first inspection determination unit 371 compares the data converted by the data processing unit 36 for the first inspection determination (for example, image data generated based on the point cloud data) with the master data of the non-defective workpiece (so-called image data). process).
- the first inspection determination unit 371 can inspect the weld bead shape reliability, bead chipping, and bead positional deviation with high accuracy.
- the first inspection determination unit 371 calculates an inspection score indicating the shape reliability of the weld bead, bead chipping, and bead misalignment, and creates the calculated value of the inspection score as the first inspection result. Further, the first inspection determination unit 371 compares the created first inspection result with the threshold value for the first inspection result stored in the memory 32 .
- the first inspection determination unit 371 sends the first inspection result including the information of the comparison result (that is, whether the acquired first inspection result passes or fails the bead appearance inspection) to the comprehensive determination unit 370, or It outputs to the second inspection determination section 372 to the N-th inspection determination section 37N.
- Bead appearance inspection for determining the presence or absence of welding defects based on AI targeting input data related to shape or input data after the input data has been preprocessed by the data processing unit 36), perforation of the weld bead, Inspect for pits, undercuts, spatters, and protrusions. Weld bead perforations, pits, undercuts, spatters, and protrusions are listed as examples only, and the defect types inspected by the N-th inspection determination unit 37N are not limited to these.
- each of the second inspection determination unit 372 to the Nth inspection determination unit 37N determines that the corresponding type of welding defect is detected, it specifies the position of the weld bead where the welding defect is detected.
- Each of the second inspection determination unit 372 to the Nth inspection determination unit 37N uses a learning model (AI) obtained in advance by learning processing for each type of defective welding or for each group of defective welding types, Determine the presence or absence of welding defects.
- AI learning model
- second inspection determining section 372 to N-th inspection determining section 37N do not perform the inspections of weld bead shape reliability, bead chipping, and bead position deviation that are performed by first inspection determining section 371, respectively.
- the second inspection determination unit 372 to the Nth inspection determination unit 37N calculate the inspection results (in other words, inspection scores indicating the probability of occurrence) of perforations, pits, undercuts, spatters, and protrusions in the weld bead, and calculate the inspection scores. is created as the second inspection determination result.
- the inspection result determination unit 37 determines whether repair welding by the welding robot MC1 is possible based on the inspection result (inspection score) included in the first inspection result or the second inspection result (in other words, It may be determined whether repair welding by the welding robot MC1 is better or manual repair welding is better), and the result of the determination may be included in the visual inspection report described above and output.
- the repair welding program creation unit 38 creates a repair welding program for the work Wk to be executed by the welding robot MC1, using the appearance inspection report of the work Wk by the data processing unit 36.
- the repair welding program includes welding current, welding voltage, offset amount, welding speed, welding torch 400 for controlling power supply 500, manipulator 200, wire feeder 300, welding torch 400, etc. during execution of repair welding. may include various parameters such as the attitude of the Note that the generated repair welding program may be stored in the processor 31, may be stored in the RAM in the memory 32, or may be associated with the visual inspection report and sent to the host device via the communication unit 30. 1 or the robot controller 2 .
- the repair welding program creation unit 38 receives the visual inspection report of the work Wk (for example, work or repair work) by the inspection result determination unit 37 and work information (for example, information such as coordinates indicating the position of the detection point of the defective welding of the work or repair work) ) to create a repair welding program for the work Wk (for example, work or repair work) to be executed by the welding robot MC1.
- the repair welding program includes welding current, welding voltage, offset amount, welding speed, welding torch 400 for controlling power supply 500, manipulator 200, wire feeder 300, welding torch 400, etc. during execution of repair welding. may include various parameters such as the attitude of the
- the generated repair welding program may be stored in processor 31 or may be stored in RAM in memory 32 .
- the sensor 4 is, for example, a three-dimensional shape sensor, is attached to the tip of the welding robot MC1, and acquires a plurality of point cloud data that can identify the shape of the workpiece Wk or the welding location on the workpiece Wk. Based on the obtained point cloud data, the sensor 4 generates point cloud data that can identify the three-dimensional shape of the welded portion, and transmits the generated point cloud data to the inspection control device 3 .
- the sensor 4 is not attached to the tip of the welding robot MC1 and is arranged separately from the welding robot MC1, the position information of the workpiece Wk or the welding point transmitted from the inspection control device 3 is used.
- a laser light source (not shown) configured to be able to scan the work Wk or the welding point on the work Wk (for example, work or repair work) and an imaging area including the work Wk or the periphery of the welding point can be imaged.
- a camera (not shown) that is arranged and captures the reflected trajectory of the reflected laser beam (that is, the shape line of the welded portion) of the laser beam irradiated to the workpiece Wk or the welded portion.
- the sensor 4 transmits to the inspection control device 3 shape data of the work Wk or the welded portion (in other words, image data of the work Wk or the weld bead) based on the laser light imaged by the camera.
- the camera described above includes at least a lens (not shown) and an image sensor (not shown).
- the image sensor is a solid-state imaging device such as a CCD (Charge Coupled Device) or CMOS (Complementary Metal Oxide Semi-conductor), and converts an optical image formed on an imaging surface into an electrical signal.
- CCD Charge Coupled Device
- CMOS Complementary Metal Oxide Semi-conductor
- the offline teaching device 5 is connected to the robot control device 2, the inspection control device 3, the monitor MN3, and the input device UI3 so that they can communicate with each other.
- the off-line teaching device 5 creates a teaching program for the welding operation of the workpiece Wk based on various data such as welding line position information, welding operation setting information, scan operation setting information, etc., and welding line position information transmitted from the input device UI 3. and a scanning operation teaching program.
- the offline teaching device 5 includes a communication section 50 , a processor 51 , a memory 52 and an input/output section 53 .
- the offline teaching device 5 in Embodiment 1 will explain an example in which teaching programs for welding operations and scanning operations are created, but the creation of teaching programs for welding operations is not essential and may be omitted.
- the off-line teaching device 5 is provided with a sensor 4, and it is sufficient that a scanning motion teaching program for a robot capable of executing a scanning motion (that is, a bead visual inspection) can be created by the sensor 4.
- FIG. 1 A scanning motion teaching program for a robot capable of executing a scanning motion (that is, a bead visual inspection) can be created by the sensor 4.
- the communication unit 50 is connected to enable data communication with the robot control device 2, the inspection control device 3, the input device UI3, and the monitor MN3.
- the communication unit 50 transmits the created teaching programs for the welding operation and the scanning operation, and various data (for example, welding line position information, welding operation settings, etc.) used to create the teaching programs for the welding operation and the scanning operation. information, scan operation setting information, work information of the work Wk, etc.) are associated with each other and transmitted to the robot control device 2 .
- the processor 51 is configured using, for example, a CPU or FPGA, and cooperates with the memory 52 to perform various types of processing and control. Specifically, the processor 51 functionally implements the 3D calculation unit 54 and the program creation unit 55 by referring to the program held in the memory 52 and executing the program.
- the memory 52 has, for example, a RAM as a work memory that is used when executing the processing of the processor 51, and a ROM that stores a program that defines the processing of the processor 51. Data generated or acquired by the processor 51 is temporarily stored in the RAM. A program that defines the processing of the processor 51 is written in the ROM. In addition, the memory 52 stores the welding operation teaching program created by the program creating unit 55, the scanning operation teaching program, and the workpiece information in association with each other.
- An input/output unit 53 which is an example of an input unit and an acquisition unit, includes an execution command transmitted from the input device UI 3, a 3D model of the workpiece Wk, welding operation setting information, and scanning operation setting information, the robot control device 2, an inspection It acquires the position information of the weld line transmitted from the control device 3 or the input device UI 3 and outputs it to the processor 51 .
- the input/output unit 53 also receives images of virtual production equipment (for example, virtual welding robots, virtual workpieces, virtual stages, etc.) generated by the 3D computing unit 54, virtual production equipment transmitted from the offline teaching device 5, An image obtained by superimposing the motion locus of the sensor 4 or the motion locus of the welding torch 400 on the image of the equipment is transmitted to the monitor MN3.
- virtual production equipment for example, virtual welding robots, virtual workpieces, virtual stages, etc.
- the 3D calculation unit 54 as an example of a generation unit includes, for example, input data (that is, three-dimensional shape data) regarding the shape of the work Wk or the weld bead, data of the 3D model of the work Wk, work information of the work Wk, production equipment (for example, positional information of the stage STG, robot information or positional information of the welding robot MC1), etc., virtual configured to
- the 3D computing unit 54 converts the data of the virtually configured production equipment into image data, outputs the image data to the input/output unit 53, and displays it on the monitor MN3.
- the 3D computing unit 54 also calculates one or more teaching points included in the teaching program for the welding operation created by the program creation unit, and the operation trajectory of the welding torch 400 (specifically, idle running section, welding section, etc.). etc. are virtually superimposed on the production equipment to generate image data.
- the 3D computing unit 54 acquires one or more teaching points included in the scanning motion teaching program created by the program creating unit, the motion trajectory of the sensor 4 (specifically, various Image data is generated by virtually superimposing an operation trajectory indicating an operation, an idle running section, a scanning section, etc., on the production equipment.
- the 3D computing unit 54 converts the data of the virtual production facility on which data included in various teaching programs are superimposed into image data, outputs the image data to the input/output unit 53, and displays it on the monitor MN3. Note that the 3D computing unit 54, based on teaching programs for the welding operation and the scanning operation, respectively, teaches points for the welding operation and the scanning operation, and the operation trajectories of the welding torch 400 and the sensor 4 (specifically, idle running). section, welding section, scanning section, etc.) may be collectively superimposed on virtual production equipment to generate image data.
- a program creation unit 55 as an example of a control unit generates welding line position information (e.g., 3D model data of the workpiece Wk, input data related to the shape of the workpiece Wk or the weld bead, starting points and ending points of the welding lines). coordinate information), the welding operation setting information, and the scanning operation setting information, a teaching program for the welding operation and a teaching program for the scanning operation are created.
- the program generator 55 includes a welding motion generator 551 and a scan motion generator 552 .
- the welding operation creation unit 551 creates a welding operation teaching program for performing the main welding process on the workpiece Wk based on the input welding line position information and welding operation setting information.
- the welding operation setting information referred to here may be a group of various parameters necessary for the welding operation, such as various welding conditions for the main welding and retracted positions of the welding torch 400 before the start of welding and after the end of welding.
- the scan motion creation unit 552 generates motion trajectories of welding motions, weld line position information, a 3D model, one or more scan effective areas arranged on the 3D model, scan motion setting information, and the like.
- a scanning operation teaching program for executing a visual inspection process for a weld bead or other visual inspection portion generated on the workpiece Wk.
- the scan operation setting information here means the distance between the sensor 4 and the workpiece Wk, the information of the sensor 4 (for example, the effective scan range AR0 (see FIG. 4), the effective scan area AR1 (see FIG. 5), etc.).
- any parameter group may be used as long as it is necessary for the scanning operation of the object to be inspected.
- approach information for example, approach start position and approach end position information, instruction information for instructing approach, etc.
- scan run-up section scan section
- retract information for example, retract start position and retract end position information , instruction information for instructing retraction, etc.
- avoidance information for example, information on the avoidance start position and avoidance end position, position information of the original work that is an obstacle to be avoided, jigs, etc.
- avoidance information for example, information on the avoidance start position and avoidance end position, position information of the original work that is an obstacle to be avoided, jigs, etc.
- the off-line teaching device 5 teaches a new scanning motion based on the operator's operation obtained via the input device UI 3 and a teaching program for a welding motion or scanning motion that has already been created for the same or another work. Create a teaching program.
- FIG. 3 is a diagram showing an example of the 3D model MD1. Note that the work Wk indicated by the 3D model MD1 in FIG. 3 is an example and is not limited to this.
- the off-line teaching device 5 instructs the robot controller 2, the inspection controller 3, or the input device UI 3, based on the operator's operation, to perform welding operation and scanning operation of the workpiece Wk for which a teaching program for a new scanning operation is to be created. and the 3D model data. Specifically, first, the off-line teaching device 5 determines the operation trajectory of the welding operation of the work Wk for which the scan operation teaching program is to be created (that is, the operation trajectory of the welding torch 400 during the main welding), and the bead appearance inspection.
- Data of the 3D model of the workpiece Wk, which is the object that is, data of the three-dimensional shape of the workpiece Wk
- information of the effective scanning range AR0 of the sensor 4 for example, distance information between the sensor 4 and the effective scanning range AR0, three-dimensional information such as range information of the scan effective range AR0.
- the offline teaching device 5 superimposes the operation trajectory RT1 of the welding operation on the acquired data of the 3D model MD1 of the work Wk.
- the off-line teaching device 5 superimposes an image (that is, , FIG. 3) are generated and transmitted to the monitor MN3 for display.
- the offline teaching device 5 based on the 3D model MD1 on which the operation trajectory RT1 of the welding operation is superimposed, scans the idling operation sections RT11 and RT12 and the scanning sections WL11 and WL12 during the bead visual inspection of the workpiece Wk.
- the welded section or the like indicated by each, or each of the weld lines WLM11 and WLM12, can be presented to the operator so as to be visually recognizable.
- the offline teaching device 5 may omit the acquisition of the position information of the welding line when creating the scanning operation teaching program.
- the offline teaching device 5 only needs to be able to acquire at least the data of the 3D model MD1 of the workpiece Wk and the motion trajectory RT1 of the welding motion.
- the offline teaching device 5 can acquire various motion information (for example, approach motion, retract motion, avoidance motion, etc.) related to the welding motion associated with the motion trajectory RT1 of the welding motion. It should be noted that if the avoidance action is unnecessary, the information regarding the avoidance action may be omitted.
- the offline teaching device 5 executes processing for generating a scan effective area of the sensor 4, which will be described later, based on the operator's operation. Further, the offline teaching device 5 may execute processing for generating a new scan effective area based on the operator's operation.
- the scan effective area generation processing will be described with reference to FIGS. 4 and 5, respectively.
- FIG. 4 is a diagram illustrating an example of the scanning effective range AR0 of the sensor 4.
- FIG. FIG. 5 is a diagram illustrating an example of the scan effective area AR1 of the sensor 4. As shown in FIG. It goes without saying that the effective scan range AR0 shown in FIG. 4 and the effective scan area AR1 shown in FIG. 5 are examples, and are not limited to these.
- a scan effective range AR0 shown in FIG. 4 is a range in which the sensor 4 can scan the three-dimensional shape of an object (for example, a weld bead to be inspected for bead appearance) on the YZ plane.
- the sensor 4 is moved in the traveling direction by driving the manipulator 200 of the welding robot MC1 to scan and obtain the three-dimensional shape of the object located within the scan effective area AR1 shown in FIG.
- the offline teaching device 5 receives an operator's operation for the scan effective range AR0 of the sensor 4 and generates a scan effective area AR1. Specifically, the offline teaching device 5 accepts operator's operation to move the scanning effective range AR0 on the YZ plane in any one direction that can be read by the sensor 4 of the welding robot MC1. The offline teaching device 5 generates the scan effective area AR1 based on the direction of the movement operation performed by the operator and the section of this movement operation (that is, the distance between the movement start position and the movement end position).
- the offline teaching device 5 determines the distance of the scan section SR1 from the effective scan range AR0.
- a scan effective area AR1 is generated up to the scan effective area AR01 located at .
- the offline teaching device 5 supplies the acquired 3D model MD1 or the teaching program for scan teaching with one or more pieces of scan effective area information (for example, the position, three-dimensional shape, size, angle, etc. of the scan effective area). etc.), an editing operation such as copying (duplicating), deleting, or dividing any one of the one or more scan valid areas may be accepted.
- scan effective area information for example, the position, three-dimensional shape, size, angle, etc. of the scan effective area. etc.
- an editing operation such as copying (duplicating), deleting, or dividing any one of the one or more scan valid areas may be accepted.
- the offline teaching device 5 accepts the operator's operation via the input device UI 3 and generates one scan effective area having an arbitrary size and angle. good too.
- the offline teaching device 5 accepts an operation such as copying (replicating) or deleting one generated scan effective area.
- the offline teaching device 5 stores one or more scan effective area information (for example, scan effective area position, three-dimensional shape, size , angle, etc.) are not linked, a teaching program for a new scanning operation can be created by generating a scan effective area based on the operator's operation.
- the offline teaching device 5 includes the scan effective area generated based on the operator's operation, or the scan effective area edited, the motion trajectory of the welding motion, and the 3D model of the workpiece Wk.
- a teaching program for a new scan operation is created based on MD1.
- FIG. 6 is a diagram illustrating an example of copy processing of the scan effective areas AR11 and AR12 according to the first embodiment.
- the scan effective area AR11 shown in FIG. 6 is a scan effective area corresponding to the scan section WL11 of the welding line WLM11 shown in FIG.
- a scan effective area AR11 indicates a scannable area of the sensor 4 when scanning a weld bead formed based on the weld line WLM11.
- the scan effective area AR12 shown in FIG. 6 is the scan effective area corresponding to the scan section WL12 of the weld line WLM12 shown in FIG.
- a scan effective area AR12 indicates a scannable area of the sensor 4 when scanning a weld bead formed based on the weld line WLM12.
- each of the two scan effective areas AR11 and AR12 is an example and need not be limited to this.
- Copy processing of the scan effective area is useful, for example, for teaching the scanning operation of a workpiece that includes two or more workpieces having the same shape.
- the copy processing of the scan effective area is the scan operation used for executing external appearance inspections other than the welding points (for example, inspections for determining the presence or absence of parts (screws, etc.) already attached to the workpiece). Useful for teaching.
- the offline teaching device 5 acquires 3D model data (for example, , in the example shown in FIG. 6, one 3D model data in which the relative positions of the 3D model MD1 and the 3D model MD2 are defined).
- the off-line teaching device 5 copies (duplicates) the 3D model MD1 and accepts a definition operation for defining the relative positions of the 3D model MD1 and the 3D model MD2.
- 3D model data may be acquired.
- the offline teaching device 5 copies one or more scan effective areas (here, two scan effective areas) specified by the operator's operation. AR11 and AR12) are copied (duplicated). The offline teaching device 5 arranges each of the plurality of copied (duplicated) scan effective areas AR13 and AR14 at each of the designated positions based on the operator's operation acquired via the input device UI3.
- the offline teaching device 5 acquires a specified position based on the operator's operation via the input device UI3, it identifies the position of the welding robot MC1 corresponding to this specified position from the motion trajectory of the welding robot MC1.
- the offline teaching device 5 calculates the position and orientation of the sensor 4 included in the welding robot MC1 at the specified position of the welding robot MC1, and based on the calculated position and orientation of the sensor 4, the sensor 4 at the specified position. Calculate the position and angle (orientation) of the scan effective area.
- the offline teaching device 5 generates an image in which the scan effective area is superimposed on the 3D model MD1 based on the calculated position and angle (orientation) of the scan effective area.
- the offline teaching device 5 transmits the generated image to the monitor MN3 for display.
- the offline teaching device 5 displays each scan effective area based on the motion trajectory of the welding robot MC1 (for example, the four scan effective areas AR11 to AR11 superimposed on each of the 3D models MD1 and MD2 after copy processing shown in FIG. 6). AR14 respectively) can be visualized to the operator.
- the offline teaching device 5 creates a scan effective area corresponding to the position and orientation of the sensor 4 at the specified position on the 3D model MD1 based on the position of the welding robot MC1 associated with the motion trajectory of the welding robot MC1.
- Superimposition is possible. Therefore, the offline teaching device 5 in Embodiment 1 can detect the difference between the scan effective area of the sensor 4 when the actual welding robot MC1 is operated and the virtual scan effective area constructed by the offline teaching device 5. can be made smaller. As a result, the off-line teaching device 5 can present to the worker the scan effective area that can be scanned by the sensor 4 during operation, so that the teaching work by the worker can be efficiently supported.
- the offline teaching device 5 presents the operator with a scan effective area that can be scanned by the sensor 4 during operation, thereby further improving the scanning accuracy of the teaching point (that is, the scan effective area) during operation, It is possible to more efficiently reduce the load required for the teaching work such as correction of the teaching location (scan effective area).
- FIG. 7 is a diagram illustrating example 1 of deletion processing of the scan effective area AR12 according to the first embodiment.
- FIG. 8 is a diagram illustrating example 2 of processing for deleting the scan effective area AR17 according to the first embodiment. It goes without saying that the scanning effective area deletion processing example shown in FIGS. 7 and 8 is an example, and the present invention is not limited to this.
- the 3D model MD1 of the workpiece Wk shown in FIG. 7 is the same as the 3D model MD1 shown in FIG. 6, so the description is omitted.
- the processing for deleting the scan effective area is such that although the welding torch 400 can be welded at the time of main welding due to the shape of the workpiece Wk, the sensor 4 cannot approach the position where the sensor 4 can scan during the visual inspection, and obstacles (such as the workpiece Wk, This is useful for teaching the scanning operation when interference occurs with a jig of the work Wk, or when the scan effective area of the sensor 4 does not reach the target portion of the visual inspection.
- the offline teaching device 5 can acquire data of a 3D model of a production facility or an obstacle as information about the production facility of the workpiece Wk, the virtual 3D model of the production facility or the obstacle, An image including the virtual 3D model MD1 of the workpiece Wk and one or more scan effective areas may be generated and transmitted to the monitor MN3 for display. Accordingly, the worker visually confirms whether or not each of the scan effective areas arranged on the 3D model MD1 of the work Wk interferes with production equipment or obstacles, and selects the scan effective area to be deleted. Easy to find.
- the offline teaching device 5 deletes the scan effective area AR12 specified by the operator's operation obtained via the input device UI3.
- each of the plurality of visual inspection locations referred to here is not limited to only the weld bead, and may be, for example, the presence or absence of a component included in the workpiece.
- the 3D model MD21 of the workpiece shown in FIG. 8 is data of a 3D model of a work produced by welding three original works Wk1, Wk2, Wk3 with two welding lines WLM21, WLM22, respectively. is a top view of (viewed from the Z direction).
- illustration of the motion trajectory of the welding motion is omitted.
- the original work Wk1 and the original work Wk2 are each welded by a welding line WLM21.
- a bead visual inspection of the weld bead formed corresponding to the weld line WLM21 is performed by scanning the scan effective area AR16 with the sensor 4.
- FIG. Further, the original work Wk2 and the original work Wk3 are each welded by a welding line WLM22.
- a bead visual inspection of the weld bead formed corresponding to the weld line WLM22 is performed by the sensor 4 scanning the scan effective area AR17.
- the scan effective area AR16 is an area including the two welding lines WLM21 and WLM22, and partially overlaps the scan effective area AR17. In such a case, the offline teaching device 5 deletes the scan effective area AR17 when receiving a control command to delete the scan effective area AR17 transmitted from the input device UI3.
- the operator can perform unnecessary scanning based on the virtual 3D model MD21 of the work created by the offline teaching device 5 and displayed on the monitor MN3 and each of the scan effective areas superimposed on the work.
- a teaching point (for example, the scan effective area AR17 shown in FIG. 8) can be found or deleted.
- the offline teaching device 5 can support the teaching work performed by the operator so that it can be executed more efficiently, and can create a more efficient teaching program for the scanning operation based on the operator's operation.
- FIG. 9 is a diagram for explaining an example of dividing the scan effective area AR15 according to the first embodiment. It goes without saying that the scanning effective area deletion processing example shown in FIG. 9 is merely an example, and the present invention is not limited to this.
- the 3D model MD1 of the workpiece Wk shown in FIG. 9 shows an example in which one scan effective area AR15 including the respective positions of the two welding lines WLM11 and WLM12 shown in FIG. 3 is arranged.
- one scan effective area AR15 is arranged at a position including the obstacle OB1 (original workpiece).
- the scanning effective area division processing is such that although the welding torch 400 can be welded at the time of main welding due to the shape of the workpiece Wk, the sensor 4 cannot approach the position where the sensor 4 can scan during the visual inspection, and obstacles (workpiece Wk, It is useful for teaching the scanning operation in the case of interference with the jig of the work Wk.
- the offline teaching device 5 converts one scan effective area AR15 into two scan effective areas based on a control command for instructing division of the scan effective area AR15 and a control command for designating the division position transmitted from the input device UI3. It is divided into AR151 and AR152. Note that here, the offline teaching device 5 is operated by the operator for each scan section of the two scan effective areas AR151 and AR152 after the division process as in the scan effective area change process described later in Embodiment 2. It may be possible to accept designation by
- the operator can teach points (for example, , the scan effective area AR15 shown in FIG. 9) can be divided.
- the offline teaching device 5 can support the teaching work performed by the operator so that it can be executed more efficiently, and can create a more efficient teaching program for the scanning operation based on the operator's operation.
- FIG. 10 is a diagram illustrating an example of various operations associated with the scan effective areas AR11 and AR12 according to the first embodiment.
- FIG. 11 is a flow chart showing an operation procedure example of the offline teaching device 5 according to the first embodiment. It goes without saying that the various operations shown in FIG. 10 are examples and the present invention is not limited to these.
- the 3D model MD1 of the workpiece shown in FIG. 10 shows an example in which the two scan effective areas AR11 and AR12 shown in FIG. 6 are arranged.
- illustration of the two scan effective areas AR11 and AR12 is omitted in order to facilitate understanding of various operations.
- the offline teaching device 5 Based on the data of the 3D model MD1 and the teaching program of the welding operation, the offline teaching device 5 acquires the operation trajectory RT1 of the welding operation and various pieces of operation information (specifically, Approach information, retraction information, avoidance information, etc.) and two scan effective areas AR11 and AR12 arranged on the 3D model MD1, a teaching program for a new scan operation is created.
- various pieces of operation information specifically, Approach information, retraction information, avoidance information, etc.
- the offline teaching device 5 determines whether or not there is a scan effective area for which a scan operation for causing the sensor 4 to scan the scan effective area has not yet been created among each of the one or more scan effective areas (St10).
- the off-line teaching device 5 determines in the process of step St10 that there is a scan effective area for which no scan operation has been created (St10, YES), the sensor 4 detects this scan effective area based on the welding operation teaching program. It is determined whether or not there is approach information necessary for scanning (St11).
- step St10 determines in the process of step St10 that there is no scan effective area for which the scan operation has not yet been created (St10, NO)
- each of all the scan effective areas arranged on the 3D model MD1 A new teaching program for scanning motions corresponding to the 3D model MD1 is created by linking each of the teaching programs for scanning motions corresponding to .
- the offline teaching device 5 determines that there is approach information in the process of step St11 (St11, YES), it creates an approach motion corresponding to the scan effective area (St13).
- the offline teaching device 5, as shown in FIG. is linked, an approach operation is created in which the sensor 4 approaches the workpiece Wk in the section APR11 from the approach start position PT1 to the approach end position PT2.
- the approach information linked to the scan effective area AR11 may be at least one of the approach end positions PT2.
- the offline teaching device 5 determines in the process of step St11 that there is no approach information (St11, NO), or after the process of step St13, it creates a scan operation corresponding to the scan effective area (St14). .
- the offline teaching device 5 uses the section from the scan start position PT3 to the scan end position PT4 (here, A scanning operation is created to cause the sensor 4 to scan the section corresponding to the weld line WLM11.
- the off-line teaching device 5 creates a teaching program for the scanning operation of the scan effective area AR12. , a section corresponding to the weld line WLM12), a scanning operation is created to cause the sensor 4 to scan.
- the start position and the end position of the weld line WLM11 are located at substantially the same positions as the scan start position PT3 and the scan end position PT4 of the scan effective area AR11, respectively, and the start position of the weld line WLM12.
- An example in which the position and the end position are located at substantially the same positions as the start position PT7 and the end position PT8 of the scan effective area AR12 is shown, but the present invention is not limited to this.
- the start position of the weld line and the start position of the scan effective area may be different.
- the end position of the weld line and the end position of the scan effective area may be different.
- the offline teaching device 5 determines whether or not there is retraction information for the sensor 4 to leave the workpiece based on the welding operation teaching program (St15).
- the offline teaching device 5 determines that there is retraction information in the process of step St15 (St15, YES), it creates a retraction motion corresponding to the scan effective area (St16).
- the offline teaching device 5 stores the retract information of the retract start position PT9 and the retract end position PT10 in the 3D model MD1 or the motion trajectory RT1 as shown in FIG. is linked, a retract operation is created to separate the sensor 4 from the workpiece Wk in the section RTR11 from the retract start position PT9 to the retract end position PT10.
- the retraction information associated with the scan effective area AR12 may be at least one of the retraction start positions PT9.
- step St15 determines that there is no retraction information in the process of step St15 (St15, NO), or after the process of step St16, the sensor 4 is detected as an obstacle based on the welding operation teaching program. (St17).
- step St17 When the offline teaching device 5 determines in the process of step St17 that there is avoidance information (St17, YES), it creates an avoidance action corresponding to the scan effective area (St18).
- the offline teaching device 5 places an avoidance start position PT5 and an avoidance end position PT6 on the 3D model MD1 or the motion trajectory RT1 as shown in FIG. are associated with each other, an avoidance motion for avoiding the sensor 4 from the obstacle OB1 is created in the section ESC11 from the avoidance start position PT5 to the avoidance end position PT6.
- step St17 determines that there is no avoidance information in the process of step St17 (St17, NO), or after the process of step St18, the next (that is, other) scan effective area
- the scan operation creation process that is, the process of step St10
- the offline teaching device 5 creates scanning motions corresponding to each of one or more teaching locations (scanning effective regions) generated based on the operator's operation via the input device UI 3, and all these teachings are performed.
- a teaching program for a new scanning motion can be automatically created.
- the offline teaching device 5 can create a teaching program for a new scanning motion based on the motion trajectory of the welding motion, the scanning accuracy of the teaching point (that is, the scan effective area) during operation can be improved, and the teaching It is possible to more efficiently reduce the load required for teaching work such as correction of a portion (scanning valid area).
- the offline teaching device 5 includes the input/output unit 53 (an example of the input unit) capable of receiving operator operations, and the data of the 3D model MD1 of the workpiece Wk produced by welding (three-dimensional shape
- An input/output unit 53 or a communication unit 50 (an example of an acquisition unit) that acquires an example of data) and a scan effective range AR0 (an example of a scan range) of the sensor 4 that scans the welding operation trajectory and the external shape of the workpiece Wk; , the scan effective area (for example, the scan effective area AR11 shown in FIG. 6) to be scanned by the sensor 4 based on the acquired scan effective area AR0 and the scan section (for example, the scan section SR1 shown in FIG. 5).
- the welding robot MC1 that performs the welding scan effective area and a scan motion creation unit 552 (an example of a control unit) that creates and outputs a teaching program for scanning.
- the offline teaching device 5 can scan the valid scan area corresponding to the position and orientation of the sensor 4 at the specified position based on the position of the welding robot MC1 associated with the motion trajectory RT1 of the welding robot MC1. can be placed on the 3D model MD1 to create a scan motion for each of the placed scan effective areas. Therefore, the off-line teaching device 5 can more efficiently create a teaching program for the scanning motion to be executed by the welding robot MC1 using the created scanning motion, and can scan during operation based on the teaching program for the scanning motion. It is possible to further improve the accuracy of the positions of the scan location and the teaching location (that is, the scan effective area). Therefore, the offline teaching device 5 can more efficiently create a scanning operation teaching program that can more efficiently reduce the load required for teaching work such as correction of a teaching point (scan effective area).
- the scan motion creation unit 552 of the offline teaching device 5 can create the arranged scan effective areas (for example, the scan effective areas AR11 and AR12 shown in FIG. 10), the welding motion trajectory RT1, and the , and operation information (for example, approach information, retraction information, avoidance information, etc.) of the welding robot MC1 that performs welding linked to the data of the 3D model MD1, a teaching program is created.
- the offline teaching device 5 in the first embodiment can create scan motions for the arranged scan effective areas based on the welding motion trajectory RT1 and the motion information of the welding robot MC1. Therefore, the off-line teaching device 5 can more efficiently create a teaching program for the scanning motion to be executed by the welding robot MC1 using the created scanning motion.
- the scan motion creation unit 552 of the offline teaching device 5 in the first embodiment can perform various motions (for example, an approach motion, a retract motion, an avoidance motion, etc.) of the welding robot MC1 with respect to the workpiece Wk based on the motion information. and a scan operation for each scan effective area to be executed by the welding robot MC1.
- the scan motion creation unit 552 creates a teaching program by associating scan motions corresponding to each of the created scan effective areas with various motions.
- the offline teaching device 5 according to the first embodiment performs the scanning motions for the workpiece Wk based on the created various motions of the welding robot MC1 and the scanning motions created for each scan effective area. You can create a teaching program for
- the scan motion creation unit 552 of the offline teaching device 5 creates welding lines of welding (for example, welding lines WLM11 and WLM12 shown in FIG. 10) linked to the data of the 3D model MD1. Then, a teaching program is created and output in which the welding line included in the scan effective area is used as the scan point (that is, the taught point) of the sensor 4 .
- the offline teaching device 5 in the first embodiment can create a scanning operation teaching program capable of performing bead visual inspection of the weld bead formed on the produced workpiece Wk.
- the 3D calculation unit 54 of the offline teaching device 5 in Embodiment 1 duplicates (copies) and arranges the scan effective area based on the operator's operation.
- the scan motion creation unit 552 generates all scan effective areas (for example, the scan effective areas AR11 to AR14 shown in FIG. 6) including the duplicated scan effective areas (for example, each of the scan effective areas AR13 and AR14 shown in FIG. 6). respectively) and the welding operation locus RT1, a teaching program for scanning the scan effective area is created.
- the offline teaching device 5 according to the first embodiment can copy each of the scan effective areas indicating the teaching points, thereby virtually generating the scan effective area and superimposing it on the 3D model MD1 and displaying the scan effective area.
- the offline teaching device 5 can more efficiently create a scanning operation teaching program that further improves the positional accuracy between the scanning location scanned during operation and the teaching location (that is, the scan effective area).
- the 3D calculation unit 54 of the offline teaching device 5 selects one of the generated two or more scan effective areas (for example, the 7) is deleted.
- the scan motion creation unit 552 creates at least one scan effective region among all the scan valid regions after excluding the deleted scan valid region (for example, the scan valid region AR11 shown in FIG. 7) and the welding motion trajectory RT1. and create a teaching program for scanning the scan effective area.
- the off-line teaching device 5 according to the first embodiment can create a teaching program for scanning motions that does not include unnecessary scanning motions by making it possible to delete each of the scan effective areas indicating teaching locations.
- the 3D calculation unit 54 of the offline teaching device 5 in Embodiment 1 divides the scan effective area (for example, the scan effective area AR13 shown in FIG. 9) based on the operator's operation.
- the scan motion creation unit 552 arranges a plurality of divided scan effective areas (for example, the scan effective areas AR131 and AR132 shown in FIG. 9), and selects the Based on at least one scan effective area and the welding operation trajectory RT1, a teaching program for scanning the scan effective area is created.
- the offline teaching device 5 according to the first embodiment can divide each of the scan effective areas indicating the teaching points, so that the sensor 4 can be prevented from becoming an obstacle (production equipment or the original work constituting the work Wk).
- a teaching program for scanning operations that do not interfere can be created more efficiently.
- the offline teaching device 5 in Embodiment 1 includes one or more computers communicably connected to the input device UI3 by the operator operating the input device UI3.
- the worker inputs the data (an example of three-dimensional shape data) of the 3D model MD1 of the work Wk produced by welding into the computer, and scans the external shape of the work Wk (for example, the scan section shown in FIG. 5).
- SR1 is entered into the computer.
- the offline teaching device 5 scans the data of the 3D model MD1 based on the scan points corresponding to the scan section (for example, the welding lines WLM11 and WLM12 shown in FIG. 10).
- a teaching program is created for scanning a three-dimensional area by a welding robot MC1 that performs welding.
- the offline teaching device 5 can automatically create a teaching program for the scanning operation by acquiring the data of the 3D model MD1 of the work Wk and the scanning section for scanning the external shape of the work Wk. .
- Embodiment 2 The offline teaching device 5 according to Embodiment 1 accepts an operator's operation via the input device UI 3, and duplicates (copys) or deletes the scan effective area having the same scan section based on the accepted operator's operation.
- an example of creating a teaching program for a new scan operation for executing editing such as division and causing the sensor 4 to scan each of one or more edited scan effective areas is shown.
- the offline teaching device 5 according to the second embodiment accepts an operator's operation via the input device UI 3, and based on the accepted operator's operation, determines the scan section, rotation angle, and position (arrangement) of each scan effective area.
- An example of creating a new scan operation teaching program for executing editing and causing the sensor 4 to scan each of one or more edited scan effective areas will be described.
- the welding system 100 according to the second embodiment has substantially the same internal configuration as the welding system 100 according to the first embodiment. Further, the offline teaching device 5 according to the second embodiment has substantially the same internal configuration as the welding system 100 according to the first embodiment.
- the same reference numerals are used for the same components as in the first embodiment, and the description thereof is omitted.
- the offline teaching device 5 accepts an operator's operation via the input device UI 3, and based on the scan section, rotation angle, and position (arrangement) specified by the accepted operator's operation, one Generation of each of the above scan effective areas is executed. That is, each of the scan effective areas in Embodiment 2 may be generated with different scan sections, rotation angles, and positions (arrangements).
- the offline teaching device 5 stores one or more pieces of information on the scan effective area (for example, the scan section, rotation angle, position (arrangement), etc. of the scan effective area) in the acquired 3D model MD1 or the teaching program for scan teaching.
- an operator's editing operation may be accepted for the scan section, rotation angle, and position (arrangement) of one of the linked scan valid areas.
- FIGS. 12 and 13 are diagrams illustrating an example of the movement processing and the rotation processing of the scan effective area AR2 according to the second embodiment.
- 13A and 13B are diagrams for explaining each of scan effective area change processing example 1, change processing example 2, and change processing example 3 according to the second embodiment. Note that the scan effective area AR2 shown in FIGS. 12 and 13 is an example, and needless to say, the present invention is not limited to this.
- a scan effective area AR2 shown in FIG. 12 has a scan section SR2 along the traveling direction (X direction).
- the offline teaching device 5 accepts an operator's operation to move the scan section of the scan effective area AR2 in the X direction, the Y direction, or the Z direction with respect to the scan effective area AR2.
- the offline teaching device 5 accepts the operator's operation via the input device UI3, and based on the accepted operator's operation (specifically, the amount of movement in any direction), determines the position of the scan effective area AR2. change.
- the offline teaching device 5 cannot scan at least part of the scan effective area AR2 after the movement processing based on the operator's operation based on the operation trajectory of the welding operation and the scan effective range (see FIG. 4). If it is determined that the scan valid area AR2 after the movement process is not scannable, or only the area determined as not scannable in the scan valid area AR2 after the move process is displayed in a color such as red. may be generated and transmitted to the monitor MN3 for display. Accordingly, the operator can confirm at a glance whether or not the scan effective area AR2 after the movement processing is an area that can be scanned by the sensor 4.
- FIG. 4
- the offline teaching device 5 generates an image in which each of the plurality of rotation reference points RP is superimposed on the 3D model of the scan effective area AR2, and transmits the image to the monitor MN3 for display. Note that in FIG. 12, the reference numerals for all the rotation reference points RP are omitted. In the example shown in FIG. 12, 16 rotation reference points RP are shown, but the positions and number of rotation reference points for rotating the scan effective area AR2 are not limited to this.
- the off-line teaching device 5 is operated by an operator to specify a rotation reference point RP, and to rotate in a rotation direction RRX about the X axis, a rotation direction RRY about the Y axis, or a rotation direction RRZ about the Z axis. , respectively.
- the offline teaching device 5 performs a rotation process for rotating the scan effective area AR2 in the rotation direction RRX, the rotation direction RRY, or the rotation direction RRZ with the specified rotation reference point RP as the origin based on the operator's operation.
- the offline teaching device 5 determines that at least a part of the scan effective area AR2 after the rotation processing based on the operator's operation is the 3D model MD1 of the workpiece Wk, the operation trajectory of the welding operation, and the scan effective range (see FIG. 4). ), a notification to the effect that the scan effective area AR2 after the rotation process is unscannable, or it is determined that the scan is impossible in the scan effective area AR2 after the rotation process.
- a screen may be generated in which only the selected area is highlighted in a color such as red, and transmitted to the monitor MN3 for display. Accordingly, the operator can confirm at a glance whether or not the scan effective area AR2 after the rotation processing is an area that can be scanned by the sensor 4.
- a scan effective area AR2 shown in FIG. 13 has a scan section SR2 along the traveling direction (X direction).
- the offline teaching device 5 accepts an operator's operation for changing the scan effective area AR2 by extending the scan section of the scan effective area AR2 in the X direction, Y direction, or Z direction. .
- the offline teaching device 5 accepts the operator's operation via the input device UI3, and based on the accepted operator's operation (specifically, extension in the X direction), the length of the scan effective area AR2 (scan section ).
- the offline teaching device 5 accepts an operator's operation to extend the scan section of the scan effective area AR2 in the X direction, Y direction, or Z direction with respect to the scan effective area AR2.
- the offline teaching device 5 accepts the operator's operation via the input device UI3, and based on the accepted operator's operation (specifically, extension in the X direction), the size of the scan effective area AR2 (scan section ).
- the scan effective area AR21 shown in modification processing example 1 is a scan effective area generated by execution of modification processing in which the scan effective area AR2 is extended by a distance SR211 in the X direction and by a distance SR212 in the -X direction. It has a scan section SR213.
- the scan effective area AR22 shown in modification processing example 2 is a scan effective area generated by execution of modification processing that extends the scan effective area AR2 by a distance SR221 in the X direction, and has a scan section SR222.
- the scan effective area AR23 shown in modification processing example 3 is a scan effective area generated by performing modification processing to extend the scan effective area AR2 in the -X direction by a distance SR231, and has a scan section SR232. It goes without saying that each of the scan effective areas AR21, AR22, and AR23 after the change processing shown in FIG. 13 is an example and is not limited to this.
- the offline teaching device 5 scans at least part of the scan section of the scan effective area AR2 changed based on the operator's operation based on the operation trajectory of the welding operation and the scan effective range (see FIG. 4). If it is determined that it is not possible, a notification is sent to the effect that the scan valid area AR2 after the change processing is outside the scan valid range, or an area outside the scan valid range of the scan valid area AR2 after the change process is displayed in red, for example. A color-highlighted screen may be generated and sent to monitor MN3 for display. Accordingly, the operator can confirm at a glance whether or not the scan effective area AR2 after the change processing is an area that can be scanned by the sensor 4 .
- FIGS. 14, 15, and 16 a specific description will be given of the scanning effective area editing process and the new scanning operation teaching program creation process of the offline teaching device 5 according to the second embodiment.
- the 3D model MD3 shown in FIGS. 14 to 16 is merely an example, and the present invention is not limited to this.
- the 3D model MD3 shown in each of FIGS. 14 to 16 omits illustration of the operation trajectory RT3 of the welding operation in order to facilitate the explanation of the scan effective area editing process.
- the taught location WLM3 shown in FIG. 14 is shown to facilitate understanding of the taught locations (scanning locations) taught by the arrangement of each of the plurality of scan effective areas AR31, AR32, AR33, AR34, and AR35, which will be described later. , may be omitted on the screen displayed on the monitor MN3.
- the off-line teaching device 5 instructs the robot controller 2, the inspection control device 3, or the input device UI 3, based on the operator's operation, to create a teaching program for a new scanning motion of the workpiece Wk.
- the teaching programs for the welding operation and scanning operation and the data of the 3D model MD3 are obtained.
- the offline teaching device 5 superimposes the operation trajectory RT3 of the welding operation on the acquired data of the 3D model MD3 of the work Wk.
- the operator may be able to select display or non-display of the motion trajectory RT3 of the welding motion superimposed on the data of the 3D model MD3.
- the offline teaching device 5 generates an image in which the motion trajectory RT3 of the acquired welding motion is superimposed on the 3D model MD3 of the work Wk, and transmits the image to the monitor MN3 for display.
- the offline teaching device 5 also superimposes the acquired 3D model MD3 of the workpiece Wk or the scanning effective area AR31 linked to the scanning operation teaching program on the data of the 3D model MD3. If there is no scan effective area linked to the 3D model MD3 of the workpiece Wk or the scan operation teaching program, the offline teaching device 5 executes a new scan effective area generation process based on the operator's operation. do.
- the offline teaching device 5 shown in FIG. 15 generates two scan effective areas AR32 and AR33 by copying (replicating) the scan effective area AR31 based on the operator's operation via the input device UI3.
- the offline teaching device 5 moves the two scan effective areas AR32 and AR33 to positions specified by the operator and places them on the 3D model MD3.
- the offline teaching device 5 shown in FIG. 16 generates two scan effective areas AR34 and AR35 by copying (replicating) the scan effective area AR31 based on the operator's operation via the input device UI3, and generates a 3D model. Place on MD3.
- the illustration of the scan effective area AR35 is omitted in order to make the explanation easier to understand.
- the offline teaching device 5 moves each of the two scan effective areas AR34 and AR35 to positions specified by the operator.
- the offline teaching device 5 performs a designation operation of a rotation reference point (not shown) designated by the operator in each of the two scan effective areas AR34 and AR35, and selects a direction with this rotation reference point as the origin. Receives the operation of specifying the amount of rotation to.
- the offline teaching device 5 rotates each of the scan effective areas AR34 and AR35 based on each of the accepted designation operations and arranges them on the 3D model MD3.
- the offline teaching device 5 may further receive a designation operation regarding the shape of the teaching location corresponding to each of the five scan effective areas AR31 to AR35. Specifically, the off-line teaching device 5 performs a designation operation for designating whether a teaching point (that is, a scanning point) is a linear shape, a curved shape, or a shape including a straight line and a curved line for each scan effective area. may be accepted.
- a teaching point that is, a scanning point
- the offline teaching device 5 in the example of the 3D model MD3 shown in FIG. A designation operation is accepted to the effect that the taught location is a line including a straight line and a curved line, and that the taught location corresponding to the scan effective area AR33 is a curved line.
- the offline teaching device 5 provides information on the shape of the teaching point corresponding to each designated scan effective area, and the three-dimensional shape of the 3D model MD3 (specifically, one or more shapes constituting the workpiece Wk). (surface shape of the original work, intersection or contact point between the original work and the original work, etc.), or the operation trajectory of the welding operation, teaching taught by each of the five scan effective areas AR31 to AR35 generated based on Get location WLM3.
- the offline teaching device 5 may acquire the taught points included in each of the plurality of continuous scan effective areas as one continuous taught point.
- the offline teaching device 5 in the second embodiment acquires the taught point WLM3 taught by each of the five generated scan effective areas AR31 to AR35 even when there is no weld line position information.
- a teaching program for a new scanning operation for causing the sensor 4 to scan the five scan effective areas AR31 to AR35 is created using the acquired teaching point WLM3.
- a procedure for creating a teaching program for a new scan operation by the offline teaching device 5 is the same as the flow chart showing an example of the operation procedure of the offline teaching device 5 according to the first embodiment shown in FIG.
- FIG. 17 is a diagram illustrating an example of various operations associated with the scan effective areas AR31 to AR35 according to the second embodiment. Note that the specific examples of various operations shown in FIG. 17 are merely examples, and needless to say, the present invention is not limited to these.
- the offline teaching device 5 executes creation and linking processing of various actions of the 3D model MD3 based on each of the five scan effective areas AR31 to AR35, based on the action procedure shown in the flowchart of FIG.
- the offline teaching device 5 based on the information of various motions linked to the 3D model MD3 or the motion trajectory RT3 of the welding motion, in the section APR31 from the approach start position PT11 to the approach end position PT12 , to create an approach motion for bringing the sensor 4 closer to the workpiece Wk.
- the offline teaching device 5 based on the information of various motions linked to the 3D model MD3 or the motion trajectory RT3 of the welding motion, in the section RTR31 from the retraction start position PT15 to the retraction end position PT16, from the work Wk to the sensor Create a retract motion that moves 4 apart.
- the offline teaching device 5 associates each motion of all created scan effective areas. Specifically, the offline teaching device 5 associates the approach motion, the retract motion, and the scan motion (that is, the motion for causing the sensor 4 to scan the taught location WLM3). Note that FIG. 17 illustrates an example in which avoidance information is not associated with the 3D model MD3 or the welding operation trajectory RT3.
- the offline teaching device 5 creates a scanning operation corresponding to each of one or more teaching locations (scanning effective regions) generated based on the operator's operation via the input device UI3. , by linking the scan operations corresponding to all of these teaching locations (scanning effective areas), a teaching program for new scan operations can be automatically created.
- the offline teaching device 5 can create a teaching program for a new scanning motion based on the motion trajectory of the welding motion, the scanning accuracy of the teaching point (that is, the scan effective area) during operation can be improved, and the teaching It is possible to more efficiently reduce the load required for teaching work such as correction of a portion (scanning valid area).
- the offline teaching device 5 can perform the following operations based on each of the generated one or more scan effective areas, even when there is no data on the weld line (position information of the weld line). , a new scanning operation teaching program can be created.
- the offline teaching device 5 preliminarily stores welding lines in the data of the 3D model MD3. , or if a teaching point (scanning point) or the like is tied, the position information of the tied welding line, intersection, contact point, or teaching point (scanning point) is acquired.
- the offline teaching device 5 generates one or more scan effective areas based on the operation trajectory RT3 of the welding operation and the acquired welding line, intersection, contact point, or teaching point (scanning point). Execute copy (replication) processing, rotation processing, movement processing, etc. for the scan effective area, and automatically generate each of multiple scan effective areas including acquired weld lines, intersections, contact points, or taught points (scan points). and placed on the 3D model MD3.
- the off-line teaching device 5 can more efficiently create a scanning operation teaching program for scanning the acquired welding line, intersection, contact point, or teaching point (scanning point) with the sensor 4 .
- the offline teaching device 5 provides information on the surface shape of the original work that constitutes the work Wk linked to the 3D model MD3, or a plurality of points (intersections) at which the original works intersect or touch each other. Based on the information of multiple points (contact) and the shape information of the teaching points (scanning points) corresponding to each scanning effective area, the length of the straight line, the curvature of the curve, etc. of the teaching points (scanning points) are automatically calculated. You may As a result, the off-line teaching device 5 can automatically calculate acquired welding lines, intersections, contact points, or teaching points (for example, teaching points WLM3, etc.). can be created efficiently.
- the offline teaching device 5 includes the input/output unit 53 (an example of the input unit) capable of receiving operator operations, and the data of the 3D model MD3 of the workpiece Wk produced by welding (three-dimensional shape an example of data), a welding operation trajectory RT3, and a scanning effective range AR0 (an example of a scanning range) of the sensor 4 that scans the external shape of the workpiece Wk. Then, the sensor 4 scans based on the acquired scan effective range AR0 (an example of a three-dimensional area) and the scan section specified by the operator's operation (for example, the scan sections SR213, SR222, and SR232 shown in FIG.
- the 3D calculation unit 54 (an example of the generation unit) that generates the scan effective areas (for example, the scan effective areas AR21, AR22, and AR23 shown in FIG. 13) arranging at least one scan effective area on the data of the 3D model MD3 of the workpiece, and scanning the scan effective area with the welding robot MC1 that performs welding based on the arranged scan effective area and the welding operation trajectory RT3. and a scanning motion creation unit 552 (an example of a control unit) that creates and outputs a teaching program for scanning.
- the scan effective areas for example, the scan effective areas AR21, AR22, and AR23 shown in FIG. 13
- arranging at least one scan effective area on the data of the 3D model MD3 of the workpiece and scanning the scan effective area with the welding robot MC1 that performs welding based on the arranged scan effective area and the welding operation trajectory RT3.
- a scanning motion creation unit 552 (an example of a control unit) that creates and outputs a teaching program for scanning.
- the offline teaching device 5 can scan effective regions corresponding to the position and orientation of the sensor 4 at the specified position, based on the position of the welding robot MC1 associated with the motion trajectory RT3 of the welding robot MC1. can be placed on the 3D model MD3 to create a scan motion for each of the placed scan effective areas. Therefore, the off-line teaching device 5 can more efficiently create a teaching program for the scanning motion to be executed by the welding robot MC1 using the created scanning motion, and can scan during operation based on the teaching program for the scanning motion. It is possible to further improve the accuracy of the positions of the scan location and the teaching location (that is, the scan effective area). Therefore, the offline teaching device 5 can more efficiently create a scanning operation teaching program that can more efficiently reduce the load required for teaching work such as correction of a teaching point (scan effective area).
- the scan motion creation unit 552 of the offline teaching device 5 performs welding linked to the data of the arranged scan effective area, the welding motion trajectory RT3, and the 3D model MD1.
- a teaching program is created based on operation information (for example, approach information, retraction information, avoidance information, etc.) of the welding robot MC1.
- the offline teaching device 5 in the second embodiment can create scan motions for the arranged scan effective areas based on the welding motion trajectory RT3 and the motion information of the welding robot MC1. Therefore, the off-line teaching device 5 can more efficiently create a teaching program for the scanning motion to be executed by the welding robot MC1 using the created scanning motion.
- the scanning motion creation unit 552 of the offline teaching device 5 according to the second embodiment can perform various motions (for example, an approach motion, a retraction motion, an avoidance motion, etc.) for the workpiece of the welding robot MC1 based on the motion information.
- a scanning operation for each scan effective area to be executed by the welding robot MC1 is created, and a teaching program is created by associating the scanning operation corresponding to each of the created scan effective areas with various operations.
- the off-line teaching device 5 according to the second embodiment performs the scanning motions for the workpiece Wk based on the created various motions of the welding robot MC1 and the scan motions created for each scan effective area. You can create a teaching program for
- the scan motion creation unit 552 of the offline teaching device 5 can determine the shape (for example, the linear shape) of the scan location (that is, the teaching location) to be scanned in each of the scan effective areas by the operator's operation. , a curved shape, or a shape including straight lines and curves) is accepted, and welding is performed based on the shape of each scan location in the specified scan effective area, the arranged scan effective area, and the operation trajectory RT3 A teaching program for causing the robot MC1 to scan the scan effective area is created and output.
- the shape for example, the linear shape
- the off-line teaching device 5 provides a teaching program for scanning operations suitable for the shape of the scanning location (teaching location) specified in each scan effective area based on the shape of the scanning location (teaching location). can be automatically created.
- the 3D calculation unit 54 of the offline teaching device 5 in Embodiment 2 duplicates and arranges the scan effective area (for example, the scan effective area AR31 shown in FIG. 15) based on the operator's operation.
- the scan operation creation unit 552 creates at least one scan effective area among all the scan effective areas including the duplicated scan effective areas (for example, the scan effective areas AR32 to AR35 shown in FIGS. 15 to 17) and the welding operation. Based on the trajectory RT3, a teaching program is created for scanning the scan effective area (for example, the scan effective areas AR31 to AR35 shown in FIG. 17).
- the offline teaching device 5 enables each of the scan effective areas indicating the teaching points to be duplicated, thereby virtually generating the scan effective area and displaying the scan effective area superimposed on the 3D model MD3. Positional deviation between the area and the scan effective area that can be scanned by the actual sensor 4 can be suppressed. Therefore, the offline teaching device 5 can more efficiently create a scanning operation teaching program that further improves the positional accuracy between the scanning location scanned during operation and the teaching location (that is, the scan effective area).
- the 3D calculation unit 54 of the offline teaching device 5 can set one point (one of the plurality of rotation reference points RP shown in FIG. 12) on the scan effective area specified by the operator's operation.
- the scan effective area is rotated based on one rotation reference point) and the amount of rotation about one point as the rotation center.
- the scan motion creation unit 552 selects at least one of all the scan effective areas (for example, the scan effective areas AR31 to AR35 shown in FIG. 17) including the rotated scan effective area (for example, the scan effective area AR34 shown in FIG. 16).
- a teaching program for scanning the scan effective area is created based on the two scan effective areas and the welding operation trajectory RT3.
- the offline teaching device 5 according to the second embodiment can generate a scan effective area more suitable for the taught point by making each of the scan effective areas indicating the taught point rotatable. Therefore, the offline teaching device 5 according to the second embodiment can more efficiently create a teaching program for scanning operations more suitable for the workpiece Wk scanned by the sensor 4 based on the generated scan effective area.
- the 3D calculation unit 54 of the offline teaching device 5 determines the scan effective area (for example, the scan effective area shown in FIG. 16) based on the movement amount of the scan effective area designated by the operator's operation. Move the position of the area AR34).
- the scan motion creation unit 552 selects at least one of all scan effective areas (for example, the scan effective areas AR31 to AR35 shown in FIG. 17) including the moved scan effective area (for example, the scan effective area AR34 shown in FIG. 16).
- a teaching program for scanning the scan effective area is created based on the two scan effective areas and the welding operation trajectory RT3.
- the offline teaching device 5 according to the second embodiment can generate the scan effective area at a position more suitable for the teaching point by enabling each of the scan effective areas indicating the teaching point to be moved. Therefore, the offline teaching device 5 according to the second embodiment can more efficiently create a teaching program for scanning operations more suitable for the workpiece Wk scanned by the sensor 4 based on the generated scan effective area.
- the offline teaching device 5 in the second embodiment includes one or more computers communicatively connected to the input device UI3 by the operator operating the input device UI3.
- the worker inputs the data of the 3D model MD3 (an example of three-dimensional shape data) of the work Wk produced by welding into the computer, and scans the external shape of the work Wk (for example, the scan section shown in FIG. 13).
- SR 213, SR 222, SR 232) are input to the computer, and the shape (for example, a straight line shape, a curved line shape, or a shape including a straight line and a curved line) of the scanning portion to be scanned in the scanning section is input to the computer.
- the offline teaching device 5 creates a teaching program for causing the welding robot MC1 that performs welding to scan a scan effective area (an example of a three-dimensional area) based on the shape of the scan location.
- the offline teaching device 5 in the second embodiment acquires the data of the 3D model MD3 of the work Wk, the scan section for scanning the external shape of the work Wk, and the shape of the scan location scanned in the scan section. This makes it possible to automatically create a teaching program for scanning operations.
- the present disclosure is useful as an offline teaching device and offline teaching method for more efficiently creating a teaching program for sensor scanning operations executed by a welding robot.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Geometry (AREA)
- Plasma & Fusion (AREA)
- Manipulator (AREA)
- Numerical Control (AREA)
Abstract
Description
特許文献1のように、オフライン教示装置を用いて仮想的な生産設備を構築可能な装置構成は従来から知られている。このようなオフライン教示装置は、溶接ロボットの動作軌跡に対応する一部の位置検出命令、および一部の溶接命令を同時に表示することで、作業者に教示プログラム作成時の編集箇所の特定を容易にし、作成されたプログラムの作成効率と正確性との向上を支援できる。
図1は、実施の形態1に係る溶接システム100のシステム構成例を示す概略図である。溶接システム100は、外部ストレージST、入力インターフェースUI1およびモニタMN1のそれぞれと接続された上位装置1と、ロボット制御装置2と、検査制御装置3と、センサ4と、オフライン教示装置5と、モニタMN3と、入力装置UI3と、溶接ロボットMC1と、モニタMN2とを含む構成である。また、図1では、センサ4は、溶接ロボットMC1と別体として図示されているが、溶接ロボットMC1と一体化されて設けられてもよい(図2参照)。モニタMN2は必須の構成でなく、省略されてもよい。
図6を参照して、実施の形態1におけるスキャン有効領域のコピー処理について説明する。図6は、実施の形態1におけるスキャン有効領域AR11,AR12のコピー処理例を説明する図である。
次に、図7および図8を参照して、実施の形態1におけるスキャン有効領域の削除処理について説明する。図7は、実施の形態1におけるスキャン有効領域AR12の削除処理例1を説明する図である。図8は、実施の形態1におけるスキャン有効領域AR17の削除処理例2を説明する図である。なお、図7および図8に示すスキャン有効領域の削除処理例は一例であってこれに限定されないことは言うまでもない。
次に、図9を参照して、実施の形態1におけるスキャン有効領域の分割処理について説明する。図9は、実施の形態1におけるスキャン有効領域AR15の分割処理例を説明する図である。なお、図9に示すスキャン有効領域の削除処理例は一例であってこれに限定されないことは言うまでもない。
実施の形態1におけるオフライン教示装置5は、入力装置UI3を介した作業者操作を受け付けて、受け付けられた作業者操作に基づいて、同一のスキャン区間を有するスキャン有効領域の複製(コピー)、削除、あるいは分割等の編集を実行し、編集された1つ以上のスキャン有効領域のそれぞれをセンサ4にスキャンさせるための新たなスキャン動作の教示プログラムを作成する例を示した。実施の形態2におけるオフライン教示装置5は、入力装置UI3を介した作業者操作を受け付けて、受け付けられた作業者操作に基づいて、スキャン有効領域ごとのスキャン区間、回転角度、位置(配置)の編集を実行し、編集された1つ以上のスキャン有効領域のそれぞれをセンサ4にスキャンさせるための新たなスキャン動作の教示プログラムを作成する例について説明する。
図12に示すスキャン有効領域AR2は、進行方向(X方向)に沿ってスキャン区間SR2を有する。オフライン教示装置5は、このスキャン有効領域AR2に対して、スキャン有効領域AR2のスキャン区間をX方向、Y方向、あるいはZ方向に移動させる作業者操作を受け付ける。オフライン教示装置5は、入力装置UI3を介した作業者操作を受け付け、受け付けられた作業者操作(具体的には、いずれかの方向への移動量)に基づいて、スキャン有効領域AR2の位置を変更する。
オフライン教示装置5は、スキャン有効領域AR2の3Dモデル上に複数の回転基準点RPのそれぞれを重畳した画像を生成して、モニタMN3に送信して表示させる。なお、図12では、すべての回転基準点RPのそれぞれへの符号の付与を省略している。また、図12に示す例では、16点の回転基準点RPのそれぞれを示すが、スキャン有効領域AR2を回転操作するための回転基準点の位置および数は、これに限定されない。
図13に示すスキャン有効領域AR2は、進行方向(X方向)に沿ってスキャン区間SR2を有する。オフライン教示装置5は、このスキャン有効領域AR2に対して、スキャン有効領域AR2のスキャン区間をX方向、Y方向、あるいはZ方向に延長させてスキャン有効領域AR2を変更するための作業者操作を受け付ける。オフライン教示装置5は、入力装置UI3を介した作業者操作を受け付け、受け付けられた作業者操作(具体的には、X方向への延長)に基づいて、スキャン有効領域AR2の長さ(スキャン区間)を変更する。
2 ロボット制御装置
3 検査制御装置
4 センサ
5 オフライン教示装置
10,20,30,50 通信部
11,21,31,51 プロセッサ
12,22,32,52 メモリ
53 入出力部
54 3D演算部
55 プログラム作成部
551 溶接動作作成部
552 スキャン動作作成部
100 溶接システム
200 マニピュレータ
300 ワイヤ送給装置
301 溶接ワイヤ
400 溶接トーチ
500 電源装置
AR0 スキャン有効範囲
AR11,AR12,AR13,AR14,AR15,AR151,AR152,AR21,AR22,AR23,AR31,AR32,AR33,AR34,AR35 スキャン有効領域
MC1 溶接ロボット
MD1,MD2,MD3 3Dモデル
MN1,MN2,MN3 モニタ
RP 回転基準点
SR1,SR2,SR213,SR222,SR232 スキャン区間
UI3 入力装置
WLM11,WLM12,WLM21,WLM22 溶接線
Wk ワーク
Claims (9)
- 作業者操作を受け付け可能な入力部と、
溶接により生産されるワークの3次元形状データと前記溶接の動作軌跡と前記ワークの外観形状をスキャンするセンサのスキャン範囲とを取得する取得部と、
取得された前記スキャン範囲とスキャン区間とに基づいて、前記センサによりスキャンされる3次元領域を生成する生成部と、
前記入力部に入力された前記作業者操作に基づいて、前記ワークの3次元形状データ上に少なくとも1つの前記3次元領域を配置し、配置された前記3次元領域と、前記溶接の前記動作軌跡とに基づいて、前記溶接を行う溶接ロボットに前記3次元領域をスキャンさせるための教示プログラムを作成して出力する制御部と、を備える、
オフライン教示装置。 - 前記制御部は、配置された前記3次元領域と、前記溶接の前記動作軌跡と、前記3次元形状データに紐付けられた前記溶接を行う溶接ロボットの動作情報とに基づいて、前記教示プログラムを作成する、
請求項1に記載のオフライン教示装置。 - 前記制御部は、前記動作情報に基づいて、前記溶接ロボットの前記ワークに対する各種動作と前記溶接ロボットにより実行される前記3次元領域ごとのスキャン動作とをそれぞれ作成し、作成された前記3次元領域のそれぞれに対応する前記スキャン動作と前記各種動作とを紐付けて前記教示プログラムを作成する、
請求項2に記載のオフライン教示装置。 - 前記制御部は、前記3次元形状データに紐付けられた前記溶接の溶接線を抽出し、前記3次元領域に含まれる前記溶接線を前記センサのスキャン箇所とする前記教示プログラムを作成して出力する、
請求項1に記載のオフライン教示装置。 - 前記生成部は、前記作業者操作に基づいて、前記3次元領域を複製して配置し、
前記制御部は、複製された3次元領域を含むすべての3次元領域のうち少なくとも1つの3次元領域と、前記溶接の前記動作軌跡とに基づいて、前記3次元領域をスキャンさせるための教示プログラムを作成する、
請求項1に記載のオフライン教示装置。 - 前記生成部は、前記作業者操作に基づいて、生成された2つ以上の前記3次元領域のうちいずれかの3次元領域を削除し、
前記制御部は、削除された3次元領域を除外した後のすべての3次元領域のうち少なくとも1つの3次元領域と、前記溶接の前記動作軌跡とに基づいて、前記3次元領域をスキャンさせるための教示プログラムを作成する、
請求項1に記載のオフライン教示装置。 - 前記生成部は、前記作業者操作に基づいて、前記3次元領域を分割し、分割された複数の3次元領域を配置し、
前記制御部は、分割された前記複数の3次元領域を含むすべての3次元領域のうち少なくとも1つの3次元領域と、前記溶接の前記動作軌跡とに基づいて、前記3次元領域をスキャンさせるための教示プログラムを作成する、
請求項1に記載のオフライン教示装置。 - 作業者操作を受け付け可能な入力装置との間で通信可能に接続された1つ以上のコンピュータを含んで構成されたオフライン教示装置が行うオフライン教示方法であって、
溶接により生産されるワークの3次元形状データと前記溶接の動作軌跡と前記ワークの外観形状をスキャンするセンサのスキャン範囲とを取得し、
取得された前記スキャン範囲とスキャン区間とに基づいて、前記センサによりスキャンされる3次元領域を生成し、
前記入力装置から取得された前記作業者操作に基づいて、前記ワークの3次元形状データ上に少なくとも1つの前記3次元領域を配置し、
配置された前記3次元領域と、前記溶接の前記動作軌跡とに基づいて、前記溶接を行う溶接ロボットに前記3次元領域をスキャンさせるための教示プログラムを作成して出力する、
オフライン教示方法。 - 作業者が入力装置を操作して、入力装置との間で通信可能に接続された1つ以上のコンピュータを含んで構成されたオフライン教示装置を用いて行うオフライン教示方法であって、
溶接により生産されるワークの3次元形状データを前記コンピュータに入力し、
前記ワークの外観形状をスキャンするスキャン区間を前記コンピュータに入力し、
前記3次元形状データにおいて前記スキャン区間に対応するスキャン箇所に基づく3次元領域を、前記溶接を行う溶接ロボットにスキャンさせるための教示プログラムを作成する、
オフライン教示方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22828496.4A EP4360826A4 (en) | 2021-06-23 | 2022-06-23 | OFFLINE LEARNING DEVICE AND OFFLINE LEARNING METHOD |
CN202280044292.1A CN117580688A (zh) | 2021-06-23 | 2022-06-23 | 离线示教装置和离线示教方法 |
JP2023530117A JPWO2022270580A1 (ja) | 2021-06-23 | 2022-06-23 | |
US18/394,121 US20240123625A1 (en) | 2021-06-23 | 2023-12-22 | Offline teaching device and offline teaching method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021104354 | 2021-06-23 | ||
JP2021-104354 | 2021-06-23 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/394,121 Continuation US20240123625A1 (en) | 2021-06-23 | 2023-12-22 | Offline teaching device and offline teaching method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022270580A1 true WO2022270580A1 (ja) | 2022-12-29 |
Family
ID=84544418
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/025097 WO2022270580A1 (ja) | 2021-06-23 | 2022-06-23 | オフライン教示装置およびオフライン教示方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240123625A1 (ja) |
EP (1) | EP4360826A4 (ja) |
JP (1) | JPWO2022270580A1 (ja) |
CN (1) | CN117580688A (ja) |
WO (1) | WO2022270580A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024156002A1 (en) * | 2023-01-20 | 2024-07-25 | Path Robotics, Inc. | Scan planning and scan operations for welding an object |
WO2024162338A1 (ja) * | 2023-01-31 | 2024-08-08 | リンクウィズ株式会社 | システム、プログラム、及び製造方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006247677A (ja) * | 2005-03-09 | 2006-09-21 | Fanuc Ltd | レーザ溶接教示装置及び方法 |
JP2007098464A (ja) * | 2005-10-07 | 2007-04-19 | Nissan Motor Co Ltd | レーザー加工ロボット制御装置、レーザー加工ロボット制御方法およびレーザー加工ロボット制御プログラム |
WO2016021130A1 (ja) | 2014-08-05 | 2016-02-11 | パナソニックIpマネジメント株式会社 | オフラインティーチング装置 |
JP2021104354A (ja) | 2017-09-15 | 2021-07-26 | 株式会社三洋物産 | 遊技機 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10065318B2 (en) * | 2014-09-15 | 2018-09-04 | The Boeing Company | Methods and systems of repairing a structure |
KR20180103467A (ko) * | 2017-03-10 | 2018-09-19 | 아진산업(주) | 3차원 카메라를 이용한 용접 로봇 끝단의 위치 추정 시스템 및 방법 |
JP7245603B2 (ja) * | 2017-11-10 | 2023-03-24 | 株式会社安川電機 | 教示装置、ロボットシステムおよび教示方法 |
-
2022
- 2022-06-23 EP EP22828496.4A patent/EP4360826A4/en active Pending
- 2022-06-23 CN CN202280044292.1A patent/CN117580688A/zh active Pending
- 2022-06-23 WO PCT/JP2022/025097 patent/WO2022270580A1/ja active Application Filing
- 2022-06-23 JP JP2023530117A patent/JPWO2022270580A1/ja active Pending
-
2023
- 2023-12-22 US US18/394,121 patent/US20240123625A1/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006247677A (ja) * | 2005-03-09 | 2006-09-21 | Fanuc Ltd | レーザ溶接教示装置及び方法 |
JP2007098464A (ja) * | 2005-10-07 | 2007-04-19 | Nissan Motor Co Ltd | レーザー加工ロボット制御装置、レーザー加工ロボット制御方法およびレーザー加工ロボット制御プログラム |
WO2016021130A1 (ja) | 2014-08-05 | 2016-02-11 | パナソニックIpマネジメント株式会社 | オフラインティーチング装置 |
JP2021104354A (ja) | 2017-09-15 | 2021-07-26 | 株式会社三洋物産 | 遊技機 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024156002A1 (en) * | 2023-01-20 | 2024-07-25 | Path Robotics, Inc. | Scan planning and scan operations for welding an object |
WO2024162338A1 (ja) * | 2023-01-31 | 2024-08-08 | リンクウィズ株式会社 | システム、プログラム、及び製造方法 |
Also Published As
Publication number | Publication date |
---|---|
EP4360826A1 (en) | 2024-05-01 |
EP4360826A4 (en) | 2024-10-23 |
JPWO2022270580A1 (ja) | 2022-12-29 |
US20240123625A1 (en) | 2024-04-18 |
CN117580688A (zh) | 2024-02-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2022270580A1 (ja) | オフライン教示装置およびオフライン教示方法 | |
JP7369982B2 (ja) | リペア溶接装置およびリペア溶接方法 | |
JP7422337B2 (ja) | リペア溶接制御装置およびリペア溶接制御方法 | |
WO2020262049A1 (ja) | リペア溶接制御装置およびリペア溶接制御方法 | |
JP7289087B2 (ja) | リペア溶接装置およびリペア溶接方法 | |
WO2022270579A1 (ja) | オフライン教示装置およびオフライン教示方法 | |
US20220410323A1 (en) | Bead appearance inspection device, bead appearance inspection method, program, and bead appearance inspection system | |
JP6990869B1 (ja) | 外観検査方法および外観検査装置 | |
JP7555041B2 (ja) | ビード外観検査装置、ビード外観検査方法、プログラムおよびビード外観検査システム | |
WO2023105980A1 (ja) | オフライン教示装置およびオフライン教示方法 | |
WO2023105977A1 (ja) | オフライン教示装置およびオフライン教示方法 | |
WO2023105978A1 (ja) | オフライン教示装置 | |
WO2022270578A1 (ja) | オフライン教示装置およびオフライン教示方法 | |
WO2023105979A1 (ja) | オフライン教示装置およびオフライン教示方法 | |
JP2021139771A (ja) | 制御装置、表示装置の制御方法およびプログラム | |
WO2023199620A1 (ja) | ロボット制御装置およびオフライン教示システム | |
JP7365623B1 (ja) | オフライン教示装置およびオフライン教示システム | |
JP2021062441A (ja) | リペア溶接装置およびリペア溶接方法 | |
CN118541236A (zh) | 离线示教装置 | |
WO2021177361A1 (ja) | ビード外観検査装置およびビード外観検査システム | |
WO2022091543A1 (ja) | リペア溶接区間検出方法およびリペア溶接区間検出装置 | |
JP2021137848A (ja) | ビード外観検査装置およびビード外観検査システム | |
JP2021137849A (ja) | ビード外観検査装置、ビード外観検査方法、ビード外観検査プログラムおよびビード外観検査システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22828496 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023530117 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280044292.1 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202347087978 Country of ref document: IN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022828496 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022828496 Country of ref document: EP Effective date: 20240123 |